A recent post on a Wired Blog by Rhett Allain posed the question, “what kind of science is computational science?” it came to the conclusion that it is neither experimental or theoretical, but not worthy of being a third type of science either (http://www.wired.com/wiredscience/2014/01/what-kind-of-science-is-computational-science/).  This post prompted a rebuttal by Tammy Kolda in a SIAM blog post (http://blogs.siam.org/what-kind-of-science-is-computational-science-a-rebuttal/).  Her defense of computational science is earnest and accurate, but doesn’t quite get to the heart of the philosophical question Allain poses.  Each contains some fair points, although I find the spirit of Allain’s post to be almost completely off base, and his final conclusion quite dubious.  More than dubious, it fails to recognize how computational science is something different, and this difference should be embraced rather than dismissed as second-rate.  Kolda’s post is much closer to the mark, but misses some key aspects that provide computational science a sterner defense to Allain’s dismissal.

In my opinion, computational science is not a third type or pillar of science, but rather a new way to doing science that builds upon the foundation of theoretical and experimental science.  Rather than being something different, it is an innovative product of both.  Its validity is utterly dependent upon the foundation in theory and observation or experiment.  Seeing computational science as something different implies scarcity in that it detracts from the traditional scientific approaches.  I believe it is much more accurate to see computational science as a new, different and complementary approach that only adds to our repertoire of scientific tools.  The entirety of computational science is dependent upon the invention of the digital computer, and computational science developed to allow the utilization of this tool.  Kolda’s post captures much of the depth of scientific disciplines necessary to make this approach to science effective, and some of the key areas where it is currently contributing.

As a computational scientist I think the blame for the miscommunication lies with the computational scientist’s collective message.  The value of computational science lies directly in its capasity to force the combination of disiplines that drives innovation in ways the traditional scientific approach does not.  By opening the lines of communciation between fields, science is enriched in ways that are difficult to measure.  Such innovation is a key to progress and spurs the generation of new ideas. Computational science is as much about how science is conducted as what that science is.  Too often computational science gets stuck in its own little world of computers, algorithms, mathematics, code, data and avoids deep dialog with domain science.  This inward looking emphasis is short-sighted, failing to capitalize on the greatest strength of computational science, its inherently multi-disiplinary point-of-view.

Beyond the difference in approach that computation offers, the importance of modeling on computers is its natural ability to handle complexity that analytical approaches falter under.  This complexity spurs connections between disparate scientific disiplines that ultimately power innovation.  New ideas are usually not new at all, but the combination of different ideas in new ways.  A new technology is the combination and packaging of existing concepts together to offer functionality that its base technologies did not offer.  Rarely are the ideas explored computationally completely new with traditional science supplying most of the concepts explored.  More often, new ideas area melange of existing ideas, but engaged in a fundamentally different context.  As such computational science provides an engine of discovery mearly by providing an effective vehicle for combining disiplines together.  As such computational science is a powerful new “integrating factor” for science.

While I largely disagree with Allain’s assessment, although his comments provide a window into the physicist’s mind and some stark conclusions about why computations are so poorly done.  Doing computational work is difficult and requires deep technical skills usually acquired with years of professional study and practice.  Physicists are responsible for some of the best examples of computational science, but also some of the worst.  I would express a deep concern about an attitude that projects such a degree of dismissal of the field as a valid scientific endeavor. To be done well computational science can’t simply be approached as a hobby, but rather as an equal contributing partner in the scientific enterprise.

Computational science has been an emergent technology in the last 70 years.  In the past 20 years there has been a veritable explosion in capability.  Computation has risen into prominence societally through the Internet with all the good and bad it brings.  All of this new capability and connectivitity will allow new problems to be posed and solved through providing a meaningful path to solving problems.  Hype today circles around big data, and will likely end up with some sort of rational equilibrium where big data contributes meaningfully to scientific progress.  Analytical tractability has always been a limitation to meaningful theory.  Today computers offer different paths to tractable solutions for theory.  For a theory to be solvable no longer requires access to analytical solutions, or their near relative in asymptotic expansions.  Instead, the conditions are lossened by access to a numerical solution.  Of course getting the numerical solutions correct is a subtle technical matter requiring immense skill.  Accurate, robust and convergent numerical approximation can be just as challenging as analytical work, if not more so.  Inspite of this difficult endeavor, numerical approximation is an improvement provided by computational science and a boon to scientific progress in general.

Computers and computation is now an indispensible part of science and engineering.  Doing it well is important, and this may be the heart of what is wrong with Allain’s discussion.  His dismissal of computational science as equals is really about defending the primal place of experiment and theory.  This is old fashioned and shortsighted.  The key to advancing science is to embrace new tools, new perspective and approaches in the desire to discover and understand.  Computational science can bring new perspective and allow old problems to be solved anew.  I do worry that Allain’s perspective is commonplace among the physics community and leads to computational practice that is less then completely rigorous.  Just as rigor and professionalism are absolutely necessary in theory and experiment, they are required for computational science to flourish.

The key argument for defining computational science as a new way of doing investigations are computational experiments that blend complex theoretical models together.  Such blends of models were functionally impossible in the past due to inability to tackle their solutions analytically.  Thus a numerical, i.e., computational approach is necessary.  The classical example of such models comes originally from defense science, e.g., nuclear weapons, but the approach rapidly spanned spinoffs to weather and climate via the efforts of visionaries such as John Von Neumann.  As such these efforts are central to the tension between science, policy and politics when their results indicate the root cause of climate change is human activity.   Before the advent of serious computational power, such a modeling activity would have been impossible.  This is an argument for including computational science as something new.

Science creates knowledge that ultimately comes into common use through engineering.  This is another place where computations are reshaping how work is done and what sort of activities is possible.  With the new power comes danger of over-reliance on the virtual experiment over the cruel mastery of nature.  Often the theory that underpins our models is too weak to capture uncommon events that often dominate issues such as safety.  These tail events quite often become those that shape history.  Think have 9/11, Fukishima, Earthquakes, Tsumanis, Katrina and other massive events that are in the tails of distributions.  Our mean field theory-based science is ill prepared to provide good answers to these issues much less engineer a robust technological response.  The important thing that computational science brings to the table is the ability to identify the issue more clearly.  Only then can we begin to address solutions.  Traditional science hadn’t provided progress enough in that direction prior to the advent of computational science.

While computational science brings new perspectives to the table it should remain firmly entrenched in reality.  This is where theoretical and experimental science in their largely traditional form should come in.  This is realm of verification and validation.  Verification is the way of tying computations directly to theory, that is can I prove that my model is a correct solution to the theory I think I’m solving.  In the same way validation is the way I prove that my model is providing some reasonable representation of the observed universe.  Together these techniques tie the three branches of science together as a cohesive whole, and provide some measure of confidence in the computational view.  Add to this uncertainty quantification and we can start to meaningfully engage with people making decisions.

A key to progress is a natural tension between theory and experiment.  Sometimes a new theory will drive science to make new observations that may confirm or deny an important theory.  At other time observations are made that push theory to explain them.  This is a useful and important tug-of-war.  Computational science now offers a third mechanism to achieve this import dynamic.   A calculation can sometimes serve as the “experiment” to drive new theory and observation, such as the problem of climate change.  Sometimes it fits more firmly into the theoretical camp as turbulent fluid mechanics where experimental techniques are playing catch up in measuring the energetic dynamics at small scales.  At other times it plays the role of experimental evidence as with simulations of the evolution of the large scale of universe.  The important aspect is that it plays both roles in a way that pushes knowledge and understanding forward.

In the end we just have science.  Theory is the way we try to conceptually understand the world and equip ourselves to predict what might happen.  Experiments are the way we record phenomena and observe the world.  Experiments and observations are used to confirm or deny theory.  Computation is another path that stands between these two approaches in a different (but perhaps more unified) manner.  It is a new tool to examine and understand.  It needs to be used properly, but also respected as a field of meaningful endeavor.  I read the Wired blog post and didn’t really feel a lot of respect.  Computation was characterized as being a bit less important than the traditional approaches.  This is not the way to progress.

Realizing that computational science is a new approach to conducting science that enhances the older traditional approaches.  It can offer new solutions to problems and provide a greater chance for success.  It only adds to our knowledge, and poses no risk to the tried and true approaches to science so cherished by many. Rather than competing with traditional scientific practice, computational science enriches and provides new connections and ideas to solve today’s most important challenges.