It doesn’t matter how beautiful your theory is … If it doesn’t agree with experiment, it’s wrong.
― Richard Feynman
Bill’s corollary: It doesn’t matter how massive your calculation is … If it doesn’t agree with experiment, it’s wrong.
The real world is complex, dangerous and expensive. It is also where mystery lives and the source of knowledge. There seems to be some desire to use computers, modeling an
d simulation to replace our need for dealing with the real world. This is untenable from many different perspectives and misplaces the proper role of everything possible via computing. Worse yet, computing can not be a replacement for reality, but rather is simply a tool for dealing with it better. In the final analysis the real world still needs to be in the center of the frame. Computing needs to be viewed in the proper context and this perspective should guide our actions in its proper use.
Experiment is the sole source of truth. It alone can teach us something new; it alone can give us certainty.
― Henri Poincaré
We see the confluence of many things in our attitudes toward computing. It is a new thing constantly unveiling new power and possible ways of changing our lives. In many ways computing is driving enormous change societally and creating very real stress in the real world. These stresses are stoking fears and lots of irrational desire to control dangers and risks. All of this control is expensive, and drives an economy of fear. Fear is very expensive. Trust, confidence and surety are cheap and fast. One totally irrational way to control fear is ignore it, allowing reality to be replaced. For people who don’t deal with reality well, the online world can be a boon. Still the relief from a painful reality ultimately needs to translate to something tangible physically. We see this in an over-reliance on modeling and simulation in technical fields. We falsely believe that experiments and observations can be replaced. The needs of the human endeavor of communication can be done away with through electronic means. In the end reality must be respected, and people must be engaged in conversation. Computing only augments, but never replaces the real world, or real people, or real experience. This perspective is a key realization in making the best use of technology.
The real world is where the monsters are.
― Rick Riordan
In science we must always remember that understanding reality is the fundamental objective. Theory acts to explain what we see, but observation always rules supreme in defining the validity of knowledge and understanding. We must always remember that computing is a tool that augments theory. It never replaces theory, nor can it replace experiments or observation. A computational simulation can never be better than the model that theory has provided it. If the theory is lacking (and it always is), more computing cannot rescue it. No amount of computing can fill in the gap between what is and isn’t known. It is a new and powerful tool to be wielded with care and skill, but a tool. These perspectives seem to be lost on so many people who see computing as some sort of silver bullet that transcends these simple truths.
It is sometimes an appropriate response to reality to go insane.
― Philip K. Dick
While computing isn’t a silver bullet for making painful elements of reality go away, it is a powerful tool if wielded properly. Modeling and simulation serves as a powerful means of testing our knowledge and general capability to understand the world around us. When simulations are tested against reality and produce good results (that is they are validated), we feel that our grasp of the how’s and why’s of the real world are at hand. If we are grounded in this understanding, the modeling and simulation can aid our ability to examine the World around us. We can optimize our observations or design experiments to more effectively examine and measure various things. A successful model can serve a wonderful role in focusing our attention toward the most important aspects of reality, or ignoring what is not essential.
More than simply assisting the design of better experiments and observations of reality, the use of modeling and simulation can provide a significant flywheel effect. All the models of reality we use are flawed at some level. In a similar vein, our observations of reality are always limited and flawed. In very good models these flaws are subtle and hard to expose. Good experiments need to be designed to expose and improve these models. We can achieve some stunning synergies if we utilize the models to design the most stringent tests of them. This is exactly the thing we can do with a well-designed program that collaborates effectively. If we examine the models we can find the parts of a physical system most sensitive to the impact of parts of the model. One way of proactively improving models is to identify where to make measurements, and what to measure to maximize the ability to prove, disprove or improve a given model. The key point is the oft-missed point that the models are always imperfect.
Reality is that which, when you stop believing in it, doesn’t go away.
― Philip K. Dick
These imperfections are rarely acknowledged in the current National dialog on high performance computing. Rather than state this rather powerful truth, we see a focus on computer power coupled to an unchanging model as the recipe for progress. Focus and attention to improving modeling is almost completely absent in the modeling and simulation world. This ignores one of the greatest truths in computing that no amount of computer power can rescue an incorrect model. These truths do little to alter the approach although we can be sure that we will ultimately pay for the lack of attention to these basics. Reality cannot be ignored forever; it will make itself felt in the end. We could make it more important now to our great benefit, but eventually our lack of consideration will demand more attention.
A more profitable a proactive strategy would benefit everyone. Without attention many end up accommodating the model’s imperfections through heavy use of calibration. Ultimately the calibration hammer is lowered on imperfect models to render them useful and capable of influencing reality. In the wake of heavy-handed calibration we can achieve a great focus on localizing the modeling issues. In a deep sense the areas for crude calibration (often crude and very effective) are exactly the places for the greatest modeling improvement. Typically the calibration ends up merging multiple issues together. As a result one needs to carefully deconstruct the whole of the effects being accounted for in calibration. For example one may find a single calibration knob accounting for the effects of turbulence, inadequate constitutive relations and mesh resolution. To make progress these effects need to be separated and dealt with independently. The proper decomposition of error allows the improvement of modeling in a principled manner.
The key to utilizing simulation effectively is the recognition of what it can and cannot do. While one can experiment with computations, these experiments can only unveil secrets of the models or computations themselves. The capacity of such unveiled secrets to be meaningful in reality always involves direct comparison with observations of the real world. If the secret seen computationally is also seen in reality then a true discovery can be made. In the process the model gains credibility and validity as well. In these cases simulation and modeling can tell us where to look, and if the secret is found, we know the model is valuable and correct. If it is not found, we know the model is deficient and must be improved. The observations may or may not be sufficient for improving the model in such a way that its predictions are validated by reality.
Successful modeling and simulation implies a level of understanding that empowers humanity. The implication of understanding goes to our ability to control reality effectively through human action. If reality can be modeled its effects can be affected or accommodated through design or mitigation. The definition of success is always through validation of the model’s results against observations of the world (including carefully designed experiments). If the model can be demonstrated via verification to be solving the model we believe we are using, the validation is powerful evidence. One must recognize that the degree of understanding is always relative to the precision of the questions being asked. The more precise the question being asked is, the more precise the model needs to be. This useful tension can help to drive science forward. Specifically the improving precision of observations can spur model improvement, and the improving precision of modeling can drive observation improvements, or at least the necessity of improvement. In this creative tension the accuracy of solution of models and computer power plays but a small role.
Any physical theory is always provisional, in the sense that it is only a hypothesis: you can never prove it. No matter how many times the results of experiments agree with some theory, you can never be sure that the next time the result will not contradict the theory.
― Stephen Hawking