When the number of factors coming into play in a phenomenological complex is too large scientific method in most cases fails. One need only think of the weather, in which case the prediction even for a few days ahead is impossible.

― Albert Einstein

One of the dirty little secrets of computing in the scientific and engineering worlds is the fact that the vast majority of serious calculations are highly calibrated (and that’s the nice way to say it). In many important cases, the quality of the “prediction” is highly dependent upon models being calibrated against data. In some cases calling the calibrated “models,” does modeling a great disservice, and the calibration instruments are simply knobs used to tune the calculation. The tuning accounts for serious modeling shortcomings and often allows the simulation to produce results that approximate the fundamental balances of the physical system. Often without the calibrated or knobbed “modeling” the entire simulation is of little use and bears no resemblance of reality. In all cases this essential simulation practice creates a huge issue for the proper and accurate uncertainty estimation.

Confidence is ignorance. If you’re feeling cocky, it’s because there’s something you don’t know.

― Eoin Colfer

At some deep level the practice of calibrating simulations against data is entirely unavoidable. Behind this unavoidable reality is a more troubling conclusion that our knowledge of the World is substantially less than we might like to freely admit to ourselves. By the same token the actual ClimateModelnestinguncertainty in our knowledge is far larger than we are willing to admit. The sort of uncertainty that is present cannot be meaningfully addressed through the focus on more computing hardware (its assessment could be helped, but not solved). This uncertainty can only be addressed through a systematic effort to improve models and engage in broad experimental and observation science and engineering. If we work hard to actively understand reality better the knobs can be reduced or even removed as knowledge grows. This sort of work is exactly the sort of risky thing our current research culture eschews as a matter of course.

Do not fear to be eccentric in opinion, for every opinion now accepted was once eccentric.

― Bertrand Russell

This area of modeling and simulation is essential to many areas to varying degrees. If we are to advance our use and utility of modeling and simulation with confidence, it must be dealt with in a better and more honest way. It is useful to point to a number of important applications where calibration or knobs are essential to success. For air flow over an airplane or automobiles turbulence modeling is essential and turbulence is one of the key areas for calibrated results. Climate and weather modeling is another area where knobs are utterly essential. Plasma physics is yet another area where the modeling is so poor that calibration is absolutely necessary. Inertial or magnetically confined fusion both require knobs to allow simulations to be useful. In addition to turbulence and mixing, various magnetic or laser physics add to the problems with simulation quality, which can only be dealt with effectively through calibration and knobs.

You couldn’t predict what was going to happen for one simple reason: people.

― Sara Sheridan

The conclusion that I’ve come to is that the uncertainty in the cases of calibrated or knobbed calculation has two distinct faces each of which should be fully articulated by those conducting simulations. One is the best-case scenario of the simulated uncertainty, which depends on the modeling and its calibration being rather complete and accurate in capturing reality. The second is the pessimistic case where the uncertainty comes from the lack of knowledge that led to the need for calibration in the first place. If the simulation is calibrated, the truth is that the calibration is highly dependent upon the data used and guarantees of validity are dependent on matching the conditions closely associated with the data. Outside the range where the data was collected, the calibration should carry with it greater uncertainty. The further we move outside the range defined by the data, the greater the uncertainty.

This is most commonly seen in curve fitting using regression. The curve and the data are closely correlated and standard uncertainties are relatively small. When the uncertainty is taken outside the range of the data, it grows much larger. In the assessment of uncertainty in calculations this is rarely taken into account. Generally those using calculation like to be blithely unaware of whether the calibrations they are using are well within the range of validity. Calibration is also imperfect and carries an error with them intrinsic to the determination of the settings. The uncertainty associated with the data itself is always an issue when either taking the optimistic or more pessimistic face of uncertainty.

A potentially more problematic aspect of calibration is using the knobs toMesh_Refinement_Image4 account for multiple effects (turbulence, mixing, plasma physics, radiation and numerical resolution are common). In this cases the knobs may account for a multitude of poorly understood physical phenomena, mystery physics and lack of numerical resolution. This creates a massive opportunity for severe cognitive dissonance, which is reflected in an over-confidence in simulation quality. Scientists using simulations like to provide those funding their work with greater confidence than it should carry because the actual uncertainty would trouble those paying for it. Moreover the range of validity for such calculation is not well understood or explicitly stated. One of the key aspects of the calibration being necessary is that the calculation cannot reflect a real World situation without it. The model simply misses key aspects of reality without the knobs (climate modeling is an essential example).

In the cases of the knobs accounting for numerical resolution, the effect is usually crystal clear when the calibration of the knob settings needs to be redone whenever the numerical resolution changes because a new faster computer becomes available. The problem is that those conducting the calculations rarely make a careful accounting of this effect. They simply recalibrate the calculations and go on without ever making much of it. This often reflects a cavalier attitude toward computational simulation that rarely intersects with high quality. This lack of transparency can border on delusional. At best this is simply intellectually sloppy, at worst it reflects a core of intellectual dishonesty. In either case a better path is available to us.

Science is not about making predictions or performing experiments. Science is about explaining.

― Bill Gaede

titan2In essence there are two uncertainties that matter: the calibrated uncertainty where data is keeping the model reasonable, and the actual predictive uncertainty that is much larger and reflects the lack of knowledge that makes the calibration necessary in the first place. Another aspect of the modeling in the calibrated setting is the proper use of the model for computing quantities. If the quantity coming from the simulation can be tied to the data used for calibration, the calibrated uncertainty is a reasonable thing to use. If the quantity from the simulation is inferred and not directly calibrated, the larger uncertainty is appropriate. Thus we see that the calibrated model has intrinsic limitations, and cannot be used for predictions that go beyond the data’s physical implications. For example climate modeling is certainly reasonable for examining the mean temperature of the Earth. One the other hand the data associated with extreme weather events like flooding rains are not calibrated, and uncertainty regarding their prediction under climate change are more problematic.

climate_modeling-ruddmanIn modeling and simulation nothing comes for free. If the model needs to be calibrated to accurately simulate a system, the modeling is limited in an essential way. The limitations in the model are uncertainties about aspects of the system tied to the modeling inadequacies. Any predictions of the details associated with these aspects of the model are intrinsically uncertain. The key is the acknowledgement of the limitations associated with calibration. Calibration is needed to deal with uncertainty about modeling, and the lack of knowledge limits the applicability of simulation. One applies the modeling in a manner that is cautious, if they are being rational. Unfortunately people are not rational and tend to put far too much faith in these calibrated models. In these cases they engage in wishful thinking, and fail to account for the uncertainty in applying the simulations for prediction.

It is impossible to trap modern physics into predicting anything with perfect determinism because it deals with probabilities from the outset.

― Arthur Stanley Eddington

If we are to improve the science associated with modeling and simulation the key is uncertainty. We should charter work that addresses the most important uncertainties through well-designed scientific investigations. Many of these mysteries cannot be addressed without adventurous experimentation. Current modeling approaches need to be overthrown and replaced with different approaches without limitations (e.g., the pervasive mean field models of today). No amount of raw computing power can solve any of these problems. Our current research programs in high performance computing are operating in complete ignorance of the approach necessary for progress.

All you need in this life is ignorance and confidence, and then success is sure.

– Mark Twain

Advertisements