An idea that is not dangerous is unworthy of being called an idea at all.

― Oscar Wilde

Uncertainty quantification (UQ) is a truly dangerous activity for modeling and simulation. This might seem to be a rather odd thing to say with all the focus and interest on research in UQ. The problem with uncertainty is its seeming contradistinction with precision. Computers are seen as a tool to provide precise well-determined solutions with some degree of repeatability. Examining the uncertainty of the solutions and the models run counter to the seeming spirit of the field. More deeply the computer-based solution of the models is treated with a degree of suspicion by many. The purveyors of these solutions seek to blunt these suspicions with a degree of confidence about their legitimacy. The notion of uncertainty runs counter to the intent of providing confidence in solutions. As such uncertainty is frequently downplayed and poorly executed to buoy confidence. This mentality is a genuine threat to the scientific credibility of modeling and simulation.

One of the prevailing uncomfortable aspects of computed solutions to models is their intrinsically approximate nature. We don’t have the sense of security that analytical solutions provide someone (even if this is largely an illusion). Numerical practitioners are both arrogant and slipshod in their approach. They show too much arrogance in providing solutions without necessary caveats and appropriate care for the correctness of solutions. The slipshod and scientifically careless approach of proof by overwhelming power and colorful graphics is easy to fall back on. Moreover, these terrible practices are effective. The approximate solution of the models of nature is an intricate and highly technical nature of the expertise. There are numerous ways to completely screw it up. Too many users of numerical model solution are either oblivious to the intricacies or willfully ignore concerns.

One of the major issues with modeling and simulation is the lack of negative feedback for egregious practices. Either willfully or implicitly major sources of uncertainty are ignored usually to the benefit or modeling and simulation. If modelers were truthful and accurate about uncertainty there is a perception that the results might be treated with less confidence. This might indeed be true, but the lack of disclosure and honestly in this practice is genuinely dangerous. By failing to address the uncertainty of computational models directly we are harming progress in distinct ways. We are failing to use the knowledge of what we don’t know to guide our research. Uncertainty is the study of how well know or don’t know something. In other cases, there is a limit to how much we can know, in some cases there is a core uncertainty that is irreducible. We need to know this and act accordingly.

All of this discussion is a subtext to the observation that we allow ourselves to default to uncertainty estimates of zero. In other words when we know nothing at all we chose a default uncertainty estimate of zero, which is the smallest value possible. In other words, we demonstrate a complete lack of knowledge with an estimate that implies exact and complete knowledge. This is patently and utterly absurd. We only get away with this utterly untenable situation because of the exotic and esoteric nature of computational modeling. It is viewed as the purview of experts and deeply technical and complex.

When the depth of a technical field is used to shield the associated work from scrutiny, it is deplorable. For UQ this is the standard way of operating. The modeling associated with typical simulations is quite complex on a number of different levels. The most esoteric and technical aspects of the entire field are the models themselves usually based on differential equations, which must be solved by complicated mathematically complex numerical methods. Physical reality is then represented by a mesh along with modeling choices to make the overall simulation tractable. Each of these areas is only well understood by highly specialized scientists invariably having PhD’s who are actively working on research. The complexity only diminishes slightly in the voyage down to the computing hardware. In each of these areas there are incredibly detailed technical fields that are inaccessible to all but the most educated and specialized scientists. The result is an intricate interlinked set of activities that all must be executed at a high level to produce competent modeling and simulation. Looking into UQ is then another complex specialty added into an immensely complex system.

One of the most difficult issues about UQ is its focus on the most technical part of the simulation pipeline. UQ is heavily focused on physics, engineering and mathematics often blended together intricately. UQ is demanding in terms of computation, but far more demanding in terms of intellectual labor and the overall flow of work. The hardest aspect of the field is dealing with all the things unknown or barely known. There are swaths of knowledge and information we have no grasp on. With UQ we combine the deeply esoteric along with the unknown into a stew of impenetrable complexity. It can also produce the effect of being a great place of people to obfuscate. Being part of a generally esoteric activity, poor and shoddy work can easily pass as high quality, especially with gee-whiz graphics and movies to provide panache to information. Somebody who is engaged in this sort of marketing-based approach to work will likely want to under-estimate the uncertainty to provide a false sense of better precision than is justifiable.

Here we get to the crux of the problem with uncertainty. Large uncertainty is almost invariably judged as a problem. With a complex and esoteric subject like modeling and simulation the ability to either pass poor work off as good, or outright lie and bullshit about quality is great. Increasingly the management and customers for the work are incompetent at judging its quality. As such they are prone to reward work that provides low uncertainty even if the nature of the work is poor. What we have is the perfect storm for promulgating bad work, complex multi-disciplinary work judged by technically inferior management and customers. It is the perfect storm for skipping large portions of the quality work needed for excellent work. In addition, high quality work is expensive, time consuming and difficult. Why do this when the customer will accept a shoddy cheap product? Why do it when the customer cannot tell the difference?

In this view the failure to address conduct UQ rather completely reduces our ability to determine where progress can most impactfully be made. Uncertainty can be used to guide investments in the wide array of disciplines needed to conduct modeling and simulation. It shows the source of lack of knowledge in clear terms, which should dictate effort toward progress. In places where the uncertainty cannot be removed because it is intrinsic, we can accommodate the uncertainty and look to progress elsewhere. In either case, the UQ is needed to provide meaningful direction. For example, our current emphasis on providing and using massive computers is not grounded on any necessity based in uncertainty. If you look at the program only the barest lip service is paid to UQ or its parent activities of verification and validation. For the most part the justification in focus on computing hardware is based entirely on superficial and naïve arguments divorced from evidence.

We can use the knowledge for bad or good. If we are good we can push ourselves to demand excellence by identifying and either computing, estimating or bounding all the uncertainty. On the other hand, knowing that good work is not rewarded and often not payed for, we can promote bad work. We have managers and customers who don’t know the difference anyway, so why do the extra work? This would seem to be the spirit of today, why do good work when bad work is just as acceptable?

Everything you’ve learned in school as “obvious” becomes less and less obvious as you begin to study the universe. For example, there are no solids in the universe. There’s not even a suggestion of a solid. There are no absolute continuums. There are no surfaces. There are no straight lines.

― R. Buckminster Fuller