What I cannot create, I do not understand.

– Richard Feynman

We are in deep danger of relying upon science and associated software we do not understand because we have stopped the active creation of knowledge so broadly. I open with one of my favorite quotes by the great physicist Richard Feynman, who also wrote about Cargo Cult Science (https://en.wikipedia.org/wiki/Cargo_cult_science). It is a bold, but warranted assertion to note that much of our science work today is taking on the character of Cargo Cult Science. We are not all the way there, but we have moved a long way toward taking on all of the characteristics of this pathology. In this assertion money is the “cargo” that pseudo-scientific processes are chasing. It is nomaxresdefaultexaggeration to say that getting funding for science has replaced the conduct and value of that science today. This is broadly true, and particularly true in scientific computing where getting something funded has replaced funding what is needed or wise. The truth of the benefit of pursuing computer power above all else is decided upon a priori. The belief was that this sort of program could “make it rain” and produce funding because this sort of marketing had in the past. All results in theRichard-feynmanprogram must bow to this maxim, and support its premise. All evidence to the contrary is rejected because it is politically incorrect and threatens the attainment of the cargo, the funding, the money. A large part of this utterly rotten core of modern science is the ascendency of the science manager as the apex of the enterprise. The accomplished scientist and expert is merely now a useful and necessary detail, the manager reigns as the peak of achievement.

The first principle is that you must not fool yourself — and you are the easiest person to fool.

We’ve learned from experience that the truth will come out. Other experimenters will repeat your experiment and find out whether you were wrong or right. Nature’s phenomena will agree or they’ll disagree with your theory. And, although you may gain some temporary fame and excitement, you will not gain a good reputation as a scientist if you haven’t tried to be very careful in this kind of work. And it’s this type of integrity, this kind of care not to fool yourself, that is missing to a large extent in much of the research in cargo cult science.

– Richard Feynman

If one looks at the scientific computing landscape today, one sees a single force for progress: the creation of a new more powerful supercomputer that is much faster than anything we have today. The United States, Europe and China are all pursuing this path for advancing scientific computing. It is a continuation of a path we have pursued for the last 25 years, but our future is not remotely like the last 25 years. This approach to progress can be explained simply and marketed to the naïve and untechnical. This works because our National leadership is increasingly naïve, witless and obsessively anti-intellectual lacking any technical sophistication. We are in the midst of a tide of low information leadership who are swayed by sweet sounding bullshit far more easily than hard-nosed facts.

The farther backward you can look, the farther forward you are likely to see.

― Winston S. Churchill

mediocritydemotivatorIn this putrid environment, faster computers seem an obvious benefit to science. They are a benefit and pathway to progress, this is utterly undeniable. Unfortunately, it is an expensive and inefficient path to progress, and an incredibly bad investment in comparison to alternative. The numerous problems with the exascale program are subtle, nuanced, highly technical and pathological. As I’ve pointed out before the modern age is no place for subtlety or nuance, we live it an age of brutish simplicity where bullshit reigns and facts are optional. In such an age, exascale is an exemplar, it is a brutally simple approach tailor made for the ignorant and witless. If one is willing to cast away the cloak of ignorance and embrace subtlety and nuance, a host of investments can be described that would benefit scientific computing vastly more than the current program. If we followed a better balance of research, computing to contribute to science far more greatly and scale far greater heights than the current path provides.

Applications that matter to something big would create a great deal of this focus naturally. The demands of doing something real and consequential would breed a necessity to focus progress in an organic way. Last week I opined that such big things are simply not present today in science or society’s broader narrative. Society is doing nothing big or aspirational or challenging to drive progress forward with genuine purpose. To be more pointed, the push for exascale is not big at all, it is rather an exemplar of the lack of vision and consequence. There is a bit of chicken and egg argument to all this. The bottom line is a general lack of underlying and defining purpose to our efforts in computing. Exascale is what we do when we want to market something as “feeling” big, when it is actually doing something small and inconsequential.

Those who do not move, do not notice their chains.

― Rosa Luxemburg

How can I say such a thing?

In a nutshell computing speed is one of the least efficient and least effective ways to improve computational science. It has only been an enabler because computing speed came for free with Moore’s law for most of the last half century. That free lunch is over and past, yet we mindfully ignore this reality (http://herbsutter.com/welcome-to-the-jungle/ ). Even with Moore’s law fully in effect, it was never the leading contributor for progress, progress was paced by numerical methods and algorithmic scaling. Moreover, computing speed cannot fix modeling that is wrong (methods and algorithms don’t fix this either). If a model is wrong, the wrong answer is simply computed much faster. Of course, we know that every model is wrong and the utility of any model is determined via V&V. Issues associated with the use of computing, naïve code users, the loss of expertise, and understanding are simply overlooked, or worse yet made more intractable due to inattention.

Each of these advances has been mentioned before in the guise of a full blog post, but it is useful to put things together to see the wealth of unused opportunity.

80% of results come from 20% of effort/time

― Vilfredo Pareto

  1. Modernizing modeling ought to be a constant and consistent emphasis in science. Computational science is no different. For some reason, the modeling advances have simply stopped. Our basic models of reality are increasingly fixed and immutable, and ever less fit for future purpose. The models of reality have become embedded in computer codes, and ultimately central to the codes structure in numerous respects. As such we start to embed a framework for modeling whose foundation becomes invariant. We can’t change the model without developing an entirely different code. We reduce our modeling to submodels and closure of existing models while the staying within a fundamental modeling framework. This is another area where progress is phenomenally risky to approach and substantially prone to failures and misguided efforts. Without the failure, the ability to learn and produce new and improved model is virtually impossible. https://wjrider.wordpress.com/2015/02/02/why-havent-models-of-reality-changed-more/, https://wjrider.wordpress.com/2015/07/03/modeling-issues-for-exascale-computation/ , https://wjrider.wordpress.com/2017/07/07/good-validation-practices-are-our-greatest-opportunity-to-advance-modeling-and-simulation/
  2. Modernizing methods is not happening. Since methods are one of the best ways to improve the efficiency and effective solution of models, progress is harmed in a manner that cannot be easily recovered by other means. Usually when a model is decided upon, a method is used to solve the model numerically. The numerical method is only slightly less code specific and invariant than the model itself. By virtue of this character, the basic numerical method for a model becomes indistinguishable from the code. If we preserve the code base, we preserve old methods, which means no progress. We are stuck using relatively low-order methods with crude stability mechanisms. The ability to use high-order methods with enhanced accuracy and efficiency is not advancing. The research in numerical methods and the practical application of numerical methods is becoming increasingly divorced from one another. The gap has grown into a chasm, and numerical methods research is losing relevance. Part of the problem is related to the standards of success where methods research allows success to be found on easier problems rather than keeping the problem difficulty fixed. This is yet another place where the inability to accept failure as a necessary element (or even fuel) for success is fatal. https://wjrider.wordpress.com/2016/06/14/an-essential-foundation-for-progress/, https://wjrider.wordpress.com/2016/07/25/a-more-robust-less-fragile-stability-for-numerical-methods/,
  3. Algorithmic scaling is the most incredible thing we could achieve in terms of computational performance. The ability to change the scaling exponent on how much work it takes to solve a problem can have a magical impact. Linear algebra is the posterchild for this effect. A breakthrough in scaling can make the impossible problem, possible and even routine to solve. The classical naïve scaling for matrix inversion has the work scaling with the cube of the problem size. Even small problems quickly become utterly intractable and almost no amount of computer power can fix this. Change the scaling to quadratic and new problems suddenly become routine, change the scaling to linear and the problems that can be tackled routinely were unimaginable before. We are stuck at linear, although some fields are starting to see sublinear algorithms. Could these breakthroughs be more common and useful? If they could the impact on computational science would overwhelm the capacity of exascale easily. Today we aren’t even trying to make these advances. In my view, such work is generically risky and prone to failure, can failure is something that has become intolerable, thus success if sacrificed. https://wjrider.wordpress.com/2015/05/29/focusing-on-the-right-scaling-is-essential/
  4. cell-phoneToday supercomputing is completely at odds with the commercial industry. After decades of first pacing advances in computing hardware, then riding along with increases in computing power, supercomputing has become separate. The separation occurred when Moore’s law died at the chip level (in about 2007). The supercomputing world has become increasingly disparate to continue the free lunch, and tied to an outdated model for delivering results. Basically, supercomputing is still tied to the mainframe model of computing that died in the business World long ago. Supercomputing has failed to embrace modern computing with its pervasive and multiscale nature moving all the way from mobile to cloud. https://wjrider.wordpress.com/2017/12/15/scientific-computings-future-is-mobile-adaptive-flexible-and-small/
  5. Verification & validation – If the scientific computing efforts are to be real scientific endeavors, V&V is essential. Computational modeling is still modeling and comparison with experiment is the gold standard for modeling, but with computational work the comparison has numerous technical details needing serious attention.  In a very complete way V&V is the scientific method in action within the context of modeling and simulation. This energizes a top to bottom integration of scientific activities and essential feedback up and down this chain. The process produces actionable evidence of how progress is being made and where the bottlenecks to progress exist. The entirety of the V&V work provides a deep technical discourse on the breadth of computational science. The whole of computational science can be improved by its proper application. By weakly supporting V&V, current efforts are cutting themselves off from the integration of the full scientific enterprise and impact into the use of computation scientifically. https://wjrider.wordpress.com/2016/12/22/verification-and-validation-with-uncertainty-quantification-is-the-scientific-method/
  6. chart-with-huge-error-barsExpansive uncertainty quantification – too many uncertainties are ignored rather than considered and addressed. Uncertainty is a big part V&V, a genuinely hot topic in computational circles, and practiced quite incompletely. Many view uncertainty quantification as only being a small set of activities that only address a small piece of the uncertainty question. Too much benefit is achieved by simply ignoring a real uncertainty because the value of zero that is implicitly assumed is not challenged. This is exacerbated significantly by a half funded and deemphasized V&V effort in scientific computing. Significant progress was made several decades ago, but the signs now point to regression. The result of this often willful ignorance is a lessening of impact of computing and limiting the true benefits. https://wjrider.wordpress.com/2016/04/22/the-default-uncertainty-is-always-zero/
  7. Data integration and analysis – one of the latest hot topics is big data and data analysis. The internet and sensors are creating massive amounts of data, and its use is a huge technical problem. The big data issue is looking for significant and actionable understanding from the oceans of data. A related and perhaps more difficult problem is small data where there isn’t enough data, or the enough of the data you want. Lots of science and engineering is data limited to a degree that scientific understanding is limited. Modeling and simulation offers a vehicle to augment this data and fill in the gaps. Doing this in a manner that is credible will be huge challenge. The ways forward with credibility use V&V and intensive uncertainty quantification. The proper use of codes and the role of calibration also becomes critical to success.  https://wjrider.wordpress.com/2016/07/10/10-big-things-for-the-future-of-computational-science/
  8. Multidisciplinary, multiscale science – one of the hot topics a quarter century ago was better Multiphysics methods to replace the pervasive use of operator splitting for complex codes. This effort has utterly failed. We have made very little progress forward. Part of the issue is the inability to produce computational algorithms that are efficient enough to compete. A fully coupled method ends up being so expensive that any accuracy increases from the improved coupling are rendered ineffective. A second and perhaps more powerful reason for lack of ms11progress are the computer codes. Old computer codes are still being used, and most of them use operator splitting. Back in the 1990’s a big deal was made regarding replacing legacy codes with new codes. The codes developed then are still in use, and no one is replacing them. The methods in these old codes are still being used and now we are told that the codes need to be preserved. The codes, the models, the methods and the algorithms all come along for the ride. We end up having no practical route to advancing the methods. https://wjrider.wordpress.com/2016/09/16/is-coupled-or-unsplit-always-better-than-operator-split/
  9. legacy-code-1Complete code refresh – we have produced and now we are maintaining a new generation of legacy codes. A code is a storage for vast stores of knowledge in modeling, numerical methods, algorithms, computer science and problem solving. When we fail to replace codes, we fail to replace knowledge. The knowledge comes directly from those who write the code and create the ability to solve useful problems with that code. Much of the methodology for problem solving is complex and problem specific. Ultimately a useful code becomes something that many people are deeply invested in. In addition, the people who originally write the code move on taking their expertise, history and knowledge with them. The code becomes an artifact for this knowledge, but it is also a deeply imperfect reflection of the knowledge. The code usually contains some techniques that are magical, and unexplained. These magic bits of code are often essential for success. If they get changed the code ceases to be useful. The result of this process is a deep loss of expertise and knowledge that arises from the process of creating a code that can solve real problems. If a legacy code continues to be used it also acts to block progress of all the things it contains starting with the model and its fundamental assumption. As a result, progress stops because even when there is research advances, it has no practical outlet. This is where we are today. https://wjrider.wordpress.com/2015/10/30/preserve-the-code-base-is-an-awful-reason-for-anything/ https://wjrider.wordpress.com/2016/01/01/are-we-really-modernizing-our-codes/ https://wjrider.wordpress.com/2016/01/14/a-response-to-criticism-are-we-modernizing-our-codes/ https://wjrider.wordpress.com/2014/03/20/legacy-code-is-terrible-in-more-ways-than-advertised/
  10. image005Democratization of expertise – the manner in which codes are applied has a very large impact on solutions. The overall process is often called a workflow, encapsulating activities starting with problem conception, meshing, modeling choices, code input, code execution, data analysis, visualization. One of the problems that has arisen is the use of codes by non-experts. Increasingly code users are simply not sophisticated and treat codes like black boxes. Many refer to this as the democratization of the simulation capability, which is generally beneficial. On the other hand, we increasingly see calculations conducted by novices who are generally ignorant of vast swaths of the underlying science. This characteristic is keenly related to a lack of V&V focus and loose standards of acceptance for calculations. Calibration is becoming more prevalent again, and distinctions between calibration and validation are vanishing anew. The creation of broadly available simulation tools must be coupled to first rate practices and appropriate professional education. In both of these veins the current trends are completely in the wrong direction. V&V practices are in decline and recession. Professional education is systematically getting worse as the educational mission of universities is attacked, and diminished along with the role of elites in society. https://wjrider.wordpress.com/2016/12/02/we-are-ignoring-the-greatest-needs-opportunities-for-improving-computational-science/

titan

One of the key aspects of this discussion is recognizing that these activities are all present to some small degree in exascale, but all of them are subcritical. The program basically starves all of these valuable activities and only supports them in fashion that creates a “zombie-like” existence. As a result, the program is turning its back on a host of valuable avenues for progress that could actually make an exascale computer actually far more useful. Our present path has genuine utility, but represents an immense opportunity cost if you factor in what could have been accomplished instead with better leadership, vision and technical sophistication. The way we approach science more broadly is permeated with these inefficiencies meaning our increasingly paltry investments in science are further undermined by our pathetic execution. At the deepest level our broader societal problems revolving around trust, expertise, scandal and taste for failure may doom any project unless they are addressed. For example, the issues related to the preservation of code bases (i.e., creating new legacy codes) are creating deep problems with advancing on the essential fronts of modeling, methods and algorithms. Everything is woven together into a tapestry whose couplings cannot be ignored. This is exactly the sort of subtlety and nuance our current time finds utterly incomprehensible.

Postscript:

It is sometimes an appropriate response to reality to go insane.

― Philip K. Dick

Healey’s First Law Of Holes: When in one, stop digging.

― Denis Healey

DMgfsliWkAAzZ_-Last week I tried to envision a better path forward for scientific computing. Unfortunately, a true better path flows invariably through a better path for science itself and the Nation as a whole. Ultimately scientific computing, and science more broadly is dependent on the health of society in the broadest sense. It also depends on leadership and courage, two other attributes we are lacking in almost every respect. Our society is not well, the problems we are confronting are deep and perhaps the most serious crisis since the Civil War. I believe that historians will look back to 2016-2018 and perhaps longer as the darkest period in American history since the Civil War. We can’t build anything great when the Nation is tearing itself apart. I hope and pray that it will be resolved before we plunge deeper into the abyss we find ourselves. We see the forces opposed to knowledge, progress and reason emboldened and running amok. The Nation is presently moving backward and embracing a deeply disturbing and abhorrent philosophy. In such an environment science cannot flourish, it can only survive. We all hope the darkness will lift and we can again move forward toward a better future; one with purpose and meaning where science can be a force for the betterment of society as a whole.

Everything passes, but nothing entirely goes away.

― Jenny Diski

Advertisements