People don’t want to buy a quarter-inch drill. They want a quarter-inch hole.
― Clayton M. Christensen
The truth in this quote is for those looking to aspire to the greater things that a drill can help build. At the same time we all know people who love their tools and aspire to the greatest toolset money can buy, yet never build anything great. They have the world’s most awesome set of tools in their garage or work room, yet do nothing but tinker. People who build great things have a similar set of great tools, but focus on what they are building. The whole area of computing is risking becoming the hobby enthusiast who loves to show you their awesome tool box, but never have the intent of doing more than showing it off while building nothing in the process. Our supercomputers are the new prize for such enthusiasts. The core of the problem is the lack of anything important to apply these powerful tools to accomplish.
High Performance computing has become the central focus for scientific computing. It is just a tool. This is a very bad thing and extremely unhealthy for the future of scientific computing. The problems with making it the focal point for progress are manifestly obvious if one thinks about what it takes to make scientific computing work. The root of the problem is the lack of thought going into our current programs, and ultimately a failure to understand that HPC isn’t what we should be focused on; it is a necessary part of how scientific computing is delivered. The important part of what we do is modeling and simulation, which can transform how we do science and engineering. HPC can’t transform anything except electricity into heat, and relies upon modeling and simulation for its value. While HPC is important and essential to the whole enterprise the other aspects of proper delivery of scientific computing are so starved for attention that they may simply fail to exist soon.
For all the talk of creating a healthy ecosystem for HPC the programs comprised today are woefully inadequate towards achieving that end. The computer hardware focus exists because it is tangible and one can point at a big mainframe and say “I bought that!” It misses the key aspects of the field necessary for success, and even worse the key value in scientific computing. Scientific computing is a set of tools that allow the efficient manipulation of models of reality to allow exploration, design and understanding of the World without actually having to experiment with the real thing. Everything valuable about computing is in reality and what I will explain below is that the computer hardware is the most distant and least important aspect of the ecosystem that makes scientific computing important, useful and valuable to society.
Effectively we are creating an ecosystem where the apex predators are missing, and this isn’t a good thing. The models we use in science are the key to everything. They are the translation of our understanding into mathematics that we can solve and manipulate to explore our collective reality. Computers allow us to solve much more elaborate models than otherwise possible, but little else. The core of the value in scientific computing are the capacity of the models to explain and examine the physical World we live in. They are the “apex predators” in the scientific computing system, and taking this analogy further our models are becoming virtual dinosaurs where evolution has ceased to take place. The models in our codes are becoming a set of fossilized skeletons and not at all alive, evolving and growing.
The process of our models becoming fossilized is a form of living death. Models need to evolve, change, grow or even become extinct for science to be healthy. In the parlance of computing, models are embedded in codes and form the basis of a code’s connection to reality. When a code becomes a legacy code, the model(s) in the code become legacy as well. A healthy ecosystem would allow for the models (codes) to confront reality and come away changed in substantive ways including evolving, adapting and even dying as a result. In many cases the current state of scientific computing with its focus on HPC does not serve this purpose. The forces that change codes are diverse and broad. Being able to run codes at scales never before seen should produce outcomes that sometimes lead to a code’s (model’s) demise. The demise or failure of models is an important and health part of an ecosystems missing today.
People do not seem to understand that faulty models render the entirety of the computing exercise moot. Yes, the computational results may be rendered into exciting and eye-catching pictures suitable for entertaining and enchanting various non-experts including congressmen, generals, business leaders and the general public. These eye-catching pictures are getting better all the time and now form the basis of a lot of the special effects in the movies. All of this does nothing for how well the models capture reality. The deepest truth is that no amount of computer power, numerical accuracy, mesh refinement, or computational speed can rescue a model that is incorrect. The entire process of validation against observations made in reality must be applied to determine if models are correct. HPC does little to solve this problem. If the validation provides evidence that the model is wrong and a more complex model is needed then HPC can provide a tool to solve it.
The modern computing environment whether seen in a cell phone, or a supercomputer is a marvel of the modern World. It requires a host of technologies working togetherseamlessly to produce incredible things. We have immensely complex machines that produce important outcomes in the real world through a set of interweaved systems that translate electrical signals into instructions understood by the computer and humans, into discrete equations, solved by mathematical procedures that describe the real world and ultimately compared with measured quantities in systems we care about. If we look at our focus today, the complexity of focus is the part of the technology that connects very elaborate complex computers to the instructions understood both by computers and people. This is electrical engineering and computer science. The focus begins to dampen in the part of the system where the mathematics, physics and reality comes in. These activities form the bond between the computer and reality. These activities are not a priority, and conspicuously diminished significantly by today’s HPC.
HPC today is structured in a manner to eviscerate fields that have been essential to the success of scientific computing. A good example is our applied mathematics programs. In many cases applied mathematics has become little more than scientific programming and code development. Far too little actual mathematics is happening today, and far too much focus is seen in productizing mathematics in software. Many people with training in applied mathematics only do software development today and spend little or no effort in doing analysis and development away from their keyboards. It isn’t that software development isn’t important, the issue is the lack of balance in the overall ratio of mathematics to software. The power and beauty of applied mathematics must be harnessed to achieve success in modeling and simulation. Today we are simply bypassing this essential part of the problem to focus on delivering software products.
Similar issues are present with applied physics work. A healthy research environment for making progress in HPC would see far greater changes in modeling. A key aspect of modeling is the presence of experiments that challenge the ability of modeling to produce good useful representations of reality. Today such experimental evidence is sorely lacking and significantly hampered by the inability to take risks and push the envelope with real things. If we push the envelope on things in the real world it will expose our understanding to deep scrutiny. Such scrutiny will necessitate changes to our modeling. Without this virtuous cycle the drive to improve is utterly lacking.
Another missing element in the overall mix is the extent to which modeling and simulation supports activities pushing society forward. We seem to be in an era where society is not interested in progressing at anything these days. Instead we are trying to work toward a risk free world where everyone is completely safe and entirely divorced from ever failing at anything. Because of this environment the overall push for better technology is blunted to the degree that nothing of substance ever gets done. The maxim of modern life is that vast amounts of effort will be expended to assure that trivially possible things are assured of not happening. No small possibility of a dire outcome is too small to inhibit the expenditure of vast resources to make this smaller. This explains so much of what is wrong with the World today. We will bankrupt ourselves to achieve many expensive and unimportant outcomes that are completely unnecessary. Taking this view of the World allows us to explain to utter stupidity of the HPC world.
The origin, birth and impact of modeling and simulation arises from its support of activities essential to making societal progress. Without activities working toward societal progress (or at least the scientific-technological aspects of this) modeling and simulation is stranded in stasis. Progress in modeling and simulation is utterly tied to work in areas where big things are happening. High performance computing arose to prominence as a tool to allow modeling and simulation to tackle problems of greater complexity and difficulty. That said HPC is only one of the tools allowing this to happen. There is a complex and vast chain of tools necessary for modeling and simulation to succeed. It is arguable that HPC isn’t even the most important or lynchpin tool in the mix. If one looks at the chain of things that needs to work together the actual computer is the farthest removed from the reality we are aiming to master? If anything along the chain of tools closer to reality breaks, the computer is rendered useless. In other words, the computer can work perfectly and be infinitely fast and efficient, yet still be useless unless the software running on it is correct. Furthermore in the exercise of modeling and simulation, the software must be based on firm mathematical and physical principles, or it will be similarly useless. This last key step is exactly the part of the overall approach we are putting little or no effort in to in our current approach. Despite this lack of evidence we have made HPC central to success today. Too much focus on a tool of limited importance will swallow resources that could have been expended on more impactful activities. Ultimately the drivers for progress at a societal level are necessary for any of this work to have actual meaning.
If I’m being honest modeling and simulation is just a tool as well. As a tool it is much closer to the building of something great than the computers are. Used properly it can have a much greater impact on the capacity for helping produce great things than the computers can. What we miss more than anything is the focus on achieving great things as a society. We are too busy trying to save ourselves from a myriad of minute and vanishing threats to our safety. As long as we are so unremittingly risk adverse we will accomplish nothing. Our focus on big computers over big achievements is just a small reflection of this vast societal ill.
To the man who only has a hammer, everything he encounters begins to look like a nail.
― Abraham H. Maslow