We adhere to the saying, “if it ain’t broke, don’t fix it,” while not really questioning whether “it” is “broke.”

― Clayton M. Christensen

Supercomputing is a trade show masquerading as a scientific conference and at its core big money chasing small ideas. It takes place this week in Austin, and features the slogan “HPC Transforms“. The small idea is that all we need to do for modeling & simulation (and big data) to succeed is build faster computers. This isn’t a wrong idea per se, but rather a naïve and simplistic strategy that is suboptimal in the extreme. Its what we are doing despite the vacuous thinking behind it. Unfortunately we and other countries are prepared to spend big money on this strategy while overlooking the rather obvious and more balanced path to success. The balanced path is more difficult, challenging and risky, which is part of our unwillingness to pursue it. The tragedy that is unfolding is one of lost opportunity for true sustainable progress and massive societal impact.Mainframe_fullwidth

“HPC Transforms” isn’t a bad or wrong idea either. The problem is the basic concept of transformation happened decades ago, and today HPC works on the pure inertia of that generation old progress. It was the 1980’s that marked the birth of HPC and its transformative power on science. If look at HPC today we see a shell left over with only massive computing hardware being the focus. The elements of progress and success that fed the original transformative power of HPC have been allowed to whither. The heart and soul of HPC is whithering due to lack of care and feeding. A once balanced and important effort has become a dismal shell of its former self. We have allowed shallow slogans to replace a once magnificent scientific field’s path to change.

This week marked some insightful commentary about Clayton Christensen’s theory of disruptive innovation (https://hbr.org/2015/12/what-is-disruptive-innovation or the reader’s digest version http://www.businessinsider.com/clay-christensen-defends-his-theory-of-disruption-2015-11), which has become a bit of a hollow mantra and buzzword in many places. For many, like those in HPC, it has become a bit of a shallow offering about the nature of Supercomputing. Instead I’ll submit that the last twenty years has been marked by a disruptive disinnovation. The parallel computing “revolution” has ripped the heart and soul from supercomputing and left a rotting husk behind. The next generation of computing will only offer an acceleration of the process that has lobodomized supercomputing, and left a vertiable zombie Unknown-3behind. The lobodomy is the removal of attention and focus on the two pieces of computing that are most responsible for impacting reality, which I am going to refer to as the heart and soul of HPC. It doesn’t need to be this way, instead the path we are taking is a conscious choice driven by naivity and risk-aversion.

If you defer investing your time and energy until you see that you need to, chances are it will already be too late.

― Clayton M. Christensen

So what is this opposing concept of disruptive disinnovation that I’m proposing? It is a new technology that you are forced into using that undermines other important technologies. For supercomputing the concept is relatively easy to see. Computing has transformed quickly into a global economic colossus, but focused on the mobile market, which derive their value primarily through mobility, connectivity and innovation in applications.

Traditional mainframe sort of computing has changed with a distinct lack of drive for raw computing power. Low power that allows long battery life became the design mantra for computer chips and the easy performance of improvements of Moore’s law ended last decade. At the same time we have a mantra that we must have the most powerful computer (measured by some stupid benchmark that is meaningless!). This demand for the fastest computer became some sort of emptyend-world-survival-guide-staying-alive-during-zombie-apocalypse.w654national security issue to sell it without a scintila of comprehension for what makes these computers useful in the first place. The speed of the computer is one of the least important aspects of the real transformative power of supercomputing, and the most distant from its capasity to influence the real world.

To enable us to claim to have the fastest computer, which naively means we have the best science. In the process of using these new computers we undermine our modeling, methods and algorithmic work because just using these new computers was so hard. The quality of the science done with computers is completely and utterly predicated on the modeling used.

There are quarters that like to say that parallel computing was a disruptive innovation, except it made things worse. In the process we underminded the most important aspect of supercomputing to enable meaningless boasting. The concept is really simple to understand and communicate: it’s the apps stupid.url-1 The true value of computers are the applications, not the hardware. If anything should be obvious about the mobile computing era, it is the software that determines the value of computing, and we have systematically undermined the value, content and quality of our software. When I say this it is not an SQE question, but the application’s utility to impact reality.

What is this heart and soul of HPC?

Modeling is the heart of high performance computing. Modeling is the process of producing a mathematical model of the real world. HPC provided a path to solving a far greater variety and complexity of models scientifically and opened new vistas for scientific exploration and engineering creation. When modeling is a living breathing entity, it grows when it is critically compared with the reality it is supposed to represent. Some models die and others are born to replace them. Models breed with their genetic material mixing to produce better and more powerful offspring.

Today we have created walls that keep our models from breeding, growing and extending them to become better and more relevant to the issues that society is depending upon them to contribute toward. The whole modeling aspect of HPC is rather static and simply reflects a fixed point-of-view toward what we should be modeling. More than anything the current slogan-based approach to HPC simply promulgates models from the past into the future by fiat rather than an explicit choice.

You view the world from within a model.

― Nassim Nicholas Taleb

Perhaps the worst thing about the lack of attention being paid to modeling is the extreme needs that are unmet and the degree of opportunity being lost. The degree of societal impact that supercomputing could be having is being horrendously shortchanged. The leadership is fixated on hardware primarily as a low-risk path to seeming progress (a gravy train that is about to end). A higher risk path would be the support of work that evolves the utility of supercomputing into the modern world. The risk is higher, but the payoff would be potentially2-29s03immense and truly transformative. We have deep scientific, engineering and societal questions that will be unanswered, or answered poorly due to our risk aversion. For example, how does climate change impact the prevalence of extreme weather events? Our existing models can only infer this rather than simulate it directly. Other questions related to material failure, extremes of response for engineered systems, and numerous scientific challenges will remain beyond our collective grasp. All of this
opportunity is missed because we are unwilling to robustly fund risky research that would challenge existing modeling approaches.

Risks must be taken because the greatest hazard in life is to risk nothing.

― Leo Buscaglia

The soul of HPC is methods and algorithms, which together power the results that the computer can produce. We used to invest a great deal in improving methods and algorithms to amplify the good that the computer does. Today we simply use what we already have developed and re-implement them to fit onto the modern monstrosities we call supercomputers. The drive to continually improve and extend existing methods and algorithms to new heights of quality and performance is gone. We have replaced this with the attitude that these areas are mature and well developed not needing attention. Again, we can honestly assess this as a lost opportunity. In the past methods and algorithms have produced as much gain in performance as the machines. In effect they have been a powerful multiplier to the advances in hardware. Today we deny ourselves this effect to the detriment of the transformation this conference is touting to the World.

JohnvonNeumann-LosAlamosAll of this reflects a rather fundamental misunderstanding of what HPC is and could be. It is not a fully matured topic, nor is it ready to simply go into this station-keeping mode of operation. It still requires the extreme intellectual efforts and labors that put it in this transformative place societally. If HPC were more mature we might reasonably be more confident in its results. Instead HPC relies upon bravado of boastful claims that hardly match what capability it truly has. Any focused on attention on the credibility of computed results reveals that HPC has a great deal of work to do, and the focus on hardware does little to solve it. The greatest depth of work is found in modeling closely followed by issues associated with methods and algorithms.

Instead of basing a program for making HPC transformative on empirical evidence, we have a program based on unsupported suppositions. Hardware is easily understood by the naïve masses, which includes politicians and paper pushers. They see big computers making noise and lots of blinking lights. Models, methods and algorithms don’t have that appeal, yet without them the hardware is completely useless. With an investment in them we could make the hardware vastly more powerful and useful. The problem at hand isn’t that the new hardware is a bad investment; it is a good investment. The problem is how much better the new hardware could be with an appropriately balanced development program that systematically invested in modeling, methods and algorithms too.

People don’t want to buy a quarter-inch drill. They want a quarter-inch hole.

― Clayton M. Christensen

Despite this we have systematically disinvested in the heart and soul of HPC. It is arguable that our actual capacity for solving problems has been harmed by this lack of investment to the tune of 10, 100 or even 1000 times. We could have HPC that is a 1000 times more powerful today if we had simply put our resources into a path that had already been proven for decades. If we had a bolder and more nuanced view of supercomputing, the machines we are buying today could be vastly more powerful and impactful. Instead we clunk along and crow about a transformative capability that largely already happened. There are stunning potential payoffs societally that we are denying ourselves.

Modeling defines what a computer can do, and methods/algorithms define how well they can do it. What our leadership does not seem to realize is that no amount of computing power can do anything to improve a model that is not correct. The only answer that improves the ability to impact reality is a newer, better model of reality. The second aspect of supercomputing we miss is the degree to which methods and algorithms provide benefit.

Our computing power today is more dependent and has received greater benefit from the quality and efficiency of methods and algorithms than hardware. Despite the images-1clear evidence of its importance we are shunning progress in method and algorithms in order to focus on hardware. This is a complete and utter abdication of leadership. We are taking a naïve path simply because it is politically saleable and seemingly lower in obvious risk. The risk we push aside is short term, in the long term the risks we are taking on are massive and potentially fatal. Unfortunately we live in a World where our so-called leaders can make these choices without consequence.

This is an absolute and complete failure of our leadership. It is a tragedy of epic proportions. It reflects poorly on the intellectual integrity of the field. The choices made today reflect a mindset that erupted at the end of the cold war and was successful then in keeping the DOE’s national labs alive. We have gotten into a model of confusing survival with success. Instead of building from this survival strategy into something sustainable, the survival strategy has become the only strategy. If science were actually working properly, the lack of balance in HPC would have become evident. The Supercomputing meeting this week is an annual monument to the folly of our choices in investment in HPC.images-1

I can only hope that saner, more intelligent and braver choices will be made in the not too distant future. If we do we can look forward to a smarter, less naïve and far bolder future with high performance computing that brings the transformative power of modeling and simulation to life. The tragedy of HPC today isn’t what it is doing; it is what isn’t being done and the immense opportunities squandered.

We all die. The goal isn’t to live forever, the goal is to create something that will.

― Chuck Palahniuk

 

Advertisements