Societies in decline have no use for visionaries.

― Anaïs Nin

gordon-moore-blue-3-1.jpg.rendition.cq5dam.webintel.960.320 So it’s a new year with all the requisite reflective looks forward and backwards. I’ll do both here and posit that perhaps an era is drawing to a close and its time for a big change in scientific computing. Even more, I’ll argue that a big change is being thrust upon us, and its time to get ahead of it. I’ve taken the history of scientific computing and laid it out in a series of eras each 15-20 years long. These eras are defined by a combination of ideas, algorithms, methods, hardware and software. Changes in the composition of all of these define each era and trigger the changes.

 A man’s shortcomings are taken from his epoch; his virtues and greatness belong to himself.

― Johann Wolfgang von Goethe

I believe that a combination of crises will trigger the change that is upon us. One of these crises has all the headlines, one is barely hidden from view and a third is silent, but each has a huge role to play. The key visible crisis is hardware driven and revolves around the viability of Moore’s law operating in computing and computational performance. We seem to have taken the approach that maintaining Moore’s law is essential and we are willing to expend vast amounts of money to achieve it. This money could be spent more profitably elsewhere in the enterprise. The second crisis is software driven and associated with the complexity of scientific software and the ponderous nature it has taken on. Software is becoming increasingly unsustainable and expensive threatening to swallow all of the available resources. The third silent crisis is the dearth of new ideas in scientific computing and/or an inability to impact progress. This third crisis is primarily driven by the combination of hard-to-impossible to use hardware with software complexity exploding to strangle any new ideas in their proverbial cribs. Even when the ideas can be breathed to life, they are starved of the sort of resources and focus necessary to bring them to fruition. Dealing with the first two problems is simply taking all the resources available, and leaving nothing else.

History is a Rorschach test, people. What you see when you look at it tells you as much about yourself as it does about the past.

― Jennifer Donnelly

Yesterday I tweeted “scientific computing was once the grand avenue of computing, now it is a dark alley in a bad neighborhood.” The scientific community once drove computing as a vanguard, and now has to adapt to whatever the market does. It has become a niche activity and economically slaved to a colossus of global reach. A huge marketplace, which might benefit computing, now drives hardware and software innovation but its direction is not optimal. We must react to directions that benefit the marketplace rather than determine the direction for the market.

Let us study things that are no more. It is necessary to understand them, if only to avoid them.

― Victor Hugo

images-1The politics of the time have an enormous impact on focus and resource availability. Scientific computing was born in the crucible of a World War and matured in the urgency of the Cold War. Nothing like this exists to focus the mind and open the pocketbook like that today. On the other hand computing has never been as important as it is today. Never have more of society’s resources gone in its direction. How can we harness this massive creative force for our benefit?

The greatest and most powerful revolutions often start very quietly, hidden in the shadows. Remember that.

― Richelle Mead

I’ve taken the history of scientific computing and broken in up into distinct five eras and made the leap of defining 2015 as the beginning of a new one. I’ll grant that the dates are rounded up or down by a few years, so maybe we’re already in a new era, or it’s a few years off.

The farther backward you can look, the farther forward you are likely to see.

― Winston S. Churchill

  • imagesUnknownPre-1945 (prehistory): There was no scientific computing because computers were in their infancy, and the combination of their use for science was not yet seen. In this time the foundations of mathematics and physics were made by a host of greats along with numerical methods crafted for hand computations. The combination of computers
    with the vision of John von Neumann and the necessity of World War 2 brought scientific computing out of this womb and into practice.
  • vnc011945-1960 (creation): In this time scientific computing was largely taking place in the most important Labs on the most important topic with access to high priority and huge resources. Great innovations were taking place in computers and the practice of computing. Along with refinements in the engineering of computers, the practice of programming began to take shape. The invention of Fortran and its capacity to express methods and algorithms in code was one of the developments to bring this era to a close. In this time, the development of mathematical theory and numerical analysis was key. The invention of stability, and convergence of numerical methods was one of the great achievements. These provided a platform for systematic development in the 1960’s.
  • 1960-1975 (foundations): During this period scientific computing emerged from the shadows into the light. Computers became increasingly available outside the top-secret environment, and computing began to be a valid academic endeavor. With this democratization of computing came extensive application to an ever-widening set of problems. Many of the key methods and algorithms for scientific computing were created in this period. The field of computational fluid dynamics (CFD) came into being. CFD was then viewed as an enormous boon to aerospace science. By the time the period drew to a close there was great optimism and hope. Computers were becoming quite capable and more generally available. The Labs still led the world especially because they always had access to better hardware and software than anyone else. They were beginning to be indispensible tools for business. Moore’s law was defined and the beginning of a massive growth in computing power had started.
  • 1975-1995 (glory years): I’ve described this time as the “golden age” of Unknown-2computational science. For the first time the computers and software was balanced with the methods and models. In many ways Seymour Cray defined the era first with the CDC 6600 and 7600 computers then with the machines bearing his name. The vision set forth by von Neumann came into force. Academic scientific computing became completely respectable with mathematics, physics and engineering all taking part. The first hints of extreme hubris were witnessed; the “numerical wind tunnel” debacle unfolded in aerospace. The ability for CFD to displace physical wind tunnel testing in design and qualification was a masUnknown-4sive over-reach in capability. Great damage was done in the process, and no one seems to have learned from the experience. It foreshadows the developments of the current time with ASC when the creation of “virtual underground testing” was proposed to make up for a ban on actual underground testing.
  • images-21995-2015 (mid-life): Then the glory days ended with a bang through a combination of events. The Cold War ended and soon nuclear testing ceased. The Labs would have their own “numerical wind tunnel” moment, but no actual wind tunnel would be available to test it. At the same time the capacity of the supercomputers of the golden era to maintain Moore’s law came to an end. The entire ASC program hinged upon the premise that advances in computational performance would pave the way for predictive simulation. We had the attack of the killer micros and the birth of massively parallel computation to keep hardware performance on the increase. Unknown-3Getting the methods and models of old to work on these computers became an imperative; the access to more computing power via Moore’s law became an imperative as well. At the same time the complexity of the codes was growing by leads and bounds. New programming paradigms were being ushered into use with C++ leading the way. Its object-oriented principles were thought to be a way to handle the seemingly overwhelming complexity. With more resources flowing into hardware and software the amount of energy going into methods and models waned. Where efforts in these endeavors had previously yielded gains larger than Moore’s law such gains have simply evaporated during this era.
  • 2015- (mid-life crisis): Now we get to today and the elements of revolution are falling into place. We have three crises at hand with each brewing during the era we’re at the end of. Hardware is in crisis with Moore’s law either already dead or on death’s door. The complexity of the software is beginning to threaten progress. Lack of innovation in methods, algorithms and modeling is killing other sources of improved performance. Let’s look at each crisis in turn and its threat to scientific computing’s progress.

All revolutions are, until they happen, then they are historical inevitabilities.

― David Mitchell

Transistor_Count_and_Moore's_Law_-_2008_1024The most evident crisis is the demise of Moore’s law. Given the devotion to computing power as the route to predictive computational science, the loss of growth in computing power would be fatal. There are two worrying signs: the growth in computing power at the processor level has slipped to a crawl, and the ability to use all the power of the massively parallel computers for real problems is missing. At the low end of computing nothing will save Moore’s law especially as the computing industry has moved on to other priorities. It is just accepted. At the high end we grasp on to terrible metrics like weak scaling, or LINPAC to hide the problems, but the immensity of the issues become clearer every day. In the middle of this Moore’s law is clinging to life, but the two sides are converging on the middle and when they do Moore’s law will be dead. There are a host of hopes for life, but the laws of physics are arrayed against the continuation of this trend. With all the effort going into using Moore’s law what will be left to pick up the pieces?fastest-supercomputer-Fujitsu-Numerical-Wind-Tunnel

 

A second crisis simmering just below the surface is software complexity. The combination of trying to make codes work on cutting edge computers and the increasing desire for capability in codes is creating a software monstrosity. Earlier in the week I learned from readers of the blog about Gall’s law, which says that a complex system evolves from a simple system that works, and a complex system defined from scratch will not work. We run a great risk that we will be stuck with massive codes bases that are eroded by mountains of technical debt. These debts threaten to choke all progress and commit us to the fundamental methodology defined by the increasingly unwieldy code. The issue of software and how we translate our intellectual labor to silicon has to be dealt with soon or it will strangle progress more surely than the death of Moore’s law.

 

CFD_tunnel_comparisonThe third and most shadowy crisis is lack of impact from methods, models and algorithms in the most modern era of scientific computing. As I said earlier, part of the problem are the twin crises of decline in hardware gains and software-bloat sapping the energy from the system? Before our infatuation with Moore’s law as the heartbeat of progress innovation in algorithms, numerical methods and modeling produced more progress than hardware gains. These gains are harder to measure and far subtler than raw computational performance, but just as real. As hardware fades away as a source of progress they are the natural place to turn to for advances. The problem is that we have starved this side of scientific computing for nearly 20 years. Major changes are needed to reinvigorate this approach. As I’ve come to realize the software languages are themselves a massive algorithmic achievement (Fortran is listed among the 10 greatest algorithms of the 20th century!). This is to say that intellectual labor toward figuring out how to program computers in the future is part of this issue and a necessary element in fixing two of the crises.

 

But I suppose the most revolutionary act one can engage in is… to tell the truth.

― Howard Zinn

 

The question is whether we will answer the call to action that the current day’s developments should demand. The situation is so critical that the current state of affairs cannot continue for much longer. My greatest concern is the lack of leadership and the lack of appetite for the risk-taking necessary to take on the challenge of the day. If we can find the courage to step forward new vista await, it’s simply a matter of coming to terms with realities. If we don’t the next era of scientific computing could be marked by decline and obsolesce. It need not be this way, but some significant changes are needed in attitudes and approaches to leading the field. Are we up to the challenge? Do we have the leadership we need? It is time to get ahead of the crises now before they become overwhelming.

 

Those who make peaceful revolution impossible will make violent revolution inevitable.

― John F. Kennedy

 

A revolution is not a bed of roses. A revolution is a struggle between the future and the past.

― Fidel Castro

 

Advertisements