We are losing the ability to understand anything that’s even vaguely complex.

― Chuck Klosterman

I get asked, “what do you do?” quite often in conversation, and I realize the truth needs to be packaged carefully for most people. One of my issues is that advertise what I do on my body with some incredibly nerdy tattoos including an equation that describes oneIMG_3502 form of the second law of thermodynamics. What I do is complex and highly technical full of incredible subtlety. Even when talking with someone from a nearby technical background the subtlety of approximating physical laws numerically in a manner suitable for computing can be daunting. For someone without a technical background it is positively alien. This character comes to play rather acutely in the design and construction of research programs where complex, technical and subtle does not sell. This is especially true in today’s world where expertise and knowledge is regarded as suspicious, dangerous and threatening to so many. In today’s world one of the biggest insults to hurl at some one is to accuse them of being one of the “elite”. Increasingly it is clear that this isn’t just an American issue, but Worldwide in its scope. It is a clear and present threat to a better future.

21SUPERCOMPUTERS1-master768I’ve written often about the sorry state of high performance computing. Our computing programs are blunt and naïve constructed to squeeze money out of funding agencies and legislatures rather then get the job done. The brutal simplicity of the arguments used to support funding is breathtaking. Rather than construct programs to be effective and efficient getting the best from every dollar spent, we construct programs to be marketed at the lowest common denominator. For this reason something subtle, complex and technical like numerical approximation gets no play. In today’s world subtlety is utterly objectionable and a complete buzz kill. We don’t care that it’s the right thing to do, or that it is massively greater in return than simply building giant monstrosities of computing. It would take an expert from the numerical elite to explain it, and those people are untrustworthy nerds, so we will simply get the money to waste on the monstrosities instead. So here I am, an expert and one of the elite using my knowledge and experience to make recommendations on how to be more effective and efficient. You’ve been warned.

Truth is much too complicated to allow anything but approximations.

— John Von Neumann

If we want to succeed at remaining a high performance computing superpower, we need change our approach and fast. Part of what is needed is a greater focus on numerical approximation. This is part of deep need to refocus on the more valuable aspects of the scientific computing ecosystem. The first thing to recognize is that our current hardware first focus is oriented on the least valuable part of the ecosystem, the computer itself. A computer is necessary, but horribly insufficient for high performance computing supremacy. The real value for scientific computing is the opposite end of the spectrum where work is grounded in physics, engineering and applied mathematics.Crays-Titan-Supercomputer

Although this may seem a paradox, all exact science is dominated by the idea of approximation.

— Bertrand Russell

I’ve made this argument before and it is instructive to unpack it. The model solved via simulation is the single most important aspect of the simulation. If the model is flawed, no amount of raw computer speed, numerical accuracy, or efficient computer code can rescue the solution and make it better. The model must be changed, improved, or corrected to produce better answers. If a model is correct the accuracy, robustness, fidelity and efficiency of its numerical solution is essential. Everything upstream of the numerical solution aimed toward the computer hardware is less important. We can move down the chain of activities all of which are necessary seeing the same effect, the further you get from the model of reality, the less efficient the measures are. This whole thing is referred to an ecosystem these days and every bit of it needs to be in place.3_code-matrix-944969 What also needs to be in place is a sense of the value of each activity, and priority placed toward those that have the greatest impact, or the greatest opportunity. Instead of doing this today, we are focused on the thing with least impact, farthest from reality and starving the most valuable parts of the ecosystem. One might argue that the hardware is a subject of opportunity, but the truth is the opposite. The environment for improving the performance of hardware is at a historical nadir; Moore’s law is dead, dead, dead. Our focus on hardware is throwing money at an opportunity that has passed into history.

I’m a physicist, and we have something called Moore’s Law, which says computer power doubles every 18 months. So every Christmas, we more or less assume that our toys and appliances are more or less twice as powerful as the previous Christmas.

Michio Kaku

At some point, Moore’s law will break down.

— Seth Lloyd

There is one word to describe this strategy, stupid!

500x343xintel-500x343.jpg.pagespeed.ic.saP0PghQP9At the core of the argument is a strategy that favors brute force over subtleties understood mainly by experts (or the elite!). Today the brute force argument always takes the lead over anything that might require some level of explanation. In modeling and simulation the esoteric activities such as the actual modeling and its numerical solution are quite subtle and technical in detail compared to the raw computing power that can be understood with ease by the layperson. This is the reason the computing power gets the lead in the program, not because of its efficacy in improving the bottom line. As a result our high performance-computing world is dominated by meaningless discussions of computing power defined by a meaningless benchmark. The political dynamics is basically a modern day “missile gap” like we had during the Cold War. It has exactly as much virtue as the original “missile gap”; it is a pure marketing and political tool with absolutely no technical or strategic validity aside from its ability to free up funding.

Each piece, or part, of the whole of nature is always merely an approximation to the complete truth, or the complete truth so far as we know it. In fact, everything we know is only some kind of approximation because we know that we do not know all the laws as yet.

— Richard P. Feynman

Once you have an entire program founded on bullshit arguments, it is hard to work your way back to technical brilliance. It is easier to double down on the bullshit and simply define everything in terms of the original fallacies. A big part of the problem is the application of modern verification and validation to the process. Both verification and validation are modern practices to accumulate evidence on the accuracy, correctness and fidelity of computational simulations. Validation is the comparison of simulation with experiments and in this comparison the relative correctness of models is determined. Verification determines the correctness and accuracy of the numerical solution of the vyxvbzwxmodel. Together the two activities should help energize high quality work. In reality most programs consider them to be nuisances and box checking exercises to be finished and ignored as soon as possible. Programs like to say they are doing V&V, but don’t want to emphasize or pay for doing it well. V&V is a mark of quality, but the programs want its approval rather than attend to its result. Even worse, if the results are poor or indicate problems, they are likely to be ignored or dismissed as being inconvenient. Programs get away with this because the practice of V&V is technical and subtle and in the modern world highly susceptible to bullshit.

Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong question, which can always be made precise.

— John W. Tukey

Numerical methods for solving models are even more technical and subtle. As such they are the focus of suspicion and ignorance. For high performance computing today they are considered to be yesterday’s work and largely a finished, completed product now simply needing a bigger computer to do better. In a sense this notion is correct, the bigger computer will produce a better result. The issue is that using the computer power, as the route to improvement is inefficient under the best of circumstances. We are not living under of the best of circumstances! Things are far from efficient, as we have been losing the share of computer power advances useful for modeling and bh_computers_09simulation for decades now. Let us be clear, when we receive an ever-smaller proportion of the maximum computing power as each year passes. Thirty years ago we would commonly get 10, 20 or even 50 percent of the peak performance of the cutting edge supercomputers. Today even one percent of the peak performance is exceptional, and most codes doing real application work are significantly less than that. Worse yet, this dismal performance is getting worse with every passing year. This is one element of the autopsy of Moore’s law that we have been avoiding while its corpse rots before us.

So we are prioritizing improvement in an area where the payoffs are fleeting and suboptimal. Even these improvements are harder and harder to achieve as computers become ever more parallel and memory access costs become ever more extreme. Simultaneously we are starving more efficient means of improvement of resources and emphasis. Numerical methods and algorithms are two key areas not getting any significant attention or priority. Moreover support for these areas is actually diminishing so that support for the inefficient hardware path can be increased. Let’s not mince words; we are emphasizing a crude naïve and inefficient route to improvement at the cost of a complex and subtle route that is far more efficient and effective.

Numerical approximations and algorithms are complex and highly technical things john-von-neumann-2poorly understood by non-experts even if they are scientists. The relative merits of one method or algorithm compared to another is difficult to articulate. The merits and comparison is highly technical and subtle. Since creating new methods and algorithms makes progress, this means improvements are hard to explain and articulate to non-experts. In some cases both methods and algorithms can produce breakthrough results and produce huge speed-ups. These cases are easy to explain. More generally a new method or algorithm produces subtle improvements like more robustness or flexibility or accuracy than the older options. Most of these changes are not obvious, but making this progress over time leads to enormous improvements that swamp the progress made by faster computers.

An expert is someone who knows some of the worst mistakes that can be made in his subject, and how to avoid them.

― Werner Heisenberg

The huge breakthroughs are far and few between but provide much greater value than any hardware over similar periods of time. To get these huge breakthroughs requires continual investment in research for extended periods of time. For much of the time the research is mostly a failure producing small or non-existent improvements, until they don’t. Without the continual investment, the failure and the expertise failure produces, the breakthroughs will not happen. They are mostly serendipitous and the end product of many unsuccessful ideas. Today the failures and lack of progress is not supported; we exist in a system where insufficient trust exists to support the sort of failure needed for progress. The result is the addiction to Moore’s law and its seemingly guaranteed payoff because it frees us from subtlety.

Often a sign of expertise is noticing what doesn’t happen.

― Malcolm Gladwell

A huge aspect of expertise is the taste for subtlety. Expertise is built upon mistakes and idiocracyfailure just as basic learning is. Without the trust to allow people to gloriously make professional mistakes and fail in the pursuit of knowledge, we cannot develop expertise or progress. All of this lands heavily on the most effective and difficult aspects of scientific computing, the modeling and solution of the models numerically. Progress on these aspects is both highly rewarding in terms of improvement, and very risky being prone to failure. To compound matters progress is often highly subjective itself needing great expertise to explain and be understood. In an environment where the elite are suspect and expertise is not trusted such work is unsupported. This is exactly what we see, the most important and effective aspects of high performance computing are being starved in favor of brutish and naïve aspects, which sell well. The price we pay for our lack of trust is an enormous waste of time, money and effort.

Wise people understand the need to consult experts; only fools are confident they know everything.

― Ken Poirot

Again, I’ll note that we still have so much to do. Numerical approximations for existing models are inadequate and desperately in need of improvement. We are burdened by theory that is insufficient and heavily challenged by our models. Our models are all flawed and the proper conduct of science should energize them to improve.

…all models are approximations. Essentially, all models are wrong, but some are useful. However, the approximate nature of the model must always be borne in mind… [Co-author with Norman R. Draper]

— George E.P. Box

Advertisements