Computers get better faster than anything else ever. A child’s PlayStation today is more powerful than a military supercomputer from 1996.
— Erik Brynjolfsson
For supercomputing to provide the value it promises for simulating phenomena, the methods in the codes must be convergent. The metric of weak scaling is utterly predicated on this being true. Despite its intrinsic importance to the actual relevance of high performance computing relatively little effort has been applied to making sure convergence is being achieved by codes. As such the work on supercomputing simply assumes that it happens, but does little to assure it. Actual convergence is largely an afterthought and receives little attention or work.
Don’t confuse symmetry with balance.
― Tom Robbins
Thus the necessary and sufficient conditions are basically ignored. This is one of the simplest examples of the lack of balance I experience every day. In modern computational science the belief that faster supercomputers are better and valuable has become closer to an article of religious faith than a well-crafted scientific endeavor. The sort of balanced, well-rounded efforts that brought scientific computing to maturity have been sacrificed for an orgy of self-importance. China has the world’s fastest computer and reflexively we think there is a problem.
I am not saying it is utterly useless. It can play video games.
—Unnamed Chinese Academy of Sciences Professor
At least the Chinese have someone who is smart enough to come to an honest conclusion about their computer! It could be a problem, or it might not be a problem at all. Everything that determines whether it’s a problem has little or nothing to do with the actual computer. The important thing is whether we, or they do the things necessary to assure that the computer is useful.
There is nothing quite so useless, as doing with great efficiency, something that should not be done at all.
― Peter F. Drucker
I know we are doing a generally terrible job of it. I worry a lot more about how much the Chinese are investing in the science going into the computer relative to us. The quote above probably means that they understand how bullshit the “fastest supercomputer” metric actually is. The signs are that they are taking action to fix this. This means much more than the actual computer.
Once upon a time applied mathematics was used to support the practical and effective use of computing. During the period of time from World War 2 to the early 1990’s applied math helped making scientific computing effective. It planted the seeds of the faith in faster computers we take for granted today. Over the past twenty or so years, this has waned and applied math has shrunk from impact. More and more computing simply works on autopilot to produce more computing power without doing what is important for utilizing this power effectively. Applied math is one of the fields necessary to do this.
Computer science is one of the worst things that ever happened to either computers or to science.
— Neil Gershenfeld
While necessary applied math isn’t sufficient. Sufficiency is achieved when the elements are applied together with science. The science of computing cannot remain fixed because computing is changing the physical scales we can access, and the fundamental nature of the questions we ask. The codes of twenty years ago can’t simply be used in the same way. It is much more than rewriting them or just refining a mesh. The physics in the codes needs to change to reflect the differences.
a huge simulation of the ‘exact’ equations…may be no more enlightening than the experiments that led to those equations…Solving the equations leads to a deeper understanding of the model itself. Solving is not the same as simulating.
For example we ought to be getting ensembles of calculations from different initial conditions instead of single well-posed initial value problems. This is just like experiments, no two are really identical, and computations should be the same. In some cases this can lead to huge systematic changes in solutions. Reality produces vastly different outcomes from ostensibly identical initial conditions. This makes people really uncomfortable, but science and simulations could provide immense insight into this. Our current attitudes are holding us back from realizing this.
Single calculations will never be “the right answer” for hard problems.
Right now this makes scientists immensely uncomfortable because the necessary science isn’t in place. Developing understanding of this physically and mathematically is needed for confidence. It is also needed to get the most out of the computers we are buying. Instead we simply value the mere existence of these computers and demonstrate their utility through a sequence of computing stunts of virtually no scientific value.
To me, this is not an information age. It’s an age of networked intelligence, it’s an age of vast promise.
Beyond the science, the whole basis of computing is still grounded in models of computing from twenty or thirty years ago (i.e., mainframes). While computing has undergone a massive transformation and become a transformational social technology scientific computing has remained stuck in the past. Science is only beginning to touch the possibilities of computing. In many ways the high performance computing world is even further behind than much of the rest of the scientific world in utilizing of the potential computing as it exists today.
All these computers, all these handhelds, all these cell phones, all these laptops, all these servers — what we’re getting out of all these connections is we’re getting one machine. … We’re constructing a single, global machine.
A chief culprit is the combination of the industry and its government partners who remain tied to the same stale model for two or three decades. At the core the cost has been intellectual vitality. The implicit assumption of convergence, and the lack of deeper intellectual investment in new ideas has conspired to strand the community in the past. The annual Supercomputing conference is a monument to this self-imposed mediocrity. It’s a trade show through and through, and in terms of technical content a truly terrible meeting (I remember pissing the Livermore CTO off when pointing this out).
You can’t solve a problem with the management of technology with more technology.
The opportunities provided by the modern world of computing are immense. Scientific computing should be at the cutting edge, and instead remains stranded in the past. The reason is the lack of intellectual vitality that a balanced effort would provide. The starting point was a failure to attend to the necessary and sufficient efforts to assure success. Too much effort is put toward making “big iron” function, and too little effort in making it useful.
We’ve got 21st century technology and speed colliding head-on with 20th and 19th century institutions, rules and cultures.
— Amory Lovins