Crisis is Good. Crisis is a Messenger.
― Bryant McGill
Computational What? Science? Engineering? Physics? Mathematics?
My last post is about computational science’s impending multi-crisis with the loss of Moore’s law, exploding software complexity, and failure to invest in its intellectual foundation. A reasonable question is how broadly the issues described there apply to subsets of the field. What about computational physics? What about computational engineering? What about computer science? What about applied mathematics? What are the differences between these notions of computation’s broader role in the scientific world? What are the key characteristics that make the essence of scientific computing?
One of the most important of disciplines is the one that never assumed its logical name: mathematical engineering.
Computational science is an umbrella for a variety of things that have gradations of difference and don’t form a terribly coherent whole. Instead it is a continuum with one end being held down by computer science and the other end by computational engineering; or perhaps the fields that birthed computing, physics and mathematics. The differences between engineering, mathematics and physics show themselves in computation as they do in other fields, but scientific computing should really be something of amalgam of all of these areas.
We are not creators; only combiners of the created. Invention isn’t about new ingredients, but new recipes. And innovations taste the best.
― Ryan Lilly
The Origin: Physics and Mathematics
To start our discussion it is worth taking a look at the origins of computing when mathematics and physics combined to create the field. This combination is embodied in John von Neumann whose vision largely produced the initial instantiation of scientific computing. Scientific computing began in earnest under the aegis of the development of the atomic bomb. The application of computing was engineering analysis done by some of the greatest physicists in the world most notably Hans Bethe and Richard Feynman using methods devised by John von Neumann and Rudolf Peierls. Engineering was limited to the computer itself. Mathematicians played key roles in more properly using computers notably through the efforts of Robert Richtmyer, Nicholas Metropolis and Richard Hamming. As a rule, the overall effort was conducted by a host of geniuses for an application of monumental international impact and importance. Practically speaking, they were exquisitely talented scientists who were also immensely motivated and had every resource available to them.
From this august origin, computing began to grow outside the womb of nuclear weapons’ work. Again, it was John von Neumann that provided the vision and leadership. This time from the Institute for Advanced Study in Princeton focused on weather and development of better computers. Again, the application was largely in the realm of physics with the engineering being applied to the computers. Meanwhile computing was broadening in its appeal and attention from the success in Los Alamos and Princeton along with colleagues at universities. Other Labs in the United States and the Soviet Union also began exploring the topic. It still remained immature and speculative especially in a world that scarcely comprehended what a computer was or could do.
Computers are incredibly fast, accurate, and stupid: humans are incredibly slow, inaccurate and brilliant; together they are powerful beyond imagination.
― Albert Einstein
Engineering Joins the Revolution
It wasn’t until the 1960’s when engineering activities began to include computation. Part of the reason for this was the development of the initial set of methods by physicists and mathematicians and the unavailability of computing in general and specifically computing power sufficient to contemplate engineering. At first, the engineering uses of computing were exploratory and relegated to the research activities more like the physical or mathematical sciences than engineering. By the 1970’s this ended led by a handful of pioneers in Aerospace, Mechanical and Civil engineering. The growth of engineering use of computing led to some bad things like hubris of the awful “numerical wind tunnel” affair. In the late 1970’s talk of replacing wind tunnel testing with numerical simulations became an embarrassing set back (which we have naively made all over again). It represented a massive technical over reach and ultimately a setback by driving a wedge between computing and experiments.
Civil engineering made progress by utilizing the finite element method, which was ideally suited for that fields’ intellectual basis. In mechanical engineering heat transfer and fluid flow problems and dominantly heat exchanger design led the way. Together with aerospace engineering the important topic of computational fluid dynamics (CFD), which is the archetype of computational science in general. Nuclear engineering was birthed from physics and had computing at its heart almost from the beginning especially with the problem of reactor core design. These methods were born directly from the nuclear weapons’ program as a natural outgrowth of the peaceful exploration of nuclear power.
Science is the extraction of underlying principles from given systems, and engineering is the design of systems on the basis of underlying principles.
Mathematics Shrinks from View
Computer Science is no more about computers than astronomy is about telescopes
― Edsger Wybe Dijkstra
All of this was a positive outgrowth of the combination of physics and mathematics. During the same period the mathematical contributions to scientific computing went several directions, with pure mathematics birthing computer science, and applied mathematics. Computer science has become increasingly divorced from scientific computing over time and failed to provide the sort of inspirational impetus mathematics had previously provided. For several decades applied mathematics filled this vacuum with great contributions to progress. In more recent times applied mathematics has withdrawn from this vital role. The consequence of these twin developments has taken a terrible toll of depriving scientific computing of a strong pipeline of mathematical innovation. I will admit that statistics has made recent strides in connecting to scientific computing. While this is a positive development, it hardly makes up for the broader diminishing role of other mathematics from computing.
We see that computation was born from physics and mathematics with engineering joining after the field had been shaped by those fields. Over the past thirty or forty years engineering has come to play an ever larger part in scientific computing, the physical sciences have continued their part, but mathematics has withdrawn from centrality. Computer science has taken the mantle of pure mathematics’ lack of utility. Applied mathematics leapt to fill this void, but has withdrawn from providing the full measure of much needed intellectual vitality.
Computer science is one of the worst things that ever happened to either computers or to science.
Computing Becomes Something Monstrous
Part of the reason for this is a change in the cultural consciousness regarding computing. In the beginning there was physics and mathematics combining in the imagination of John von Neumann to produce something new, something wonderful and era defining. It gestated in the scientific community for several decades until computing exploded into the public conscious. It was a combination of maturity in the use of computing and sufficient computing power available to the masses that triggered the transition. Computing was no longer the purview of nerds and geeks, now being owned by all of humanity. As such computing became somewhat pedestrian in nature and lost its sheen. This also explains the rise of engineering as an outlet for computing, and the loss of mathematics. In the absence of innovation we substituted raw power. Rather than continue to improve through better thinking we came to rely upon Moore’s law for progress. Where we used to out-smart problems, they now would be overpowered by an unparalleled force.
While scientists and big business owned computing until about 1995, all of sudden it became public property. Soon it grew to be something that dominated the global economy. Powered by Moore’s law computing became ubiquitous and ironically ceased being about computing; computers became about communication. Now everything valuable about computers is communication, not computation. Computation is an essential, but minor element in the value proposition. A big part of the reason is the power of computers is so great that the computational load has become trivial. The Internet gives access to information, data and connects people in ways never imaginable. As such the business possibilities are staggering. Computing is now longer so much about computers as it is about people and their money.
Moore’s law also became a cipher for technology and progress. It has infected computational science with its pervasively superficial nature. It isn’t that Moore’s law is superficial per se; it is the societal interpretation of its implications. Moore’s law is at death’s door, if it hasn’t already passed. Its death does not mean progress will die, it just means progresses path will change.
You can’t solve a problem with the management of technology with more technology.
— Bill Joy
Where do we go from here?
What we can conclude form this discussion is a fundamental change in the character of scientific computing. Where the use of computing for engineering work should have added to form a more complete whole, the withdrawal of mathematics has cheated us of that brighter future. Engineering is an essential human activity and the natural outgrowth of our scientific achievements, but it can lack creativity at times. Such creativity is always beneficial and better when combining disciplines. The structure and rigor of mathematics is essential for putting this creativity on the strongest footing. To make progress in the future it should be essential to include engineering, the physical sciences and mathematics with some degree of equality. The rather weak minded approach of simply utilizing Moore’s law to drive scientific computing forward must end both from a fundamental standpoint as well as by the death of this source of progress.
I’ve decided to get off the daily writing thing, or more accurately the daily posting. I still need to write every single day, so that’s not changing. I’m a lot better off thinking a bit more about the topic to write about, and working it out over several days. My new goal is two or three blog posts a week.
The Future is Already Here, Everyday.
The future is already here – it’s just not evenly distributed.
― William Gibson