Late last week I tweeted that I thought high performance computing (HPC) was a zombie. I’ll admit that this is an intentionally provocative statement emblematic of the age of Twitter, but it is also worth some deeper thought too. So in my first daily blog post let me explain.

 The riskiest thing we can do is just maintain the status quo.

― Bob Iger

A zombie is a fearsome motif of modern horror, the product of post-modern feabike-girl-zombiers of technology, as well as a metaphor for the dehumanizing effect of the modern world. The zombie lives the existence of the undead driven by an inhuman hunger for flesh (or brains!). It has no other purpose, but to mindlessly consume the living.

So, why am I calling supercomputing a zombie?

Because it’s mindless, just lurching forward consuming resources without any vision of where it is going or why. We are just trying to have the fastest computer, so that we have the fastest computer instead of China. Whether the computer is useful for anything is really secondary or tertiary to the purpose of having the fastest computer. The unfortunate side-effect of this approach is that the computers are increasingly difficult to solve problems on, and little or no thought is going into how to use them.

At some point in time around 1990 people got the idea that we should rate supercomputer utility by their speed instead of a holistic view of the computer’s problem-solving utility. At some deeper level a computer is simply a tool to be utitansed. We don’t endeavor to have the World’s largest hammer or wrench; we recognize that these are tools whose function is essential. We don’t recognize this about computers, for some reason their status as tools has been lost. In this sense supercomputing research has become a bit of a fetish.

There are three classes of men; the retrograde, the stationary and the progressive.

― Johann Kaspar Lavater

 

A long time ago supercomputing was about solving problems, and it mattered whether the computer was actually useful for solving problems. Originally scientists envisioned solving problems through computation and then build computers to bring this vision to fruition. Speed was always welcome as long as the speed was useful. Along the way supercomputing underwent the transition to massively parallel computing as a matter of course.

 

Around that same time supercomputing was equivocated with National Security. The virus to create the zombie has infected the victim. It was just a matter of time before the zombie began to lurch forward hungrily devouring resources. The program to develop supercomputers had to be successful so they developed a new metric of success, weak scaling. Weak scaling is the ability to get ever-higher performance as the number of processors used to solve a problem grows as the problem size grows as well. It’s validity is predicated on the “bigger is better” point-of-view”.

 

What’s measured improves

Peter F. Drucker

 

The problem with weak scaling is that it allows the computational intensity to drop (which it has) while yielding success. As a result weak scaling increasingly drives our computer development and the algorithms used on them. All the while their actual computational performance is plummeting. None of these measurements really have much to do with problem solving, just declarative success with building a faster computer. All of these advances only improve problem solving as a collateral benefit rather than the central purpose.

 

This is the zombie, a pursuit of a mindless goal without regard to any purpose beyond the pursuit itself.

 

Two reasons why people hate and/or fight change: (1) People fear the unknown; and (2) There are always people profiting from how things are.

― Mokokoma Mokhonoana

 

 

Advertisements