??????????????????????????????????????????????????????????????????????????One of the things that seem intriguing is the appearance of the algorithm in the broader cultural milieu. Despite its inherently esoteric and abstract character, the algorithm is becoming a bit of a celebrity these days. Popular press articles have started to examine the impact of the algorithm on our daily lives and explore the power and dangers of relying upon them.

 When we change the way we communicate, we change society

― Clay Shirky

Why? What is happening?

Moore’s law is fading and approaching the end of its wonderful reign (for most of the computing world its already effectively pushing up daisies). Gone are the halcyon days when we could be assured of waiting a couple of years and purchasing a new computer that offered double or more of the performance of the old one. Given this extra power, the software on the older computer rapidly becomes equally or more obsolete. This is still happening today, but for different reasons; the software now has new ideas in it, new algorithms with new capabilities. From 1975 to 2005 Moore’s law produced a growth in computer power that fueled a rise in computing from a scientific backwater or corporate niche to the centerpiece in the World’s economy. Halfway through this great lurch forward, the Internet became the tie that bound all that power into a whole that was greater than any of its parts. Computing power around the world was connected together along with all the people whose numbers recently swelled to all of humanity through cell phones becoming magical handheld computers.

new-google-algorithm As the massive gains from computer power wound down, and simultaneously the Internet transitioned into a huge web of human connectivity, the value proposition for computing changed. Suddenly the greatest value in all of this power switched to connection, access and sorting information. There were some fitful starts at attacking this key problem, but one solution rose above the rest, Google. Based on the work of a couple of Stanford graduate students and some really cool mathematics, Google took the world by storm. In a decade it had transformed itself into the World’s most powerful company. An algorithm that solved the data and connectivity access problem better than anything before it fundamentally powered Google.

Communications tools don’t get socially interesting until they get technologically boring.

― Clay Shirky

UnknownGoogle replaced a computer software company as the World’s most powerful company, Microsoft. In both cases computer programming was the engineering vehicle for these companies. Programming is a technique where intellectual labor is committed to a form where a computer can automatically execute a method, or algorithm to solve a problem. Usually the computer program is actually a large collection of methods,
algorithms, and heuristics that are uniquely composed together to solve problems. As these problems are more difficult and elaborate, the software gains more value.

The bottom line is that all of a sudden the algorithm and its software manifestation had eclipsed the computer hardware as a source of value. This transformation began when Microsoft rushed past IBM. IBM failed to see that software’s importance was about to eclipse hardware, and paid for it. Google put the algorithm together with tIBMhe ability to give people access to information and connectivity to eclipse Microsoft. The algorithm had moved from being a topic of nerdish academic interest to one of the most powerful things in the World. The world’s economy spun on an axis determined by a handful of algorithms.

Change almost never fails because it’s too early. It almost always fails because it’s too late.

― Seth Godin

2000px-Netflix_logo.svgMeanwhile scientific computing has lost its mind and decided that the path that led IBM down the path towards disaster is its chosen path. The end of Moore’s law has resulted in a collective insanity of spending vast sums of money supporting the hardware path in the face of looming disaster. At the same time they have turned their backs on algorithms. Effort and focus flows into obtaining and building massive computers that are increasingly useless for real science while ignoring the value that algorithms bring. The infatuation with the biggest and fastest computer measured by KuDr42X_ITXghJhSInDZekNEF0jLt3NeVxtRye3tqcoan increasingly meaningless benchmark only grows with time. This continues while the key to progress stares them in the eye every time they do an Internet search, the power of the algorithm.

The easiest way to solve a problem is to deny it exists.

― Isaac Asimov

What the hell is going on?

Part of the problem is the ability to artificially breathe life into the corpse of Moore’s law through increasingly massively parallel computers. This has been done through moving the goalposts significantly. The LINPAC benchmark never had much to do with the core of scientific computing and this distance has only grown over the past three decades. This benchmark papers over the myriad of vexing issues with the new computers. What once was a gulf or irritating width has widened into a chasm of dangerous proportions. Disaster looms in the not too distant future as a result.

A secondary goalpost moving is the adoption of “weak scaling”. Scaling is the metric of how well an algorithm or code uses the parallel computing to solve problems faster. True (strong) scaling would ask how much faster could I solve a certain problem with more processors. Perfect scaling means that with “N” processors I would solve the problem “N” times faster. Weak scaling changes this reasonable measure by making the problem “N” times bigger at the same time as the number of processors grows. If the performance of a code is poor on a single processor, weak scaling will successfully hide this fact (most scientific codes in fact suck on single processors, and suck more on many processors). In fact, our codes are performing worse and worse on single processors, and little or nothing has been done about it, weak scaling carries some of the blame by hiding the problem.

Scientific computing is fundamentally about problem solving with computers, not computers unto themselves. The field is being perverted into a fetish where the focus is computers, and problem solving is secondary. This is where we come back to a necessary focus on algorithms. Algorithms are fundamentally about solving problems, and algorithm research is about better, faster, more efficient problem solving. Everything we do in scientific computing runs through an algorithm instantiated in software. Without the algorithms and software the computers are worthless. Without the model being solved and its connection to physics and engineering the value to society is questionable. The combination of algorithm and model is an expression of human intellect and problem solving. It needs a capable computer to allow the solution, but the essence is all-human. We have lost the context of the place of the computer as a tool; it should never be an end unto itself. Yet that is what it’s become.

Any sufficiently advanced technology is indistinguishable from magic.

― Arthur C. Clarke

At the end of the 20th Century a list of the top algorithms was published (Dongarra, Jack, and Francis Sullivan. “Guest editors’ introduction: The top 10 algorithms.” Computing in Science & Engineering 2.1 (2000): 22-23.):

1. 1946: The Metropolis Algorithm for Monte Carlo.

  1. 1947: Simplex Method for Linear Programming.
  2. 1950: Krylov Subspace Iteration Method.
  3. 1951: The Decompositional Approach to Matrix Computations.
  4. 1957: The Fortran Optimizing Compiler.
  5. 1959: QR Algorithm for Computing Eigenvalues.
  6. 1962: Quicksort Algorithms for Sorting.
  7. 1965: Fast Fourier Transform.
  8. 1977: Integer Relation Detection.
  9. 1987: Fast Multipole Method.

One can argue for a few differences (finite elements, shock capturing, multigrid, cyptography,…) in the list, but the bottom line is that scientific computing dominates the list. What about since the turn of the 21st Century? The algorithmic heavy hitters are Google, Facebook, Netflix, encryption, iPhone apps, … If that top ten list were redone now, Google’s PageRank would almost certainly take one of the places. Scientific computing has shrunk from the algorithmic limelight, and commercial interests have leapt to the fore. The intellectual core of scientific computing has committed to utilizing these massive computers instead of solving problems better, or smarter. It is a truly tragic loss of leadership and immensely short sighted.

…invention is a somewhat erratic thing.

— J. Robert Oppenheimer

The key to progress is balance coupled with faith in the human intellect and its power to create. These creations are wonders, this includes computers, but they are machines that are merely tools. As tools they are only as good as what controls them, the algorithms and the software. I am convinced that breakthroughs are still possible. All that is needed is the focus and resources so that great minds will prevail. The modern world of computing offers vast opportunity for science that remains unexplored. Current leadership only seems to see the same path as we have taken in the past. It seems like a low risk path, but in fact represents the highest risk possible, the loss of potential. The lessons from commercial computing are there to be seen plain as day, algorithms rule. All we need to do is pay attention to what is sitting right in front of us.

images

Advertisements