What’s the “best” computer? By what criteria should a computer be judged? Best for what? Is it the fastest? Or the easiest to use? Or the most useful?
The most honest answer is probably the most useful, or impactful computer in how I live my life or work, so I’ll answer in that vein.
Have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.
― Steve Jobs
Details matter, it’s worth waiting to get it right.
― Steve Jobs
If I had to answer honestly, it’s probably the latest computer I bought, my new iPhone 6. It’s an absolute marvel. It is easy to use and useful all at once. I have a vast array of applications to use, plus I can communicate with the entire World and access an entire World’s worth of information. I can access maps, find a place to eat lunch, take notes, access notes, find out the answer to questions, keep up with friends, and make new ones. It also allows me to listen to music either stored or via “radio”. It is so good that I am rarely without it. It helps me work out at the gym with an interval timer that I can program to develop unique tailored workouts. Anything that links to the “cloud” for data is even better because the data on the iPhone is the same as other platforms I use. The productivity and efficiency that I can work with is now simply stunning. The word awesome doesn’t quite do it justice. If you gave it to me ten years ago, I’d have thought aliens delivered the technology to humans.
We don’t get a chance to do that many things, and every one should be really excellent. Because this is our life.
― Steve Jobs
The fastest computer I have access to isn’t very good, or useful. It is just fast and really hard to use. In all honesty it is a complete horror show. For the most part this really fast computer is only good for crunching a lot of numbers in a terribly inefficient manner. It isn’t merely not a multi-purpose computer; it is single purpose computer that is quite poor at delivering that single purpose. Except for its speed it compares poorly to the supercomputers I used over 20 years ago. I say this noting that I am not prone to nostalgia at all. Generally I favor the modern over the past by a wide margin. This makes the assessment of modern supercomputing all the more damning.
Don’t be trapped by dogma — which is living with the results of other people’s thinking.
― Steve Jobs
Your time is limited, so don’t waste it living someone else’s life.
― Steve Jobs
Unlike the iPhone with its teeming modernity, the modern supercomputer is an ever more monstrous proposition with each passing year. Plans for future supercomputers are sure to create a new breed of monsters (think Godzilla, a good name for one of the machines!) that promise to consume energy like American consumers drunk on demonstrating their God-given right to excess. They also promise to be harder to use, less reliable, and nearly impossible to program. They might just be truly evil monsters in the making. The evil being done is primarily the loss of opportunity to make modeling and simulation match the hype.
Anything worth doing, is worth doing right.
― Hunter S. Thompson
It isn’t that the hyped vision of modeling and simulation as a third way for science is so flawed; it is our approach to achieving this vision that is so counter-productive. The vision is generally sound provided that the steps we took actually led to such an outcome. The overbearing emphasis on computing speed as the key path to producing a predictive modeling capability is fatally flawed. It is a path lacks the sort of checks and balances that science needs to succeed. A faulty model cannot predict reality regardless of how fast it executes on a computer, or how refined the computational “mesh” is. Algorithmic improvements can provide new applications, solve unsolved problems, and provide greater efficiency that pure computational speed cannot deliver.
It’s not like I’m all into nostalgia and history, it’s just that I can’t stand the way things are now
The current fastest computer certainly isn’t the best supercomputer ever built. That crown lies on the head of the Crays of the 70’s, 80’s and 90’s built by that genius Seymour Cray. In the form of the X-MP, Y-MP, C90 or Cray 2 the supercomputer reached its zenith. In relative terms these Crays were joys to use, and program. They were veritable iPhones compared to the rotary phones we produce today. At that time with an apex in functionality and utility for supercomputing massively parallel computing was born (i.e., the attack of the killer micros), and the measure of a supercomputer became speed above all else. Utility, and usefulness be damned. The fully integrated software-hardware solution found in a Cray Y-MP became a relic in the wake of the “need for speed”.
Study the past if you would define the future.
In a sense the modern trajectory of supercomputing is quintessentially American, bigger and faster is better by fiat. Excess and waste are virtues rather than flaw. Except the modern supercomputer it is not better, and not just because they don’t hold a candle to the old Crays. These computers just suck in so many ways; they are soulless and devoid of character. Moreover they are already a massive pain in the ass to use, and plans are afoot to make them even worse. The unrelenting priority of speed over utility is crushing. Terrible is the only path to speed, and terrible is coming with a tremendous cost too. When a colleague recently quipped that she would like to see us get a computer we actually wanted to use, I’m convinced that she had the older generation of Crays firmly in mind.
The future is already here – it’s just not evenly distributed.
― William Gibson
So, who are the geniuses that created this mess?
We have to go back to the mid-1990’s and the combination of computing and geopolitical issues that existed then. The path taken by the classic Cray supercomputers appeared to be running out of steam insofar as improving performance. The attack of the killer micros was defined as the path to continued growth in performance. Overall hardware functionality was effectively abandoned in favor of pure performance. The pure performance was only achieved in the case of benchmark problems that had little in common with actual applications. Performance on real application took a nosedive; a nosedive that the benchmark conveniently covered up. We still haven’t woken up to the reality.
Remembrance of things past is not necessarily the remembrance of things as they were.
― Marcel Proust
Geopolitically we saw the end of the Cold War including the cessation of nuclear
weapons’ testing. In the United Stated a program including high performance computing was sold as the alternative to nuclear testing (the ASCI program, now the ASC program). This program focused on computing power as the sole determinant of success. Every other aspect of computing became a veritable afterthought and was supported on a shoestring budget (modeling, methods, algorithms, and V&V). The result has been fast, unusable computers that deliver a pittance of their promised performance and a generation of codes with antiquated models and algorithms (written mostly in C++). We’ve been on this foolish path ever since to the extent that it’s become the politically correct and viable path going forward. We have lost a generation of potential scientific progress at the altar of this vacuous model for progress.
It shocks me how I wish for…what is lost and cannot come back.
― Sue Monk Kidd
Why do we choose this path when other more useful and rational approaches are available?
In the past forty some odd years we have as a society lost the ability to take risks even when the opportunity available is huge. The consequence of failure has become greater than the opportunity for success. In computing this trend has been powered by Moore’s law, the exponential growth in computing power over the course of the last 50 years (its not a law, just an observation). Under Moore’s law you just have to let time pass and computer performance will grow. It is a low-risk path to success.
When did the future switch from being a promise to being a threat?
― Chuck Palahniuk
Every other aspect of modeling and simulation entails far greater risk and opportunity to either fail, or fail to deliver in a predictable manner. Innovation in many areas critical to modeling and simulation are prone to episodic or quantum leaps in terms of capabilities (especially modeling and algorithms). These areas of potential innovationare also prone to failures where ideas simply don’t pan out. Without the failure you don’t have the breakthroughs hence the fatal nature of risk aversion. Integrated over decades of timid low-risk behavior we have the makings of a crisis. Our low-risk behavior has already created a fast immeasurable gulf in what we can do today versus what we should be doing today.
You realize that our mistrust of the future makes it hard to give up the past.
― Chuck Palahniuk
An aspirational goal for high performance computing would be the creation of a computing environment that meant as much for scientific work as my iPhone means for how I live my life. Today we are very far from that ideal. The key to the environment isn’t the speed of the hardware, but rather the utility of how the hardware is integrated with the needs of the user. In high performance computing the user needs to produce scientific results, which depend far more on the modeling’s fundamental character than the speed of the computer.
The future depends on what you do today.
― Mahatma Gandhi