In working on an algorithm (or a code) we are well advised to think carefully about requirements and priorities. It is my experience that these can be stated clearly as a set of adjectives about the method or code that form the title of this post. Moreover the order of these adjectives forms the order of the priorities from the users of the code. The priorities of those funding code development are quite often the direct opposite!

There is nothing quite so useless as doing with great efficiency something that should not be done at all.

– Peter Drucker

n05_TR018003None of these priorities can be ignored. For example if the efficiency becomes too poor, the code won’t be used because time is money. A code that is too inaccurate won’t be used no matter how robust it is (these go together, with accurate and robust being a sort of “holy grail”).

 Extraordinary claims require extraordinary evidence.

― Carl Sagan

The problem I’d like to raise your attention to be that the people handing out money to do the work seem inverts these priorities. This creates a distinct problem with making the work useful and impactful. The over-riding concern is the high-performance computing imperative, which is encoded in the call for efficiency. There is an operating assumption that all of the other characteristics are well in hand, and its just a matter of getting a computer (big and) fast enough to crush our problems out of sight and out of mind. Ironically, the meeting devoted to this dim-sighted worldview is this week, SC14. Thankfully, I’m not there.

All opinions are not equal. Some are a very great deal more robust, sophisticated and well supported in logic and argument than others.

― Douglas Adams

antifragile1Robust. A robust code runs to completion. Robustness in its most refined and crudest sense is stability. The refined sense of robustness is the numerical stability that is so keenly important, but it is so much more. It gives an answer come hell or high water even if that answer is complete crap. Nothing upsets your users more than no answer; a bad answer is better than none at all. Making a code robust is hard work and difficult especially if you have morals and standards. It is an imperative.

Physical. Getting physical answers is often thought of as being equal to robustness, and it should be. It isn’t so this needs to be a separate category. In a sense physical answers are a better source of robustness, that is, an upgrade. An example is that the density of a material must remain positive-definite, or velocities remain sub-luminal, and things like that. A deeper take on physicality involves quantities staying inside bounds imposed, or satisfaction of the second law of thermodynamics.Supersonic_Bullet_Shadowgraph

Robustness and physicality of solutions with a code are ideally defined by the basic properties of an algorithm. Upwind differencing is a good example where this happens (ideally). The reality is that achieving these goals usually takes a lot of tests and checks with corrective actions where the robustness or physicality is risked or violated. This makes for ugly algorithms and ugly codes. It makes purists very unhappy. Making them happy is the place where the engineering becomes science (or math).

A computational study is unlikely to lead to real scientific progress unless the software environment is convenient enough to encourage one to vary parameters, modify the problem, play around.

– Nick Trefethen

Flexible. You can write your code to solve one problem, if the problem is important enough, and you solve it well enough it might be a success. If your code can solve a lot of problems robustly (and physically) you might be even more successful. A code that is a jack-of-all-trades is usually a huge success. This requires a lot of thought, experience and effort to pull off. This usually gets into issues of meshing, material modeling, input, output, and generality in initial and boundary conditions. Usually the code gets messy in the process.

All that it is reasonable to ask for in a scientific calculation is stability, not accuracy. – Nick Trefethen

Accuracy. This is where things get interesting. For the most part accuracy is in conflict with every single characteristic we discuss. It is a huge challenge to both ends of the spectrum to robustness and efficiency. Accuracy makes codes more prone to failure (loss of robustness) and the failures have more modes. Accuracy is expensive and makes codes run longer. It can either cause more communication, or more complexity (or both). It makes numerical linear algebra harder and far more fragile. It makes boundary conditions, and initial conditions harder and encourages much more simplification of everything else. If it can be achieved accuracy is fantastic, but it is still a huge challenge to pull of.

The fundamental law of computer science: As machines become more powerful, the efficiency of algorithms grows more important, not less.

– Nick Trefethen

IBM_Blue_Gene_P_supercomputerEfficiency. The code runs fast and uses the computers well. This is always hard to do, a beautiful piece of code that clearly describes an algorithm turns into a giant plate of spaghetti, but runs like the wind. To get performance you end up throwing out that wonderful inheritance hierarchy you were so proud of. To get performance you get rid of all those options that you put into the code. This requirement is also in conflict with everything else. It is also the focus of the funding agencies. Almost no one is thinking productively about how all of this (doesn’t) fit together. We just assume that faster supercomputers are awesome and better.

Q.E.D.

God help us.

 

Advertisements