First an admission, I thought it was Friday yesterday and it was time to post.  It pretty much sums up my week, so as penance I’m giving you a bonus post.  I also had a conversation yesterday that really struck a cord with me, and it relates to yesterday’s topic, legacy codes.  In a major project area I would like to step into a time machine and see what where current decision-making will take us.  The conversation offered me that very opportunity.

A brief aside to introduce the topic, I’m a nuclear engineer educationally, and worked as a reactor safety engineer for three years at Los Alamos.  Lately I’ve returned to nuclear engineering as part of a large DOE project.  After having gone into great depth with modern computational science, the return to nuclear engineering has been bracing.  It is like stepping into the past.  I am trying to bring modern concepts in computational science quality to bear on the analysis of nuclear reactors.  To say it is an ill fit is a dramatic over-statement, it is a major culture clash.  The nuclear engineering community’s idea of quality is so antiquated that almost none of my previous 20 years experience is helpful; it is the source of immense frustration.  I have to hold back my disgust constantly at what I see.

A big part of the problem is the code base that nuclear engineering uses.  It is legacy code.  The standard methodology is almost always based on the way things were done in the 1970’s, when the codes were written.  You get to see lots of Fortran, lots of really crude approximations, and lots of code coupling via passing information through the file system.  Nuclear reactor analysis is almost always done with a code and model that is highly calibrated.  It is so calibrated that there isn’t any data left over to validate the model.  We have no idea whether the codes are predictive (it is almost assured that they are not).

It is a giant steaming pile of crap.  The best part is that this steaming pile of crap is the mandated way doing the analysis.  The absurd calibrated standards are written into regulations that the industry must follow.  It creates a system where nothing will ever get any better, and rather than follow the best scientific approach to doing this analysis, we do things in the slipshod way they were done in the 1970’s.  I am mindful that we didn’t know any better back then and we had limitations in the methodology we could apply.  After all, a wristwatch can beat the biggest supercomputer in the world in the early to mid-1970’s today.  This is a weak excuse for continuing to do things today like we did it then, but we do.  We have to.

We still use codes today that ended their active development in that era.  Some of the codes have been revamped with modern languages and interfaces, but the legacy intellectual core remains stuck in the mid-1970’s (40 years ago!).  The government simply stopped funding the development of new methods, and began to mandate the perpetuation of the legacy methodology.  The money dried up for new development and has been replaced by maintenance of the legacy capability and legacy analysis methodology that is unworthy of the task it is set to in the modern world.

Here is the punch line in my mind. We are setting ourselves on a course to do the same with the analysis of the nuclear weapons.  I think looking at computational analysis of nuclear reactors gives us a “time machine” that shows the path nuclear weapon’s analysis is on.  We have stopped developing anything new, and started to define a set of legacy capabilities that must be perpetuated.  Some want to simply work on porting the existing codes to the next generation of computers without adding anything to the intellectual basis.  Will this create an environment just like reactor safety analysis in 15 years?  Will we be perpetuating the way we do things now for perpetuity?  I worry that this is where our leaders are taking us?

I believe that three major factors are at play.  One is a deep cultural milleu that is strangling scientific innovation, reducing both aggregate funding and the emphasis and capacity for innovation.  The United States simply lacks faith that science can improve our lives and acts accordingly.  The second two factors are more psychological.  The first is a belief that we have a massive sunk cost in software and it must be preserved.  This is the same fallacy that makes people lose all their money in Las Vegas. It is stupid, but people buy it. Software can’t be preserved; it begins to decay the moment it is written.  More tellingly, the intellectual basis of software must either grow or it begins to die.  We are creating experts in preserving past knowledge, which is very different from creating new knowledge.

Lastly, when the codes began to become useful to analysis an anchoring bias was formed.  A lot of what nuclear engineers analyze can’t be seen.  As such, a computer code becomes the picture many of us have of phenomena.  Think about radiation transport and what it “looks” like.  We can’t see it visually.  Our path to seeing it is computational simulation.  When we do “see” it, it forms a powerful mental image.  I can attest to learning about radiation transport for years and the power of simulation to put this concept into vivid images.  This image becomes an anchor bias that is difficult to escape.  This image includes both the simulation’s picture of reality as well as the simulation’s errors and model deficiencies.  The bias means that an unambiguously better simulation will be rejected because it doesn’t “look” right.  It is why legacy codes are so hard to displace.  For reactor safety the anchoring bias has been written into regulatory law.

This resonates with my assessment of how the United States’ is managing to destroy its National Laboratory system through systematic mismanagement of the scientific enterprise.  It fact it is self-consistent.  The deeper question is why our leader makes decisions like this?  These are two cases where a collective decision has been made to mothball a technology, an important and controversial technology, as everything nuclear is.  Instead of applying the best of modern science to the mothballed technology, we mothball everything about it.

It would seem that the United States will only invest in the analysis of significant technological programs while the technology is being actively developed.  In other words, the computational tools are built only when the thing they are being analyzed is being built too.  We do an awful job of stewardship using computation.  This is true in spades with nuclear reactors, and I fear may be true with the awkwardly named “stockpile stewardship program”.  It turns out that the entirety of the stewardship is grounded on ever-faster computers rather than a holistic, balanced approach.  We aren’t making new nuclear weapons, and increasingly we aren’t applying new science to their stewardship.  We aren’t actually doing our best to do this important job.  Instead we are holding fast to a poorly constructed, politically expedient plan laid out 20 years ago.

On the other hand maybe it’s just the United States ceding scientific leadership in yet another field.  We’ll just let the Europeans and Chinese have computational science too.

Advertisements