This is the 10th edition of the conference and the first one in Europe. It is a mix of astrophysicists, plasma physicists, particle physicists, experimental physicists and a handful of nuclear engineers (like me).

Image

This week was spent at a conference generally outside my field. Doing this is a mixed bag, great at being exposed to new things and new people, but often being way out of my depth. I had been invited to give a talk about V&V, as calculations are very important in both astrophysics and high energy density experiments. While calculations are important, the physicists’ mode of investigation almost seems to be intrinsically at odds with V&V. While outside my field I felt increasingly like I had gone back in time as the week of talks unfolded. I felt like the community acted like the classical Los Alamos physics community I had come to understand while working there. I came away thinking that they need more V&V, but not in the same way applied programs need it. Interactions here are likely to be instructive for the subtleties to be found elsewhere.

Physicists are highly motivated to study the impact of the completeness of the model of reality to such a degree that it inhibits virtually any attention to verification. Validation in a loose sense is the focus, but it tends to take on an ad hoc character, as models are changed whole cloth during a computational investigation while seeking to see the fidelity of the modeling to observations. These are essentially sensitivity studies, but the practice that is accepted is very ad hoc and lacks a certain systematic flavor. More commonly, the whole study embodies a curiosity driven approach. Perhaps, this is generally OK; however, some of the calculations left me feeling very uneasy.

HEDLA is involved with doing a broad spectrum of experimental work with astrophysical significance. A host of phenomena can be profitably examined using the modern facilities in high energy density physics. These facilities include laser fusion centers (NIF, Rochester, LMJ,…) and electromagnetic centers (magpie and Z). The topics to study are dynamics such as jets, radiating shock waves, material characterization and equation of state, and so on. The goal is to understand the physics in a more controlled environment than the pure observational environment of astronomy. The problem with the approach is the difficulty of measuring quantities in the environment offered by the experiments, which is generally very very hot and very small. The other opportunity is the more direct validation of the physical models available in the computer codes developed? These codes share the dual role of providing design and analysis of the experiments and exploring astrophysical theories and concepts.

This issue with combining astrophysics with experimental physics isn’t the quality of the science. The science in this community is strong, exploratory and interesting. The problem is that the experiments are hard to do, hard to diagnose and painfully expensive. Under these conditions the curiosity-driven approach to science becomes problematic. Experiments need to be carefully designed and the quantitative aspects of the work grow in priority. It clashes with the more qualitative mode of investigation that dominates astrophysics where the key is to understand the basic principles governing observed phenomena. An example is the sensitivity of dependence to initial conditions where the experiments could provide a measure of repeatability except replicate experiments are never done; they are beaten out by more interesting unique experiments. This is in spite of the replication issue being capable of addressing the true error bar for every experiment that is done.

Take for example core collapse supernova where computation has played a major role in understanding what is probably happening. Early on pure hydrodynamic simulations could not recover the behavior apparent from observations (the mixing of elements into the envelope of the exploding star). Adding multiple physical effects has provided a better qualitative picture of what is likely to be happening. When the simulations added asymmetry in the initial conditions, neutrino transport with coupling to the hydro, magnetic fields and rotation every thing became better. Suddenly the character of the simulations became much more like the observations. The issue of initial conditions comes up in spades here. Supernovas are difficult to make explode, and the question remains about how often there are duds that don’t explode. We really only see the supernovas that explode, the duds may happen, but we don’t see them.

The question is whether this successful approach can be used for very expensive experimental design and analysis. I’m not so sure.

Using codes in conjunction with expensive, complex experiments should naturally evoke refined V&V. V&V is natural in the sort of engineering uses of computation that experimental design engenders. Conversely V&V seems to be almost unnatural for physics investigations. V&V implies a certain stability of modeling and theory that this field does not have. The careful and complete investigation of a stable model in an anathema to open-ended physics investigations. In other words the places where V&V is well grounded and natural are exactly the areas where the physics research community isn’t interested in. So the key is to craft a path forward that at once provides better quality of simulation for high energy density physics without clashing with the sorts of investigations important to the vibrancy of the community.

In a strong sense I think this is a perfect example for the flexible approach to V&V I’ve been advocating. In essence the idea is to apply V&V in a limited and carefully defined manner crafted to the needs of the community. The codes should probably have a greater level of foundational V&V in terms of the implementation of the basic numerical methods and physical models. Beyond the foundational V&V the application specific V&V should be far greater when the codes are applied to experimental design and analysis to assure that the outcomes of the experimental work has sufficient value. On the other hand the hard-nosed V&V concepts are inappropriate for the curiosity-driven astrophysics investigations. This isn’t that they couldn’t be applied, but rather that they would be potentially counter-productive. Once a mechanism is well enough established to transition to an experimental study, more V&V should kick in.

 We also visited the French version of NIF, the LMJ, which is a CEA run facility. We had a wonderful tour and since I saw NIF a year or so ago, it was useful to compare notes. Mostly the facility is similar, but seems more austere and less boastful. It is lower power and probably consciously avoids the word “ignition”. Interestingly the facility is still being constructed, but overall looks quite a bit like NIF (minus the landscaping, façade and other window dressing). The French are much more transparent about the connection of LMJ to their defense work. In addition the tour was dramatically more technical (although they probably have a smaller number of visitors by a lot).

 Overall it was a good experience and gave me lots to think about. V&V should connect all the way from engineering and a heavy hand to physicists and a much lighter touch. Wherever codes are used seriously in design and analysis V&V should play some role, even if it is minor. After my talk I met a blogger attending the meeting (Adam Frank who blogs at http://www.npr.org/blogs/13.7/). He asked me the question about V&V and climate change. It was a good question that led to a much longer discussion. In a nutshell my opinion climate change needs to have serious discussion about V&V issues, but the atmosphere is so poisonous toward dialog that will never happen. One should be able to criticize how climate science is done without being labeled a denier. Right now, that cannot happen, and we are all poorer for it.

Image

Advertisements