Multimat2013 or My Biannual Geekfest.

In all honesty most of you would consider every conference I go to as a “Geekfest” so that label is overdoing it.


Last week I attended a meeting of one of the communities I participate actively in.  This meeting goes by the catchy name of “multimat” which is short hand for multimaterial computational hydrodynamics.  Most of the attendees are work at their nation’s nuclear weapons’ labs although we are getting broader attendance from countries and labs outside that special community.  This year’s meeting had a really good energy despite the seemingly overwhelming budgetary constraints both in the United States and Europe.

Why was the energy at the meeting so good, when the funding picture is so bleak?  New ideas.

That’s it, the community has new ideas to work on and it energizes everything.  What new ideas you ask?  Cell-centered and high-order methods for Lagrangian hydrodynamics, concomitant spill over from other areas of science such as optimization.

Let me explain why this might be important given that cell-centered and high-order methods are commonplace in the aerospace community.  In fact, as will discuss at length in a future post these fields were intimately connected at their origins, but the ties have become estranged over the intervening decades.

Using cell-centered methods for Lagrangian hydrodynamics was long thought to be unworkable with episodic failures over the preceding decades.  Lagrangian hydrodynamics has long followed the approach provided by the combination of John Von Neumann’s staggered mesh method published in 1944 combined with the essential artificial viscosity of Richtmyer developed in 1948.*  The staggered mesh has material quantities at cell centered, but velocities (kinematics) at the cell edges.   Everything done within the confines of this community proceeded using this approach for decades (including France, England, Soviet Union/Russia, China, Israel).    All of these methods are also either first- or second-order accurate.  Cell-centered approaches based upon Godunov’s method appear every so often, but are viewed as practical failures (example, the Caveat code from Los Alamos).

A second historical footnote is that cell-centered methods started at Los Alamos shortly after the famous Von Neumann-Richtmyer paper appeared in 1950.  By 1952 Peter Lax introduced a cell-centered finite difference method, which we know as the Lax-Friedrichs methods (really finite volume, but that term didn’t exist till 1973).  Godunov’s method was developed between 1954 and 1956 independently.

Multimat2013 took place of five days with about 50 talks and 25 posters.  In particular I thought the first three days were fantastic.   As I noted, a great deal of the positive energy comes from the development of cell-centered Lagrangian methods starting with the work of Depres and Maire in France.  Similar methods have been developed from that foundation in England and the United States.  Further developments have been made to these methods with high-order approaches including discontinuous Galerkin, and high-order “traditional” finite elements.  This seems to have opened the door to high-order methods which has been an active area World-wide since the 1980’s.

This in part was the inspiration for my talk.  Recently, I attended the JRV symposium (, which preceded the AIAA CFD Meeting in June.  JRV stands for Jameson, Roe and Van Leer.  Bram Van Leer ( gave a talk that largely chided the community on referencing classical papers (he has several!) and not really reading their content.  I decided to discuss one of Bram’s papers from that perspective (Journal of Computational Physics, Volume 23, 1977).  To make a long story short, the Multimat community has focused on one of the six methods in Bram’s paper.  In fact, the method has been given the name “Van Leer Method” in the community of code developers represented at Multimat!  When I met Bram and relayed this he found it offputting, and slightly horrified him.  This method is the worst of the six methods from some basic perspectives.  The other methods may gain a second life with new computers, but require some effort to get them up to snuff.  I focused to some degree on the the fifth method, which has very nice properties, and unbeknownst to many of the researchers has been rediscovered without referencing the original work of Van Leer.  Perhaps this method can be the topic of another future post.

Being a multimaterial conference, techniques for evolving material interfaces is of interest.  Again, the conference featured a neat mix of traditional and modern approaches with some trends.  Part of this included the use of optimization/minimization principles for solving particularly pernicious problems.  There is also notable improvement in level set techniques in this area.  I’ll note that Jamie Sethian once told me that he thought that this area provided some of the greatest challenges to level sets (in other words its ideally suited to the other problems it is used for).  Nonetheless, progress has been immense over the past 15 years.

Ann Mattsson gave a talk on our joint work on artificial viscosity.  It received mixed reviews largely due to Ann’s most valuable characteristic.  She isn’t one of us in that she is an accomplished atomic physicist and not a numerical hydrodynamics expert.  She took her unique professional perspective to try and build artificial viscosity from the ground up.  She also started from the viewpoint of the less widely known first report on the method written by Richtmyer in 1948.  These conditions conspire to create a functionally different perspective and different method than the classical viscosity arising from the Von Neumann-Richtmyer paper.  I then took her results and put together an initial implementation of the method (I am probably significantly biased by the classical approach, that has had 65 years of use).  One other aspect of the Richtmyer report that is notable is that it was classified secret until 1993.   It is nothing but mathematical physics and its status only robbed us of having a correct history of the lineage of shock capturing methods.

To be clear, Von Neumann conceived of shock capturing as a concept, but needed Richtmyer’s contribution to make it practical.

I also gave a poster on the goings on and progress with the code I support.  This included the introduction of a meme to the proceedings to explain why things are difficult.  It turns out this is a common issue (not surprising at all!).


The last two days seemed a bit less exciting with more traditional themes taking over.  That might have been simply a function of an over-celebration of my birthday, which occurred fortuitously on the night of the banquet (and the wonderful hospitality of my fellow travelers leading to less sleep than I normally need).

The meeting has been in Europe previously (Paris, Oxford, Prague, Pavia Italy, Archchon France), and very well executed.  We Americans had a high bar to meet, and I think the organizers from Lawrence Livermore Lab did very well. The choice of San Francisco was inspired, and did a great deal to help make the meeting successful.  We managed to provide hospitality that didn’t embarrass the United States.**  So hats off to Rob Reiben, Mike Owen, and Doug Miller for a job well done.  They also had wonderful assistance from Darlene Henry and Jenny Kelley who kept everything humming for the entire week.


Here is a picture of the sunset at the banquet.  Really beautiful (Hans and Vince got in the way).

* I will briefly note that artificial viscosity is an example of a method that regularizes a singularity, and leads to this blog’s name.

** I am often taken aback by the degree to which our European colleagues offer far greater hospitality than we Americans can.  We literally can’t match them.  It is a continual issue with working for the government, and a source of personal embarrassment.  We government contractors are required to be run “like a business” yet we offer hospitality that no business would allow.  Frankly, it is complete bullshit.