Our greatest fear should not be of failure but of succeeding at things in life that don’t really matter.

― Francis Chan

Vacations are a necessary disconnection from the drive and drain of the every day events of our lives. They are also necessary to provide perspective on things that become to commonplace within the day in day out. My recent little vacation was no different. I loved both Germany and France, but came to appreciate the USA more too. Europeans and particularly Parisians smoke in tremendous numbers. It makes a beautiful city like Paris a bit less ideal and uncomfortable. My wife has recently had major surgery that limits her mobility, and European accommodation for disabilities and handicaps are terrible in comparison to the USA.   680x250_paris2

So the lesson to be learned is that despite many issues, the USA is still a great place and better than Europe in some very real ways. These interesting observations are not the topic here, which is another observation born of vacation time and its numerous benefits.

Bureaucracy destroys initiative. There is little that bureaucrats hate more than innovation, especially innovation that produces better results than the old routines. Improvements always make those at the top of the heap look inept. Who enjoys appearing inept?

― Frank Herbert

Another thing I noted was the irritation I feel when formality trumps substance in work. Formality has its place, but when it stifles initiative, innovation and quality, it does more harm than good. There was an activity at work that I had finished prior to leaving. Anything related to the actual work of it was utterly complete. It involved reviewing some work and denoting whether they had completed the work promised (it was in our parlance, a milestone). They had, and in keeping with current practice for such things, the bar was set so low that they would have had an almost impossible time not succeeding. Despite the relative lack of substance to the entire affair, the old fashioned memo to management was missing (the new fashioned memo with electronic signature was finished, and delivered). Here form was subservient to function and effort was expended in a completely meaningless way.

I had committed to taking a real vacation; I was not going to waste Paris doing useless administrative work. I screened my e-mail and told the parties involved that I would deal with it upon my return. Yet people wouldn’t let go of this. They had to have this memo signed in the old-fashioned way. In the end I thought what if they had put as much effort into doing old-fashioned work? What of instead of dumbing down and making the work meaningless to make sure of success, the work had been reaching far, and focused on extending the state of practice? Well then it might have been worth a bit of extra effort, but the way we do work today, this administrative flourish was simply insult to injury.

Management cares about only one thing. Paperwork. They will forgive almost anything else – cost overruns, gross incompetence, criminal indictments – as long as the paperwork’s filled out properly. And in on time.

― Connie Willis

The experience makes it clear, today we value form over function, appearances over substance. It is an apt metaphor for how things work today. We’d rather be successful doing using useless work than unsuccessful doing useful work. A pathetic success is more valuable than a noble failure. Any failure is something that induces deep and unrelenting fear. The inability to fail is hampering success of the broader enterprise to such a great degree as to threaten the quality of the entire institution (i.e., the National Lab system).Pert_example_gantt_chart

The issue of how to achieve the formality and process desired without destroying the essence of the Lab’s excellence has not been solved. Perhaps if credit were given for innovative, risk-taking work so that the inevitable failures were not penalized would be a worthy start. In terms of impact on all the good that the Labs do, the loss of the engines of discovery they represent negatively impacts national security, the economy, the environment and the general state of human knowledge. Reversing these impacts and puts the Labs firmly in the positive column would be a monumental improvement.

It is hard to fail, but it is worse never to have tried to succeed.

― Theodore Roosevelt

The epitomes of these pathetic successes are milestones. Milestones measure our programs, and seemingly the milestones denote important, critical work. Instead of driving accomplishments, the milestones sew the seeds of failure. This is not specifically recognized failure, but the failure driven by the long-term decay of innovative, aggressive technical work. The reason is the view that milestones cannot fail for any reason, and this knowledge drives any and all risk from the definition of the work. People simply will not take a chance on anything if it is associated with a milestone. If we are to achieve excellence this tendency must be reversed. Somehow we need to reward the taking of risks to power great achievements. We are poorer as a society for allowing the current mindset to become the standard.

Without risk, we systematically accomplish less innovative important work, and ironically package the accomplishment of relatively pathetic activities as success. To insure success, good work is rendered pathetic so that the risk of failure is completely removed. It happens all the time, over and over. It is so completely engrained into the system that people don’t even realize what they are doing. To make matters worse, milestones command significant resources be committed toward their completion. So we have a multitude of sins revolving around milestones: lots of money going to execute low-risk research masquerading as important work.

pgmmilestoneOver time these milestones come to define the entire body of work. This approach to managing the work at the Labs is utterly corrosive and has aided the destruction of the Labs as paragons of technical excellence. We would be so much better off if a large majority of our milestone failed, and failed because they were so technically aggressive. Instead all our milestones succeed because the technical work is chosen to be easy. Reversing this trend requires some degree of sophisticated thinking about success. In a sense providing a benefit for conscientious risk-taking could help. We still could rely upon the current risk-averse thinking to provide systematic fallback positions, but we would avoid making the safe, low-risk path the default chosen path.

Only those who dare to fail greatly can ever achieve greatly.

― Robert F. Kennedy

One place where formality and substance collide constantly is the world of V&V. The conduct of V&V is replete with formality and I generally hate it. We have numerous frameworks and guides that define how it should be conducted. Doing V&V is complex and deep, never being fully defined or complete. Writing down the process for V&V is important simply for the primary need to grapple with the broader boundaries of what is needed. It is work that I do, and continue to do, but following a framework or guide isn’t the essence of what is needed to do V&V. An honest and forward-looking quality mindset is what V&V is about.

It is a commitment to understanding, curiosity and quality of work. All of these things are distinctly lacking in our current culture of formality over substance. People can cross all the t’s and dot all the i’s, yet completely failed to do a good job. Increasingly the good work is being replaced by formality of execution without the “soul” of quality. This is what I see, lots of lip service paid to completion of work within the letter of the law, and very little attention to a spirit of excellence. We have created a system that embraces formality instead of excellence as the essence of professionalism. Instead excellence should remain the central tenet in our professional work with formality providing structure, but not the measure of it.

Bureaucracies force us to practice nonsense. And if you rehearse nonsense, you may one day find yourself the victim of it.

― Laurence Gonzales

Let’s see how this works in practice within V&V, and where a different perspective and commitment would yield radically different results.

I believe that V&V should first and foremost be cast as the determination of uncertainties in modeling and simulation (and necessarily experimentation as the basis for validation). Other voices speak to the need to define the credibility of the modeling and simulation enterprise, which is an important qualitative setting for V&V work. Both activities combine to provide a deep expression of commitment to excellence and due diligence that should provide a foundation for quality work.

I feel that uncertainties are the correct centering of the work in a scientific context. These uncertainties should always be quantitatively defined, that is should never be ZERO, but always have a finite value. V&V should push people to make concrete quantitative estimates of uncertainty based on technical evidence accumulated through focused work. Sometimes this technical evidence is nothing more than expert judgment or accumulated experience, but most of time much more. The true nature of what is seen in work done today whether purely within the confines of customer work or research shown at conferences or in journals does not meet these principles. The failure to meet these principles isn’t a small quibbling amount, but a profound systematic failure. The failure isn’t really a broad moral problem, but a consequence of fundamental human nature at work. Good work does not provide a systematic benefit and in many cases actually provides a measurable harm to those conducting it.

Today, many major sources of uncertainty in the modeling, simulation or experimentation are unidentified, unstudied and systematically reported as being identically ZERO. Often this value of ZERO is simply implicit. This means that the work doesn’t state that it’s “ZERO,” but rather fails to examine the uncertainties at all leading to it being a nonentity. In other words the benefit of doing no work at all is reporting a smaller uncertainty. The nature of the true uncertainty is invisible. This is a recipe for an absolute disaster.Mesh_Refinement_Image4

A basic principle is that doing more work should result in smaller uncertainties. This is like a statistical sampling where gathering more samples systematically produces a smaller statistical error (look at standard error in frequentist statistics). The same thing applies to modeling or numerical uncertainty. Doing more work should always reduce uncertainties, but the uncertainty is always finite, and never identically ZERO. Instead by doing no work at all, we allow people to report ZERO as the uncertainty. Doing more work can only result in increasing the uncertainty. If doing more work increases the uncertainty the proper conclusion is that your initial uncertainty estimate was too small. The current state of affairs is a huge problem that undermines progress.

Here is a very common example of how it manifests itself in practice. The vast majority of computations for all purposes do nothing to estimate numerical errors, and get away with reporting an effective value of ZERO for the numerical uncertainty. Instead of ZERO if you have done little or nothing to structurally estimate uncertainties, your estimates should be larger than the truth to account for your lack of knowledge. Less knowledge should never be rewarded with reporting smaller uncertainty.

For example you do some work and find out that the numerical uncertainty is larger than your original effort. The consequence is that your original estimate was too small and you should learn about how to avoid this problem in the future. Next time doing more work and getting to report a smaller uncertainty should then reward you. You should also do the mea culpa and admit that your original estimate was overly optimistic. Remember V&V is really about being honest about the limitations of modeling and simulation. Too often people get hung up on being able to do complete technical assessment before reporting any uncertainty. If the full technical work cannot be executed, they end up presenting nothing at all, or ZERO.

People get away with not doing numerical error estimation in some funny ways. Here is an example that starts with the creation of the numerical model for a problem. If the model is created so that it uses all the reasonably available computing resources, it can avoid estimating numerical errors by a couple of ways. Often these models are created with an emphasis of computational geometric resolution. Elements of the problem are meshed using computational elements that are one thick (or wide or tall). As a result the model cannot be (simply) coarsened to produce a cheaper model that can assist in computational uncertainty estimation. Because it has used all the reasonable resources, refining the model and completing a simulation is impossible without heroic efforts. You effectively only have a single mesh resolution to work with by fiat.

Then they often claim that their numerical errors are really very small. Any effort to estimate these small errors would be a waste of time. This sort of twisted logicurldemands a firm unequivocal response. First, if your numerical error is so small than why are using such a computationally demanding model? Couldn’t you get by with a bit more numerical error since its so small as to be regarded as negligible? Of course their logic doesn’t go there because their main idea is to avoid doing anything, not actually estimate the numerical uncertainty or do anything with the information. In other words, this is a work avoidance strategy and complete BS, but there is more to worry about here.

It is troubling that people would rely upon meshes where a single element defines a length scale in the problem. Almost no numerical phenomena I am aware of are resolved in a single element with the exception of integral properties such as conservation, and only if this is built into the formulation. Every quantity associated with the single element is simply there for integral effect, and could be accommodated with even less resolution. It is almost certainly not “fully resolved” in any way shape or form. Despite these rather obvious realities of numerical modeling of physical phenomena, the practice persists and in fact flourishes. The credibility of such calculations should be taken as being quite suspect without extensive evidence to contrary.

In the end we have embraced stupidity and naivety as principles packaged as formality and process. The form of the work being well planned and executed as advertised for defining quality. Our work is delivered with unwavering over-confidence that is not supported by the qualities of the foundational work. We would be far better off looking to intelligence, curiosity and sophistication with a dose of wariness as the basis for work. Each of these characteristics form a foundation that naturally yields the best effort possible rather than the systematic reduction of risk that fails to push the boundaries of knowledge and capability. We need to structure our formal processes to encourage our best rather than frighten away the very things we depend upon for success as individuals, institutions and a people.