How Useful are Smoothed Operators?

To test a perfect theory with imperfect instruments did not impress the Greek philosophers as a valid way to gain knowledge.

― Isaac Asimov

Note: I got super annoyed with the ability of WordPress to parse LaTeX so there are a lot of math type in here that I gave up on. Apologies!

Last week I introduced a set of alternatives to discontinuous intrinsic functions providing logical functional capability in programming. In the process I outlined some of the issues that arise due to discontinuous aspects of computer code operation. The utility of these alternatives applies to common regression testing in codes and convergence of nonlinear solvers. The issue remains with respect to how useful these alternatives are. My intent is to address components of this aspect of the methods this week. Do these advantageous functions provide their benefit without undermining more fundamental properties of numerical methods like stability and convergence of numerical methods? Or do we need to modify the implementation of these smoothed functions in some structured manner to assure proper behavior.

The answer to both questions is an unqualified yes, they are both useful and they need some careful modification to assure correct behavior. The smoothed functions may be used, but the details do matter.

To this end we will introduce several analysis techniques to show these issues concretely. One thing to get out of the way immediately is how this analysis does not change some basic aspects of the functions. For all of the functions we have the property that the original function is recovered in an asymptotic limit that is \mbox{softsign}(x) = \tanh(n x) becomes the original sign function, \rightarrow \infty . Our goal is to understand the behavior of these functions within the context of a numerical method away from this limit where we have obviously deviated substantially from the classical functions. We fundamentally want to assure that the basic approximation properties of methods are not altered in some fatal manner by their use. A big part of the tool kit will be systematic use of Taylor series approximations to make certain that the consistency of the numerical method and order of accuracy are retained when switching from the classical functions to their smoothed versions. Consistency simply means that the approximations are valid approximations to the original differential equation (meaning the error is ordered).

There was one important detail that I misplaced during last week’s post. If one takes the definition of the sign function we can see an alternative that wasn’t explored. If we have \mbox{sign}(x) = \|x\|/x = x/\|x\|. Thus we can easily rearrange this expression to give two very different regularized absolute value expressions, \|x\| = x \mbox{sign}(x) and \|x\| = x /\mbox{sign}(x). When we move to the softened sign function the behavior of the absolute value changes in substantive ways. In particular in the cases where the softened absolute value was everywhere less than the classical absolute value, it would be greater for the second interpretation and vice-versa. As a result functions like \mbox{softsign}(x) = \tanh(n x) now can produce an absolute value, \mbox{softabs}(x)=x/\mbox{softsign}(x) that doesn’t have issues with entropy conditions as the dissipation will be more than the minimum, not less. Next we will examine whether these different views have significance in truncation error.

Our starting point will be the replacement of the sign function or absolute value in upwind approximations to differential equations. For entropy satisfaction and generally stable approximation the upwind approximation is quite fundamental as the foundation of robust numerical methods for fluid dynamics. We can start with the basic upwind approximation with classical function, the absolute value in this case. We will base the analysis on the semi-discrete version of the scheme using a flux difference, u_t = - f(u)_x = $\frac{1}{h} \left[ f(j+1/2) – f(j-1/2) \right]  $. The basic upwind approximation is f(j+1/2) = \frac{1}{2} \left[ f(j) + f(j+1)\right] $ – \frac{1}{2} \left|a\right| \left[ u(j+1)  –  u(j)  \right]$ where a is the characteristic velocity for the flux and provides the upwind dissipation. The Taylor series analysis gives the dissipation in this scheme as f(u)_x - \frac{1}{2} \left|a\right| u_{xx} + {\cal O}(h^2) . We find a set of very interesting conclusions can be drawn almost immediately.

All of the smoothed functions introduced are workable as alternatives although some versions seem to be intrinsically better. In other words all produce a valid consistent first-order approximation. The functions based on analytical functions like \tanh or \mbox{erf} are valid approximations, but the amount of dissipation is always less than the classical function leading to potential entropy violation. They approach the classical absolute value as one would expect and the deviations similarly diminish. Functions such as \mbox{softabs}(x) = x^2 /(\left|x\right| + n) or \mbox{softabs}(x) = x^2 /\sqrt{x^2+ n} result in no change in the leading order truncation error although similarly the deviations are always produce less dissipation than classical upwind. We do find that for both functions we need to modify the form of the regularization to get good behavior to \mbox{softabs}(x) = x^2 /(\left|x\right| + n h) and \mbox{softabs}(x) = x^2 /\sqrt{x^2+ n^2 h^2}.For the classical softmax function based on logarithms and exponentials is the same, but it always produces more dissipation than upwinding rather than less, \mbox{softabs}(x) = \mbox{softmax}(a,-a) \ge \|a \|. This may make this functional basis better for replacing the absolute value for the purpose of upwinding. The downside to this form of the absolute value is the regularized sign function’s passage through hard zero, which makes division problematic.

Let’s look at the functions useful for producing a more entropy satisfactory result for upwinding.  We find that these functions work differently than the original ones. For example the hyperbolic tangent does not as quickly become equivalent to the upwind scheme as n \gg 0. There is a lingering departure from linearity with \mbox{softsign}(x) =x/ (\|x\| + n h) \rightarrow \mbox{softabs}(x) = (\|x\| + n h) proportional to the mesh spacing and n. As a result the quadratic form of the softened sign is best because of the h^2 regularization. Perhaps this is a more widely applicable conclusion as will see as we develop the smoothed function more with limiters.

Where utility ends and decoration begins is perfection.

― Jack Gardner

Now we can transition to looking at a more complex and subtle subject, limiters. Briefly put, limiters are nonlinear functions applied to differencing schemes to produce non-oscillatory (or monotone solutions) with higher order accuracy. Generally in this context high-order is anything above first order. We have theory that confines non-oscillatory methods to first-order accuracy where upwind differencing is canonical. As a result the basic theory applies to second-order method where a linear basis is added to the piecewise constant basis the upwind method is based on. The result is the term “slope limiter” where the linear, slope, is modified by a nonlinear function. Peter Sweby produced a diagram to describe what successful limiters look like parametrically. The parameter is non-dimensionally described by the ratio of discrete gradients, $ r = \frac{(u(j+1) – u(j)}{u(j) – u(j-1)}$. The smoothed functions described here modify the adherence to this diagram. The classical diagram has a region where second-order accuracy can be expected. It is bounded by the function \mbox{minmod}(1,r) and twice the magnitude of this function.

We can now visualize the impact of the smoothed functions on this diagram. This produces systematic changes in the diagram that lead to deviations from the ideal behavior. Realize that the ideal diagram is always recovered in the limit as the functions recover the classical form. What we see is that the classical curves are converged upon from above or below, and produces wiggles in the overall functional evaluation. My illustrations all show the functions with the regularization chosen to be unrealistically small to exaggerate the impact of the smooth functions. A bigger and more important question is whether the functions impact the order of approximation.

To finish up this discussion I’m going to look at analyzing the truncation error of the methods. Our starting point is the classical scheme’s error, which provides a viewpoint on the nature of the nonlinearity associated with limiters. What is clear about a successful limiter is its ability to produce a valid approximation to a gradient with an ordered error of a least order h= \Delta x. The minmod limiter produces a truncation error of $ \mbox{minmod}(u(j)-u(j-1), u(j+1) – u(j)) = u_x – \frac{h}{2} \left| \frac{u_{xx}}{u_x} \right| u_x $. The results with different sorts of recipes for the smoothed sign function and its extension to softabs, softmin and softmax are surprising to say the least and a bit unexpected.

Here is a structured summary of the options as applied to a minmod limiter, \mbox{minmod}(a,b) = \mbox{sign}(a) \max\left[ 0, \min\left( \|a\|, \mbox{sign}(a) b\right) \right] :

 

  1. \mbox{softsign}(x) = \tanh(n x) and \mbox{softabs}(x) = x \tanh(n x). The gradient approximation is u_x \approx \tanh(n)\tanh(2 n) u_x + \mbox(giant mess) h u_{xx} + \cal{O}(h^2). The constant in front of the gradient approaches one very quickly as n grows.tanh-constant
  2. \mbox{softsign}(x) = \tanh(n x) and \mbox{softabs}(x) = x/ \tanh(n x). The gradient approximation is u_x \approx \frac{1}{2}\left(2 \coth(2 n)- \frac{1}{n} \right) \tanh(n) u_x + \mbox(giant mess) h u_{xx} + \cal{O}(h^2). The constant in front of the gradient approaches one very slowly as n grows. This is smoothing is unworkable for limiters.one-over-tanh-constant
  3. \mbox{softsign}(x) = x/(n+\|x\|) and \mbox{softabs}(x) = x^2/(n+\|x\|) putting a mesh dependence in the sign function results in inconsistent gradient approximations. The gradient approximation is u_x \approx \frac{2}{(n+1)(n+2)} u_x + \frac{3n+2n^2}{(1+n)^2(2+n)^2} h u_{xx} + \cal{O}(h^2). The leading constant goes slowly to one as n\rightarrow 0.
  4. \mbox{softsign}(x) = x/(n+\|x\|) and \mbox{softabs}(x) = (n h +\|x\|). The gradient approximation is u_x \approx u_x - n h u_x + \cal{O}(h^2).linear-constant
  5. \mbox{softsign}(x) = x/\sqrt{x^2 + n^2} and \mbox{softabs}(x)= x^2/\sqrt{x^2 + n^2}. putting a mesh dependence in the sign results in inconsistent gradient approximations. The gradient approximation is u_x \approx u_x +\mbox{giant unordered mess} + \cal{O}(h). This makes the approximation utterly useless in this context.
  6. \mbox{softsign}(x) = x/\sqrt{x^2 + n^2 h^2} and \mbox{softabs}(x) = \sqrt{x^2 + n^2 h^2}. The gradient approximation is $ u_x \approx u_x – \left[ u_x\sqrt{n^2 + \left(\frac{(u_{xx}}{u_x}} \right)^2 \right]h+ \cal{O}(h^2)$.quadratic-constant

Being useful to others is not the same thing as being equal.

― N.K. Jemisin

Sweby, Peter K. “High resolution schemes using flux limiters for hyperbolic conservation laws.” SIAM journal on numerical analysis 21, no. 5 (1984): 995-1011.

Smoothed Operators

Everything that looks too perfect is too perfect to be perfect.

― Dejan Stojanovic

This post is going to delve directly into this blog’s name, how to regularize a singularity, but in this case we are talking about an artificial one. When one is writing a computer program to solve differential equations it is easy to introduce a discontinuity into how the program operates. This is particularly true when you’re implementing an elaborate model or numerical method. In the process of taking care of special cases or making a program robust for general use, logic is employed. Should a circumstance arise that causes the program to fail, it can be detected and avoided by making a logical change in the operation. The most common way to do this uses logic in the program through an “if” statement or some sort of switch. When the “if” triggers on a floating-point value, the impact on the solution can be subtle and creates a host of practical issues.

Ability to find the answers is more important than ability to know the answers.

― Amit Kalantri

As computer program development becomes more rigorous testing of various sorts becomes important and valuable to quality of the work. One form of the testing is regression testing. Here the program is run through a series of usually simple problems with well-defined answers. If the program’s solution changes in some way that is unexpected, the testing should pick it up and alert the development team. In addition this testing often runs across a bunch of different computers to make sure the answers are the same, and the program works properly on all of them. For basic quality assessment and control, regression testing is essential. It is one of the hallmarks of serious, professional code development. The logic and if statements introducing discontinuous behavior into the code based on floating point numbers can wreck havoc with the testing! We can end up with a situation where the tests produce much different results because of infinitesimal changes in the numbers at some point in the calculation.

You might be asking how this can happen? This all seems rather disturbing, and it is. It is simple matter whenever the logical decision is made on the basis of a floating-point number. Consider a simple, but common bit of (pseudo) computer code,

if (value > 0.0) then

newvalue = value1

else

newvalue = value2

endif

which is a very simple test that one might see with upwind finite differences. In some cases a logical switch like this might trigger an elaborate mathematical expression, or even call a very different function or subroutine. Consider the case where the “value” is very near zero, but not exactly. In this case small differences in the quantity “zero” will trigger completely different evaluations of the logic. For special values (especially zero) this happens all the time. If the solution is dependent upon a certain sequence any regression test depending on this test could differ based on inconsequential numerical differences. As programs become more complex these branches and differences explode. Code development teams relying upon regression testing end up chasing this sort of problem over and over. It becomes a huge drain on the productivity and quality.

These problems can be replicated with a set of standard functions. The logic above can be replaced by a single statement using a “sign” function,

newvalue = sign(value) * value1 + 0.5*(1.0 – sign(value)) * value2

which gives exactly the same result as the if test in the previous paragraph. It is also prone to exactly the same problems in practical testing. These issues are the tip of a proverbial iceberg. It isn’t just regression testing that suffers from these issues, if the method for solution involves solving a nonlinear equation that goes through the above logic, the solution can stall and stagnate causing solution accuracy to suffer. The same switches can produce breaks in symmetry or bifurcation of solutions near critical points. Next, I will describe ways of implementing the sign function to alleviate these problems. It turns out that there is a whole family of functions that can replace the discontinuous behavior with something continuous, and the sign function can be used to construct other functions with the same switching behavior built in. I’ve written about some of these functions before in a different context where discontinuous logical functions were replaced by differentiable functions for the purpose of conducting modified equation analysis that rely upon valid Taylor series expansions.

Here are a couple of times I’ve hit upon this topic before: https://wjrider.wordpress.com/2016/06/07/the-marvelous-magical-median/, https://wjrider.wordpress.com/2015/08/17/evolution-equations-for-developing-improved-high-resolution-schemes-part-2/, https://wjrider.wordpress.com/2016/06/22/a-path-to-better-limiters/. This post might be a good postscript to these because the techniques here can cure some of the practical ills remaining for these rather powerful methods. We see issues with solving nonlinear equations where limiters are used in discretizations, various symmetry breaking effects, and extreme sensitivity to initial conditions. As I200px-LimiterRegionwill touch upon at the very end of this post, Riemann solvers-numerical flux functions can also benefit from this, but some technicalities must be proactively dealt with.

Slow is smooth; smooth is fast.

― Jack Coughlin

Using the sign function we can systematically remove the switching behavior that plagues regression testing, nonlinear solutions, symmetry preservation, and extreme sensitivity to initial conditions.

For me, the first function to start this was the hyperbolic tangent function. The basic behavior of this function acts to create a switch between two states based on the argument and steepness of the transition, \mbox{softsign}(x) = \tanh(a x) as a becomes larger, the function approaches the idealized step function. It turns out that there are a number of smooth functional representations of the sign function including \mbox{softsign}(x) = \mbox{erf}(a x), \mbox{softsign}(x) = x/\left(a + \|x\| \right), and \mbox{softsign}(x) = x/\sqrt{a + x^2}. There are many others that can be derived as well as other more exotic functions. These functions are used in other fields to remove the discontinuity from a switching function sign(https://en.wikipedia.org/wiki/Sigmoid_function ).

These functions provide a flexible foundation to build upon. As an initial example take the definition of the absolute value, \|x\| = sign(x)x (https://en.wikipedia.org/wiki/Sign_function ). This can be rearranged in a number of useful forms, sign(x) = x/\|x\| = \|x\|/x . We can see that a simple smoothed version of the absolute value is \mbox{softabs}(x) = \mbox{softsign}(x)x. We can now build an entire family of softened or smoothed functions that can be differentiated (they are C_\infty). Each of the classical versions of these functions cannot be differentiated everywhere and create a host of problems in practical programs. Another common switching function are “min” and “max”. We can rewrite both functions as \min(a,b) = \frac{1}{2}(a+b) - \frac{1}{2}\|a-b\|, and \max(a,b) = \frac{1}{2}(a+b) + \frac{1}{2}\|a-b\|. The modified smooth versions are relatively obvious, \mbox{softmin}( (a,b) = \frac{1}{2}(a+b) - \frac{1}{2}softabs(a-b), and \mbox{softmax}(a,b) = \frac{1}{2}(a+b) + \frac{1}{2}\mbox{softabs}(a-b). From this basic set of function we can build the backbone of the limiters, the minmod function and the marvelous magical median function. What we have removed in the process is the discontinuous switching process that can wreck havoc with finite precision arithmetic.abs

We note that there is a separate version of the softmin and softmax functions used in ladders-or-a-tightropesome optimization solutions (https://www.johndcook.com/blog/2010/01/13/soft-maximum/, https://en.wikipedia.org/wiki/Softmax_function, ). This uses a combination of exponentials and logrithmns to provide a continuously differentiable way to take the maximum (or minimum) of a set of arguments. My naming convention “soft” comes from being introduced to the ideas in this blog post. This separates the idea from a “hard” max where the arguments switch based on the precision of the floating-point numbers as opposed to being continuous. For completeness the softmax uses the following expression \mbox{softmax}(a,b) = \log\left(\exp(n a) + \exp(n b) \right)/n, which may be expanded to additional arguments without complications. By the same token we can define a “softmin” as \mbox{softmin}(a,b) = -\log\left( \exp(-n a) + \exp(-n b) \right)/n that can be similarly be expanded to more arguments. In both cases the parameter n controls the sharpness of the smoothed version of the standard function, the larger the value the closer the function is to the standard function.

Using our previous definitions of “softmax” we can derive a new version of “softabs”.   We rearrange \mbox{softmax}(a,b) = \frac{1}{2}(a+b) +\frac{1}{2} \mbox{softabs}(a-b) to derive a \mbox{softabs}(a). We start with the observation that \mbox{softmax}(a,-a) =\frac{1}{2}(a-a) +\frac{1}{2} \mbox{softabs}(a+a) therefore \mbox{softabs}(a) = \mbox{softmax}(a,-a). We find that this version of the absolute value has much different properties than the previous softened version in some important ways. The key thing about this version of absolute value is always greater in value than the classical absolute value function. This would turn out be useful for use with Riemann solvers by not violating the entropy condition. With appropriate wavespeed estimates the entropy condition will be satisfied (wavespeed estimates are out of scope for this post). By the same token this absolute value is not valuable for limiters because of the same property!

Ultimately we want to understand whether these functions alter the basic accuracy-consistency or stability properties of the numerical methods based on using the classical functions. The answer to this question is subtle, but can be answered via analysis and numerical experiment. Not to belabor the details, but we use series expansions and discover that with appropriate regularization of the smoothed functions, we can use them to replace the classical functions and not undermine the accuracy of the discretization. This has been confirmed for the softened version of the “minmod” limiter. A downside of the softened limiters are small deviations from idealistic monotonticity preserving behavior.

Finally as eluded to early in the post, we can also use these functions to modify Riemann solvers. The first code example can form the logical basis for upwind bias with a finite difference by choosing a one-sided difference based upon the sign of the characteristic velocity. When Riemann solvers are examined we see that either “if” statements are used or when full flux functions are used absolute values (the flux is in a general sense a characteristic quantity multiplied by the characteristic velocity), the absolute value of the characteristic velocity introduces the sign convention the “if” statement provides.

The lingering problem with this approach is the concept of entropy violating sodxapproximations. This issue can easily be explained by looking at the smooth sign function compared with the standard form. Since the dissipation in the Riemann solver is proportional to the characteristic velocity, we can see that the smoothed sign function is everywhere less than the standard function resulting in less dissipation. This is a stability issue analogous to concerns around limiters where these smoothed functions are slightly more permissive. Using the exponential version of “softabs” where the value is always greater than the standard absolute value can modulate this permissive nature.

Let us study things that are no more. It is necessary to understand them, if only to avoid them.

― Victor Hugo

Reality can’t be substituted

 

It doesn’t matter how beautiful your theory is … If it doesn’t agree with experiment, it’s wrong.

― Richard Feynman

Bill’s corollary: It doesn’t matter how massive your calculation is … If it doesn’t agree with experiment, it’s wrong.

The real world is complex, dangerous and expensive. It is also where mystery lives and the source of knowledge. There seems to be some desire to use computers, modeling an
d simulation to replace our need for dealing with the real world. This is untenable from many different perspectives and misplaces the proper role of everything possible via The-most-powerful-Exascale-Computercomputing. Worse yet, computing can not be a replacement for reality, but rather is simply a tool for dealing with it better. In the final analysis the real world still needs to be in the center of the frame. Computing needs to be viewed in the proper context and this perspective should guide our actions in its proper use.

Experiment is the sole source of truth. It alone can teach us something new; it alone can give us certainty.

― Henri Poincaré

We see the confluence of many things in our attitudes toward computing. It is a new thing constantly unveiling new power and possible ways of changing our lives. Infig10_role many ways computing is driving enormous change societally and creating very real stress in the real world. These stresses are stoking fears and lots of irrational desire to control dangers and risks. All of this control is expensive, and drives an economy of fear. Fear is very expensive. Trust, confidence and surety are cheap and fast. One totally irrational way to control fear is ignore it, allowing reality to be replaced. For people who don’t deal with reality well, the online world can be a boon. Still the relief from a painful reality ultimately needs to translate to something tangible physically. We see this in an over-reliance on modeling and simulation in technical fields. We falsely believe that experiments and observations can be replaced. The needs of the human endeavor of communication can be done away with through electronic means. In the end reality must be respected, and people must be engaged in conversation. Computing only augments, but never replaces the real world, or real people, or real experience. This perspective is a key realization in making the best use of technology.

The real world is where the monsters are.

― Rick Riordan

In science we must always remember that understanding reality is the fundamental objective. Theory acts to explain what we see, but observation always rules supreme in defining the validity of knowledge and understanding. We must always remember that computing is a tool that augments theory. It never replaces theory, nor can it replace experiments or observation. A computational simulation can never be better than the model that theory has provided it. If the theory is lacking (and it always is), more computing cannot rescue it. No amount of computing can fill in the gap between what is and isn’t known. It is a new and powerful tool to be wielded with care and skill, but a tool. These perspectives seem to be lost on so many people who see computing as some sort of silver bullet that transcends these simple truths.

It is sometimes an appropriate response to reality to go insane.

― Philip K. Dick

While computing isn’t a silver bullet for making painful elements of reality go away, it is a powerful tool if wielded properly. Modeling and simulation serves as a powerful means of testing our knowledge and general capability to understand the world around us. When simulations are tested against reality and produce good results (that is they are validated), we feel that our grasp of the how’s and why’s of the real world are at hand. If we are grounded in this understanding, the modeling and simulation can aid our ability to examine the World around us. We can optimize our observations or design experiments to more effectively examine and measure various things. A successful model can serve a wonderful role in focusing our attention toward the most important aspects of reality, or ignoring what is not essential.Cielo rotator

More than simply assisting the design of better experiments and observations of reality, the use of modeling and simulation can provide a significant flywheel effect. All the models of reality we use are flawed at some level. In a similar vein, our observations of reality are always limited and flawed. In very good models these flaws are subtle and hard to expose. Good experiments need to be designed to expose and improve these models. We can achieve some stunning synergies if we utilize the models to design the most stringent tests of them. This is exactly the thing we can do with a well-designed program that collaborates effectively. If we examine the models we can find the parts of a physical system most sensitive to the impact of parts of the model. One way of proactively improving models is to identify where to make measurements, and what to measure to maximize the ability to prove, disprove or improve a given model. The key point is the oft-missed point that the models are always imperfect.

Reality is that which, when you stop believing in it, doesn’t go away.

― Philip K. Dick

These imperfections are rarely acknowledged in the current National dialog on high performance computing. Rather than state this rather powerful truth, we see a focus MRISB2on computer power coupled to an unchanging model as the recipe for progress. Focus and attention to improving modeling is almost completely absent in the modeling and simulation world. This ignores one of the greatest truths in computing that no amount of computer power can rescue an incorrect model. These truths do little to alter the approach although we can be sure that we will ultimately pay for the lack of attention to these basics. Reality cannot be ignored forever; it will make itself felt in the end. We could make it more important now to our great benefit, but eventually our lack of consideration will demand more attention.

A more profitable a proactive strategy would benefit everyone. Without attention many end up accommodating the model’s imperfections through heavy use of calibration. Ultimately the calibration hammer is lowered on imperfect models to render them useful and capable of influencing reality. In the wake of heavy-handed calibration we can achieve a great focus on localizing the modeling issues. In a deep sense the areas for crude calibration (often crude and very effective) are exactly the places for the greatest modeling improvement. Typically the calibration ends up merging multiple issues together. As a result one needs to carefully deconstruct the whole of the effects being accounted for in calibration. For example one may find a single calibration knob accounting for the effects of turbulence, inadequate constitutive relations and mesh resolution. To make progress these effects need to be separated and dealt with climate_modeling-ruddmanindependently. The proper decomposition of error allows the improvement of modeling in a principled manner.

The key to utilizing simulation effectively is the recognition of what it can and cannot do. While one can experiment with computations, these experiments can only unveil secrets of the models or computations themselves. The capacity of such unveiled secrets to be meaningful in reality always involves direct comparison with observations of the real world. If the secret seen computationally is also seen in reality then a true discovery can be made. In the process the model gains credibility and validity as well. In these cases simulation and modeling can tell us where to look, and if the secret is found, we know the model is valuable and correct. If it is not found, we know the model is deficient and must be improved. The observations may or may not be sufficient for improving the model in such a way that its predictions are validated by reality.

Successful modeling and simulation implies a level of understanding that empowers humanity. The implication of understanding goes to our ability to control reality effectively through human action. If reality can be modeled its effects can be affected or accommodated through design or mitigation. The definition of success is always through validation of the model’s results against observations of the world (including carefully designed experiments). If the model can be demonstrated via verification to be solvi8ng the model we believe we are using, the validation is powerful evidence. One must recognize that the degree of understanding is always relative to the precision of the questions being asked. The more precise the question being asked is, the more precise the model needs to be. This useful tension can help to drive science forward. Specifically the improving precision of observations can spur model improvement, and the improving precision of modeling can drive observation improvements, or at least the necessity of improvement. In this creative tension the accuracy of solution of models and computer power plays but a small role.

Any physical theory is always provisional, in the sense that it is only a hypothesis: you can never prove it. No matter how many times the results of experiments agree with some theory, you can never be sure that the next time the result will not contradict the theory.

― Stephen Hawking

 

The truth hurts, but it is needed

Sometimes, these tribal affiliations push us to become better versions of ourselves. We take a long-term view, check our selfish impulses and work hard to meet the high standards of those around us.

– Seth Godin

icn.seths.headSometimes you read something that hits you hard. Yesterday was one of those moments while reading Seth Godin’s daily blog post (http://sethgodin.typepad.com/seths_blog/2017/03/the-best-of-us-the-worst-of-us.html). I’ve enjoyed Seth’s books and ideas finding them easy to grasp and connect to. Like a lot of things. The point of the post was simple. Our associations impact us. They can bring out the best or worst in us. When I reflected on this point, the quote above came into sharp focus. Looking at my current work the quote seemed almost cruel. It was completely opposite of everything driving me today. Such a circumstance is ultimately untenable.

Writers like Godin often speak of aspirations for better World, a better workplace that makes all of us better. My managers read these books all the time (Daniel Pink’s book “Drive” comes to mind). I’ve opined that the distance between the workplace espoused in these books and where I work is vast. The management seems to be actively working to make things worse and worse. On the other hand they are always reading these books or going to the Harvard Business Review for advise. Do they really think that they are applying anything to their actual work? It would seem to me that they are completely delusional if they think their actions follow from any of this advise.

I once worked somewhere that pushed me to be better. It was a wonderful place where I grew professionally every day. The people there were generous with their knowledge. Collaboration was encouraged. It was also a rough and tough place to work. The culture was aggressive and combative. There was plenty of bad behavior and conflict. Nonetheless it was an incubator for me. It changed me for the better and filled me with desire to improve. It was also a place that had run out of time so we systematically destroyed it. Perhaps it was a place that can’t exist in today’s world, but it would be good to create places like it that can. We should all aspire to create places that make us better, that help us grow into the best versions of ourselves.

urlI rewrote Godin’s quote to reflect how work is changing me (at the bottom of the post). It really says something needs to give. I worry about how many of us feel the same thing. Right now the workplace is making me a shittier version of myself. I feel that self-improvement is a constant struggle against my baser instincts. I’m thankful for a writer like Seth Godin who can push me to into a vital and much needed self-reflective “what the fuck” !

Sometimes, these tribal affiliations push us to become worse versions of ourselves. We take a short-term view, give into our selfish impulses and become lazy to meet the low standards of those around us.

We are the Over-Managed and Under-Led

 

Management is doing things right; leadership is doing the right things.

― Peter F. Drucker

It’s a really incredible time to be alive. The world is going through tremendous changes in many respects. Much of the change is driven by technology and scientific breakthroughs of the past century. One might reasonably argue that the upheavals we are witnessing today are the most important since the Renaissance and the Reformation. Weencryption-NSA-spying_SS_127879991_090613-617x416 are seeing cultural, economic, and political changes of epic proportions across the human world. With the Internet forming a backbone of immense interconnection, and globalization, the transformations to our society are stressing people resulting in fearful reactions. These are combining with genuine threats to humanity in the form of weapons of mass destruction, environmental damage, mass extinctions and climate change to form the basis of existential danger. We are not living on the cusp of history; we are living through the tidal wave of change. There are massive opportunities available, but the path is never clear or safe. As the news every day testifies, the present mostly kind of sucks. While I’d like to focus on the possibilities of making things better, the scales are tipped toward the negative backlash to all this change. The forces trying to stop the change in its tracks are strong and appear to be growing stronger.

People in any organization are always attached to the obsolete – the things that should have worked but did not, the things that once were productive and no longer are.

― Peter F. Drucker

maxresdefaultMany of our institutions are under continual assault by the realities of today. The changes we are experiencing are incompatible with many of our institutional structures such as the places I work. Increasingly this assault is met with fear. The evidence of the overwhelming fear is all around us. It finds its clearest articulation within the political world where fear-based policies abound with the rise of Nationalist anti-Globalization candidates everywhere. We see the rise of racism, religious tensions and protectionist attitudes all over the World. The religious tensions arise from an increased tendency to embrace traditional values as a hedge against change and the avalanche of social change accompanying technology, globalization and openness. Many embrace restrictions and prejudice as a solution to changes that make them fundamentally uncomfortable. This produces a backlash of racist, sexist, homophobic hatred that counters everything about modernity. In the workplace this mostly translates to a genuinely awful situation of virtual paralysis and creeping bureaucratic over-reach resulting in a workplace that is basically going no where fast. For someone like me who prizes true progress above all else, the workplace has become a continually disappointing experience.

All organizations are perfectly designed to get the results they are now getting. If we want different results, we must change the way we do things.

― Tom Northup

One of the most prevalent aspects of today’s world is the focus on appearances as opposed to substtinder-640x334ance. As we embrace online life and social media, we have gotten supremely fixated on superficial appearances and lost the ability to focus on substance. The way things look has become far more important than the actuality of anything. Having a reality show celebrity as the President seems like a rather emphatic exemplar of this trend. Someone who looks like a leader, but lacks most of the basic qualifications is acceptable to many people. People with actual qualifications are viewed as suspicious. The elite are rejected because they don’t relate to the common man. While this is obvious on a global scale through political upheaval, the same trends are impacting work. The superficial has become a dominant element in managing because the system demands lots of superficial input while losing any taste for anything of enduring depth. Basically, the system as a whole is mirroring society at large.

Management cares about only one thing. Paperwork. They will forgive almost anything else – cost overruns, gross incompetence, criminal indictments – as long as the paperwork’s filled out properly. And in on time.

― Connie Willis

There is nothing so useless as doing efficiently that which should not be done at all.

― Peter F. Drucker

Working within one of our “prized” National institutions has been an interesting, magical and initially empowering experience. Over the past decade or two, these institutions have been dragged down by the broader societal trends into the muck. It is no exaggeration to say that we are being slowly and surely strangled by overwhelming management oversight. The basic recipe for management of the Labs I’ve worked at is making lots and lots of rules to keep people from “fucking up”. The bottom line is that it’s fine if we really don’t accomplish anything as long as people just don’t ever fuck up. The maxim at work is don’t ever fuck up, which is the result of fear being the core motivation for everything. All of our most important institutions are withering under society-wide loss of faith and mistrust. This creates an environment where any scandal can be a direct threat to the future of the institution. This direct threat means that achievement and the very reason for the institution’s existence are lost.

The goal of management is to remove obstacles.

― Paul Orfalea

imagesThe prime institutional directive is survival and survival means no fuck ups, ever. We don’t have to do anything as long as no fuck ups happen. We are ruled completely by fear. There is no balance at all between fear-based motivations and the needs for innovation and progress. As a result our core operational principle for is compliance above all else. Productivity, innovation, progress and quality all fall by the wayside to empower compliance. Time and time again decisions are made to prize compliance over productivity, innovation, progress, quality, or efficiency. Basically the fear of fuck ups will engender a management action to remove that possibility. No risk is ever allowed. Without risk there can be no reward. Today no reward is sufficient to blunt the destructive power of fear.

Our management has become all about no fuck ups, and appearances. The actual, good, productive management work that should be done is virtually entirely absent. We don’t see managers trying to figure out how to clear barriers or enable people to get work done. We see enforced compliance. We hear lots of things about formality of operations and assurance of results. This all comes down to pervasive lack of trust and fear of failure. Increasingly we can fake progress and results. Increasingly bullshit has taken the place of actual results. Even better, bullshit results are safe and entail far less risk of fuck ups. They are mostly upside without the downside, plus bullshit is in vogue! It has the benefit of sounding better than anything we are likely to achieve, and doesn’t carry the risks of real work. The end result is deep-seated corrosive forces unleashed within our institutions that are eating away at them from the inside.

The over-management is joined at the hip with a lack of leadership and di112215_1728_theonlythin1rection. It is the twin force for professional drift and institutional destruction. Working at an under-led institution is like sleepwalking. Every day you go to work basically making great progress at accomplishing absolutely nothing of substance. Everything is make-work and nothing is really substantive you have lots to do because of management oversight and the no fuck up rules. You make up results and produce lots of spin to market the illusion of success, but there is damn little actual success or progress. The utter and complete lack of leadership and vision is understandable if you recognize the prime motivation of fear. To show leadership and vision requires risk, and risk cannot take place without failure and failure courts scandal. Risk requires trust and trust is one of the things in shortest supply today. Without the trust that allows a fuck up without dire consequences, risks are not taken. Management is now set up to completely control and remove the possibility of failure from the system.

Leadership and learning are indispensable to each other.

― John F. Kennedy

The capacity to achievejohn-f-kennedy-1 rewards and achievement without risk is incompatible with experience. Everyday I go to work with the very explicit mandate to do what I’m told. The clear message every day is never ever fuck up. Any fuck ups are punished. The real key is don’t fuck up, don’t point out fuckups and help produce lots of “alternative results” or “fake breakthroughs” to help sell our success. We all have lots of training to do so that we make sure that everyone thinks we are serious about all this shit. The one thing that is absolutely crystal clear is that getting our management stuff correct is far more important than every doing any real work. As long as this climate of fear and oversight is in place, the achievements and breakthroughs that made our institutions famous (or great) will be a thing of the past. Our institutions are all about survival and not about achievement. This trend is replicated across society as a whole; progress is something to be feared because it unleashes unknown forces potentially scaring everyone. The fear resulting in being scared undermines trust and without trust the whole cycle re-enforces itself.

Leaders must be close enough to relate to others, but far enough ahead to motivate them.

― John C. Maxwell

Along with progress, leadership is also sacrificed at the altar of fear. Anything out of the ordinary is completely suppressed in the current environment. The ordinary can be managed and controlled; it is a known quantity. Progress and innovation produces unusual things that might have unintended consequences making its management difficult. Something unusual is more likely to produce a fuck up and therefore it must be stopped to assure the survival imperative. Of course, innovation, progress, and the unusual can also be wonderful and produce the breakthroughs all of us celebrate. The problem is that this cannot take place without risk and the potential for things to get fucked up. This also holds for people, who also must be ordinary, the unusual that might lead us in new directions are to be feared and controlled. The unusual is dangerous and feared. Leaders are unusual, so they too are reviled.

Start with the end in mind.

― Stephen R. Covey

imagesA big piece of the puzzle is the role of money in perceived success. Instead of other measures of success, quality and achievement, money has become the one-size fits all measure of the goodness of everything. Money serves to provide the driving tool for management to execute its control and achieve broad-based compliance. You only work on exactly what you are supposed to be working on. There is no time to think or act on ideas, learn, or produce anything outside the contract you’ve made with you customers. Money acts like a straightjacket for everyone and serves to constrict any freedom of action. The money serves to control and constrain all efforts. A core truth of the modern environment is that all other principles are ruled by money. Duty to money subjugates all other responsibilities. No amount of commitment to professional duties, excellence, learning, and your fellow man can withstand the pull of money. If push comes to shove, money wins. The peer review issues I’ve written about are testimony to this problem; excellence is always trumped by money.

One of the things that are mostly acutely impacted by all of this is the ability for strategic thought, work or action. In the wake of the lack of trust and degree of control, the ability to do big things is almost completely lost. All work becomes unremittingly tactical in nature. Big ideas are lost and people can only envision committing to small things. Big ideas require a level of trust that cannot be summoned or supported. An element in this lack of trust is an obsession with reporting and careful attention to progress by the management. We see rather extensive draws of information from the depths of organizations to check on whether money is being spent properly. The entire management apparatus is engaged in getting information, but nothing is done with it. It is only used to check up on things, the whole of the management is devoted to attending to the trustworthiness of those working. The good that management might do is scarified, and leadership is completely absent. Without some symmetry of trust, the whole idea of leadership is vacant.

What the hell is to be done about all of this? How do we recapture progress and reject fear? How do we embrace leadership and harness management as a force for good rather than decline and decay?

I really don’t know the answer to any of these questions, but I can propose a few things that might resist these issues. Hopefully a large number of people will join together in prizing progress enough to reject fear as a prime societal motivator. The desire to live and grow will overthrow the fear of change. The forces of fear have the potential to undo so much of the good of the modern World. Those who prize modernity and the benefits of freedom and progress will reject fear as a motivator. Realizing that fear emboldens hatred and reduces the potential for good is a first step. We must recognize and reject our so-maxresdefault copycalled leaders who utilize fear as a prime motivation. Every time a leader uses fear to further their agenda, we take a step backward. One the biggest elements in this backwards march is thinking that fear and danger can be managed. Danger can only be pushed back, but never defeated. By controlling it in the explicit manner we attempt today, we only create a darker more fearsome danger in the future that will eventually overwhelm us. Instead we should face our normal fears as a requirement of the risk progress brings. If we want the benefits of modern life, we must accept risk and reject fear. We need actual leaders who encourage us to be bold and brave instead of using fear to control the masses. We need to quit falling for fear-based pitches and hold to our principles. Ultimately our principles need to act as a barrier to fear becoming the prevalent force in our decision-making.

People who don’t take risks generally make about two big mistakes a year. People who do take risks generally make about two big mistakes a year.

― Peter F. Drucker

 

You want quality? You can’t handle the quality!

 

Excellence does not come from believing in excellence, but from constant change, challenge, and improvement.

― Jeffrey Fry

mediocritydemotivatorEveryone wants his or her work or work they pay for to be high quality. The rub comes when you start to pay for the quality you want. Everyone seems to want high quality for free, and too often believes that low cost quality is a real thing. Time and time again it becomes crystal clear that high quality is extremely expensive to obtain. Quality is full of tedious detail oriented work that is very expensive to conduct. More importantly when quality is aggressively pursued, it will expose problems that need to be solved to reach quality. For quality to be achieved these problems must be addressed and rectified. This ends up being the rub, as people often need to stop adding capability or producing results, and focus on fixing the problems. People, customer and those paying for things tend to not want to pay for fixing problems, which is necessary for quality. As a result, it’s quite tempting to not look so hard at quality and simply do more superficial work where quality is largely asserted by fiat or authority.

Trouble cannot be avoided, you either go looking for it or it will come looking for you.

― Constance Friday

imgresThe entirety of this issue is manifested in the conduct of verification and validation in modeling and simulation. Doing verification and validation is a means of high quality work for modeling and simulation. Like other forms of quality work, it can be done well engaging in details and running problems to ground. Thus V&V is expensive and time consuming. These quality measures take time and effort away from results, and worse yet produce doubt in the results. As a consequence the quality mindset and efforts need to have significant focus and commitment, or they will fall by the wayside. For many customers the results are all that matters, they aren’t willing to pay for more. This becomes particularly true if those doing the work are willing to assert quality without doing the work to actually assure it. In other words the customer will take work that is asserted to be high quality based on the word of those doing the work. If those doing the work are trying to do this on the cheap, we produce low or indeterminate quality work, sold as high quality work masking the actual costs.

The reality is that we do have that problem. Work of unknown quality is being asserted as high quality without the evidence of the high quality ever being produced. Generally the evidence is simply the authority of a “trusted” code user.

imgres-2The largest part of the issue is the confluence of two terrible trends: increasingly naïve customers for modeling and simulation and decreasing commitment for paying for modeling and simulation quality. Part of this comes from customers who believe in modeling and simulation, which is a good thing. The “quality on the cheap” simulations create a false sense of security because it provides them financial resources. Basically we have customers who increasingly have no ability to tell the difference between low and high quality work. The work’s quality is completely dependent upon those doing the work. This is dangerous in the extreme. This is especially dangerous when the modeling and simulation work is not emphasizing quality or paying for its expensive acquisition. We have become too comfortable with the tempting quick and dirty quality. The (color) viewgraph norm that used to be the quality standard for computational work that had faded in use is making a come back. A viewgraph norm version of quality is orders of magnitude cheaper than detailed quantitative work needed to accumulate evidence. Many customers are perfectly happy with the viewgraph norm and naïvely accept results that simply look good and asserted as high quality.

Instead of avoiding criticisms, make criticisms work for you.

― Aniekee Tochukwu Ezekiel

ppt3.pngPerhaps an even bigger issue is the misguided notion that the pursuit of high quality won’t derail plans. We have gotten into the habit of accepting highly delusional plans for developing capability that do not factor in the cost of quality. We have allowed ourselves to bullshit the customer to believing that quality is simple to achieve. Instead the pursuit of quality will uncover issues that must be dealt with and ultimately change schedules. We can take the practice of verification as an object lesson in how this works out. If done properly verification will uncover numerous and subtle errors in codes such as bugs, incorrect implementations, boundary conditions, or error accumulation mechanisms. Sometimes the issues uncovered are deeply mysterious and solving them requires great effort. Sometimes the problems exposed require research with uncertain or indeterminate outcomes. Other times the issues overthrow basic presumptions about your capability that require significant corrections in large-scale objectives. We increasingly live in a world that cannot tolerate these realities. The current belief is that we can apply project management to the work, and produce high quality results that ignore all of this.

2D MHD B_60The way that the trip down to “quality hell” starts is the impact of digging into quality. Most customers are paying for capability rather than quality. When we allow quick and dirty means of assuring quality to be used, the door is open for the illusion of quality. For the most part the verification and validation done by most scientists and engineers is the quick, dirty and incomplete variety. We see the use of eyeball or viewgraph norm pervasively in comparing results in both verification and validation. We see no real attempt to grapple with the uncertainties in calculations or measurements to put comparisons in quantitative context. Usually we see people create graphics that have the illusion of good results, and use authority to dictate that these results indicate mastery and quality. For the most part the scientific and engineering community simply gives in to the authoritative claims despite a lack of evidence. The deeper issue with the quick and dirty verification is the mindset of those conducting it; they are working from the presumption that the code is correct instead of assuming there are problems, and collecting evidence to disprove this.

The core issue in quality is the accumulation of evidence pro and con with the courage to accept what that evidence shows. As usual, the greater the evidence the better the work is. Graphical work is useful and powerful only when it is backed up by quantitative work. The dag006quantitative work is the remedy for the qualitative, eyeball, viewgraph, and color video metric so often used today. Deep quantitative studies show the sort of evidence that cannot be ignored. If the results are good, the evidence of quality is strong. If a problem is found, the need for remedy is equally strong. In validation or verification the creation of an error bar goes a long way to putting any quality discussion in context. The lack of an error bar casts any result adrift and lacking in context. A secondary issue would be the incomplete work where full error bars are not pursued, or results that are not favorable are not pursued or worse yet, suppressed.

One key aspect of verification testing is the high degree of technical knowledge needed to conduct it in a fully professional and complete manner. The gap between quick and dirty verification using plot overlays, and various forms of the viewgraph norm, and fully well executed quantitative verification is vast. Orders of magnitude level of effort stands between the quick check producing a plot overlay and a fully quantitative study including rates of convergence. The plot overlay can be produced without consideration for deep fields of technical knowledge that verification taps into. Many of the code development communities at our leading institutions are quite accepting of the quick and dirty approach as adequate. These communities produce little or no push back and the people paying for the code development are too naïve to know the difference. The quick and dirty approach has the virtue of providing a basic qualitative check on the code and is quite capable of unveiling catastrophic problems in the code. One has checked the ability of the code to get some semblance of a solution to a problem with an analytical solution. The subtle issues that must be solved for high quality have lots of room to hide, and sometimes hide for decades.

At the same time the full benefit of the analytical solution and verification is clouded by the lack of quantitative rigor. Moreover the full benefit of verification is lost in the sloppy standard of acceptance for quick and dirty verification. Serious bugs in a code can go completely unnoticed where the full quantitative approach would unmask the bugs for removal by the development teams. If acceptance standards are weak, the quick and dirty approach is accepted as giving the appearance of verification in the absence of its professional level of practice. Validation practices tap into similar differences in execution cost and sophistication in terms of effort and cost.

Nothing in this world is harder than speaking the truth, nothing easier than flattery.

― Fyodor Dostoyevsky

What really happens when the feedback from our customers does not accept the steps necessary for quality? We are starting to see this unfold around us. The forces behind “imagesalternative facts” are driven by this lack of willingness to deal with reality. Why deal with truth and the reality of real problems when we can just define them away with more convenient facts. In today’s world we are seeing a rise of lies, bullshit and delusion all around us. As a result, we are systematically over-promising and under-delivering on our work. We over-promise to get the money, and then under-deliver because of the realities of doing work one cannot get maximum capability with maximum quality for discount prices. Increasingly bullshit (propaganda) fills in the space between what we promise and what we deliver. Pairing with this deep dysfunction is a systematic failure of peer review within programs. Peer review has been installed as backs stop again the tendencies outlined above. The problem is that too often peer review does not have a free reign. Too often with have conflicts of interest, or control that provide an explicit message that the peer review had better be positive, or else.

imgres-1We bring in external peer reviews filled with experts who have the mantle of legitimacy. The problem is that these experts are hired or drafted by the organizations being reviewed. Being too honest or frank in a peer review is the quickest route to losing that gig and the professional kudos that goes along with it. One bad or negative review will assure that the reviewer is never invited back. I’ve seen it over and over again. Anyone who provides an honest critique is never seen again. A big part of the issue is that the reviews are viewed as pass-fail tests and problems uncovered are dealt with punitively. Internal peer reviews are even worse. Again any negative review is met with distain. The person having the audacity and stupidity to be critical is punished. This punishment is meted out with the clear message, “only positive reviews are tolerated.” Positive reviews are thus mandated by threat and retribution. We have created the recipe for systemic failure.

peerreviewPutting the blame on systematic wishful thinking is far too kind. High quality for a discount price is wishful thinking at best. If the drivers for this weren’t naïve customers and dishonest programs, it might be forgivable. The problem is that everyone who is competent knows better. The real key to seeing where we are going is the peer review issue. By squashing negative peer review, the truth is exposed. Those doing all this substandard work know the work is poor, and simply want a system that does not expose the truth. We have created a system with rewards and punishments that allows this. Reward is all monetary, and very little positive happens based on quality. We can assert excellence without doing the hard things necessary to achieve it. As long as we allow people to simply declare their excellence without producing evidence of said excellence quality will languish.

Do Not Lie to Yourself

We have to be honest about what we want and take risks rather than lie to ourselves and make excuses to stay in our comfort zone.

― Roy T. Bennett

 

 

Seek First to Understand

Science is not about making predictions or performing experiments. Science is about explaining.

― Bill Gaede

iceberg3In the modern dogmatic view of high performance computing, the dominant theme of utility revolves around being predictive. This narrative theme is both appropriate and important, but often fails to recognize the essential prerequisites for predictive science, the need to understand and explain. In scientific computing the ability to predict with confidence is always preceded by the use of simulations to aid and enable understanding and assist in explanation. A powerful use of models is the explanation of the mechanisms leading to what is observed. In some cases simulations allow exquisite testing of models of reality, and when a model matches reality we infer that we understand the mechanisms at work in the World. In other cases we have observations of reality that cannot be explained. With simulations we can test our models or experiment with mechanisms that can explain what we see. In both cases the confidence of the traditional science and engineering community is gained through the process of simulation-based understanding.

Leadership and learning are indispensable to each other.

― John F. Kennedy

Too often in today’s world we see a desire to leap over this step and move directly to prediction. This is a foolish thing to attempt. Like fools, this is exactly where we are leaping! The role of understanding in the utility of simulation is vital in building the foundation upon which prediction is laid. This has important technical and cultural imprints that should never be overlooked. The role of building understanding is deep and effective in providing a healthy culture of modeling and simulation excellence. Most essentially it builds deep bonds of curiosity satisfaction within the domain science and engineering community. The experimental and test community is absolutely vital to a healthy approach, and needs a collaborative spirit to thrive. When prediction becomes the mantra without first building understanding, simulations often get put into an adversarial position. For example we see simulation touted as a replacement to experiment and observation. Instead of collaboration simulation becomes an outright threat. This can lead to completely and utterly counter-productive competition where collaboration would serve everyone far better in almost every case.

The-most-powerful-Exascale-ComputerUnderstanding as the object of modeling and simulation also works keenly to provide a culture of technical depth necessary for prediction. I see simulation leaping into the predictive fray without the understanding stage as arrogant and naïve. This is ultimately highly counter-productive. Rather than building on the deep trust that the explanatory process provides, any failure on the part of simulation becomes proof of the negative. In the artificially competitive environment we too often produce, the result is destructive rather than constructive. Prediction without first establishing understanding is an act of hubris, and plants the seeds of distrust. In essence by sidestepping the understanding phase of simulation use makes failures absolutely fatal to success instead of stepping-stones to excellence. This is because the understanding phase is far more forgiving. Understanding is learning and can be engaged in with a playful abandon that yields real progress and breakthroughs. It works through a joint investigation of things no one knows and any missteps are easily and quickly forgiven. This allows the competence and knowledge to be built through the acceptance of failure. Without allowing these failures, success in the long run cannot happen.

Raise your quality standards as high as you can live with, avoid wasting your time on routine problems, and always try to work as closely as possible at the boundary of your abilities. Do this, because it is the only way of discovering how that boundary should be moved forward.

― Edsger W. Dijkstra

The essence of the discussion revolves around the sort of incubator that can be created by a collaborative, learning environment focused on understanding. When the focus is understanding of something the dynamic is forgiving and open. No one knows the answer and people are eager to accept failure as long as it is an honest attempt. More importantly when success comes it has the flavor of discovery and serendipity. The discovery takes the role of an epic win by heroic forces. After the collaboration has worked to provide new understanding and guided the true advance of knowledge, simulation sits in a place where it can be a trusted partner in the scientific or engineering enterprise. Too often in today’s world we disallow the sort of organic mode of capability development in favor of an artificial project based approach.

images-1Our current stockpile stewardship program is a perfect example of how we have systematically screwed all this up. Over time we have created a project management structure with lots of planning, lots of milestone, lots of fear of failure and managed to completely undermine the natural flow of collaborative science. The accounting structure and funding has grown into a noose that is destroying the ability to build a sustainable success. We divide the simulation work from the experimental or application work in ways that completely undermine any collaborative opportunity. Collaborations become forced and teaming with negative context instead of natural and spontaneous. In fact anything spontaneous or serendipitous is completely antithetical to the entire management approach. Worse yet, the newer programs have all the issues hurting the success of stockpile stewardship and have added a lot additional program management formality. The biggest inhibition to success is the artificial barriers to multi-disciplinary simulation-experimental collaborations, and the pervasive fear of failure permeating the entire management construct. By leaping over the understanding and learning phase of modeling and simulation we are short-circuiting the very mechanisms for the most glorious successes. We are addicted to managing programs not to ever fail, which ironically sew the seeds of abject failure.

The problem with the current project milieu is the predetermination of what success looks like. This is then encoded into the project plans and enforced via our prevalent compliance culture. In the process we almost completely destroy the potential for serendipitous discovery. Good discovery science is driven by having rough and loosely defined goals with an acceptance of outcomes that are unknown beforehand, but generally speaking provide immense value at the end of the projects. Today we have instituted project management that attempts to guide our science toward scheduled breakthroughs and avoid any chance at failure. The bottom line is that breakthroughs are grounded on numerous failures and course corrections that power enhanced understanding and a truly learning environment. Our current risk aversion and fear of failure is paving the road to a less prosperous and knowledgeable future.

Aurl specific area where this dynamic is playing out with maximal dysfunctionality is climate science. Climate modeling codes are not predictive and tend to be highly calibrated to the mesh used. The overall modeling paradigm involves a vast number submodels to include a plethora of physical processes important within the Earth’s climate. In a very real sense the numerical solution of the equations describing the climate are forever to be under-resolved with significant numerical error. The system of Earth’s climate also involves very intricate and detailed balances between physical processes. The numerical error is generally quite a bit larger than the balance effects determining the climate, so the overall model must be calibrated to be useful.

You couldn’t predict what was going to happen for one simple reason: people.

― Sara Sheridan

earth_system_interactionsIn the modern modeling and simulation world this calibration then provides the basis of very large uncertainties. The combination of numerical error and modeling error means that the true simulation uncertainty is relatively massive. The calibration assures that the actual simulation is quite close to the behavior of the true climate. The models can then be used to study the impact of various factors on climate and aid the level of understanding of climate science. This entire enterprise is highly model-driven and the level of uncertainty is quite large. When we transition over to predictive climate science, the issues become profound. We live in a world where people believe that computing should help to provide quantitative assistance for vexing problems. The magnitude of uncertainty from all sources should provide people with significant pause and provide a pushback from putting simulations in the wrong role. It should also not prevent simulation from providing a key tool in understanding this incredibly complex problem.

The premier program for high performance computing simply takes all of these issues and amplifies them to an almost ridiculous degree. The entire narrative around the need for exascale computing is entirely predicated on the computers providing predictive calculations. This is counter to the true role of computation as a modeling, learning, explanation, and understanding partner with scientific and engineering domain expertise. While it is wrong at an intrinsic level the secondary element in the program’s spiel is the simplicity of moving existing codes to new, faster computers for better science. Nothing could be further from the truth on either account. Most codes are woefully inadequate for predictive science first and foremost because of their models. All the things that the exascale program ignores are the very things that are necessary for predictivity. At the end of the day this program is likely to only produce more accurately solved wrong models and do little for predictive science. To exacerbate these issues, the exascale 21SUPERCOMPUTERS1-master768program generally does not support the understanding role of simulation in science.

The long-term impact of this lack of support for understanding is profound. It will produce a significant issue with the ability for simulation to elevate itself to a predictive role in science and engineering. The use of computation to help with understanding difficult problems paves the way for a mature predictive future. Removing the understanding is akin to putting someone into an adult role in life without going through a complete childhood. This is a recipe for disaster. The understanding portion of computational collaboration with engineering and science is the incubator for prediction. It allows the modeling and simulation to be very unsuccessful with prediction and still succeed. The success can arise through learning things scientifically through trial and error. These trials, errors and response over time provide a foundation for predictive computation. In a very real way this spirit should always be present in computation. When it is absent, the computational efficacy will become stagnant.

Crays-Titan-SupercomputerIn summary we have yet another case of marketing of science overwhelming the true narrative. In the search for funding to support computing, the sale’s pitch has been arranged around prediction as a product. Increasingly, we are told that a faster computer is all that we really need to do. The implied message in this sale’s pitch is a lack of necessity to support and pursue other aspects of modeling and simulation for predictive success. These issues are plaguing our scientific computing programs. Long-term success of high performance computing is going to be sacrificed, based on this funding-motivated approach. We can add the failure to recognize understanding, explaining and learning as a key products for science and engineering from computation.

Any fool can know. The point is to understand.

― Albert Einstein

seek-to-understand-cloud

 

What Makes A Calculation Useful?

 

It is quality rather than quantity that matters.

― Seneca
The utility of calculations and scientific computing is taken to be axiomatic, yet we cannot easily articulate why a calculation is useful. By virtue of this dynamic, we also can’t tell you why a calculation isn’t useful either. This axiomatic belief underlies the investment by the nation into high performance computing (HPC), yet the lack of clarity on utility clouds any discussion. Worse yet, the clouding of the reasons for utility produces counter-productive investment decisions and suboptimal programs. Perhaps it is high time to wrestle with this issue and try to see our way clear to some greater clarity.

Useful calculations shed light and understanding on issues existing in the real world. The utility of calculation for scientific investigations lies in their ability to study hypotheses or (help to) explain observations. A successful model of reality implies a certain level of understanding that can be comforting, contrasted with an unsuccessful or highly calibrated that drives new work. With sufficient confidence, the calculation allows one to play “what if” scenarios and study the impact of changes to a physical system. This guides physical studies, measurements and observations, which can yield unequivocal evidence. Computations usually do not provide this, but show the way to finding it. The same thing happens in engineering where calculations are often used in a comparative sense to understand how to optimize designs, or fix problems with existing designs. In other cases, the calculation can help to explain why things don’t work, or broke, or behaved in a manner that was unexpected. For calculations need to take a valued role in science and engineering the demonstration the ability to provide these varied aspects of real world functionality are essential. Once calculations step into the role of advisor, sage, and detective, the confidence, trust and credibility follows. This dynamic is never present in any discourse on HPC and current HPC programs almost callously disregard this legacy. This is dangerous and threatens progress.logo

When one looks at HPC, the press is drawn to the biggest, most expensive, most time consuming calculations and the science community allows itself to bullshit people on their utility. Plainly put, the giant calculations inhabiting press releases are simply demos at best, stunts at worst and very rarely science of any note. These hero calculations are not useful for science or engineering. As one of my most senior colleagues has quipped, single calculations will never be the right answer for hard problems. These stunts and demos are single, one-off calculations that have no established pedigree and dubious credibility. The problem is that the broader dynamic in HPC is poisoned by the devotion to the myth of utility of hero calculations. At best, these calculations are harbingers of the calculations that might be useful five to ten years from now and little else. These calculations push and pace progress in HPC, but their use for engineering and domain science is minimal.

What we have is another serious case of bullshit hurting our ability does deal with reality. In HPC, the effort and funding is chasing the biggest calculations while the important work done with smaller calculations simply fails to capture the attention and focus of the community. Increasingly the funding follows the bullshit narrative instead of the actual utility narrative, which is undermining the reality of modeling & simulation impact. The danger in the distance between focus and utility is the loss of opportunity to make HPC really matter and produce unambiguous impact. The community has allowed this fiction to persist and color funding decisions for decades. The efficacy of HPC for science and engineering is suffering as a result. The depth of the issue is great and clarity is actually easy to grasp.

One of the clearest issues with HPC utility is the prevalence of faith in individuals being definitive in credibility. Even today, the analysts involved in conducting calculation matter more to real world concerns using modeling & simulation than any technical work underpinning utility. The importance of the analyst also overwhelms the importance of the code itself. We persist with this false narrative around the importance of codes. One of the clearest results of this gap is the continuing lack of impact for verification and validation. In fact I see regression instead of progress in the impact of technical work for credibility and greater focus on the personal aspect of credibility. In other words, it is gesamthubschrauber-01more important who does a calculation than how the work is done although these two items are linked. This was true 25 years ago with ASCI as it is today. The progress has not happened in large part because we let it, and failed to address the core issues while focusing on press releases and funding profiles. We see the truth squashed because it doesn’t match rhetoric. Now we see lack of funding and emphasis on calculation credibility in the Nation’s premier program for HPC. We continue to trumpet the fiction that the bigger the calculation and computer, the more valuable a calculation is a priori.

supercomputer-2016-1-100693249-large-3x2Even today with vast amounts of computer power, the job of modeling reality is subtle and nuanced. The modeler who conspires to represent reality on the computer still makes the lion’s share of the decisions necessary for high fidelity representations of reality. All of the items associated with HPC impact a relatively small amount of the overall load of analysis credibility. The analyst decides how to model problems in detail including selection of sub-models, meshes, boundary conditions, and the details included and neglected. The computer power and the mesh resolution usually end up being an afterthought and minor detail. The true overall modeling uncertainty is dominated by everything in the analyst’s power. In other words, the pacing uncertainty in modeling & simulation is not HPC; it is all the decisions made by the analysts. Even with the focus on “mesh resolution” the uncertainty associated with the finite integration of governing equations is rarely measured or estimated. We are focusing on a small part of the overall modeling & simulation capability to the exclusion of the big stuff that drives utility.

As usual, the issue is related to the relative sex appeal of the details in modeling & simulation. All the analyst-controlled details are dull and unexciting while HPC is sexy and exciting. The HPC things are easily marketed and receive funding while the analyst details are boring, but essential. The result is a focus on the sexy HPC stuff while the important work done by analysts goes by with meager, haphazard and disparate support. More deeply, the analyst support is defined purely through application work and generally divorced from the HPC work. As such the divide just grows and grows. Moreover the HPC side of the work can dutifully ignore the analyst stuff that matters because the political weight says that the important details matter little. In the HPC work all the glue between the computer-focused HPC work and applications is poorly funded or not funded at all.

One of the core issues in this entire dynamic associated with the utility of computational modeling and simulation is predictivity. Predictive simulations are a sort of “Holy Grail” for computational science. Predictive calculations are not necessarily useful. Useful computations can come from sources that are far from predictive, and the utility is far more driven by the flexibility of computational capability combined with the ability of analysts to wield the computational power. The utility, flexibility and understanding cannot come from the lumbering computational behemoths driving funding. If a calculation is predictive; so much the better it is for utility. The key to predictivity is it demands a lot of evidence and a systematic investigation, which is the whole practice of verification and validation (V&V).

Where utility ends and decoration begins is perfection.

― Jack Gardner

One of the single greatest issues is a general failure to measure prediction, modeling & simulation uncertainties in a holistic manner. Generally uncertainty estimation is limited to parametric epistemic uncertainty, which is an important, but small part of the overall uncertainty budget. Numerical uncertainty is usually not estimated at all, but declaration is made regarding the lack of mesh dependence, or simply the massive size of the calculation renders numerical errors small by fiat. In many cases systems have intrinsic variability that provides an important source of uncertainty (turbulence canonically comes to mind). This is also rarely estimated. Finally we come to the uncertainty directly associated with the analyst’s decisions. When this issue has been studied, the uncertainty associated with analyst modeling decisions or analyst assumptions tends to huge compared to other sources. The easier and common thing to do is a declaration that the calculation is predictive by definition thus avoids any real quantification of the uncertainty.

imgresThe current HPC belief system believes that massive computations are predictive and credible solely by the virtue of overwhelming computational power. In essence they use proof by massive computation as the foundation of belief. The problem is that science and engineering do not work this way at all. Belief comes from evidence and the evidence that matters are measurements and observations of the real World (i.e., this would be validation). Models of reality can be steered and coaxed into agreement via calibration in ways that are anathema to prediction. Part of assuring that this isn’t happening is verification. We ultimately want to make sure that the calculations are getting the right answers for the right reasons. Deviations from correctness should be understood at a deep level. Part of putting everything in proper context is uncertainty quantification (UQ). UQ is part of V&V. Unfortunately UQ has replaced V&V in much of the computational science community, and UQ estimated is genuinely incomplete. Now in HPC most of UQ has been replaced by misguided overconfidence.

This issue is another view of the dynamic where we have allowed alternative facts to displace reality. We are paving the road for a reality where bullshit and facts cannot be separated. It is everyone’s fault for allowing this to happen. Too many of us simply comply with the need for declarative success when admission of failure would suit progress and truth far better. Too often the emphasis is placed on marketing and spin rather than the truth. In the process we have systematically undermined core principles of quality in every corner of life. Perception has been allowed to become more important that truth and reality. Into this vacuum propaganda quickly become the medium of discourse. We may be too far-gone to fix this, and reality will bite back in a viscous manner to restore balance. This restoration will probably be very painful to experience.

bullshit_everywhere-e1345505471862At the core of the problem with bullshit as a technical medium is a general lack of trust, and inability to accept outright failure as an outcome. This combination forms the basis for bullshit and alternative facts becoming accepted within society writ large. When people are sure they will be punished for the truth, you get lies, and finely packaged lies are bullshit. If you want the truth you need to accept it and today, the truth can get you skewered. The same principle holds for the acceptance of failure. Failures are viewed as scandals and not accepted. The flipside of this coin is the truth that failures are the fuel for progress. We need to fail to learn, if we are not failing, we are not learning. Instead of hiding, or bullshitting our way through in order to avoid being labeled failures, we avoid learning, and also corrode our foundational principles. We are locked in a tight downward spiral and all our institutions are under siege. Our political, scientific and intellectual elite are not respected because truth is not valued. False success and feeling good is acceptable as an alternative to reality. In this environment bullshit reigns supreme and being useful isn’t enough to be important.

Raise your quality standards as high as you can live with, avoid wasting your time on routine problems, and always try to work as closely as possible at the boundary of your abilities. Do this, because it is the only way of discovering how that boundary should be moved forward.

― Edsger W. Dijkstra

 

It is High Time to Envision a Better HPC Future

Honest differences are often a healthy sign of progress.

― Mahatma Gandhi

Last week I attended a rather large scientific meeting in Knoxville Tennessee. It was the kickoff meeting for the Exascale Computing Project. This is a relatively mediocritydemotivatorhuge program ($250 million/year) and the talent present at the meeting was truly astounding, a veritable who’s who in computational science in the United States. This project is the crown jewel of the national strategy to retain (or recapture) pre-eminence in high performance computing. Such a meeting has all the makings for banquet of inspiration, and intellectually thought-provoking discussions along with incredible energy. Simply meeting all of these great scientists, many of whom also happen to be wonderful friends only added to the potential. While friends abounded and acquaintances were made or rekindled, this was the high point of the week. The wealth of inspiration and intellectual discourse possible was quenched by bureaucratic imperatives leaving the meeting a barren and lifeless launch of a soulless project.

The telltale signs of worry were all present in the lead up to the meeting: management of work took priority over the work itself, many traditional areas of accomplishment are simply ignored, political concerns swamping technical ones, and most damningly no aspirational vision. The meeting did nothing to dampen or dispel these signs, and we see a program spiraling toward outright crisis. Among the issues hampering the project is theimages-1 degree of project management formality being applied, which is appropriate for a benign construction projects and completely inappropriate for HPC success. The demands of the management formality was delivered to the audience much like the wasteful prep work for standardized testing in our public schools. It will almost certainly have the same mediocrity inducing impact as that same testing regime, the illusion of progress and success where none actually exists. The misapplication of this management formality is likely to provide a merciful deathblow to this wounded mutant of a program. Some point in the next couple of years we will see the euthanized project as being the subject of a mercy killing.

There can be no progress without head-on confrontation.

― Christopher Hitchens

The depth of the vision problem in high performance computing (HPC) is massive. For a quarter of a billion dollars a year, one might expect an expressive and expansive vision for a future to be at the helm of the project. Instead the vision is a stale and spent version of the same approach taken in HPC for the past quarter of a Century. ECP simply has nothing new to offer. The vision of computing for the future is the vision of the past. A quarter of a century ago the stockpile stewardship program came to being in the United States and the lynchpin of the program was HPC. New massively parallel computers would unleash their power and tame our understanding of reality. All that was needed then was some faster computers and reality would submit to the power of computation. Today’s vision ArtilleryShellis exactly the same except the power of the computers is 1000 times greater than the computers that would unlock the secrets of the universe a quarter of a century ago. Aside from the Exascale replacing Petascale in computing power, the vision of 25 years ago is identical to today’s vision. The problem then as now is the incompleteness of the vision and fatal flaws in how it is executed. If one adds a management approach that is seemingly devised by Chinese spies to undermine the program’s productivity and morale, the outcome of ECP seems assured, failure. This wouldn’t be the glorious failure of putting your best foot forward seeking great things, but failure born of incompetence and almost malicious disregard for the talent at their disposal.

The biggest issue with the entire approach to HPC is evident in the room of scientists I sat with last week, the minds and talents of these talented people are not being engaged. Let’s be completely clear, the room was full of immense talent with many members of the National Academies present, yet no intellectual engagement to speak of. How can we john-von-neumann-2succeed at something so massive and difficult while the voices of those paid to work on the project are silenced? At the same time we are failing to develop an entire generation of scientists with the holistic set of activities needed for successful HPC. The balance of technical activities needed for healthy useful HPC capability is simply unsupported and almost actively discouraged. We are effectively hollowing out an entire generation of applied mathematicians, computational engineers and physicists pushing them to focus more on software engineering than their primary disiplines. Today someone working in applied mathematics is more likely to focus on object oriented constructs in C++ than functional analysis. Moreover the software is acting as a straightjacket for the mathematics slowly suffocating actual mathematical investigations. We see important applied mathematical work avoided because software interfaces and assumptions are incompatible. One of the key aspects of ECP is the drive for everything to be expressed in software as products and our raison d’être. We’ve lost the balance of software as a necessary element in checking the utility of mathematics. We now have software in ascendency, and mathematics as a mere afterthought. Seeing this unfold with the arrayed talents on display in Knoxville last week felt absolutely and utterly tragic. Key scientific questions that the vitality of scientific computing absolutely hinge upon are left hanging without attention and progress on them is almost actively discouraged.

When people don’t express themselves, they die one piece at a time.

—  Laurie Halse Anderson

At the core of this tragedy is a fatally flawed vision of where we are going as a community. It was flawed 25 years ago, and we have failed to learn from the plainly obvious lessons. The original vision of computer power uber alles is technically and scientifically flawed, but financially viable. This is the core of the problem as dysfunction; we can get a flawed program funded and that is all we need to go forward. No leadership asserts itself to steer the program toward technical vitality. The flawed vision brings in money and money is all we need to do things. This gets to the core of so many problems as money becomes the sole source of legitimacy, correctness and value. We have lost the ability to lead by principle, and make hard choices. Instead the baser instincts hold sway looking only to provide the support for little empires that rule nothing.

First, we should outline the deep flaws in the current HPC push. The ECP program is about one thing, computer hardware. The issue a quarter of a century ago is the same as it is today; the hardware alone does not solve problems or endow us with capability. It is a single element in our overall ability to solve problems. I’ve argued many times that it is far from being the most important element, and may be one of the lesser capabilities to support. The item of greatest importance are the models of reality we solve, followed byMainframe Computerthe methods used to solve these models. Much of the enabling efficiency of solution is found in innovative algorithms. The key to this discussion is the subtext that these three most important elements in the HPC ecosystem are unsupported and minimized in priority by ECP. The focal point on hardware arises from two elements, the easier path to funding, and the fandom of hardware among the HPC cognoscenti.

We would be so much better off if the current programs took a decisive break with the past, and looked to move HPC in a different direction. In a deep and abiding way the computer industry has transformed in the last decade by the power of mobile computing. We have seen cellphones become the dominant factor in the industry. Innovative applications and pervasive connectivity has become the source of value and power. A vision of HPC that resonates with the direction of the broader industry would benefit from the flywheel effect instead of running counter to direction. Instead of building on this base, the HPC world remains tethered to the mainframe era long gone everywhere else. Moreover HPC remains in this mode even as the laws of physics conspire against it, and efforts suffer from terrible side effects of the difficulty in making progress in the outdated approach being taken. The hardware is acting to further tax every effort in HPC making the already threadbare support untenably shallow.cell-phone

Instead of focusing on producing another class of outdated lumbering dinosaur mainframes, the HPC effort could leap onto clear industry trends and seek a bold resonant path. A combination of cloud based resources, coupled with connectivity could unleash ubiquitous computing and seamless integration with mobile computing forces. The ability to communicate works wonders for combining ideas and pushing innovation ahead would do more to advance science than almost any amount of computing power conceivable. Mobile computing is focused on general-purpose use, but hardly optimized for scientific use, which brings different dynamics. Specific effort to energize science through different computing dynamics could provide boundless progress. Instead of trying something distinct and new, we head back to a mine that has long since born its greatest ore.

Progress in science is one of the most fertile engines for advancing the state of humanity. The United States with its wealth and diversitquick-fix-movie-to-watch-office-space-imagey has been a leading light in progress globally. A combination of our political climate and innate limits in the American mindset seem to be conspiring to undo this engine of progress. Looking at the ECP program as a microcosm of the American experience is instructive. The overt control of all activities is suggestive of the pervasive lack of trust in our society. This lack of trust is paired with deep fear of scandal and more demands for control. Working in almost unison with these twin engines of destruction is the lack of respect for human capital in general, which is only made more tragic when one realizes the magnitude of the talent being wasted. Instead of trust and faith in the arrayed talent of the individuals being funded by the program, we are going to undermine all their efforts with doubt, fear and marginalization. The active role of bullshit in defining success allows the disregard for talent to go unnoticed (think bullshit and alternative facts as brothers).

Progress in science should always be an imperative of the highest order for our research. When progress is obviously constrained and defined with strict boundaries as we are seeing with HPC, the term malpractice should come to mind. One of the clearest elements of HPC is a focus upon management and strict project controls. Instead I see the hallmarks of mismanagement in the failure to engage and harness the talents, capabilities and 11potential of the human resource available to them. Proper and able management of the people working on the project would harness and channel their efforts productively. Better yet, it would inspire and enable these talented individuals to innovate and discover new things that might power a brighter future for all of us. Instead we see the rule of fear, and limitations governing people’s actions. Instead we see an ever-tightening leash placed around people’s neck suffocating their ability to perform at their best. This is the core of the unfolding research tragedy that is doubtlessly playing out across a myriad of programs far beyond the small-scale tragedy unfolding with HPC.

We can only see a short distance ahead, but we can see plenty there that needs to be done.

― Alan Turing

 

 

Fear Makes Us Weak

Fear is the mind-killer.

― Frank Herbert

imagesIf one wants to understand fear and how it can destroy competence and achievement take a look at (American) football. How many times have you seen a team undone during the two minute drill? A team who has been dominating the other team defensively suddenly becomes porous when it switches to the prevent defense, it is a strategy born out of fear. They stop doing what works, but is risking and takes a safety first approach. It happens over and over providing the Madden quip that the only thing the prevent defense prevents is victory. It is a perfect metaphor for how fear plays out in society.

Fear is a rather enormous player in societal decision-making. In playing an over sized role fear provides a massive drain on everything we do ultimately costing us more than we can possibly estimate. Fear produces actions that work steadfastly to undermine every single productive bit of work we might do. Fear drives decisions that cause everything we do to be more expensive. Fear costs us time. Fear destroys trust. Fear undermines openness. Fear enslaves us to a pessimistic life always looking for disaster. In the end fear will keep us from succeeding at making the world better. Fear is making the world worse.

112215_1728_theonlythin1Over 80 years ago we had a leader, FDR, who chastened us against fear saying, “we have nothing to fear but fear itself”. Today we have leaders who embrace fear as a prime motivator in almost every single public policy decision. We have the cynical use of fear to gain power used across the globe. Fear is also a really powerful way to free money from governments too. Terrorism is both a powerful political tool for both those committing the terrorist acts, and the military-police-industrial complexes to retain their control over society. We see the rise of vast police states across the Western world fueled by irrational fears of terrorism.

If you want to control someone, all you have to do is to make them feel afraid.

― Paulo Coelho

GOP 2016 DebateFear also keeps people from taking risks. Many people decide not to travel because of fears associated with terrorism, among other things. Fear plays a more subtle role in work. If failure becomes unacceptable, fear will keep people from taking on difficult work, and focus on easier, low-risk work. This ultimately undermines our ability to achieve great things. If one does not focus on attempting to achieve great things, the great things simply will not happen. We are all poorer for it. Fear is ultimately the victory of small-minded limited thinking over hope and abundance of a better future. Instead of attacking the future with gusto and optimism, fear pushes us to contact to the past and turn our backs on progress.

One of the huge downsides to fear-based decision-making is shutting down cimages copyommunication. Good communication is based on trust. Fear is the absence of trust. People are afraid of ideas, and afraid to share their ideas or information with others. As Google amply demonstrates, knowledge is power. Fear keeps people form sharing information and leader to an overall diminishment in power. Information if held closely will produce control, but control of a smaller pie. Free information makes the pie bigger, creates abundance, but people are afraid of this. For example a lot of information is viewed as dangerous and held closely leading to things like classification. This is necessary, but also prone to horrible abuse.

Power does not corrupt. Fear corrupts… perhaps the fear of a loss of power.

― John Steinbeck

A big part of the abuse is retention of power, and used to enhance the power of those holding the power. The issue with this information control is how it inhibits people from working on things that have the greatest value, or simply working allowing people to work on things that others already know don’t work. It keeps people from building productively on the knowledge that others possess. In this and a myriad of other ways the control and failure to share information leads to a diminished future devoid of the potential freedom offers.

He who has overcome his fears will truly be free.

― Aristotle

There are very few truly unique, new ideas. Instead new things and new ideas arise from combining old ideas in new ways or for new purposes. With more ideas on the table and available, the possibilities and discoveries are great and more varied. The entirety of human experience and technology is based on the sharing of information, the combination of old existing ideas over and over. Just as the printing press created the sharing of knowledge and an explosion of creativity, the Internet is doing the same thing today. It can be a force for good and freedom. It can also be a force of evil and chaos as we have seen unfolding in the events of the World. Our job should be make sure that we actively work to make sure information can be harnessed as an agent for good. Fear when added to mix becomes a direct and powerful force for pushing us toward evil and chaos.

Another aspect of modern life and the availability of information is the ever-present worry of scandal and the implications of being part of it. Spurring this fear-based environment is the use of scandal as a political tool and the chaos scandal produces. There are fears of audits and unwanted attention driving decision-making and pushing all sorts of costs. All of this is driven by a general lack of trust across society and the rise of fear as a motivating factor. Instead of being focused on progress and achievement, we see fear of loss and terror at the prospect of scandal forming the basis of decisions. This is captured in the oft-heard comment, “I don’t want to see this featured on the front page of the New York Times.” To avoid this possibility we incur massive costs and horrible performance penalties. The bottom line is that fear is inhibiting our ability to create a better, richer and more abundant future.

Most people do not really want freedom, because freedom involves responsibility, and most people are frightened of responsibility.

― Sigmund Freud

Fear is used because fear works. Fear has become a powerful tool that political forces use to push their agenda, or attack their enemies. The most evident fear-based vehicle is terrorism, which our governments make much more powerful through channeling the fear to support the creation of large pervasive police-surveillance state. Instead of defeating terror, the state amplifies the impact of terror, terrorizes the populace, and becomes the source of terror itself. The greatest weapon against terror is to not be terrorized. Courage and bravery in the face of terror is the cure. Our reaction to terrorism gives it all of its power. By our fearful reaction we insure that more terror will bred out of our fearful reaction to it. This principle is broadly applicable. Our reactions to fear empower the fears and allow them to shape our lives. To overcome fear, we must cease to be afraid. We must be led to not fall prey to fear. Instead we are led to be afraid, and amplify our fears as a means of subservience.

maxresdefault copyWithout leadership rejecting fear too many people simply give into it. Today leaders do not reject fear; they embrace it; they use it for their purposes, and amplify their power. It is easy to do because fear engages people’s animal core and it is prone to cynical manipulation. This fear paralyzes us and makes us weak. Fear is expensive, and slow. Fear is starving the efforts society could be making to make a better future. Progress and the hope of a better future rests squarely on our courage and bravery in the face of fear and the rejection of it as the organizing principle for our civilization.

Our enemy is not terror, it is losing our soul while fighting terror.

— Jeff Lawson

And one has to understand that braveness is not the absence of fear but rather the strength to keep on going forward despite the fear.

― Paulo Coelho