Money Equals Quality and Marketing of Research is Poisoning Science

Money is a great servant but a bad master.

― Francis Bacon

quick-fix-movie-to-watch-office-space-imageOne of the clearest characteristics of our current research environment is the dominance of money. This only shadows the role of money is society at large. Money has become the one-size-fits-all measuring stick for science. This includes the view of the quality of science. If something gets a lot of money, it must be good. Quality is defined by budget. This shallow mindset is incredibly corrupting all the way from the sort of Lab’s where I work at to Universities and everything in between. Among the corrupting influences is the tendency for promotion of science to morph into pure marketing. Science is increasingly managed as a marketing problem and quality is equivalent to its potential for being flashy. In the wake of this attitude is a loss of focus on the basics and fundamentals of managing research quality.

Making money isn’t hard in itself… What’s hard is to earn it doing something worth devoting one’s life to.

― Carlos Ruiz Zafón

Doing science properly becomes an afterthought, and ultimately a lower priority. Doing the basic fundamental work for high quality research does not bring in money thus becomes optional. More and more the basics simply don’t get done. The core of managing research is talent management and development. It is about hiring, developing and retaining the best people for the work. If one thing is clear about our Universities and Labs, talented people are not important. There are those who might chafe at this, but talent is now the ability to gets lots of money, not do great work. Expertise is something all of these institutions are ceasing to value. Experts are expensive, and complicate things. Marketing is all about simple and experts tend to make things hard. Things are hard because they are. All of this is consistent with the overall diminishing ethics and integrity in public life. Rather than focus on a mission, or high-quality, money becomes the emphasis with mission and quality sacrificed as nuisance, and troublingly made equivalent to financial measures.

Don’t think money does everything or you are going to end up doing everything for money.

― Voltaire

bigbrotherMoney is a tool. Just like a screwdriver, or a pencil, or a gun. We have lost sight of this fact. Money has become a thing unto itself and replaced the value it represents as an objective. Along the way the principles that should be attached to the money have also been scuttled. This entire ethos has infected society from top to bottom with the moneyed interests at the top lording over those without money. Our research institutions are properly a focused reflection of these societal trends. They have a similar social stratification and general loss of collective purpose and identity. Managers have become the most important thing superseding science or mission in priority. Our staff are simply necessary details and utterly replaceable especially with quality being an exercise in messaging. Expertise is a nuisance, and expert knowledge something that only creates problems. This environment is tailored to a recession of science, knowledge and intellect from public life. This is exactly what we see in every corner of our society. In its place reigns managers and the money, they control. Quality and excellence are meaningless unless they come with dollars attached. This is our value system, everything is for sale.

What’s measured improves

― Peter F. Drucker

mediocritydemotivatorThe result of the system we have created is research quality in virtual freefall. The technical class has become part of the general underclass whose well-being is not the priority of this social order. Part of the rise of the management elite as the identity of organizations is driven by this focus on money. Managers look down into organizations for glitzy marketing ammo, to help the money flow. The actual quality and meaning of the research is without value unless it comes with lots of money. Send us your slide decks and especially those beautiful colorful graphics and movies. Those things sell this program and get the money in the door. That is what we are all about, selling to the customer. The customer is always right, even when they are wrong as long as they have the cash. The program’s value is measured in dollars. Truth is measured in dollars, and available for purchase. We are obsessed with metrics, and organizations far and wide work hard to massage them to look good. Things like peer review are to be managed and generally can be politicked into something that makes organizations look good. In the process every bit of ethics and integrity can be squeezed out. These managers have rewritten the rules to make this all kosher. They are clueless about the corrosive and damaging all of this is to the research culture.

Make no mistake our research culture has been undermined systematically. The people at the top are acting in full accordance with the rules designed to make their job better and provide them with “ethical” justification. The pay structure and benefits have been systematically slanted to their advantage. Organizations are defined by their management talent rather than the technical talent. Managers are celebrated and held up as the emblems of organizational identity. Gone is the sense that managers are there to serve their organizations and enable the best work. The issue is the low-quality, low-integrity and low-ethics culture instilled at the top. These attitudes are in lock step with the rest of society. Across the organizations from industry to academia to government we see one set of rules for the management at the top and another set of rules for the peons laboring below. Ethical lapses and low integrity actions by peons are swiftly and mercilessly punished while the same actions by managers receive praise. Our management is creating a culture of hypocrisy and privilege then acting utterly oblivious to the consequences. We are a society where as the saying goes “the fish rots from the head”.  Our leaders lack ethical fiber and integrity while celebrating incompetence all while being compensated handsomely. They will all simply claim to be acting within the written rules and avoid any discussion of the moral, ethical and culturally corrosive implications of their actions. The new cultural norm is that the top of society rules with a “do as I say, not as I do” mentality. Our leadership is morally bankrupt and ethically corrupt; yet operating fully within the parameters of the rules or laws.

On the face of it, shareholder value is the dumbest idea in the world.

– Jack Welch

Once upon a time we had incredible research organizations across our society including industry, academia, and government. We have allowed a number of forces loose to destroy these societal treasures. One of the biggest forces undermining the quality and competence of our research is lack of trust. This lack of trust has manifested itself as an inability to take risks necessary for research quality. The lack of trust has also produced an immense administrative load that our management class delivers to make society happy. This is only one of the forces undermining research albeit a powerfully destructive one. The second force is equally harmful. This is the topic today, the dominance of money in managing and measuring science. Money has become the great measure of what is good and bad.  Rich is good, poor is bad.  If you are poor, you are a bad person. It is your fault. A big part of this force is related to the dominant business principle of today. Profit is king, and everything is OK if it benefits stockholders. This principle is undermining society as a whole and making life awful for the vast majority of people while enriching the upper class and powering inequality to record levels. The same poisonous principles have been adopted by research institutions almost reflexively. The impact on organization structure mirrors society. In addition to managing society’s lack of trust, the adoption of “business” principles to research have powered the management class. Along with these principles has come a redefinition of integrity, ethics and quality to be strongly associated with money. Simply having money makes things high integrity, ethical and high quality. Without money you have the opposite without regard to other facts (which are optional today anyway). Culture has followed suit.

Free enterprise cannot be justified as being good for business. It can be justified only as being good for society.”

–Peter Drucker

This discussion cannot be approached in a rational way without addressing the nature of DMgfsliWkAAzZ_-our highest leadership today. We are not led by people with integrity, ethics or basic competence. The United States has installed a rampant symptom of corruption and incompetence in its highest office. Trump is not the problem, he is the symptom of the issue. He may become a bigger problem if allowed to reign too long, he can become a secondary infection. He exemplified every single issue we have with ethics, integrity and competence to an almost cartoonish magnitude. Donald Trump is the embodiment of every horrible boss you’ve ever had, then amplified to an unimaginable degree. He is completely and utterly unfit for the job of President whether measured by intellect, demeanor, ethics, integrity or philosophy. He is pathologically incurious. He is a rampant narcissist whose only concern is himself. He is lazy and incompetent. He is likely a career white color criminal who has used money and privilege to escape legal consequences. He is a gifted grifter and conman (whose greatest con is getting this office). He has no governing philosophy or moral compass. He is a racist, bigot and serial abuser of women.

He is a fucking moron.

– Rex Tillerson, Secretary of State under President Trump

trumps-leaked-tax-return-reveals-how-the-apprentice-helped-make-him-a-lot-richerIn a nutshell Donald Trump is someone you never want to meet and someone who should never wield the power of his current office. You don’t want him to be your boss, he will make your life miserable and throw you under the bus if it suits him. He is a threat to our future both physically and morally. In the context of this discussion he is the exemplar of what ills the United States including organizations that conduct research. He stands as the symbol of what the management class represents. He is decay. He is incompetence. He is a pathological liar. He is worthy of no respect or admiration save his ability to fool millions. He is the supremacy of marketing over substance. He is someone who has no idea how ironic his mantra “make America great again” is completely undermined by his every breath. His rise to power is the most clear and evident example of how our greatness as a nation has been lost and his every action accelerates our decline. People across the World have lost faith in the United States for the good reason. Any country that elected this moronic, unethical con man as leader is completely untrustworthy. No one symbolizes our fall from greatness more completely than Donald Trump as President.

Rank does not confer privilege or give power. It imposes responsibility.

― Peter F. Drucker

The deeper worry is that all of these problems will ultimately result in very real consequences. The signs are all around us and our leaders at every level act do nothing. We cannot violate the fundamentals of competence and quality for so long and not suffer ill effects. Reality will descend upon us and it will not be pretty. Just as the research in the United States is falling from its summit, the effects will be felt in other areas of life. cargo-cultThe long-term impact could well be catastrophic. We can only fake it for so long before it catches up with us. We can allow our leadership to demonstrate such radical disregard for those they lead for so long. The lack of integrity, ethics and morality from our leadership even when approved by society will create damage that our culture cannot sustain. Even if we measure things in the faulty lens of money, the problems are obvious. Money has been flowing steadily into the pockets of the very rich and the management class and away from societal investment. We have been starving our infrastructure for decades. Our roads are awful, and bridges will collapse. 21st Century infrastructure is a pipe dream. Our investments in research and development have been declining in the same time frame scarified for short term profit. At the same time the wealth of the rich has grown, and inequality has become profound and historically unprecedented. These figures are completely correlated. This correlation is not incidental, it is a change in the priorities of society to favor wealth accumulation. The decline of research is simply another symptom.

Money is not quality, money is not the objective. Money does not replace ethics and integrity. Reality matters and marketing does not replace quality and focus on the fundamentals. We need to prize people and prioritize talent and expertise if we want to succeed. Who we choose to lead us matters and the values they represent. It is time to choose differently.

Top 15 Things Money Can’t Buy

Time. Happiness. Inner Peace. Integrity. Love. Character. Manners. Health. Respect. Morals. Trust. Patience. Class. Common sense. Dignity.

― Roy T. Bennett

 

 

 

 

 

 

Advertisements

The Essential Problem with Essentially Non-Oscillatory Methods

To understand a science it is necessary to know its history.

― Auguste Comte

maxresdefaultAfter monotonicity-preserving methods came along and revolutionized the numerical solution of hyperbolic conservation laws, people began pursuing follow-on breakthroughs.  Heretofore nothing has appeared as a real breakthrough although progress has been made. There are some very good reasons for this and understanding them helps us see how and where progress might be made and how. As I noted several weeks ago in the blog post about Total Variation Diminishing methods, the breakthrough with monotonicity preserving came in several stages. The methods were invented by practitioners who were solving difficult practical problems. This process drove the innovation in the methods. Once the methods received significantly notice as a breakthrough, the math came along to bring the methodology into rigor and explanation. The math produced a series of wonderful connections to theory that gave results legitimacy, and the theory also connected the methods to earlier methods dominating the codes at that time. People were very confident about the methods once math theory was present to provide structural explanations. With essential non-oscillatory (ENO) methods, the math came first. This is the very heart of the problem.

Later I will elaborate on some of the technical challenges with ENO methods, but their first problem was related to their origin. Real progress is made by solving difficult problems in impossible ways. The methods preceding ENO were created to deal with real problems that could not be successfully solved. The innovation arose to solve the problems, not create better methods. The solution to the problems was allowed by better methods. This is key. Solving the problem is the thing to focus on without prejudice toward the means. Today’s research tends to define the means of progress a priori and results in an unnatural process. In addition, we need to be open to a multitude of means to solution. Progress and breakthroughs often come via serendipity and from places unexpected. ENO was a solution looking for a problem. This is why it hasn’t meant the level of success we had hoped for.

18-330s12As I noted the monotonicity preserving methods came along and total variation theory to make it feel rigorous and tie it to solid mathematical expectations. Before this the monotonicity preserving methods felt sort of magical and unreliable. The math solidified the hold of these methods and allowed people to trust the results they were seeing. With ENO, the math came first with a specific mathematical intent expressed by the methods. The methods were not created to solve hard problems although they had some advantages for some hard problems. This created a number of issues that these methods could not overcome. First and foremost was fragility, followed by a lack of genuine efficacy. The methods would tend to fail when confronted with real problems and didn’t give better results for the same cost. More deeply, the methods didn’t have the pedigree of doing something amazing that no one had seen before. ENO methods had no pull.

A bit of deeper dive is needed here. Originally, the monotone methods were low accuracy, but exceedingly reliable (monotonicity is the feature of producing physical solutions without unphysical artifacts, i.e. oscillations). These low-order methods had their own artifacts, extreme dissipation making solutions to every problem essentially laminar and unenergetic. These solutions did not replicate what we seen naturally. Conversely, high accuracy methods came with oscillations and unreliability. To solve real problems with high-order methods seemingly ad hoc methods like artificial viscosity could provide greater reliability. Innovation came along and produced a solution where you could blend the high order methods with the original monotone low-order methods in an adaptive manner. All of a sudden you could get reliability along with most of the accuracy. Most importantly the complex energetic flows seen in nature could be simulated practically. Flows that are turbulent suddenly looked and acted turbulent. As if almost by magic the results were regarded. This magic caught people’s attention and drove almost complete adoption of these methods by the community.

Don’t mistake activity with achievement.

― John Wooden

Only after the interest in the community came along did the mathematical rigor join the fray. I’ll note that the preceding state of affairs had a good mathematical ground itself providing the foundation for progress. Most notably the barrier theorem by Godunov provided a clear challenge that the innovators needed to overcome. Godunov’s theorem told us that a linear second-order method could not be monotone (non-oscillatory). The key to overcoming the theorem was to move to nonlinear second-order methods where the discrete representation is a function of the solution itself. The new mathematics tied admissibility conditions for solutions together with the new nonlinear methods. We overcame existing mathematical limits by changing the rule and tied ourselves to modest and minimal requirements for the validity of the results.

4_image_1ENO methods were devised to move the methods ahead. ENO took the adaptive discrete representation to new heights. Aside from the “adaptive” aspect the new method was a radical departure from those it preceded. The math itself was mostly notional and fuzzy lacking a firm connection to the same preceding work. If you had invested in TVD methods, the basic machinery you used had to be completely overhauled for ENO. The method also came with very few guarantees of success. Finally, it was expensive, and suffered from numerous frailties. It was a postulated exploration of interesting ideas, but in the mathematical frame, not the application frame. Its development also happened at the time when applied mathematics began to abandon applications in favor of a more abstract and remote connection via packaged software.

80% of results come from 20% of effort/time

― Vilfredo Pareto

In spite of this, the intrinsic weaknesses of ENO were exposed and have certainly improved over time. The adaptive stencil selection in the original ENO could produce genuinely pathological results including instabilities. The answer to this issue has canonically been provided by weighted ENO (WENO) methods. The methods were constructed to be intrinsically numerically stable. WENO also provided another benefit albeit only partially. If a solution is sufficiently smooth locally, the domain of dependence for the discrete representation can support a higher order method. WENO automatically selects this method. This was another decrement of ENO, the wastefulness of the method’s adaptively in places where is was unnecessary. The original ENO also could produce extreme sensitivity to small changes in the solution. An infinitesimal change in the solution can result in a completely different discrete method, and WENO cured this issue. Nonetheless, WENO was not a complete answer because of its intrinsic expense, and its modification of the high-order stencil when linear and nonlinear stability did not require it. Robustness of solutions could be compromised by unphysical solutions (often with negative densities, pressure or energies).  New limiters were devised to provide protection from these problems and improved the methods. In spite of all this progress, for difficult problems, WENO was still less accurate and more expensive than high quality second-order methods.

imagesENO and WENO methods were advantageous for a narrow class of problems usually having a great deal of fine scale structure. At the same time, they were not a significant (or any) improvement over the second-order accurate methods that dominate the production codes for the broadest class of important application problems. It’s reasonable to ask what might have been done differently to product a more effective outcome? One of the things that hurt the broader adoption of ENO and WENO methods is an increasingly impenetrable codes where large modification is nearly impossible as we create a new generation of legacy codes (retaining the code base).

Part of the adoption of the monotonicity preserving methods was the quantum leap in solution quality. This may not be achievable with other methods, or ENO & WENO. Part of the quantum leap derived its origin from the place the methods came from: innovative application solutions. Part was simply an incredibly valuable low hanging fruit that was harvested in the process of invention. A second part of the rapid adoption was a firm tie to the past where a hybridization of legacy methods could produce a fantastically more powerful method. ENO and WENO broke from this connection and was expressed as a completely different method that can’t be melded in. On the other hand, if ENO had started as a more incremental evolution from TVD methods, the methods could have been implemented as an extension of an existing code. This would have made the success of the methods more inevitable than difficult. Perhaps backing away from the path, we have been on and seeking a method that steps incrementally forward could stir real progress in methods.

Stark truth, is seldom met with open arms.

 Justin K. McFarlane Beau

Harten, Ami, Bjorn Engquist, Stanley Osher, and Sukumar R. Chakravarthy. “Uniformly high order accurate essentially non-oscillatory schemes, III.” In Upwind and high-resolution schemes, pp. 218-290. Springer, Berlin, Heidelberg, 1987.

Shu, Chi-Wang. “Numerical experiments on the accuracy of ENO and modified ENO schemes.” Journal of Scientific Computing 5, no. 2 (1990): 127-149.

Liu, Xu-Dong, Stanley Osher, and Tony Chan. “Weighted essentially non-oscillatory schemes.” Journal of computational physics 115, no. 1 (1994): 200-212.

Jiang, Guang-Shan, and Chi-Wang Shu. “Efficient implementation of weighted ENO schemes.” Journal of computational physics 126, no. 1 (1996): 202-228.

Rider, William J., and Len G. Margolin. “Simple modifications of monotonicity-preserving limiter.” Journal of Computational Physics 174, no. 1 (2001): 473-488.

Zhang, Xiangxiong, and Chi-Wang Shu. “Maximum-principle-satisfying and positivity-preserving high-order schemes for conservation laws: survey and new developments.” In Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, vol. 467, no. 2134, pp. 2752-2776. The Royal Society, 2011.

Greenough, J. A., and W. J. Rider. “A quantitative comparison of numerical methods for the compressible Euler equations: fifth-order WENO and piecewise-linear Godunov.” Journal of Computational Physics 196, no. 1 (2004): 259-281.

 

The Fall of the Technical Class; The Rise of the Management Class

Study hard what interests you the most in the most undisciplined, irreverent and original manner possible.

― Richard Feynman

lanl-logo-footerWhen I got my first job out of school it was in Los Alamos home of one of the greatest scientific institutions in the World. This Lab birthed the Atomic Age and changed the World. I went there to work, but also learn and grow in a place where science reigned supreme and technical credibility really and truly mattered. Los Alamos did not disappoint at all. The place lived and breathed science, and I was bathed in knowledge and expertise. I can’t think of a better place to be a young scientist. Little did I know that the era of great science and technical superiority was drawing to a close. The place that welcomed me with so much generosity of spirit was dying. Today it is a mere shell of its former self along with Laboratories strewn across the country whose former greatness has been replaced by rampant mediocrity, pathetic leadership and a management class that rules this decline. Money has replaced achievement, integrity and quality as the lifeblood of science. Starting with a quote by Feynman is apt because the spirit he represents so well is the very thing we have completely beat out of the system.

Don’t think money does everything or you are going to end up doing everything for money.

― Voltaire

If one takes a look at the people who get celebrated by organizations today, it is almost invariably managers. This happens internally to organizations, their external face and alumni recognition by universities. In almost every case the people who are highlighted to represent achievement are managers. One explanation is managers have a direct connection to money. One of the key characteristics of the modern age is the centrality of money to organizational success. Money is connected to management, and increasingly disconnected from technical achievement. This is true in industry, government and university worlds, the entire scientific universe. The whole post could have replaced “the rise of management” with the “rise of money”. We increasingly look at aggregate budget as coequal to quality. The more money an organization has, the better it is, and more important it is. A few organizations still struggle to hang on to celebrating technical achievers, Los Alamos among them. These celebrations weaken with each passing year. The real celebration is how much budget the Lab has, and how many employees that can support.

 People who don’t take risks generally make about two big mistakes a year. People who do take risks generally make about two big mistakes a year.

― Peter F. Drucker

trumps-leaked-tax-return-reveals-how-the-apprentice-helped-make-him-a-lot-richerThe days of technical competence and scientific accomplishment are over. This foundation for American greatness has been overrun by risk aversion, fear and compliance with a spirit of commonness. I use the word “greatness” with gritted teeth because of the perversion of its meaning by the current President. This perversion is acute in the context of science because he represents everything that is destroying the greatness of the United States. Rather than “making America great again” he is accelerating every trend that has been eroding the foundation of American achievement. The management he epitomizes is the very thing that is the blunt tool bludgeoning American greatness into a bloody pulp. Trump’s pervasive incompetence masquerading as management expertise will surely push numerous American institutions further over the edge into mediocrity. His brand of management is all to prevalent today and utterly toxic to quality and integrity.

In my life, the erosion of American greatness in science is profound, evident and continual. I had a good decade of basking in the greatness of Los Alamos before the forces of mediocrity descended upon the Lab and proceeded to spoil, distort and destroy every bit of greatness in sight. A large part of the destruction was the replacement of technical excellence with management. The management is there to control thepeter_nanos “butthead cowboys” and keep them from fucking up. Put differently, the management is there to destroy any individuality and make sure no one ever achieves anything great because no one can take a risk sufficient to achieve something miraculous. Anyone expressing individuality is a threat and needs to be chained up. We replaced stunning World class technical achievement with controlled staff, copious reporting, milestone setting, project management and compliance all delivered with mediocrity. This is bad enough by itself, but for an institution responsible for maintaining our nuclear weapons stockpile, the consequences are dire. Los Alamos isn’t remotely alone. Everything in the United States is being assaulted by the arrayed forces of mediocrity. It is reasonable to ask whether the responsibilities the Labs are charged with continue to be competently achieved.

There is nothing so useless as doing efficiently that which should not be done at all.

― Peter F. Drucker

The march of the United States toward a squalid mediocrity had already begun years earlier. Management has led the way at every stage of the transformation. For scientific institutions, the decline began in the 1970’s with the Department of Defense Labs. Once these Labs were shining beacons of achievement, but management unleashed on them put a stop to this. Since then we have seen NASA, Universities, and the DOE Labs all brought under the jack boots of management. All of this management was brought in to enforce a formality of operations, provide a safe or secure workplace, and keep scandals at bay. The Nation has decided that phenomenal success and great achievements aren’t worth the risks or side-effects of being successful. The management is the delivery vehicle for the mediocrity inducing control. The power and achievement of the technical class is the causality. Management is necessary, but today the precious balance between control and achievement is completely lost.

The managers aren’t evil, but neither are most of the people who simply carry out the orders of their superiors. Most managers are good people who simply carry out awful things because they are expected to do so. We now put everything except technical achievement as a priority. Doing great technical work is always the last priority. It can always get pushed out by something else. The most important thing is compliance with all the rules and regulations. Management stands there to make sure it all gets done. This involves lots of really horrible training designed to show compliance but teach people almost nothing. We have project management to make sure we are on time and budget. Since the biggest maxim of our pathetic management culture is never making a mistake, risks are the last thing you can take. It helps a lot when we really aren’t accomplishing anything worthwhile. When the fix is in and technical standards disappear, it doesn’t matter how terrible the work is. All work is World class by definition. Eventually everyone starts to believe the bullshit. The work is great, right, of course it is.

trump_fired_tw-865x452All of this is now blazoned across the political landscape with an inescapable sense that America’s best days are behind us. The deeply perverse outcome of the latest National election is a president who is a cartoonish version of a successful manager. We have put our abuser and a representative of the class that has undermined our Nation’s true greatness in the position of restoring that greatness. What a grand farce! Every day produces evidence that the current efforts toward restoring greatness are using the very things undermining it. The level of irony is so great as to defy credulity. The current administration’s efforts are the end point of a process that started over 20 years ago, obliterating professional government service and hollowing out technical expertise in every corner. The management class that has arisen in their place cannot achieve anything but moving money and people. Their ability to create the new and wonderful foundation of technical achievement is absent.

Greatness is a product of hard work, luck and taking appropriate risks. In science it is grounded upon technical achievements arising from intellectual labors along with a lot of failures, false starts and mistakes. Today’s highly managed World everything that leads to greatness is undermined. Hard work is taxed by a variety of non-productive actions that compliance demands. Appropriate risks are avoided as a matter of course because risks court failure and failure of any sort is virtually outlawed. False starts never happening any more in today’s project managed reality. Mistakes are fatal for careers. Risk, failure and mistakes are all necessary for learning, and ultimately producing unique and advanced ideas come from the intellectual product of a healthy environment. An environment that cannot tolerate failure and risk is unhealthy. It is stagnant and unproductive. This is exactly where today’s workplace has arrived.

Money is a great servant but a bad master.

― Francis Bacon

With the twin pillars of destruction coming from money’s stranglehold on science and the inability to take risks, peer review has been undermined. Our current standards of peer review lack any integrity whatsoever. Success by definition is the rule of the day. A peer review cannot point out flaws without threatening the reviewers with dire consequences. This has fueled a massive downward spiral in the quality of technical work. Why take risks necessary for progress, when success can be so much more easily faked. Today peer review is so weak that bullshitting your way to success has become the norm. To point out real shortcomings in work has become unacceptable and courts scandal. It puts monetary issues at risk and potentially produces consequences for the work that management cannot accept. In the current environment scientific achievement does not happen because achievement is invariably risk prone. Such risks cannot be taken because of the hostile environment toward any problems or failures. Without failure, we are not learning, and learning at its apex is essentially research. Weak peer review is a large contributor to the decline in technical achievement and the loss of importance for the technical contributor.

Perhaps the greatest blow to science was the end of the Cold War. The Soviet bloc represented a genuine threat to the West and a worthy adversary.  Technical and scientific competence and achievement was a key aspect in the defense of the West. Good work couldn’t be faked, and everyone knew that the West needed to bring their “A” game, or risk losing. When the Soviet bloc crumbled, so did a great deal of the unfettered support for science. Society lost its taste for the sorts of risks necessary for high levels of achievement. To some extent, the loss of ability to take risks and accept failures was already underway with the end of the Cold War simply providing a hammer blow to support for science. It ended the primacy of true achievement as a route to National security. It might be useful to note that the science behind “Star Wars” was specious1983-reagan-sdi-4-apr-60 from the beginning. In a very real way the bullshit science of Star Wars was a trail blazer for today’s rampant scientific charlatans. Rather than give science a free reign to seek breakthroughs along with the inevitable failure, society suddenly sought guaranteed achievement at a reduced cost. In reality it got neither achievement or economized results. With the flow of money being equated to quality as opposed to results, the combination has poisoned science.

How do you defeat terrorism? Don’t be terrorized.

― Salman Rushdie

This transformation was already bad enough then the war on terror erupted to further complicate matters. The war on terror was a new cash cow for the broader defense establishment but came with all the trappings of guaranteed safety and assured results. It solidified the hold of money as the medium for science. Since terrorists represent no actual threat to society, technical success was unnecessary for victory. The only risk to society from terrorism is the self-inflicted damage we do to ourselves, and we’ve done the terrorists work for them masterfully. In most respects the only thing that matters at the Labs is funding. Quality, duty, integrity and virtually anything is up for sale for money. Money has become the sole determining factor for quality and Park-911--56a9a6593df78cf772a9390athe dominant factor in every decision. Since the managers are the gate keepers for funding they have uprooted technical achievement and progress as the core of organizational identity. It is no understatement to say that the dominance of financial concerns is tied to the ascendency of management and the decline of technical work. At the same time the desire for assured results produced a legion of charlatans who began to infest the research establishment. This combination has produced the corrosive effect of reducing the integrity of the entire system where money rules and results can be finessed to outright fabricated. Standards are so low now that it doesn’t really matter.

Government has three primary functions. It should provide for military defense of the nation. It should enforce contracts between individuals. It should protect citizens from crimes against themselves or their property. When government– in pursuit of good intentions tries to rearrange the economy, legislate morality, or help special interests, the cost come in inefficiency, lack of motivation, and loss of freedom. Government should be a referee, not an active player.

― Milton Friedman

140926170715-gingrich-contract-with-america-story-topOne of the key trends impacting our government funded Labs and research is the languid approach to science by the government. Spearheading this systematic decline in support is the long-term Republican approach to starving government that really took the stage in 1994 with the “Contract with America”. Since that time the funding for science has declined in real dollars along with a decrease in the support for professionalism by those in government. Over time the salaries and level of professional management has been under siege as part of an overall assault on governing. A compounding effect has been an ever-present squeeze on the rules related to conducting science. On the one hand we are told that the best business practices will be utilized to make science more efficient. Simultaneously, best practices in support for science have denied us. The result is no efficiency along with no best practices and simply a decline in overall professionalism for the Labs. All of this is deeply compounding the overall decline in support for research.

Rank does not confer privilege or give power. It imposes responsibility.

― Peter F. Drucker

What can be done to fix all this?

Sometimes the road back to effective and productive technical work seems so daunting as to defy description. I’d say that a couple of important things are needed to pave the road. Mostly importantly, the purpose and importance of the work needs centrality to the identity of science. Purpose and service needs to replace money as the key organizing principle. A high-quality product needs to replace financial interests as the driving force in managing efforts. This step alone would make a huge difference and drive most of the rest of the necessary elements for a return to technical focus. First and foremost, among these elements is an embrace of risk. We need to take risks and concomitantly accept failures as an essential element in success. We must let ourselves fail in attempting to achieve great progress through thoughtful risks. Learning, progress and genuine expertise need to become the measure of success and the lifeblood for our scientific and technical worlds. Management needs to shrink into the background where it becomes a service to technical achievement and an enabler for those producing the work. The organizations need to celebrate the science and technical achievements as the zenith of their collective identity. As part of this we need to have enough integrity to hold ourselves to high standards, welcoming and demanding hard hitting critiques.

In a nutshell we need to do almost the complete opposite of everything we do today.

We are trying to prove ourselves wrong as quickly as possible, because only in that way can we find progress.

― Richard Feynman

 

Curing the Plague of Meetings

If you had to identify, in one word, the reason why the human race has not achieved, and never will achieve, its full potential, that word would be “meetings.”

― Dave Barry

dt170331 copyMeetings. Meetings, Meetings. Meetings suck. Meetings are awful. Meeting are soul sucking, time wasters. Meetings are a good way to “work” without actually working. Meetings absolutely deserve the bad rap they get. Most people think that meetings should be abolished. One of the most dreaded workplace events is a day that is completely full of meetings. These days invariably feel like complete losses, draining all productive energy from what ought to be a day full of promise. I say this as an unabashed extrovert knowing that the introvert is going to feel overwhelmed by the prospect.

Meetings are a symptom of bad organization. The fewer meetings the better.

– Peter Drucker

Project Quickstart – Tenstep with regard to Round Table Business Meeting

All of this is true, and yet meetings are important, even essential to a properly functioning workplace. As such, meetings need to be the focus of real effort to fix while minimizing unnecessary time spent there. Meetings are a vital humanizing element in collective, collaborative work. Deep engagement with people is enriching, educational, and necessary for fulfilling work. Making meetings better would produce immense benefits in quality, productivity and satisfaction in work.

Meetings are at the heart of an effective organization, and each meeting is an opportunity to clarify issues, set new directions, sharpen focus, create alignment, and move objectives forward.

― Paul Axtell

dt131003 copyIf there is one thing that unifies people at work, it is meetings, and how much we despise them. Workplace culture is full of meetings and most of them are genuinely awful. Poorly run meetings are a veritable plague in the workplace. Meetings are also an essential human element in work, and work is a completely human and social endeavor. A large part of the problem is the relative difficulty of running a meeting well, which exceeds the talent and will of most people (managers). It is actually very hard to do this well. We have now gotten to the point where all of us almost reflexively expect a meeting to be awful and plan accordingly. For my own part, I take something to read, or my computer to do actual work, or the old stand-by of passing time (i.e., fucking off) on my handy dandy iPhone. I’ve even resorted to the newest meeting past-time of texting another meeting attendee to talk about how shitty the meeting is. All of this can be avoided by taking meetings more seriously and crafting time that is well spent. If this can’t be done the meeting should be cancelled until the time is well spent.

The least productive people are usually the ones who are most in favor of holding meetings.

― Thomas Sewell

There are a few uniform things that can be done to improve the impact of meetings on the workplace. If a meeting is mandatory, it will almost surely suck. It will almost always suck hard. No meeting should ever be mandatory, ever. By forcing people to go to mandatory meetings, those running the meeting have no reason to make the meeting enjoyable, useful or engaging. They are not competing for your time, and this allows your time to be abused. A meeting should always be trying to make you want to be there, and honestly compete for your time. A fundamental notion that makes all meetings better is a strong sense that you know why you are at a meeting, and how you are participating. There is no reason for attendance to a meeting where you passively absorb information without any active role. If this is the only way to get the information, we highlight deeper problems that are all too common! Everyone should have an active role in the meeting’s life. If someone is not active, they probably don’t need to be there.

dt150619 copy

Meetings at work present great opportunities to showcase your talent. Do not let them go to waste.

― Abhishek Ratna

There are a lot of types of meetings, and generally speaking all of them are terrible, and they don’t need to be. None of them really have to be awful, but they are. Some of the reasons are a tremendously deep issue with the modern workplace. It is only a small over reach to say that better meetings would go a huge distance to improve the average workplace and provide untold benefits in terms of productivity and morale. So, to set the stage, let’s talk about the general types of meetings that most of us encounter:

  • Conferences, Talks and symposiums
  • Informational Meetings
  • Organizational Meetings
  • Project Meetings
  • Reviews
  • Phone, Skype, Video Meetings
  • Working meetings
  • Training Meetings

All of these meetings can stand some serious improvement that would have immense benefits.

maxresdefault

Meetings are indispensable when you don’t want to do anything.

–John Kenneth Galbraith

The key common step to a good meeting is planning and attention to the value of people’s time. Part of the planning is a commitment to engagement with the meeting attendees. Do those running the meeting know how to convert the attendees to participants? Part of the meeting is engaging people as social animals and building connections and bonds. The worst thing is a meeting that a person attends solely because they are supposed to be there. Too often our meetings drain energy and make people feel utterly powerless. A person should walk out of a meeting energized and empowered. Instead, meeting are energy and morale sucking machines. A large part of the meeting’s benefit should be a feeling of community and bonding with others. Collaborations and connections should arise naturally from a well run meeting. All of this seems difficult and it is, but anything less does not honor the time of those attending and the great expense their time represents. In the end, the meeting should be a valuable expenditure of time. More than simply valuable, the meeting should produce something better, a stronger human connection and common purpose of all those attending. If the meeting isn’t a better expenditure of people’s time, it probably shouldn’t happen.

dt121215 copy

A meeting consists of a group of people who have little to say – until after the meeting.

― P.K. Shaw

phd042417s copyConferences, Talks and symposiums. This is a form of meeting that generally works pretty well. The conference has a huge advantage as a form of meeting. Time spend at a conference is almost always time well spent. Even at their worst, a conference should be a banquet of new information and exposure to new ideas. Of course, they can be done very poorly and the benefits can be undermined by poor execution and lack of attention to detail.phd091813s copyConversely, a conference’s benefits can be magnified by careful and professional planning and execution. One way to augment a conference significantly is find really great keynote speakers to set the tone, provide energy and engage the audience. A thoughtful and thought-provoking talk delivered by an expert who is a great speaker can propel a conference to new heights and send people away with renewed energy. Conferences can also go to greater lengths to make the format and approach welcoming to greater audience participation especially getting the audience to ask questions and stay awake and aware. It’s too easy to tune out these days with a phone or laptop. Good time keeping and attention to the schedule is another way of making a conference work to the greatest benefit. This means staying on time and on schedule. It means paying attention to scheduling so that the best talks don’t compete with each other if there are multiple sessions. It means not letting speaker filibuster through the Q&A period. All of these maxims hold for a talk given in the work hours, just on a smaller and specific scale. There the setting, time of the talk and the time keeping all help to make the experience better. Another hugely beneficial aspect of meetings is food and drink. Sharing food or drink at a meeting is a wonderful way for people to bond and seek greater depth of connection. This sort of engagement can help to foster collaboration and greater information exchange. It engages with the innate human social element that meeting should foster (I will note that my workplace has mostly outlawed food and drink helping to make our meetings suck more uniformly). Too often aspects of the talk or conference that would make the great expense of people’s time worthwhile are skimped on undermining and diminishing the value.

Highly engaged teams have highly engaged leaders. Leaders must be about presence not productivity. Make meetings a no phone zone.

― Janna Cachola

 

Informational Meetings. The informational meeting is one of the worst abuses of people’s time. Lots of these meetings are mandatory, and force people to waste time witnessing evidence of what kind of shit show they are part of. This is very often a one-way exchange where people are expected to just sit and absorb. The information content is often poorly packaged, and ham handed in delivery. The talks usually are humorless and lack any soul. The sins are all compounded with a general lack of audience engagement. Their greatest feature is a really good and completely work appropriate time wasting exercise. You are at work and not working at all. You aren’t learning much either, it is almost always some sort of management BS delivered in a politically correct manner. Most of the time the best option is to completely eliminate these meetings. If these meetings are held, those conducting them should spend some real effort into making them worthwhile can valuable. They should seek a format that engages the audience and encourages genuine participation.

When you kill time, remember that it has no resurrection.

― A.W. Tozer

Organizational Meetings. The information’s meeting’s close relative is the organizational meeting. Often this is an informational meeting is disguise. This sort of meeting is called for an organization of some size to get together and hear the management give them some sort of spiel. These meeting happen at various organizational levels and almost all of them are awful. Time wasting drivel is the norm. Corporate or organizational policies, work milestones, and cheesy awards abound. Since these meeting is more personal than the pure informational meeting there is some soul and benefit to them. The biggest sin in these meetings is the faux engagement. Do the managers running these meetings really want questions, and are they really listening to the audience. Will they actually do anything with the feedback? More often than not, the questions and answers are handled professionally then forgotten. The management generally has no interest in really hearing people’s opinions and doing anything with their views, it is mostly a hollow feel good maneuver. Honest and genuine engagement is needed and these days management needs to prove that its more than just a show.

People who enjoy meetings should not be in charge of anything.

― Thomas Sowell

Project Meetings. In many places this is the most common meeting type. It is also tending to be one of the best meeting types where everyone is active and participating. The meeting involves people working to common ends and promotes genuine connection between efforts. These can take a variety of forms such as the stand-up meeting where everyone participates by construction. An important function of the project meeting is active listening. While this form of meeting tends to be good, it still needs planning and effort to keep it positive. If the project meeting is not good, it probably reflects quite fully on the project itself. Some sort of restructuring of the project is a cure. What are the signs that a project meeting is bad? If lots of people are sitting like potted plants and not engaged with the meeting, the project is probably not healthy. The project meeting should be time well spent, if they aren’t engaged, they should be doing something else.

Integrity is telling myself the truth. And honesty is telling the truth to other people.

― Spencer Johnson

downloadReviews. A review meeting is akin to a project meeting, but has an edge that makes it worse. Reviews often teem with political context and fear. A common form is a project team, reviewers and then stakeholders. The project team presents work to the reviewers, and if things are working well, the reviewers ask lots of questions. The stakeholders sit nervously and watch rarely participating. The spirit of the review is the thing that determines whether the engagement is positive and productive. The core value about which value revolves is honesty and trust. If honesty and trust are high, those being reviewed are forthcoming and their work is presented in a way where everyone learns and benefits. If the reviewers are confident in their charge and role, they can ask probing questions and provide value to the project and the stakeholders. Under the best of circumstances, the audience of stakeholders can be profitably engaged in deepening the discussion, and themselves learn greater context for the work. Too often, the environment is so charged that honesty is not encouraged, and the project team tends to hide unpleasant things. If reviewers do not trust the reception for a truly probing and critical review, they will pull their punches and the engagement will be needlessly and harmfully moderated. A sign that neither trust nor honesty is present comes from an anxious and uninvolved audience.

I think there needs to be a meeting to set an agenda for more meetings about meetings.

― Jonah Goldberg

dt130731 copy

Phone, Skype, Video Meetings. These meetings are convenient and often encouraged as part of a cost saving strategy. Because of the nature of the medium these meetings are often terrible. Most often it turns into a series of monologs usually best suited for reporting work. Such meetings are rarely good places to hear about work. This comes from two truths: the people on the phone are often disengaged and listening while attending to other things. It is difficult to participate in any dynamic discussion, it happens, but it is rare. Most of the content is limited to the spoken word, and lacks body language and visual content. The result is much less information being transmitted, along with a low bandwidth of listening. For the most part these meeting should be done away with. If someone has something really interesting and very timely it might be useful, but only if we are sure the audience is paying real attention. Without dynamic participation one cannot be sure the attention is actually being paid.

dt140816 copy

Working meetings. These are the best meetings, hands down. They are informal, voluntary and dynamic. The people are there because they want to get something done that requires collaboration. If other types of meetings could incorporate the approach and dynamic of a working meeting, all of them would improve dramatically. Quite often these meetings are deep on communication and low on hierarchical transmission. Everyone in the meeting is usually engaged and active. People are rarely passive. They are there because they want to be there, or they need to be there. In many ways all of meeting could benefit mightily by examining working meetings, and adopting their characteristics more broadly.

Training Meetings. The use of a meeting to conduct training is common, as they are bad. These meetings could be improved greatly by adopting the principles from education. A good training is educational. Again dynamic, engaged meeting attendees are a benefit. If they are viewed as students, good outcomes can be had. Far too often the training is delivered in a hollow mandatory tone that provides little real value for these receiving it. We have a lot of soulless compliance training that simply pollutes the workplace with time wasting. Compliance is often associated with hot-button issues where the organization has no interest in engaging the employees. They are simply forced to do things because those in power say so. A real discussion on this sort of training is likely to be difficult and cast doubt. The conversations are difficult and likely to be confrontational. It is easier to passively waste people’s time and get it over with. This attitude is some blend of mediocrity and cowardice that has a corrosive impact on the workplace.

One source of frustration in the workplace is the frequent mismatch between what people must do and what people can do. When what they must do exceeds their capabilities, the result is anxiety. When what they must do falls short of their capabilities, the result is boredom. But when the match is just right, the results can be glorious. This is the essence of flow.

― Daniel H. Pink

dt121226 copyBetter meetings are a mechanism where our workplaces have an immense ability to improve. A broad principle is that a meeting needs to have a purpose and desired outcome that is well known and communicated to all participants. The meeting should engage everyone attending, and no one should be a potted plant, or otherwise engaged. Everyone’s time is valuable and expensive, the meeting should be structured and executed in a manner fitting its costs. A simple way of testing the waters are people’s attitudes toward the meeting and whether they are positive or negative. Do they want to go? Are they looking forward to it? Do they know why the meeting is happening? Is there an outcome that they are invested in? If these questions are answered honestly, those calling the meeting will know a lot and they should act accordingly.

The cure for bad meetings is recognition of their badness, and a commitment to making the effort necessary to improve them. Few things have a greater capacity to make the workplace better, more productive and improve morale.

When employees feel valued, and are more productive and engaged, they create a culture that can truly be a strategic advantage in today’s competitive market.

― Michael Hyatt

Total Variation Diminishing (TVD) Schemes; Their Essential Contribution to Progress in Methods

Mathematics is the door and key to the sciences.

— Roger Bacon

images-2It is time to return to great papers of the past. The past has clear lessons about how progress can be achieved. Here, I will discuss a trio of papers that came at a critical juncture in the history of numerically solving hyperbolic conservation laws. In a sense, these papers were nothing new, but provided a systematic explanation and skillful articulation of the progress at that time. In a deep sense these papers represent applied math at its zenith, providing a structural explanation along with proof to accompany progress made by others. These papers helped mark the transition of modern methods from heuristic ideas to broad adoption and common use. Interestingly, the depth of applied mathematics ended up paving the way for broader adoption in the engineering world. This episode also provides a cautionary lesson about what holds higher order methods back from broader acceptance, and the relatively limited progress since.

The three papers I will focus on are:

Harten, Ami. “High resolution schemes for hyperbolic conservation laws.” Journal of computational physics 49, no. 3 (1983): 357-393.

Harten, Ami. “On a class of high resolution total-variation-stable finite-difference schemes.” SIAM Journal on Numerical Analysis 21, no. 1 (1984): 1-23.

Sweby, Peter K. “High resolution schemes using flux limiters for hyperbolic conservation laws.” SIAM journal on numerical analysis 21, no. 5 (1984): 995-1011.

The first two are by the late Ami Harten providing a proof of the monotone behavior seen with the heuristic methods existing at that time. The proofs provided some confidence to many that had been lacking from the truly innovative, but largely heuristic invention of the methods. The third paper by Peter Sweby provided a clear narrative and an important graphical tool for understanding these methods and displaying limiters, the nonlinear mechanism that produced the great results. The “Sweby diagram” was the reduction of these complex nonlinear methods to a nonlinear function. The limiter was then a switch between two commonly used classical methods. The diagram produced a simple way of seeing whether any given limiter was going to give second-order non-oscillatory results. Together these three papers paved the way for common adoption of these methods.

Mathematics is the art of giving the same name to different things.

– Henri Poincaré

In the 1970’s three researchers principally invented these nonlinear methods, Jay Boris, Bram Van Leer, and Vladimir Kolgan.  Of these three Boris and Van Leer achieved fame and great professional success. The methods were developed heuristically and worked very well. Each of these methods explicitly worked to overcome Godunov’s barrier theorem that says a second-order linear method cannot be monotone. Both made the methods nonlinear through adapting the approximation based on the local structure of the solution. Interestingly Boris and Van Leer were physicists, Kolgan was an engineer (Van Leer went on to work extensively in engineering). Kolgan was a Russian in the Soviet Union and died before his discovery could take its rightful place next to Boris and Van Leer (Van Leer has gone to great effort to correct the official record).

[Mathematics] is security. Certainty. Truth. Beauty. Insight. Structure. Architecture. I see mathematics, the part of human knowledge that I call mathematics, as one thing—one great, glorious thing. Whether it is differential topology, or functional analysis, or homological algebra, it is all one thing. … They are intimately interconnected, they are all facets of the same thing. That interconnection, that architecture, is secure truth and is beauty. That’s what mathematics is to me.

― Paul R. Halmos

The problem with all these methods was a lack of mathematical certainty on the quality of results along with proofs and structured explanations of their success. This made the broader community a bit suspicious of the results. In a flux corrected transport (FCT, Boris’ invention) commemorative volume this suspicion is noted. At conferences, there were questions raised about the results that implied that the solutions were faked. The breakthrough with these new methods was that good, too good to be true. Then the explanations came and made a strong connection to theory. The behavior seen in the results had a strong justification in mathematics, and the trust in the methodology grew. Acceptance came on the heals of this trust and widespread adoption.

Harten and others continued to search for even better methods after introducing TVD schemes. The broad category of essentially non-oscillatory (ENO) methods was invented. It has been a broad research success, but never experienced the wide spread adoption that these other methods enjoyed. Broadly speaking, the TVD methods are used in virtually every production code for solving hyperbolic conservation laws. In the physics world, many use Van Leer’s approach and engineering uses Harten-Sweby’s formalism broadly. FCT is used somewhat in the physics world, but its adoption is far less common. Part of the reason for this disparity comes down to the power of mathematical proof and the faith it gives. The lack of success of follow-on methods to get adoption and have success comes from the lack of strong theory with its requisite confidence. Faith, confidence and systematic explanation are all provided by well executed applied mathematics.

What is TVD the theory and how does it work?

(Note: WordPress’ Latex capability continues to frustrate, I cannot get them to typeset so if you can read TeX the equations will make sense)

In a nutshell, TVD is a way of extending the behavior of monotone methods (upwind for the purposes of this discussion) to high-order nonlinear methods. Upwind methods have the benefit of positive coefficients in their stencil. If we write this down for a scalar advection equation, u_t + a u_x = 0 , we get the following form, $u_j^{n+1} = u_j^n – C_{j-1/2} \left( u_j^n – u_{j-1}^n \right) + D_{j+1/2} \left(u_{j+1}^n – u_j^n \right) $. The key for the methods is the positivity of the functions  C_{j-1/2} \ge 0 and D_{j+1/2} \ge 0. For example, an upwind method will give constants for these functions, $latex  C_{j-1/2}  = a \Delta t/\Delta x = \nu $ and D_{j+1/2} = 0 for a > 0. The coefficient is the famous CFL (Courant-Friedrichs-Lewy) number. For the TVD methods, these functions become nonlinear functions of the solution itself, but satisfy the inequalities. Harten had done other work that connected monotone methods to entropy satisfying (i.e., physically relevant solutions), which then implies that TVD methods would be a route to similar results (this would seem to be true, but definitive proofs are lacking). Still the connections are all there and close enough to provide faith in the methodology. This is where Sweby’s work comes in and provides a crucial tool for broad acceptance of this methodology.

200px-LimiterRegionWhat Sweby did was provide a wonderful narrative description of TVD methods, and a graphical manner to depict them. In the form that Sweby described, TVD methods were a nonlinear combination of classical methods: upwind, Lax-Wendroff and Beam Warming. The limiter was drawn out of the formulation and parameterized by the ratio of local finite differences. The limiter is a way to take an upwind method and modify it with some part of the selection of second-order methods and satisfy the inequalities needed to be TVD. This technical specification took the following form, $ C_{j-1/2}  = \nu \left( 1 + 1/2\nu(1-\nu) \phi\ledt(r_{j-1/2}\right) \right) $ and D_{j+1/2} =1/2\nu(1-\nu) \phi\left(r_{r+1/2}\right) for a > 0 and $r_{j-1/2} = \frac{ u_{j}^{n} – u_{j-1}^{n} }{ u_{j-1}^{n}  – u_{j-2}^{n}} $. This produced a beautiful and simple diagram that usefully displayed how any given method compared to others. This graphical means was probably the essential step for broad acceptance (my opinion, but for visual people it was essential and a lot of technical folks are visual).

Beyond the power of applied mathematics, other aspects of the technical problem have contributed to the subsequent lack of progress. The biggest issue is the quantum leap in performance from first- to second-order accuracy. The second order methods produce results that seem turbulent because first-order methods produce a truncation error that laminarizes flows. The second-order method produces results for complex problems that have the look and feel of real flows (this may also be quantitatively true, but the jury is out). Important flows are turbulent, high energy with very large Reynolds numbers. First-order schemes cannot produce these realistically at all. Second-order methods can, and for this reason the new schemes unleashed utility upon the World. With these methods, the solutions took on the look, feel and nature of reality. For this reason, these schemes became essential for codes.

The second reason is the robustness of these methods. First-order monotone methods like upwind are terribly robust. These methods produce physically admissible solutions and do not fail often. Codes run problems to completion. The reason is their extremely dissipative nature. This makes them very attractive for difficult problems and almost guarantees a solution for the calculation. The same dissipation also destroys almost every structure in the solution and smears out all the details that matter. You get answer, but an answer that is fuzzy and inaccurate. These first order methods end up being as extremely expensive when accuracy is desired. Harten’s TVD methods provided a systematic connection of the new second-order methods to the old reliable first-order methods. The new methods were almost as reliable as the first-order methods, but got rid of much of the smearing dissipation that plagued them. Having a structured and expertly produced explanation for the behavior of these methods with clear connections to things people already knew produced rapid adoption by practitioners.

Mathematics is the cheapest science. Unlike physics or chemistry, it does not require any expensive equipment. All one needs for mathematics is a pencil and paper.

― George Pólya

The follow-on efforts with higher than second-order methods have lacked these clear wins. It is clear that going past second-order does not provide the same sort of quantum leap in results. The clear connection and expectations of robustness is also lacking. The problems do not stop there. The essentially non-oscillatory methods select the least oscillatory local approximation, which also happens to be quite dissipative by its very nature. Quite often the high-order method is actually not threatening oscillations at all yet a less accurate approximation is chosen needlessly reducing accuracy. Furthermore, the adaptive approximation selection can preferentially choose unstable approximation in an evolutionary sense, which can result in catastrophe. The tendency to produce the worst of both Worlds has doomed their success and broad adoption. Who wants dissipative and fragile? No one! No production code would make these choices, ever!

Recent efforts have sought to rectify this shortcoming. Weighted ENO methods (WENO) have provided far less intrinsically dissipative methods that also enhance the accuracy. These methods are still relatively dissipative compared to the best TVD methods and invoke their expensive approximations needlessly in regions of the solution where the nonlinear mechanisms are unnecessary. Efforts have produced positivity preserving methods that avoid the production of inherently unphysical results with high-order methods. These developments are certainly a step in the right direction. The current environment of producing new legacy codes is killing any other the energy to stewart these methods into broad adoption. The expense, overly dissipative nature and relatively small payoff all stand in the way.

What might help in making progress past second-order methods?

The first thing to note is that TVD methods are mixed in their order of accuracy. They are second-order in a very loose sense and only when one takes the most liberal norm for computations (L1 for you nerds out there). For the worst-case error, TVD methods are still first-order (L-infinity, and multiple dimensions). This is a pretty grim picture until one also realizes that for nonlinear PDEs with general solutions, first-order accuracy is all you get anyway unless you are willing to track all discontinuities. These same conditions hold for high-order methods we might like to adopt. The accuracy from the new methods is always quite limited and puts a severe constraint on the efficiency of the methods, and a challenge to development and progress. The effort that it takes to get full accuracy for nonlinear problems is quite large, and if this accuracy is not realized, the effort is not worth it. We do know that some basic elements of high-order methods yield substantial benefits, but these benefits are limited (an example are high-order edge values used in the piecewise parabolic method – PPM).

I asked myself, what worked so well for TVD? To me there is a clear and unambiguous connection to what worked in the past. The past was defined by the combination of upwind, Lax-Wendroff, and Beam-Warming methods. These methods along with largely ad hoc stabilization mechanisms provided the backbone of production codes preceding the introduction of these methods. Now TVD schemes form the backbone of production codes. It would seem that new higher order methods should preserve this sort of connection. ENO and WENO methods did not do this, which partially explains their lack of adoption. My suggestion would be a design of methods where one uses a high-order method that can be shown to be TVD, or the high-order method closest to a chosen TVD scheme. This selection would be high-order accurate by construction, but would also produce oscillations at third-order. This is not the design principle that ENO methods use where the unproven assertion is oscillations at the order of approximation. The tradeoff between these two principles is larger potential oscillations with less dissipation and a more unambiguous connection to the backbone TVD methods.

1. Everyone is entitled to their opinion about the things they read (or watch, or listen to, or taste, or whatever). They’re also entitled to express them online.

2. Sometimes those opinions will be ones you don’t like.

3. Sometimes those opinions won’t be very nice.

4. The people expressing those may be (but are not always) assholes.

5. However, if your solution to this “problem” is to vex, annoy, threaten or harrass them, you are almost certainlya bigger asshole.

6. You may also be twelve.

7. You are not responsible for anyone else’s actions or karma, but you are responsible for your own.

8. So leave them alone and go about your own life.

[Bad Reviews: I Can Handle Them, and So Should You(Blog post, July 17, 2012)]

 John Scalzi

pileofshitMy own connection to this work is a nice way of rounding out this discussion. When I started looking at modern numerical methods, I started to look at the selection of approaches. FCT was the first thing I hit upon and tried. Compared to the classical methods I was using, it was clearly better, but its lack of theory was deeply unsatisfying. FCT would occasionally do weird things. TVD methods had the theory and this made is far more appealing to my technically immature mind. After the fact, I tried to project FCT methods onto the TVD theory. I wrote a paper documenting this effort. It was my first paper in the field. Unknowingly, I walked into a veritable mine field and complete shit show. All three of my reviewers were very well-known contributors to the field (I know it is supposed to be anonymous, and the shit show that unveiled itself, unveiled the reviewers too).

The end result was that the paper was never published. This decision occurred five years after it was submitted, and I had simply moved on. My first review was from Ami Harten who basically said this paper is awesome and publish it. He signed the review and sent me some lecture notes on the same topic. I was over the moon, and did call Ami and talk briefly. Six months later my second review came in. It was as different as possible from Ami’s. It didn’t say this exactly, but in a nutshell, it said the paper was a piece of shit. It still remains the nastiest and most visceral review I’ve ever gotten. It was technically flawless on one hand and thoroughly unprofessional in tone on the other. My third review came a year later and was largely editorial in nature. I revised the paper and resubmitted. While all this unfolded Ami died, and the journal it was submitted to descended into chaos partially due to the end of the cold war and its research largess. When it emerged from chaos, I decided to publish the work was largely pointless and not worth the effort.

Some commentary about why this shit show happened is worth explaining. It is all related to the holy war between two armed camps that arose via the invention of these methods and who gets the credit. The paper was attempting to bridge the FCT and TVD worlds, and stepped into the bitter fighting around previous publications. In retrospect, it is pretty clear that FCT was first, and others like Kolgan and Van Leer came after. Their methodologies and approaches were also fully independent, and the full similarity was not clear at the time. While the fullness of time sees these approaches are utterly complementary, at the time of development it was seen as a competition. It was definitely not a collaborative endeavor, and the professional disagreements were bitter. They poisoned the field and people took sides viewing the other side with vitriolic fury. A friend and associate editor of the Journal of Computational Physics quipped that this was one of the nastiest sub-communities in the Journal, and why did I insist on working in this area. It is also one of the most important areas in computational physics working on a very difficult problem. The whole field also hinges upon expert judgement and resists a firm quantitative standard of acceptance.

What an introduction to the field and its genuinely amazing that I continue to work in it at all. If I didn’t enjoy the technical content so much, and not appreciated the importance of the field, I would have run. Perhaps greater success professionally would have followed such a departure. In the long run this resistance and the rule of experts works to halt progress.

If you can’t solve a problem, then there is an easier problem you can solve: find it.

― George Pólya

Kolgan, V. P. “Application of the principle of minimum values of the derivative to the construction of finite-difference schemes for calculating discontinuous gasdynamics solutions.” TsAGI, Uchenye Zapiski 3, no. 6 (1972): 68-77.

Boris, Jay P., and David L. Book. “Flux-corrected transport. I. SHASTA, a fluid transport algorithm that works.” Journal of computational physics 11, no. 1 (1973): 38-69.

Van Leer, Bram. “Towards the ultimate conservative difference scheme. II. Monotonicity and conservation combined in a second-order scheme.” Journal of computational physics 14, no. 4 (1974): 361-370.\

Van Leer, Bram. “Towards the ultimate conservative difference scheme. V. A second-order sequel to Godunov’s method.” Journal of computational Physics 32, no. 1 (1979): 101-136.

Harten, Ami, Bjorn Engquist, Stanley Osher, and Sukumar R. Chakravarthy. “Uniformly high order accurate essentially non-oscillatory schemes, III.” Journal of computational physics 71, no. 2 (1987): 231-303.

Harten, Ami, and Stanley Osher. “Uniformly high-order accurate nonoscillatory schemes. I.” SIAM Journal on Numerical Analysis 24, no. 2 (1987): 279-309.

Harten, Amiram, James M. Hyman, Peter D. Lax, and Barbara Keyfitz. “On finite‐difference approximations and entropy conditions for shocks.” Communications on pure and applied mathematics 29, no. 3 (1976): 297-322.

 

10 Better Things for Scientific Computing to focus on in 2018

What I cannot create, I do not understand.

– Richard Feynman

We are in deep danger of relying upon science and associated software we do not understand because we have stopped the active creation of knowledge so broadly. I open with one of my favorite quotes by the great physicist Richard Feynman, who also wrote about Cargo Cult Science (https://en.wikipedia.org/wiki/Cargo_cult_science). It is a bold, but warranted assertion to note that much of our science work today is taking on the character of Cargo Cult Science. We are not all the way there, but we have moved a long way toward taking on all of the characteristics of this pathology. In this assertion money is the “cargo” that pseudo-scientific processes are chasing. It is nomaxresdefaultexaggeration to say that getting funding for science has replaced the conduct and value of that science today. This is broadly true, and particularly true in scientific computing where getting something funded has replaced funding what is needed or wise. The truth of the benefit of pursuing computer power above all else is decided upon a priori. The belief was that this sort of program could “make it rain” and produce funding because this sort of marketing had in the past. All results in theRichard-feynmanprogram must bow to this maxim, and support its premise. All evidence to the contrary is rejected because it is politically incorrect and threatens the attainment of the cargo, the funding, the money. A large part of this utterly rotten core of modern science is the ascendency of the science manager as the apex of the enterprise. The accomplished scientist and expert is merely now a useful and necessary detail, the manager reigns as the peak of achievement.

The first principle is that you must not fool yourself — and you are the easiest person to fool.

We’ve learned from experience that the truth will come out. Other experimenters will repeat your experiment and find out whether you were wrong or right. Nature’s phenomena will agree or they’ll disagree with your theory. And, although you may gain some temporary fame and excitement, you will not gain a good reputation as a scientist if you haven’t tried to be very careful in this kind of work. And it’s this type of integrity, this kind of care not to fool yourself, that is missing to a large extent in much of the research in cargo cult science.

– Richard Feynman

If one looks at the scientific computing landscape today, one sees a single force for progress: the creation of a new more powerful supercomputer that is much faster than anything we have today. The United States, Europe and China are all pursuing this path for advancing scientific computing. It is a continuation of a path we have pursued for the last 25 years, but our future is not remotely like the last 25 years. This approach to progress can be explained simply and marketed to the naïve and untechnical. This works because our National leadership is increasingly naïve, witless and obsessively anti-intellectual lacking any technical sophistication. We are in the midst of a tide of low information leadership who are swayed by sweet sounding bullshit far more easily than hard-nosed facts.

The farther backward you can look, the farther forward you are likely to see.

― Winston S. Churchill

mediocritydemotivatorIn this putrid environment, faster computers seem an obvious benefit to science. They are a benefit and pathway to progress, this is utterly undeniable. Unfortunately, it is an expensive and inefficient path to progress, and an incredibly bad investment in comparison to alternative. The numerous problems with the exascale program are subtle, nuanced, highly technical and pathological. As I’ve pointed out before the modern age is no place for subtlety or nuance, we live it an age of brutish simplicity where bullshit reigns and facts are optional. In such an age, exascale is an exemplar, it is a brutally simple approach tailor made for the ignorant and witless. If one is willing to cast away the cloak of ignorance and embrace subtlety and nuance, a host of investments can be described that would benefit scientific computing vastly more than the current program. If we followed a better balance of research, computing to contribute to science far more greatly and scale far greater heights than the current path provides.

Applications that matter to something big would create a great deal of this focus naturally. The demands of doing something real and consequential would breed a necessity to focus progress in an organic way. Last week I opined that such big things are simply not present today in science or society’s broader narrative. Society is doing nothing big or aspirational or challenging to drive progress forward with genuine purpose. To be more pointed, the push for exascale is not big at all, it is rather an exemplar of the lack of vision and consequence. There is a bit of chicken and egg argument to all this. The bottom line is a general lack of underlying and defining purpose to our efforts in computing. Exascale is what we do when we want to market something as “feeling” big, when it is actually doing something small and inconsequential.

Those who do not move, do not notice their chains.

― Rosa Luxemburg

How can I say such a thing?

In a nutshell computing speed is one of the least efficient and least effective ways to improve computational science. It has only been an enabler because computing speed came for free with Moore’s law for most of the last half century. That free lunch is over and past, yet we mindfully ignore this reality (http://herbsutter.com/welcome-to-the-jungle/ ). Even with Moore’s law fully in effect, it was never the leading contributor for progress, progress was paced by numerical methods and algorithmic scaling. Moreover, computing speed cannot fix modeling that is wrong (methods and algorithms don’t fix this either). If a model is wrong, the wrong answer is simply computed much faster. Of course, we know that every model is wrong and the utility of any model is determined via V&V. Issues associated with the use of computing, naïve code users, the loss of expertise, and understanding are simply overlooked, or worse yet made more intractable due to inattention.

Each of these advances has been mentioned before in the guise of a full blog post, but it is useful to put things together to see the wealth of unused opportunity.

80% of results come from 20% of effort/time

― Vilfredo Pareto

  1. Modernizing modeling ought to be a constant and consistent emphasis in science. Computational science is no different. For some reason, the modeling advances have simply stopped. Our basic models of reality are increasingly fixed and immutable, and ever less fit for future purpose. The models of reality have become embedded in computer codes, and ultimately central to the codes structure in numerous respects. As such we start to embed a framework for modeling whose foundation becomes invariant. We can’t change the model without developing an entirely different code. We reduce our modeling to submodels and closure of existing models while the staying within a fundamental modeling framework. This is another area where progress is phenomenally risky to approach and substantially prone to failures and misguided efforts. Without the failure, the ability to learn and produce new and improved model is virtually impossible. https://wjrider.wordpress.com/2015/02/02/why-havent-models-of-reality-changed-more/, https://wjrider.wordpress.com/2015/07/03/modeling-issues-for-exascale-computation/ , https://wjrider.wordpress.com/2017/07/07/good-validation-practices-are-our-greatest-opportunity-to-advance-modeling-and-simulation/
  2. Modernizing methods is not happening. Since methods are one of the best ways to improve the efficiency and effective solution of models, progress is harmed in a manner that cannot be easily recovered by other means. Usually when a model is decided upon, a method is used to solve the model numerically. The numerical method is only slightly less code specific and invariant than the model itself. By virtue of this character, the basic numerical method for a model becomes indistinguishable from the code. If we preserve the code base, we preserve old methods, which means no progress. We are stuck using relatively low-order methods with crude stability mechanisms. The ability to use high-order methods with enhanced accuracy and efficiency is not advancing. The research in numerical methods and the practical application of numerical methods is becoming increasingly divorced from one another. The gap has grown into a chasm, and numerical methods research is losing relevance. Part of the problem is related to the standards of success where methods research allows success to be found on easier problems rather than keeping the problem difficulty fixed. This is yet another place where the inability to accept failure as a necessary element (or even fuel) for success is fatal. https://wjrider.wordpress.com/2016/06/14/an-essential-foundation-for-progress/, https://wjrider.wordpress.com/2016/07/25/a-more-robust-less-fragile-stability-for-numerical-methods/,
  3. Algorithmic scaling is the most incredible thing we could achieve in terms of computational performance. The ability to change the scaling exponent on how much work it takes to solve a problem can have a magical impact. Linear algebra is the posterchild for this effect. A breakthrough in scaling can make the impossible problem, possible and even routine to solve. The classical naïve scaling for matrix inversion has the work scaling with the cube of the problem size. Even small problems quickly become utterly intractable and almost no amount of computer power can fix this. Change the scaling to quadratic and new problems suddenly become routine, change the scaling to linear and the problems that can be tackled routinely were unimaginable before. We are stuck at linear, although some fields are starting to see sublinear algorithms. Could these breakthroughs be more common and useful? If they could the impact on computational science would overwhelm the capacity of exascale easily. Today we aren’t even trying to make these advances. In my view, such work is generically risky and prone to failure, can failure is something that has become intolerable, thus success if sacrificed. https://wjrider.wordpress.com/2015/05/29/focusing-on-the-right-scaling-is-essential/
  4. cell-phoneToday supercomputing is completely at odds with the commercial industry. After decades of first pacing advances in computing hardware, then riding along with increases in computing power, supercomputing has become separate. The separation occurred when Moore’s law died at the chip level (in about 2007). The supercomputing world has become increasingly disparate to continue the free lunch, and tied to an outdated model for delivering results. Basically, supercomputing is still tied to the mainframe model of computing that died in the business World long ago. Supercomputing has failed to embrace modern computing with its pervasive and multiscale nature moving all the way from mobile to cloud. https://wjrider.wordpress.com/2017/12/15/scientific-computings-future-is-mobile-adaptive-flexible-and-small/
  5. Verification & validation – If the scientific computing efforts are to be real scientific endeavors, V&V is essential. Computational modeling is still modeling and comparison with experiment is the gold standard for modeling, but with computational work the comparison has numerous technical details needing serious attention.  In a very complete way V&V is the scientific method in action within the context of modeling and simulation. This energizes a top to bottom integration of scientific activities and essential feedback up and down this chain. The process produces actionable evidence of how progress is being made and where the bottlenecks to progress exist. The entirety of the V&V work provides a deep technical discourse on the breadth of computational science. The whole of computational science can be improved by its proper application. By weakly supporting V&V, current efforts are cutting themselves off from the integration of the full scientific enterprise and impact into the use of computation scientifically. https://wjrider.wordpress.com/2016/12/22/verification-and-validation-with-uncertainty-quantification-is-the-scientific-method/
  6. chart-with-huge-error-barsExpansive uncertainty quantification – too many uncertainties are ignored rather than considered and addressed. Uncertainty is a big part V&V, a genuinely hot topic in computational circles, and practiced quite incompletely. Many view uncertainty quantification as only being a small set of activities that only address a small piece of the uncertainty question. Too much benefit is achieved by simply ignoring a real uncertainty because the value of zero that is implicitly assumed is not challenged. This is exacerbated significantly by a half funded and deemphasized V&V effort in scientific computing. Significant progress was made several decades ago, but the signs now point to regression. The result of this often willful ignorance is a lessening of impact of computing and limiting the true benefits. https://wjrider.wordpress.com/2016/04/22/the-default-uncertainty-is-always-zero/
  7. Data integration and analysis – one of the latest hot topics is big data and data analysis. The internet and sensors are creating massive amounts of data, and its use is a huge technical problem. The big data issue is looking for significant and actionable understanding from the oceans of data. A related and perhaps more difficult problem is small data where there isn’t enough data, or the enough of the data you want. Lots of science and engineering is data limited to a degree that scientific understanding is limited. Modeling and simulation offers a vehicle to augment this data and fill in the gaps. Doing this in a manner that is credible will be huge challenge. The ways forward with credibility use V&V and intensive uncertainty quantification. The proper use of codes and the role of calibration also becomes critical to success.  https://wjrider.wordpress.com/2016/07/10/10-big-things-for-the-future-of-computational-science/
  8. Multidisciplinary, multiscale science – one of the hot topics a quarter century ago was better Multiphysics methods to replace the pervasive use of operator splitting for complex codes. This effort has utterly failed. We have made very little progress forward. Part of the issue is the inability to produce computational algorithms that are efficient enough to compete. A fully coupled method ends up being so expensive that any accuracy increases from the improved coupling are rendered ineffective. A second and perhaps more powerful reason for lack of ms11progress are the computer codes. Old computer codes are still being used, and most of them use operator splitting. Back in the 1990’s a big deal was made regarding replacing legacy codes with new codes. The codes developed then are still in use, and no one is replacing them. The methods in these old codes are still being used and now we are told that the codes need to be preserved. The codes, the models, the methods and the algorithms all come along for the ride. We end up having no practical route to advancing the methods. https://wjrider.wordpress.com/2016/09/16/is-coupled-or-unsplit-always-better-than-operator-split/
  9. legacy-code-1Complete code refresh – we have produced and now we are maintaining a new generation of legacy codes. A code is a storage for vast stores of knowledge in modeling, numerical methods, algorithms, computer science and problem solving. When we fail to replace codes, we fail to replace knowledge. The knowledge comes directly from those who write the code and create the ability to solve useful problems with that code. Much of the methodology for problem solving is complex and problem specific. Ultimately a useful code becomes something that many people are deeply invested in. In addition, the people who originally write the code move on taking their expertise, history and knowledge with them. The code becomes an artifact for this knowledge, but it is also a deeply imperfect reflection of the knowledge. The code usually contains some techniques that are magical, and unexplained. These magic bits of code are often essential for success. If they get changed the code ceases to be useful. The result of this process is a deep loss of expertise and knowledge that arises from the process of creating a code that can solve real problems. If a legacy code continues to be used it also acts to block progress of all the things it contains starting with the model and its fundamental assumption. As a result, progress stops because even when there is research advances, it has no practical outlet. This is where we are today. https://wjrider.wordpress.com/2015/10/30/preserve-the-code-base-is-an-awful-reason-for-anything/ https://wjrider.wordpress.com/2016/01/01/are-we-really-modernizing-our-codes/ https://wjrider.wordpress.com/2016/01/14/a-response-to-criticism-are-we-modernizing-our-codes/ https://wjrider.wordpress.com/2014/03/20/legacy-code-is-terrible-in-more-ways-than-advertised/
  10. image005Democratization of expertise – the manner in which codes are applied has a very large impact on solutions. The overall process is often called a workflow, encapsulating activities starting with problem conception, meshing, modeling choices, code input, code execution, data analysis, visualization. One of the problems that has arisen is the use of codes by non-experts. Increasingly code users are simply not sophisticated and treat codes like black boxes. Many refer to this as the democratization of the simulation capability, which is generally beneficial. On the other hand, we increasingly see calculations conducted by novices who are generally ignorant of vast swaths of the underlying science. This characteristic is keenly related to a lack of V&V focus and loose standards of acceptance for calculations. Calibration is becoming more prevalent again, and distinctions between calibration and validation are vanishing anew. The creation of broadly available simulation tools must be coupled to first rate practices and appropriate professional education. In both of these veins the current trends are completely in the wrong direction. V&V practices are in decline and recession. Professional education is systematically getting worse as the educational mission of universities is attacked, and diminished along with the role of elites in society. https://wjrider.wordpress.com/2016/12/02/we-are-ignoring-the-greatest-needs-opportunities-for-improving-computational-science/

titan

One of the key aspects of this discussion is recognizing that these activities are all present to some small degree in exascale, but all of them are subcritical. The program basically starves all of these valuable activities and only supports them in fashion that creates a “zombie-like” existence. As a result, the program is turning its back on a host of valuable avenues for progress that could actually make an exascale computer actually far more useful. Our present path has genuine utility, but represents an immense opportunity cost if you factor in what could have been accomplished instead with better leadership, vision and technical sophistication. The way we approach science more broadly is permeated with these inefficiencies meaning our increasingly paltry investments in science are further undermined by our pathetic execution. At the deepest level our broader societal problems revolving around trust, expertise, scandal and taste for failure may doom any project unless they are addressed. For example, the issues related to the preservation of code bases (i.e., creating new legacy codes) are creating deep problems with advancing on the essential fronts of modeling, methods and algorithms. Everything is woven together into a tapestry whose couplings cannot be ignored. This is exactly the sort of subtlety and nuance our current time finds utterly incomprehensible.

Postscript:

It is sometimes an appropriate response to reality to go insane.

― Philip K. Dick

Healey’s First Law Of Holes: When in one, stop digging.

― Denis Healey

DMgfsliWkAAzZ_-Last week I tried to envision a better path forward for scientific computing. Unfortunately, a true better path flows invariably through a better path for science itself and the Nation as a whole. Ultimately scientific computing, and science more broadly is dependent on the health of society in the broadest sense. It also depends on leadership and courage, two other attributes we are lacking in almost every respect. Our society is not well, the problems we are confronting are deep and perhaps the most serious crisis since the Civil War. I believe that historians will look back to 2016-2018 and perhaps longer as the darkest period in American history since the Civil War. We can’t build anything great when the Nation is tearing itself apart. I hope and pray that it will be resolved before we plunge deeper into the abyss we find ourselves. We see the forces opposed to knowledge, progress and reason emboldened and running amok. The Nation is presently moving backward and embracing a deeply disturbing and abhorrent philosophy. In such an environment science cannot flourish, it can only survive. We all hope the darkness will lift and we can again move forward toward a better future; one with purpose and meaning where science can be a force for the betterment of society as a whole.

Everything passes, but nothing entirely goes away.

― Jenny Diski

Toward a More Useful and Impactful Scientific Computing in 2018?

The purpose of life is not to be happy. It is to be useful, to be honorable, to be compassionate, to have it make some difference that you have lived and lived well.

― Ralph Waldo Emerson

downloadIt would really be great to be starting 2018 feeling good about the work I do. Useful work that impacts important things would go a long way toward achieving this. I’ve put some thought into considering what might constitute work having these properties. This has two parts, what work would be useful and impactful in general, and what would be important to contribute to. As a necessary subtext to this conversation is a conclusion that most of the work we are doing in scientific computing today is neither useful, nor impactful and nothing important is at stake. This alone is a rather bold assertion. Simply put, as a Nation and society we are not doing anything aspirational, nothing big. This shows up in the lack of substance in the work we are paid to pursue. More deeply, I believe that if we did something big and aspirational, the utility and impact of our work would simply sort itself out as part of a natural order.

5064The march of science is the 20th Century was deeply impacted by international events, several World Wars and a Cold (non) War that spurred National interests in supporting science and technology. The twin projects of the atom bomb and the nuclear arms race along with space exploration drove the creation of much of the science and technology today. These conflicts steeled resolve, purpose and granted resources needed for success. They were important enough that efforts were earnest. Risks were taken because risk is necessary for achievement. Today we don’t take risks because nothing important is a stake. We can basically fake results and market progress where little or none exists. Since nothing is really that essential bullshit reigns supreme.

There is only one thing that makes a dream impossible to achieve: the fear of failure.

― Paulo Coelho

One of the keys to these conflicts was the presence of a worthy adversary to steel ourselves for the push forward. Both Nazi Germany and Soviet Russia were worthy enemies whose competence meant putting our best foot forward. In reality and rhetorically we lack such an adversary today to push us. We needed to fully commit and faithfully execute our endeavors to achieve victory against these enemies. These opponents had the clear capacity to destroy the United States and the West if the patriotic_american_demotivators_640_01-s640x487-200741resistance was not real. Ironically the Soviets were ultimately defeated by bullshit. The Strategic Defense Initiative, or Star Wars bankrupted the Soviets. It was complete bullshit and never had a chance to succeed. This was a brutal harbinger of today’s World where reality is optional, and marketing is the coin of the realm. Today American power seems unassailable. This is partially true and partially over-confidence. We are not on our game at all, and far to much of our power is based on bullshit. As a result, we can basically just pretend to try, and actually not execute anything with substance and competence. This is where we are today; we are doing nothing important, and wasting lots of time and money in the process.

How do you defeat terrorism? Don’t be terrorized.

― Salman Rushdie

Again, I freely admit that this is a bold assertion. In scientific computing, we have a National exascale program that underpins National security and economic interests.  It contributes to all of these things in massive ways, at least rhetorically. This support for these National goals is pure marketing, or less generously absolute bullshit. This is simply trotting out a bunch of tired sales pitches for scientific computing that lack any soul and increasingly lack substance. The Nation has no large objectives to support, the entire system is drifting along on auto-pilot. It is brimming with over-confidence and a feeling of superiority that only needs a worthy opponent to expose our largess. We have no enemies that are remotely worthy. We have created some chicken-shit paper tigers like Iran, North Korea and the amorphous and largely toothless Islamic fundamentalism. None of these enemies is even the remotest threat to the United States, or the West in general. If they were a worthy threat then we are in awful shape and far worse than we actually are. Terrorism is only as much of a threat as we make it. We have stoked fear and let ourselves we terrorized because it is useful for the defense-intelligence Industrial complex. It has put trillions of dollars into their coffers, and done little or nothing to build a future. We could simply defeat these enemies by refusing to be terrorized. Some courage and resilience as a Nation would be sufficient to render these pathetic enemies utterly impotent. The greatest damage and threat from these enemies is our response to it, not the actual carnage. Our “leaders” are using them to spread fear among the populace to further their own agendas.

csm_group1_2c3e352676The result of the current model is a research establishment that only goes through the motions and does little or nothing. We make lots of noise and produce little substance. Our nation deeply needs a purpose that is greater. There are plenty of worthier National goals. If war-making is needed, Russia and China are still worthy adversaries. For some reason, we have chosen to capitulate to Putin’s Russia simply because they are an ally against the non-viable threat of Islamic fundamentalism. This is a completely insane choice that is only rhetorically useful. If we want peaceful goals, there are challenges aplenty. Climate change and weather are worthy problems to tackle requiring both scientific understanding and societal transformation to conquer. Creating clean and renewable energy that does not create horrible environmental side-effects remains unsolved. Solving the international needs for food and prosperity for mankind is always there. Scientific exploration and particularly space remain unconquered frontiers. Medicine and genetics offer new vistas for scientific exploration. All of these areas could transform the Nation in broad ways socially and economically. All of these could meet broad societal needs. More to the point of my post, all need scientific computing in one form or another to fully succeed. Computing always works best as a useful tool employed to help achieve objectives in the real World. The real-World problems provide constraints and objectives that spur innovation and keep the enterprise honest.

Reality is that which, when you stop believing in it, doesn’t go away.

― Philip K. Dick

image008Instead our scientific computing is being applied as a shallow marketing ploy to shore up a vacuous program. Nothing really important or impactful is at stake. The applications for computing are mostly make believe and amount to nothing of significance. The marketing will tell you otherwise, but the lack of gravity for the work is clear and poisons the work. The result of this lack of gravity are phony goals and objectives that have the look and feel of impact, but contribute nothing toward an objective reality. This lack of contribution comes from the deeper malaise of purpose as a Nation, and science’s role as an engine of progress. With little or nothing at stake the tools used for success suffer, scientific computing is no different. The standards of success simply are not real, and lack teeth. Even stockpile stewardship is drifting into the realm of bullshit. It started as a worthy program, but over time it has been allowed to lose its substance. Political and financial goals have replaced science and fact, the goals of the program losing connection to objective reality.

Scientific computing came to maturity being an important supporting player for large enterprises. Originally born in the Cold War as a key tool for science and engineering supporting defense science. Scientific computing spread from this base toward more general science, and more recently into broad use by business and the society as a whole. The kernel from which computing sprang was an interwoven set of large National objectives providing the technical foundation that powers our economy today. Computing was a key contributing player in these endeavors. These endeavors also supported a broad phalanx of other technologies and scientific explorations that formed the broad basis of modernizing the world. Such over-arching goals are breathtakingly missing today. We are lacking a World with any vision of a better future and limitless progress.

If we could marshal our efforts into some worthy efforts, what would we work on?

fig10_roleWe would still be chasing faster computers, but the faster computers would not be the primary focus. We would focus on using computing to solve problems that were important. We would focus on making computers that were useful first and foremost. We would want computers that were faster as long as they enabled progress on problem solving. As a result, efforts would be streamlined toward utility. We would not throw vast amounts of effort into making computers faster, just to make them faster (this is what is happening today there is no rhyme or reason to exascale other than, faster is like better, Duh!). Utility means that we would honestly look at what is limiting problem solving and putting our efforts into removing those limits. The effects of this dose of reality on our current efforts would be stunning; we would see a wholesale change in our emphasis and focus away from hardware. Computing hardware would take its proper role as an important tool for scientific computing and no longer be the driving force. The fact that hardware is a driving force for scientific computing is one of clearest indicators of how unhealthy the field is today.

Thinking something does not make it true. Wanting something does not make it real.

― Michelle Hodkin

If scientific computing was taking its role in a healthy National enterprise, the focus would be entirely different. Invariably we would see a very strong emphasis on modeling. In almost every serious endeavor using computing to get real design and analysis results, the physical modeling is the greatest limiting factor. A faster computer is always welcome, but a faster computer never fixes a faulty model. This maxim seems to be utterly and completely ignored in the current scientific computing narrative. The most effective way to improve modeling is also different than current emphasis. Better numerical methods and algorithms provide faster and more accurate solutions to models than computing hardware. This is another area where progress is completely stalled.

code_monkeyCurrent computing focus is only porting old codes to new computers, a process that keeps old models, methods and algorithms in place. This is one of the most corrosive elements in the current mix. The porting of old codes is the utter abdication of intellectual ownership. These old codes are scientific dinosaurs and act to freeze antiquated models, methods and algorithms in place while acting to squash progress. Worse yet, the skillsets necessary for improving the most valuable and important parts of modeling and simulation are allowed to languish. This is worse than simply choosing a less efficient road, this is going backwards. When we need to turn our attention to serious real work, our scientists will not be ready. These choices are dooming an entire generation that could have been making breakthroughs to simply become caretakers. To be proper stewards of our science we need to write new codes containing new models using new methods and algorithms. Porting codes turns our scientists into mindless monks simply transcribing sacred texts without any depth of understanding. It is a recipe for transforming our science into magic. It is the recipe for defeat and the passage away from the greatness we once had.

Without Your Opponent, You are no Victor.

― Anajo Black

 

 

 

 

Saying “NO!” is the key to success

Things which matter most must never be at the mercy of things which matter least.

― Johann Wolfgang von Goethe

bullshit_everywhere-e1345505471862My work day is full of useless bullshit. There is so much bullshit that it has choked out the room for inspiration and value. We are not so much managed as controlled.  This control comes from a fundamental distrust of each other to a degree that any independent ideas are viewed as dangerous. This realization has come upon me in the past few years. It has also occurred to me that this could simply be a mid-life crisis manifesting itself, but the evidence might seem to indicate that it is something more significant (look at the bigger picture of the constant crisis my Nation is in). My mid-life attitudes are simply much less tolerant of time-wasting activities with little or no redeeming value. You realize that your time and energy is limited, why waste it on useless things.

You and everyone you know are going to be dead soon. And in the short amount of time between here and there, you have a limited amount of fucks to give. Very few, in fact. And if you go around giving a fuck about everything and everyone without conscious thought or choice—well, then you’re going to get fucked.

― Mark Manson

the-subtle-art-3d-340pxI read a book that had a big impact on my thinking, “The Subtle Art of Not Giving a Fuck” by Mark Manson . In a nutshell, the book says that you have a finite number of fucks to give in life and you should optimize your life by mindfully not giving a fuck about unimportant things. This gives you the time and energy to actually give a fuck about things that actually matter. The book isn’t about not caring, it is about caring about the right things and dismissing the wrong things. What I realized is that increasingly my work isn’t competing for my fucks, they just assume that I will spend my limited fucks on complete bullshit out of duty. It is actually extremely disrespectful of me and my limited time and effort. One conclusion is that the “bosses” (the Lab, the Department of Energy) not give enough of a fuck about me to treat my limited time and energy with respect and make sure my fucks actually matter.

Maturity is what happens when one learns to only give a fuck about what’s truly fuckworthy.

― Mark Manson

I’ve realized recently that a sense of being inspired has departed from work. I’ve felt this building for years with the feeling that my work is useful and important ebbing away. I’ve been blessed for much of my career with work that felt important and useful where an important component of the product was my own added creativity. The work included a distinct element of my own talents and ideas in whatever was produced.

Superficially speaking, the element of inspiration seems to be present, work with meaning and importance with a sense of substantial freedom. As I implied, these elements are superficial, the reality is that each of these pieces has eroded away, and it is useful to explore how this has happened. The job I have would be a dream to most people, but conditions are degrading. It isn’t just my job, but most Americans are experiencing worsening conditions. The exception is the top of the management class, the executives. This is a mirror to broader societal inequalities logically expressed in the working environment. The key is recognizing that my job used to be much better, and that is something worth exploring in some depth.

At one level, I should be in the midst of a glorious time to be working in computational science and high-performance computing. We have a massive National program focused on achieving “exascale” or at the very least a great advance in computing power. Looking more closely, we can see deep problems that produce an inspiration gap. On the one hand, we have the technical objectives for the program being obsessively hardware focused for progress. We have been on this hardware path for 25 years producing progress, but no transformation in science has actually occurred (the powers that be will say it has, but the truth is that is hasn’t). Our computations are still not predictive, and the hardware is not the limiting aspect of computational science. Worse yet, the opportunities for massive hardware advances has passed and advancing now is fraught with difficulties, roadblocks and will be immensely costly. Aside from hardware, the program is largely focused on low-level software focused while porting old codes, methods and models (note: the things being ported and not invested in are the actual science!). It is not focused on the more limiting aspects of predictive modeling because they are subtle and risky to work on. They cannot be managed like a construction project using off the shelf management practices better suited for low wage workers, and unsuitable for scientists. The hardware path is superficial, easy to explain to the novice and managed as a project similarly to building a bridge or road.K0013407147--590438

This gets to the second problem with the current programs, how they are managed. Science cannot be managed like a big construction project, at least not successfully. The result of this management model is a stifling level of micromanagement. Our management model is defined by overwhelming suspicion and lack of trust resulting in massive inefficiency. The reporting requirements for this mode of management are massive and without value except to bean-counters. At the same time, there is no appetite for risk, and no capacity to tolerate failure. As a result, the entire program loses an ability to inspire, or reach for greatness.

If the Apollo Program had been managed in this fashion, we would have never made it to the Moon while spending vastly greater sums of money. If we had managed the Manhattan project in this way, we would have failed to create the atomic bomb. Without risk, there is no reward. There is a huge amount of resource and effort wasted. We do not lack money as much as we lack vision, inspiration and competent management. This is not to say that the United States does not have an issue investing in science and technology, we do. The current level of commitment to science and technology will assure that some other nation becomes the global leader in science and technology. A compounding issue to the lack of investment is how appallingly inefficient our investment is because of how science is managed today. A complimentary compounding element is the lack of trust in the scientists and engineers. Without trust, no one will take any risk and without taking risks nothing great will ever be achieved. If we don’t solve these problems, we will not produce greatness, plain and simple; we will create decline and decay into mediocrity.

But until a person can say deeply and honestly, “I am what I am today because of the choices I made yesterday,” that person cannot say, “I choose otherwise.

― Stephen R. Covey

None of these problems suddenly appeared. They are the consequence of decades of evolution toward the current completely dysfunctional management approach. Once great Laboratories have been brought to heel with a combination of constraints, regulations and money. There is more than enough money and people to accomplish massive things. The problem is that the constraints and regulatory environment have destroyed any chance for achievement. With each passing year our scientific programs sound more expansive, but less capable of achieving anything of substance. Our management approach is undermining achievement at every turn. The focus of the management is not producing results, but producing the appearance of success without regard for reality. The workforce must be complaint, and never make any mistakes. The best way to avoid mistakes is low-balling results. You always aim low to avoid the possibility of failing. Each year we aim a little lower, and achieve a little less. This has produced a steady erosion of capability much like an interest-bearing account, but in reverse.

mediocritydemotivatorIf we look at work, it might seem that an inspired workforce would be a benefit worth creating. People would work hard and create wonderful things because of the depth of their commitment to a deeper purpose. An employer would benefit mightily from such an environment, and the employees could flourish brimming with satisfaction and growth. With all these benefits, we should expect the workplace to naturally create the conditions for inspiration. Yet this is not happening; the conditions are the complete opposite. The reason is that inspired employees are not entirely controlled. Creative people do things that are unexpected and unplanned. The job of managing a work place like this is much harder. In addition, mistakes and bad things happen too. Failure and mistakes are an inevitable consequence of hard working inspired people. This is the thing that our work places cannot tolerate. The lack of control and unintended consequences are unacceptable. Fundamentally this stems from a complete lack of trust. Our employers do not trust their employees at all. In turn, the employees do not trust the workplace. It is vicious cycles that drags inspiration under and smothers it. The entire environment is overflowing with micromanagement, control suspicion and doubt.

In the end that was the choice you made, and it doesn’t matter how hard it was to make it. It matters that you did.”

― Cassandra Clare

How do we change it?

One clear way of changing this is giving the employees more control over their work. It has become very clear to me that we have little or no power to make choices at work. One of the clearest ways of making a choice is being given the option to say “NO”. Many articles are written about the power of saying NO to things because it makes your “YES” more powerful. The problem is that we can’t say NO to so many things. I can’t begin to elaborate on all the functionally useless things that don’t have to option of skipping. I spend a great deal of effort on mandatory meetings, training, and reporting that has no value whatsoever. None of it is optional, and most of it is completely useless. Each of these useless activities drains away energy from something useful. All of the useless things I do are related to a deep lack of trust in me and my fellow scientists.

Let’s take the endless reporting and tracking of work as a key example. There is nothing wrong with planning a project and getting updates on progress. This is not what is happening today. We are seeing a system that does not trust its employees and needs to continually look over their sholders. A big part of the problem is that the employees are completely uninspired because the programs they work on are terrible. The people see very little of themselves in the work, or much purpose and meaning in the work. Rather than make the work something deeper and more collaborative, the employers increase the micromanagement and control. A big part of the lack of trust is the reporting. Somehow the whole concept of quarterly progress used for business has become part of science creating immense damage. Lately quarterly progress isn’t enough, and we’ve moved to monthly reporting. All of this says, “we don’t trust you,” “we need to watch you closely” and “don’t fuck up”.

The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum….

― Noam Chomsky

The-Subtle-Art-of-not-giving-a-fuck2If we can’t say NO to all this useless stuff, we can’t say YES to things either. My work and time budget is completely stocked up with non-optional things that I should say NO to. They are largely useless and produce no value. Because I can’t say NO, I can’t say YES to something better. My employer is sending a message to me with very clear emphasis, we don’t trust you to make decisions. Your ideas are not worth working on. You are expected to implement other people’s ideas no matter how bad they are. You have no ability to steer the ideas to be better. Your expertise has absolutely no value. A huge part of this problem is the ascendency of the management class as the core of organizational value. We are living in the era of the manager; the employee is a cog and not valued. Organizations voice platitudes toward the employees, but they are hollow. The actions of the organization spell out their true intent. Employees are not to be trusted, they are to be controlled and they need to do what they are told to do. Inspired employees would do things that are not intended, and take organizations in new directions, focused on new things. This would mean losing control and changing plans. More importantly, the value of the organization would move away from the managers and move to the employees. Managers are much happier with employees that are “seen and not heard”.

If something is not a “hell, YEAH!”, then it’s a “no!

― James Altucher

What should I be saying YES to?

If I could say YES then I might be able to put my focus into useful, inspired and risky endeavors. I could produce work that might go in directions that I can’t anticipate or predict. These risky ideas might be complete failures. Being a failure I could learn invaluable lessons, and grow my knowledge and expertise. Being risky these ideas might produce something amazing and create something of real value. None of these outcomes are a sure thing. All of these characteristics are unthinkable today. Our managers want a sure thing and cannot deal with unpredictable outcomes. The biggest thing our managers cannot tolerate is failure. Failure is impossible to take and leads to career limiting consequences. For this reason, inspired risks are impossible to support. As a result, I can’t say NO to anything, no matter how stupid and useless it is. In the process, I see work as an increasingly frustrating waste of my time.

Action expresses priorities.

― Mahatma Gandhi

We all have limits defined our personal time and effort. Naturally we have 24 hours a day, 7 days a week and 365 days a year, along with our own personal energy budget. If we are managed well, we can expand our abilities and create more. We can be more efficient and work more effectively. If one looks honestly at how we are managed expanding our abilities and personal growth has almost no priority. Creating an inspiring and exciting place to work is equally low on the list. Given the pathetic level of support for creation and inspiration attention naturally turns elsewhere. Everyone needs a level of balance in their lives and we obviously gravitate toward places where a difference can be made.

mark_manson_the_subtle_art_of_not_giving_a_f_ck_our_crisis_is_no_longerAs Mark Manson writes we only have so many fucks to give and my work is doing precious little to give them there. I have always focused on personal growth and increasingly personal growth is resisted by work instead of resonated with. It has become quite obvious that being the best “me” is not remotely a priority. The priority at work is to be compliant, take no risks, fail at nothing and help produce marketing material for success and achievement. We aren’t doing great work anymore, but pretend we are. My work could simply be awesome, but that would require giving me the freedom to set priorities, take risks, fail often, learn continually and actually produce wonderful things. If this happened the results would speak for themselves and the marketing would take care of itself. When the Labs I’ve worked at were actually great this is how it actually happened.  The Labs were great because they achieved great things. The labs said NO to a lot of things, so they could say YES to the right things. Today, we simply don’t have this freedom.

We are our choices.

― Jean-Paul Sartre

9.26.16If we could say NO to the bullshit, and give our limited fucks a powerful YES, we might be able to achieve great things. Our Labs could stop trying to convince everyone that they were doing great things and actually do great things. The missing element at work today is trust. If the trust was there we could produce inspiring work that would generate genuine pride and accomplishment. Computing is a wonderful example of these principles in action. Scientific computing became a force in science and engineering contributing to genuine endeavors for massive societal goals. Computing helped win the Cold War and put a man on the moon. Weather and climate has been modeled successfully. More broadly, computers have reshaped business and now societally massively. All of these endeavors had computing contributing to solutions. Computing focused on computers was not the endeavor itself like it is today. The modern computing emphasis was originally part of a bigger program of using science to support the nuclear stockpile without testing. It was part of a focused scientific enterprise and objective. Today it is a goal unto itself, and not moored to anything larger. If we want to progress and advance science, we should focus on great things for society, not superficially put our effort into mere tools.

Most of us spend too much time on what is urgent and not enough time on what is important.

― Stephen R. Covey

Say no to everything, so you can say yes to the one thing.

― Richie Norton

Verification and Validation’s Biggest Hurdle is an Honesty

Better to get hurt by the truth than comforted with a lie.

― Khaled Hosseini

Being honest about one’s shortcomings is incredibly difficult. This is true whether one is looking at their self, or looking at a computer model. It’s even harder to let someone else be honest with you. This difficulty is the core of many problems with verification and validation (V&V). If done correctly, V&V is a form of radical honesty that many simply cannot tolerate. The reasons are easy to see if our reward systems are considered. Computer modeling desires to get great results on the problems they want to solve. Computer modelers are rated on their ability to get seemingly high-quality answers (https://wjrider.wordpress.com/2016/12/22/verification-and-validation-with-uncertainty-quantification-is-the-scientific-method/ ). As a result, there is significant friction with honest V&V assessments, which provide uncertainty and doubt on the quality of results. The tension between good results and honesty will always favor the results. Thus V&V is done poorly to conserve the ability of modelers to believe their results are better than they really are. If we want V&V to be done well an additional level of emphasis needs to be placed on honesty.image008

If you do not tell the truth about yourself you cannot tell it about other people.

― Virginia Woolf

V&V is about assessing capability. It is not about getting great answers. This distinction is essential to recognize. V&V is about collecting highly credible evidence about the nature of modeling capability. By its very nature, the credibility of the evidence means that the results are whatever the results happen to be. If the results are good the evidence will show this persuasively. If the results are poor, the evidence will indicate the quality (https://wjrider.wordpress.com/2017/09/22/testing-the-limits-of-our-knowledge/ ). The utility of V&V is providing a path to improvement along with evidence to support this path. As such, V&V provides a path and evidence for getting to improved results. This improved result would then be supported by V&V assessments. This entire process is predicated on the honesty of those conducting the work, but the management of these efforts is a problem. Management is continually trying to promote the great results outcomes for modeling. Unless the results are actually great, this promotion provides direction for lower quality V&V. In the process, honesty and evidence are typically sacrificed.

image002

Standards Subcommittee. Provide procedures for assessing and quantifying the accuracy and credibility of computational modeling and simulation. V&V Standards Committee in Computational Modeling and Simulation. V&V-10 – Verification and Validation in Computational Solid Mechanics. V&V-20 – Verification and Validation in Computational Fluid Dynamics and Heat Transfer. V&V-30 – Verification and Validation in Computational Simulation of Nuclear System Thermal Fluids Behavior. V&V-40 – Verification and Validation in Computational Modeling of Medical Devices.

If we want to do V&V properly, something in this value system needs to change. Fundamentally, honesty and a true understanding of the basis of computational modeling must surpass the desire to show great capability. The trends in management of science are firmly arrayed against honestly assessing capability. With the prevalence of management by press release, and a marketing based sales pitch for science money both act to promote a basic lack of honesty and undermine disclosure of problems. V&V provides firm evidence of what we know, and what we don’t know. The quantitative and qualitative aspects of V&V can produce exceptionally useful evidence of where modeling needs to improve. These characteristics conflict directly with the narrative that modeling has already brought reality to heel. Program after program is sold on the basis that modeling can produce predictions of what will be seen in reality. Computational modeling is seen as an alternative to expensive and dangerous experiments and testing. It can provide reduced costs and cycle times for engineering. All of this can be a real benefit, but the degree of current mastery is seriously oversold.image001

Doing V&V properly can unmask this deception (I do mean deception even if the deceivers are largely innocent of outright graft). The deception is more the product of massive amounts of wishful thinking, and harmful group think focused on showing good results rather than honest results. Sometimes this means willfully ignoring evidence that does not support the mastery. In other cases, the results are based on heavy-handed calibrations, and the modeling is far from predictive. In the naïve view, the non-predictive modeling will be presented as predictions and hailed as great achievements. Those who manage modeling are largely responsible for this state of affairs. They reward the results that show how good the models are and punish honest assessment. Since V&V is the vehicle for honest assessment, it suffers. Modelers will either avoid V&V entirely, or thwart any effort to apply it properly. Usually the results are given without any firm breakdown of uncertainties, and simply assert that the “agreement is good” or the “agreement is excellent” without any evidentiary basis save plots that display data points and simulation values being “close”.

If you truly have faith in your convictions, then your convictions should be able to stand criticism and testing.

― DaShanne Stokes

This situation can be made better by changing the narrative about what constitutes good results. If we value knowledge and evidence of mastery as objectives instead of predictive power, we tilt the scales toward honesty. One of the clearest invitations to hedge toward dishonesty is the demand of “predictive modeling”. Predictive modeling has become a mantra and sales pitch instead of an objective. Vast sums of money are allotted to purchase computers, and place modeling software on these computers with the promise of prediction. We are told that we can predict how our nuclear weapons work so that we don’t have to test them. The new computer that is a little bit faster is the key to doing this (they always help, but are never the lynchpin). We can predict the effects of human activity on climate to be proactive about stemming its effects. We can predict weather and hurricanes with increasing precision. We can predict all sorts of consequences and effect better designs of our products. All of these predictive capabilities are real, and all have been massively oversold. We have lost our ability to look at challenges as good things and muster the will to overcome them. We need to tilt ourselves to be honest about how predictive we are, and understand where our efforts can make modeling better. Just as important we need to unveil the real limits on our ability to predict.

A large part of the conduct of V&V is unmasking the detailed nature of uncertainty. Some of this uncertainty comes from our lack of knowledge of nature, or flaws in our fundamental models. Other uncertainty is simply intrinsic to our reality. This is phenomena that is variable even with seemingly identical starting points. Separating these types of uncertainty, and defining their magnitude should be greatly in the service of science. For the uncertainties that we can reduce through greater knowledge, we can array efforts to affect this reduction. This must be coupled to the opportunity for experiment and theory to improve matters. On the other hand, if uncertainty is irreducible, it is important to factor it into decisions and accommodate its presence. By ignoring uncertainty with the practice of default of ZERO uncertainty (https://wjrider.wordpress.com/2016/04/22/the-default-uncertainty-is-always-zero/ ), we become powerless to assert our authority, or practically react to it.

image004In the conduct of predictive science, we should look to uncertainty as one of our primary outcomes. When V&V is conducted with high professional standards, uncertainty is unveiled and estimated in magnitude. With our highly over-promised mantra of predictive modeling enabled by high performance computing, uncertainty is almost always viewed negatively. This creates an environment where willful or casual ignorance of uncertainty is tolerated and even encouraged. Incomplete and haphazard V&V practice becomes accepted because it serves the narrative of predictive science. The truth and actual uncertainty is treated as bad news, and greeted with scorn instead of praise. It is simply so much easier to accept the comfort that the modeling has achieved a level of mastery. This comfort is usually offered without evidence.

The trouble with most of us is that we’d rather be ruined by praise than saved by criticism.

― Norman Vincent Peale

Somehow a different narrative and value system needs to be promoted for science to flourish. A starting point would be a recognition of the value of highly professional V&V work and the desire for completeness and disclosure. A second element of the value system would be valuing progress in science. In keeping with the value on progress would be a recognition that detailed knowledge of uncertainty provides direct and useful evidence to steer science productively. We can also use uncertainty to act proactively in making decisions based on actual predictive power. Furthermore, we may choose not to use modeling to decide if the uncertainties are too large and informing decisions. The general support for the march forward of scientific knowledge and capability is greatly aided by V&V. If we have a firm accounting of our current state of knowledge and capability, we can mindfully choose where to put emphasis on progress.\

image006This last point gets at the problems with implementing a more professional V&V practice. If V&V finds that uncertainties are too large, the rational choice may be to not use modeling at all. This runs the risk of being politically incorrect. Programs are sold on predictive modeling, and the money might look like a waste! We might find that the uncertainties from numerical error are much smaller than other uncertainties, and the new super expensive, super-fast computer will not help make things any better. In other cases, we might find out that the model is not converging toward a (correct) solution. Again, the computer is not going to help. Actual V&V is likely to produce results that require changing programs and investments in reaction. Current management often looks to this as a negative and worries that the feedback will reflect poorly on previous investments. There is a deep-seated lack of trust between the source of the money and the work. The lack of trust is driving a lack of honesty in science. Any money spent on fruitless endeavors is viewed as a potential scandal. The money will simply be withdrawn instead of redirected more productively. No one trusts the scientific process to work effectively.  The result is an unwillingness to engage in a frank and accurate dialog about how predictive we actually are.

It’s discouraging to think how many people are shocked by honesty and how few by deceit.

― Noël Coward

It wouldn’t be too much of a stretch to say that technical matters are a minor aspect of improving V&V. This does not make light of, nor minimize the immense technical challenges in conducting V&V. The problem is that the current culture of science is utterly toxic for progress technically. We need a couple of elements to change in the culture of science to make progress. The first one is trust. The lack of trust is pervasive and utterly incapacitating (https://wjrider.wordpress.com/2013/11/27/trust/, https://wjrider.wordpress.com/2016/04/01/our-collective-lack-of-trust-and-its-massive-costs/, https://wjrider.wordpress.com/2014/12/11/trust-and-truth-in-management/  ). Because of the underlying lack of trust, scientists and engineers cannot provide honest results or honest feedback on results. They do not feel safe and secure to do either. This is a core element surrounding the issues with peer review (https://wjrider.wordpress.com/2016/07/16/the-death-of-peer-review/ ). In an environment where there is compromised trust, peer review cannot flourish because honesty is fatal.

Nothing in this world is harder than speaking the truth, nothing easier than flattery.

― Fyodor Dostoyevsky

The second is a value on honesty. Today’s World is full of examples where honesty is punished rather than rewarded. Speaking truth to power is a great way to get fired. Those of us who want to be honest are left in a precarious position. Choose safety and security while compromising our core principles, or stay true to our principles and risk everything. Over time, the forces of compromised integrity, marketing and bullshit over substance wear us down. Today the liars and charlatans are winning. Being someone of integrity is painful and overwhelming difficult. The system seems to be stacked against honest discourse and disclosure. Of course, honesty and trust are completely coupled. Both need to be supported and rewarded. V&V is simply one area where these trends play out and distort work.download-1

It is both jarring and hopeful that the elements holding science back are evident in the wider world. The new and current political discourse is full of issues that are tied to trust and honesty. The degree to which we lack trust and honesty in the public sphere is completely disheartening. The entire system seems to be spiraling out of control. It does not seem that the system can continue on this path much longer (https://wjrider.wordpress.com/2017/10/20/our-silence-is-their-real-power/ ). Perhaps we have hit bottom and things will get better. How much worse can things get? The time for things to start getting better has already passed. This is true in the broader public World as well as science. In both cases trust for each other, and a spirit of honesty would go a long way to providing a foundation for progress. The forces of stagnation and opposition to progress have won too much ground.

Integrity is telling myself the truth. And honesty is telling the truth to other people.

― Spencer Johnson

 

Nothing is so difficult as not deceiving oneself.

― Ludwig Wittgenstein

 

Scientific Computing’s Future Is Mobile, Adaptive, Flexible and Small

Without deviation from the norm, progress is not possible.

― Frank Zappa

titanThere is something seriously off about working on scientific computing today. Once upon a time it felt like working in the future where the technology and the work was amazingly advanced and forward-looking. Over the past decade this feeling has changed dramatically. Working in scientific computing is starting to feel worn-out, old and backwards. It has lost a lot of its sheen and it’s no longer sexy and fresh. If I look back 10 years everything we then had was top of the line and right at the “bleeding” edge. Now we seem to be living in the past, the current advances driving computing are absent from our work lives. We are slaving away in a totally reactive mode. Scientific computing is staid, immobile and static, where modern computing is dynamic, mobile and adaptive. If I want to step into the modern world, now I have to leave work. Work is a glimpse into the past instead of a window to the future. It is not simply the technology, but the management systems that come along with our approach. We are being left behind, and our leadership seems oblivious to the problem.

For most of the history of computing in the 20th and into the 21st Century, scientific computing was at the forefront of technology. That is starting to change. Even today scientific computing remains exotic in terms of hardware and some aspects of software, but it also feels antiquated and antique. We get to use cutting edge computer chips and networking hardware that demand we live on the ragged edge technologically. This is only half the story. We also remain firmly entrenched in the “mainframe” era with corporate computing divisions that seem more “Mad Men” and less “Star Trek” than ever. The distance between the computers we use to execute our leading edge scientific investigations and our offices or our personal lives are diverging at warp speed. It has become hopelessly ironic in many ways. Worse than ironic, the current state of things is unhealthy and lessens the impact of scientific computing on today’s World.

Unknown-2Even worse than the irony is the price this approach is exacting on scientific computing. For example, the computing industry used to beat a path to scientific computing’s door, and now we have to basically bribe the industry to pay attention to us. A fair accounting of the role of government in computing is some combination of being a purely niche market, and partially pork barrel spending. Scientific computing used to be a driving force in the industry, and now lies as a cul-de-sac, or even pocket universe, divorced from the day-to-day reality of computing. Scientific computing is now a tiny and unimportant market to an industry that dominates the modern World. In the process, scientific computing has allowed itself to become disconnected from modernity, and hopelessly imbalanced. Rather than leverage the modern World and its technological wonders many of which are grounded in information science, it resists and fails to make best use of the opportunity. It robs scientific computing of impact in the broader World, and diminishes the draw of new talent to the field.

It would be great to elaborate on the nature of the opportunities, and the cost of the pileofshitpresent imbalances. If one looks at the modern computing industry and its ascension to the top of the economic food chain, two things come to mind: mobile computing – cell phones – and the Internet. Mobile computing made connectivity and access ubiquitous with massive penetration into our lives. Networks and apps began to create new social connections in the real world and lubricated communications between people in a myriad of ways. The Internet became both a huge information repository, and commerce. but also an engine of social connection. In short order, the adoption and use of the internet and computing in the broader human World overtook and surpassed the use by scientists and business. Where once scientists used and knew computers better than anyone, now the World is full of people for whom computing is far more important than for science. Science once were in the lead, and now they are behind. Worse yet, science is not adapting to this new reality.

Those who do not move, do not notice their chains.

― Rosa Luxemburg

The core of the problem with scientific computing is its failure to adapt and take advantage of the opportunity defined by this ascendency of computing. A core of science’s issue with computing is the lost sense that computers are merely a tool. Computers are a tool that may be used to do science. Instead of following this maxim, we simply focus on the older antiquated model of scientific computing firmly grounded in the mainframe era. Our mindset has not evolved with the rest of the World. One of the clear consequences of the mindset is a creeping degree of gluttony and intellectual laziness with high performance computing. All problems reduce to simply creating faster computers and making problems submit to the raw power of virtually limitless computations. We have lost sight of the lack of efficiency of this approach. A renewed focus on issues of modeling, methods and algorithms could be deeply enlivened by the constraints imposed by limited computing resources. Moreover, the benefits of solving problems more efficiently with smaller computing resources would yield innumerable benefits in the setting of big iron. This could be achieved without the very real limitations of having big iron be the sole focus of our efforts.

Cielo rotator

Scientific computing could be arranged to leverage the technology that is advancing the World today. We could look at a mobile, adaptive platform for modeling, simulation and data analysis that harnessed the best of technology. We could move through the cloud using technology in an adaptive, multiscale manner. One of the biggest challenges is letting go of the power dynamic that drives thinking today. Scientific computing has been addicted to Moore’s law for too long. The current exascale push is symptomatic of this addiction. Like any addiction it is unhealthy and causes the subject to avoid real cures for their problem. We see progress as equivalent to raw power with a single computer. The huge stunt calculation as a vehicle for science is a manifestation of this addiction. Science is done with many calculations along with an adaptive examination of problems or mindful interrogation of results. Power can also be achieved through mobility, ubiquity and flexibility. The big iron we pursue has become tantamount to progress because it’s the only route we can envision. The problem is that technology, and the arc of progress is working against us instead of with us. It is past time to change our vision of what the future can be. The future needs to be different by embracing a different technological path. On one hand, we won’t be swimming against the current of computing technology, but on the other hand we will need to invest in different solutions to make it work.

Flexibility is an art of creating way outs within the cul-de-sacs!
― Mehmet Murat ildan

Mobility is power, and it has made computing ubiquitous. When the broader computing industry embraced the death of Moore’s law, it switched its attention to cell phones. Instead of simply being phones, they became mobile computers and mobile extensions of the Internet. In doing so we unleashed a torrent of creativity and connection. All of a sudden, we saw computers enable the level of social connection that the Internet always had promised, but never delivered. The mobile computing revolution has reshaped the World in a decade. In the process, the mobile market overwhelmed the entire computing industry and created economic dominance on an unparalleled scale. The killer piece of technology was the iPhone. It combined a focus on user interface along with software that enabled everything. We also need to recognize that each phone is more powerful than the fastest computer in the World 25 years ago. We have tremendous power at our fingertips.

One of the really clear messages of the recent era in computing is a change in the nature of value and power. For a long time, power was measured by hardware gains in speed, memory and capability, but now application innovation and flexibility rule. Hardware is largely a fixed and slowly changing commodity and represents a level playing field. The software in the applications and the user interface are far more important. Algorithms that direct information and attention are dominating the success in computing. Providing the basis of connection and adaption to the needs of the users has become the medium for creating new markets. At the same time these algorithms have come under fire for how they manipulate people and data. These mobile computers have become a massive issue for society as a whole. We are creating brand new social problems and side-effects we need to effectively solve. The impact of this revolution in computing on society as a whole has been incredible.

A whole cadre of experts is fading from the field of play in computing. In taking the tact of focusing on mainframe computing, scientific computing is sidelining itself. Instead of this enormously talented group of people playing in the area that means the most to society, they are focused on a cul-de-sac grounded in old and outdated models of success. Our society would benefit by engaging these experts in making mobile computing more effective in delivering value in new innovative ways. We could be contributing to solving some of the greatest problems facing us rather than seeing our computing as a special niche serving a relatively small segment of society’s needs. In the past, scientific computing has provided innovative and dynamic solutions that ultimately made their way into the general computing. A perfect example is Google. The problem new-google-algorithmthat Google solved is firmly grounded in scientific computing and applied mathematics. It is easy to see how massive the impact of this solution is. Today we in scientific computing are getting further and further from relevance to society. This niche does scientific computing little good because it is swimming against a tide that is more like a tsunami. The result is a horribly expensive and marginally effective effort that will fail needlessly where it has the potential to provide phenomenal value.

You never change things by fighting the existing reality.

To change something, build a new model that makes the existing model obsolete.

― R. Buckminster Fuller

We are long passed the time to make a change in scientific computing’s direction and strategy. Almost everywhere else the mainframe era died decades ago. Why is scientific computing tied to this model? Why are scientists resisting the conclusions so nakedly obvious? In today’s risk, adverse environment making a change to the underlying model of this branch of science is virtually impossible. Even when the change is dramatically needed and overdue by years the resistance is strong. The status quo is safe and firmly entrenched. In a time when success can be simply asserted and largely manufactured, this unacceptable state of affairs will persist far longer than it should. Sooner or later someone will take the plunge, and success will follow them. They will have the winds of progress at their backs solving most of the problems easily that we throw billions of dollars at with meager success.

The measure of intelligence is the ability to change.

― Albert Einstein