Of course, it’s not really that simple, but yes, the code is part of the model. If it isn’t, one has to provide a substantial burden of proof.
We have no idea about the ‘real’ nature of things … The function of modeling is to arrive at descriptions which are useful.
– Richard Bandler and John Grinder
Ideally, it should not be, but proving that ideal is a very high bar that is almost never met. A great deal of compelling evidence is needed to support an assertion that the code is not part of the model. The real difficulty is that the more complex the modeling problem is, the more the code is definitely and irreducibly part of the model. These complex models are the most important uses of modeling and simulation. The complex models of engineered things, or important physical systems have many submodels each essential to successful modeling. The code is often designed quite specifically to model a class of problems. The code then becomes are clear part of the definition of the problem. Even in the simplest cases, the code includes the recipe for the numerical solution of a model. This numerical solution leaves its fingerprints all over the solution of the model. The numerical solution is imperfect and contains errors that influence the solution. For a code, there is the mesh and geometric description plus boundary conditions, not to mention the various modeling options employed. Removing the specific details of the implementation of the model in the code from consideration as part of the model becomes increasingly intractable.
The word model is used as a noun, adjective, and verb, and in each instance it has a slightly different connotation. As a noun “model” is a representation in the sense in which an architect constructs a small-scale model of a building or a physicist a large-scale model of an atom. As an adjective “model” implies a degree of perfection or idealization, as in reference to a model home, a model student, or a model husband. As an adjective “model” implies a degree or perfection or idealization, as in reference to a model home, a model student, or a model husband. As a verb “to model” means to demonstrate, to reveal, to show what a thing is like.
– Russell L. Ackoff
The word model itself is deeply problematic. Model is one of those words that can mean many different things whether its used a noun or verb (I’ll note in passing much like the curse word, “fuck” is so flexible as to be wonderful and confusing all at once). Its application in a scientific and engineering context is common and pervasive. As such, we need to inject some precision into how it is being used. For this reason, some discourage the use of “model” in discussion. On the other hand, models and modeling is so central to the conduct of science and engineering that it should be dealt with head on. It isn’t going away. We model our reality when we want to make sure we understand it. We engage in modeling when we have something in the Real World, we want to demonstrate an understand of. Sometimes this is for the purpose of understanding, but ultimately this gives way to manipulation, the essence of engineering. The Real World is complex and effective models are usually immune to analytical solution.
Essentially, all models are wrong, but some are useful.
– George E. P. Box, Norman R. Draper
You view the world from within a model.
― Nassim Nicholas Taleb
Computational science comes to the rescue, and opens the doors to solving these complex models via numerical approximations. It is a marvelous advance, but brings new challenges because the solutions are imperfect. This adds a new layer of imperfection to modeling. We already should recognize that models are generically approximate versions of reality (i.e., wrong), and necessarily imperfect mathematical representations of the Real World. Solving this imperfect model, imperfectly via an approximate method makes the modeling issue even more fraught. Invariably for any model with complexity, the numerical solution of the model, and its detailed description are implemented in computer code, or “a computer code”. The details and correctness of the implementation become inseparable from the model itself. It becomes quite difficult to extract the model as any sort of pure mathematical construct; the code is part of it intimately.
Evidence of the model’s nature and correctness is produced in the basic conduct of verification and validation with uncertainty quantification. Doing a full accounting of the credibility of modeling, including pedigree of the model will not help to exclude the code from the model, simply define the extent of this connection. Properly speaking, the code is always part of the model, but the extent or magnitude of its impact can be small, or even considered minor or negligible. This evidence is contained within the full assessment of the predictive quality of the simulation including a quantitative assessment. Among these activities verification is the most important for the question at hand. Do we have evidence that the mathematical model desired is correctly solved? Do we have evidence that the numerical errors in the solution are small? Can all the aspects of the model be well described by clearly articulated mathematics?
Any physical theory is always provisional, in the sense that it is only a hypothesis: you can never prove it. No matter how many times the results of experiments agree with some theory, you can never be sure that the next time the result will not contradict the theory.
― Stephen Hawking
A model is not the operating system for the universe. Reality is not determined by these mathematical abstractions; the mathematics is designed to describe what we observe. As such, the models are always flawed and imperfect representations to some level. Determining the flaws and the quantitative level of imperfection is difficult work requiring detailed verification and validation. It is an abstraction and representation of the processes we believe produce observable physical effects. We theorize that the model explains how these effects are produced. Some models are not remotely this high minded; they are nothing, but crude empirical engines for reproducing what we observe. Unfortunately, as phenomena become more complex, these crude models become increasingly essential to modeling. They may not play a central role in the modeling, but still provide necessary physical effects for utility. These submodels necessary to produce realistic simulations become ever more prone to include these crude empirical engines as problems enter the engineering realm. As the reality of interest becomes more complicated, the modeling becomes elaborate and complex being a deep chain of efforts to grapple with these details.
It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.
― Arthur Conan Doyle
Validation of a model occurs when we take the results of solving the model and compare them directly with observations from the Real World. A key aspect of the validation exercise is characterizing the uncertainty in both the observations and the model. When all this assessment is in hand, we can render a judgment of whether the model represents the observed reality well enough for the purposes we intend. This use is defined by a question we want to answer with the modeling. The answer needs to have a certain fidelity, and certainty that provides the notion of precision to the exercise. The certainty of the observations defines a degree of agreement that can be demanded. The model’s uncertainties define the model’s precision, but includes the impact of numerical approximation. The numerical uncertainty needs to be accounted for to isolate the model. This uncertainty defines the level of approximation in the solution to the model, and a deviation from the mathematical idealization the model represents. In actual validation work, we see a stunning lack of this essential step from validation work presented. Another big part of the validation is recognizing the subtle differences between calibrated results and predictive simulation. Again, calibration is rarely elaborated in validation to the degree that it should.
We should always expect the model to deviate from observations to some degree. If we are capable of producing more accurate observations of reality, we can more accurately determine how wrong the model is. In a sense, we can view this as a competitive race. If our model is quite precise, we are challenged in being able to observe nature well enough to expose its innate flaws. Conversely, if we can observe nature with extreme precision, we can define the model’s imperfections clearly. Progress can be made by using this tension to push one or the other. The modeling uncertainty is compounded by approximate numerical solution implemented in a computer code (including the correctness of the code). Verification and validation activities are a systematic manner to collect evidence so that the comparison can be made in a complete and compelling manner.
Computer codes serve two very important roles in modeling: the model is contained in the code including geometry, boundary condition, and a host of ancillary models for complex situations, and solving the model numerically. Both of these characteristics are essential in the conduct of modeling, but numerical solutions are far more subtle and complex. Many people using codes for modeling do not have a background sufficient to understand the subtleties of numerical methods and their impact on solutions. Moreover, the fiction that numerical methods and codes are so reliable that detailed understanding is not essential, persists and grows. Our high performance computing programs work to fuel this fiction. The most obvious aspect of the numerical solution is the meshing and the time integration with the error’s proportionality to this detail. Producing evidence of the correctness and error characteristics is produced through verification. In addition, most advanced codes solve linear and nonlinear equations in an iterative manner. Iterative solutions have a finite tolerance in their solution, which can impact solutions. This is particularly true for nonlinear equation solvers where the error tolerance that can be achieved by some popular solvers is extremely loose. This looseness can produce significant physical effects in solutions. Most verification work does not examine these aspects closely although they should. Again, the code and its capabilities and methods are extremely important, if not essential, to the model produced. In many cases fantastic modeling work is polluted by naïve numerical methods, thus a wonderful model can be wiped out by a terrible code.
You’ve baked a really lovely cake, but then you’ve used dog shit for frosting.
― Steve Jobs
So, when can we exclude the code? The big thing to focus on in this question is verification evidence. Code verification is necessary to be confident that the mathematical model intended is provably present in the code. It asks whether the mathematical abstraction that the model is based on is correctly solved by the code. Code verification can be completely satisfactory and successful, and the code can still be important. Code verification does not say that the numerical error is small, it says that numerical error is ordered and the model equations desired to be solved are indeed solved. The second half of verification is solution (calculation) verification determines the errors in solving the model. The question is how large (or small) the numerical errors in the solution of the model are? Ultimately, these errors are a strong function of the discretization and solver used in the code. The question of whether the code matters comes down to asking if another code used skillfully would produce a significantly different result. This is rarely, if ever, the case. To make matters worse, verification evidence tends to be flimsy and half-assed. Even if we could make this call and ignore the code, we rarely have evidence that this is a valid and defensible decision.
Truth can only be found in one place: the code.
― Robert C. Martin
In closing, the code IS part of the model unless evidence can be found otherwise. This can happen more easily where the model is simple. In general, the exclusion of the code is an ideal that cannot be reached. As models become complex detaching the model from the code becomes nearly intractable, and indefensible. Evidence will almost invariably point to the code being an important contributor to the model’s picture of reality.
For the scientist a model is also a way in which the human though processes can be amplified. This method often takes the form of models that can be programmed into computers. At no point, however, the scientist intend to loose control of the situation because off the computer does some of his thinking for him. The scientist controls the basic assumptions and the computer only derives some of the more complicated implications.
– C. West Churchman