Colorful fluid dynamics” and overconfidence in global climate models

From Climate Etc.

by David Young

This post lays out in fairly complete detail some basic facts about Computational Fluid Dynamics (CFD) modeling. This technology is the core of all general circulation models of the atmosphere and oceans, and hence global climate models (GCMs).  I discuss some common misconceptions about these models, which lead to overconfidence in these simulations. This situation is related to the replication crisis in science generally, whereby much of the literature is affected by selection and positive results bias.

A full-length version of this article can be found at [ lawsofphysics1 ], including voluminous references. See also this publication [ onera ]

1        Background

Numerical simulation over the last 60 years has come to play a larger and larger role in engineering design and scientific investigations. The level of detail and physical modeling varies greatly, as do the accuracy requirements. For aerodynamic simulations, accurate drag increments between configurations have high value. In climate simulations, a widely used target variable is temperature anomaly. Both drag increments and temperature anomalies are particularly difficult to compute accurately. The reason is simple: both output quantities are several orders of magnitude smaller than the overall absolute levels of momentum for drag or energy for temperature anomalies. This means that without tremendous effort, the output quantity is smaller than the numerical truncation error. Great care can sometimes provide accurate results, but careful numerical control over all aspects of complex simulations is required.

Contrast this with some fields of science where only general understanding is sought. In this case qualitatively interesting results can be easier to provide. This is known in the parlance of the field as “Colorful Fluid Dynamics.” While this is somewhat pejorative, these simulations do have their place. It cannot be stressed too strongly however that even the broad “patterns” can be quite wrong. Only after extensive validation can such simulations be trusted qualitatively, and even then only for the class of problems used in the validation. Such a validation process for one aeronautical CFD code consumed perhaps 50-100 man years of effort in a setting where high quality data was generally available. What is all too common among non-specialists is to conflate the two usage regimes (colorful versus validated) or to make the assumption that realistic looking results imply quantitatively meaningful results.

The first point is that some fields of numerical simulation are very well founded on rigorous mathematical theory. Two that come to mind are electromagnetic scattering and linear structural dynamics. Electromagnetic scattering is governed by Maxwell’s equations which are linear. The theory is well understood, and very good numerical simulations are available. Generally, it is possible to develop accurate methods that provide high quality quantitative results.  Structural modeling in the linear elasticity range is also governed by well posed elliptic partial differential equations.

2        Computational Fluid Dynamics

The Earth system with its atmosphere and oceans is much more complex than most engineering simulations and thus the models are far more complex. However, the heart of any General Circulation Model (GCM) is a “dynamic core” that embodies the Navier-Stokes equations. Primarily, the added complexity is manifested in many subgrid models of high complexity. However, at some fundamental level a GCM is computational fluid dynamics. In fact GCM’s were among the first efforts to solve the Navier-Stokes equations and many initial problems were solved by the pioneers in the field, such as the removal of sound waves. There is a positive feature of this history in that the methods and codes tend to be optimized quite well within the universe of methods and computers currently used. The downside is that there can be a very high cost to building a new code or inserting a new method into an existing code. In any such effort, even real improvements will at first appear to be inferior to the existing technology. This is a huge impediment to progress and the penetration of more modern methods into the codes.

The best technical argument I have heard in defense of GCM’s is that Rossby waves are vastly easier to model than aeronautical flows where the pressure gradients and forcing can be a lot higher. There is some truth in this argument. The large-scale vortex evolution in the atmosphere on shorter time scales is relatively unaffected by turbulence and viscous effects, even though at finer scales the problem is ill-posed. However, there are many other at least equally important components of the earth system. An important one is tropical convection, a classical ill-posed problem because of the-large scale turbulent interfaces and shear layers. While usually neglected in aeronautical calculations, free air turbulence is in many cases very large in the atmosphere. However, it is typically neglected outside the boundary layer in GCMs. And of course there are clouds, convection and precipitation, which have a very significant effect on overall energy balance. One must also bear in mind that aeronautical vehicles are designed to be stable and to minimize the effects of ill-posedness, in that pathological nonlinear behaviors are avoided. In this sense aeronautical models may be actually easier to model than the atmosphere. In any case aeronautical simulations are greatly simplified by a number of assumptions, for example that the onset flow is steady and essentially free of atmospheric turbulence. Aeronautical flows can often be assumed to be essentially isentropic outside the boundary layer.

As will be argued below, the CFD literature is affected by positive results and selection bias. In the last 20 years, there has been increasing consciousness of and documentation of the strong influence that biased work can have on the scientific literature. It is perhaps best documented in the medical literature where the scientific communities are very large and diverse. These biases must be acknowledged by the community before they can be addressed. Of course, there are strong structural problems in modern science that make this a difficult thing to achieve.

Fluid Dynamics is a much more difficult problem than electromagnetic scattering or linear structures. First many of the problems are ill posed or nearly so. As is perhaps to be expected with nonlinear systems, there are also often multiple solutions. Even in steady RANS (Reynolds Averaged Navier-Stokes) simulations there can be sensitivity to initial conditions or numerical details or gridding.  The AIAA Drag Prediction Workshop Series has shown the high levels of variability in CFD simulations even in attached mildly transonic and subsonic flows. These problems are far more common than reported in the literature.

Another problem associated with nonlinearity in the equations is turbulence, basically defined as small scale fluctuations that have random statistical properties. There is still some debate about whether turbulence is completely represented by accurate solutions to the Navier-Stokes equations, even though most experts believe that it is. But the most critical difficulty is the fact that in most real life applications the Reynolds number is high or very high. The Reynolds number represents roughly the ratio of inertial forces to viscous forces. One might think if the viscous forcing was 4 to 7 orders of magnitude smaller than the inertial forcing (as it is for example in many aircraft and atmospheric simulations), it could be neglected. Nothing could be further from the truth. The inclusion of these viscous forces often results in an O(1) change in even total forces. Certainly, the effect on smaller quantities like drag is large and critical to successful simulations in most situations. Thus, most CFD simulations are inherently numerically difficult and simplifications and approximations are required. There is a vast literature on these subjects going back to the introduction of the digital computer; John Von Neumann made some of the first forays into understanding the behaviour of discrete approximations.

The discrete problem sizes required for modeling fluid flows by resolving all the relevant scales grow as Reynolds number to the power 9/4 in the general case, assuming second order numerical discretizations. Computational effort grows at least linearly with discrete problem size multiplied by the number of time steps. Time steps must also decrease as the spatial grid is refined because of the stability requirements of the Courant-Freidrichs-Levy condition as well as to control time discretization errors. The number of time steps grows as Reynolds number to the power 3/4. Thus overall computational effort grows with Reynolds number to the power 3. Thus, for almost all problems of practical interest, it is computationally impossible (and will be for the forseeable future) to resolve all the important scales of the flow and so one must resort to subgrid models of fluctuations not resolved by the grid. For many idealized engineering problems, turbulence is the primary effect that must be so modeled. In GCMs there are many more, such as clouds. References are given in the full paper for some other views that may not fully agree with the one presented here in order to give people a feel for the range of opinion in the field.

For modeling the atmosphere, the difficulties are immense. The Reynolds numbers are high and the turbulence levels are large but highly variable. Many of the supposedly small effects must be neglected based on scientific judgment. There are also large energy flows and evaporation and precipitation and clouds, which are all ignored in virtually all aerodynamic simulations for example. Ocean models require different methods as they are essentially incompressible. This in some sense simplifies the underlying Navier-Stokes equations but adds mathematical difficulties.

2.1       The Role of Numerical Errors in CFD

Generally, the results of many steady state aeronautical CFD simulations are reproducible and reliable for thin boundary and shear layer dominated flows by assuming little flow separation and subsonic flow. There are now a few codes that are capable of demonstrating grid convergence for the simpler geometries or lower Reynolds numbers. However, many of these simulations make many simplifying assumptions and uncertainty is much larger for separated or transonic flows.

The contrast with climate models speaks for itself. Typical grid spacings in climate models are often exceed 100 km and their vertical grid resolution is almost certainly inadequate. Further many of the models use spectral methods that are not fully stable. Various forms of filtering are used to remove undesirable oscillations. Further, the many subgrid models are solved sequentially, adding another source of numerical errors and making tuning problematic.

2.2       The Role of Turbulence and Chaos in Fluid Mechanics

In this section I describe some well verified science from fluid mechanics that govern all Navier-Stokes simulations and that must inform any non-trivial discussion of weather or climate models. One of the problems in climate science is lack of fundamental understanding of these basic conclusions of fluid mechanics or (as perhaps the case may be for some) a reluctance to discuss the consequences of this science.

Turbulence models have advanced tremendously in the last 50 years and climate models do not use the latest of these models, so far as I can tell. Further, for large-scale vortical 3D flow, turbulence models are quite inadequate. Nonetheless, proper modeling of turbulence by solving auxiliary differential equations is critical to achieving reasonable accuracy.

Just to give one fundamental problem that is a showstopper at the moment: how to control numerical error in any time accurate eddy resolving simulation. Classical methods fail. How can one tune such a model? You can tune it for a given grid and initial condition, but that tuning might fail on a finer grid or with different initial conditions. This problem is just now beginning to be explored and is of critical importance for predicting climate or any other chaotic flow.

When truncation errors are significant (as they are in most practical fluid dynamics simulations particularly climate simulations), there is a constant danger of “overtuning” subgrid models, discretization parameters or the hundreds of other parameters. The problem here is that tuning a simulation for a few particular cases too accurately is really just getting large errors to cancel for these cases. Thus skill will actually be worse for cases outside the tuning set. In climate models the truncation errors are particularly large and computation costs too high to permit systematic study of the size of the various errors. Thus tuning is problematic.

2.3       Time Accurate Calculations – A Panacea?

All turbulent flows are time dependent and there is no true steady state. However, using Reynolds averaging, one can separate the flow field into a steady component and a hopefully small component consisting of the unsteady fluctuations. The unsteady component can then be modeled in various ways. The larger the truly unsteady component is, the more challenging the modeling problem becomes.

One might be tempted to always treat the problem as a time dependent problem. This has several challenges, however. At least in principle (but not always in practice) one should be able to use conventional numerical consistency checks in the steady state case. For example, one can check grid convergence, calculate sensitivities for parameters cheaply using linearizations, and use the residual as a measure of reliability. For the Navier-Stokes equations, there is no rigorous proof that the infinite grid limit exists or is unique. In fact, there is strong evidence for multiple solutions, some corresponding to states seen in testing, and others not. All these conveniences are either inapplicable to time accurate simulations or are much more difficult to assess.

Time accurate simulations are also challenging because the numerical errors are in some sense cumulative, i.e., an error at a given time step will be propagated to all subsequent time steps. Generally, some kind of stability of the underlying continuous problem is required to achieve convergence. Likewise a stable numerical scheme is helpful.

For any chaotic time accurate simulation, classical methods of numerical error control fail. Because the initial value problem is ill-posed, the adjoint diverges. This is a truly daunting problem. We know numerical errors are cumulative and can grow nonlinearly, but our usual methods are completely inapplicable.

For chaotic systems, the main argument that I have heard for time accurate simulations being meaningful is “at least there is an attractor.” The thinking is that if the attractor is sufficiently attractive, then errors in the solution will die off or at least remain bounded and not materially affect the time average solution or even the “climate” of the solution. The solution at any given time may be wildly inaccurate in detail as Lorenz discovered, but the climate will (according to this argument) be correct. At least this is an argument that can be developed and eventually quantified and proven or disproven. Paul Williams has a nice example of the large effect of the time step on the climate of the Lorentz system. Evidence is emerging of a similar effect due to spatial grid resolution for time accurate Large Eddy Simulations and a disturbing lack of grid convergence. Further, the attractor may be only slightly attractive and there will be bifurcation points and saddle points as well. And, the attractor can be of very high dimension, meaning that tracing out all its parts could be computationally a monumental if not impossible task. So far, the bounds on attractor dimension are very large. My suggestion would be to develop and fund a large long term research effort in this area with the best minds in the field of nonlinear theory. Theoretical understanding may not be adequate at the present time to address it computationally. There is some interesting work by Wang at MIT on shadowing that may eventually be computationally feasible that could address some of the stability issues for the long-term climate of the attractor. For the special case of periodic or nearly periodic flows, another approach that is more computationally tractable is windowing. This problem of time accurate simulations of chaotic systems seems to me to be a very important unsolved question in fundamental science and mathematics and one with tremendous potential impact across many fields.

While climate modelers Palmer and Stevens’ 2019 short perspective note (see full paper for the reference) is an excellent contribution by two unusually honest scientists, there is in my opinion reason for skepticism about their proposal to make climate models into eddy resolving simulations. Their assessment of climate models is in my view mostly correct and agrees with the thrust of this post, but there are a host of theoretical issues to be resolved before casting our lot with largely unexplored simulation methods that face serious theoretical challenges. Dramatic increases in resolution are obviously sorely needed in climate models and dramatic improvements may be possible in subgrid models once resolution is improved. Just as an example, modern PDE based models may make a significant difference. I don’t think anyone knows the outcomes of these various steps toward improvement.

3        The “Laws of Physics”

The “laws of physics” are usually thought of as conservation laws, the most important being conservation of mass, momentum, and energy. The conservation laws with appropriate source terms for fluids are the Navier- Stokes equations. These equations correctly represent the local conservation laws and offer the possibility of numerical simulations. This is expanded on in the full paper.

3.1       Initial Value Problem or Boundary Value Problem?

One often hears that “the climate of the attractor is a boundary value problem” and therefore it is predictable. This is nothing but an assertion with little to back it up. And of course, even assuming that the attractor is regular enough to be predictable, there is the separate question of whether it is computable with finite computing time. It is similar to the folk doctrine that turbulence models convert an ill-posed time dependent problem into a well posed steady state one. This doctrine has been proven to be wrong – as the prevalence of multiple solutions discussed above shows. However, those who are engaged in selling CFD have found it attractive despite its unscientific and effectively unverifiable nature.

A simple analogy for the climate system might be a wing as Nick Stokes has suggested. As pointed out above, the drag for a well-designed wing is in some ways a good analogy for the temperature anomaly of the climate system. The climate may respond linearly to changes in forcings over a narrow range. But that tells us little. To be useful, one must know the rate of response and the value (the value of temperature is important for example for ice sheet response). These are strongly dependent on details of the dynamics of the climate system through nonlinear feedbacks.

Many use this analogy to try to transfer the credibility [not fully deserved] from CFD simulations of simple systems to climate models or other complex separated flow simulations. This is not a correct implication. In any case, even simple aeronautical simulations can have very high uncertainty when used to simulate challenging flows.

3.2       Turbulence and SubGrid Models

Subgrid turbulence models have advanced tremendously over the last 50 years. The subgrid models must modify the Navier-Stokes equations if they are to have the needed effect. Turbulence models typically modify the true fluid viscosity by dramatically increasing it in certain parts of the flow, e.g., a boundary layer. The problem here is that these changes are not really based on the “laws of physics”, and certainly not on the conservation laws. The models are typically based on assumed relationships that are suggested by limited sets of test data or by simply fitting available test data. They tend to be very highly nonlinear and typically make an O(1) difference in the total forces. As one might guess, this area is one where controversy is rife. Most would characterize this as a very challenging problem, in fact one that will probably never be completely solved, so further research and controversy is a good thing.

Negative results about subgrid models have begun to appear. One recent paper shows that cloud microphysics models have parameters that are not well constrained by data. Using plausible values, ECS (equilibrium climate sensitivity) can be “engineered” over a significant range. Another interesting result shows that model results can depend strongly on the order chosen to solve the numerous subgrid models in a given cell. In fact, the subgrid models should be solved simultaneously so that any tuning is more independent of numerical details of the methods used. This is a fundamental principle of using such models and is the only way to ensure that tuning is meaningful. Indeed, many metrics for skill are poorly replicated by current generation climate models, particularly regional precipitation changes, cloud fraction as a function of latitude, Total Lower Troposphere temperature changes compared to radiosondes and satellite derived values, tropical convection aggregation and Sea Surface Temperature changes, just to name a few. This lack of skill for SST changes seems to be a reason why GCM model-derived ECS is inconsistent with observationally constrained energy balance methods.

Given the large grid spacings used in climate models, this is not surprising. Truncation errors are almost certainly larger than the changes in energy flows that are being modeled.  In this situation, skill is to be expected only on those metrics involved in tuning (either conscious or subconscious) or metrics closely associated with them. In layman’s terms, those metrics used in tuning come into alignment with the data only because of cancellation of errors.

One can make a plausible argument for why models do a reasonable job of replicating the global average surface temperature anomaly. The models are mostly tuned to match top of atmosphere radiation balance. If their ocean heat uptake is also consistent with reality (and it seems to be pretty close) and if the models conserve energy, one would expect the average temperature to be roughly right even if it is not explicitly used for tuning. However, this apparent skill does not mean that other outputs will also be skillful.

This problem of inadequate tuning and unconscious bias plagues all application areas of CFD. A typical situation involves a decades long campaign of attempts to apply a customer’s favorite code to an application problem (or small class of problems). Over the course of this campaign many, many combinations of gridding and other parameters are “tried” until an acceptable result is achieved. The more challenging issue of establishing the limitations of this acceptable “accuracy” for different types of flows is often neglected because of lack of resources. Thus, the cancellation of large numerical errors is never quantified and remains hidden, waiting to emerge when a more challenging problem is attempted.

3.3       Overconfidence and Bias

As time passes, the seriousness of the bias issue in science continues to be better documented and understood. One recent example quotes one researcher as saying “Loose scientific methods are leading to a massive false positive bias in the literature.” Another study states:

“Poor research design and data analysis encourage false-positive findings. Such poormethods persist despite perennial calls for improvement, suggesting that they result from something more than just misunderstanding. The persistence of poor methods results partly from incentives that favour them, leading to the natural selection of bad science.”

In less scholarly settings, these results are typically met with various forms of rationalization. Often we are told that “the fundamentals are secure” or “my field is different” or “this affects only the medical fields.” To those in the field, however, it is obvious that strong positive bias affects the Computational Fluid Dynamics literature for the reasons described above and that practitioners are often overconfident.

This overconfidence in the codes and methods suits the perceived self-interest of those applying the codes (and for a while suited the interests of the code developers and researchers), as it provides funding to continue development and application of the models to ever more challenging problems. Recently, this confluence of interests has been altered by an unforeseen consequence, namely laymen who determine funding have come to believe that CFD is a solved problem and hence have dramatically reduced the funding stream for fundamental development of new methods and also for new theoretical research. This conclusion is an easy one for outsiders to reach given the CFD literature, where positive results predominate even though we know the models are just wrong both locally and globally for large classes of flows, for example strongly separated flows. Unfortunately, this problem of bias is not limited to CFD, but I believe is common in many other fields that use CFD modeling as well.

Another rationalization used to justify confidence in models are appeals to the “laws of physics” as discussed above. These appeals however omit a very important source of uncertainty and seem to provide a patina of certainty covering a far more complex reality.

Another corollary of the doctrine of the “laws of physics” is the idea that “more physics” must be better. Thus, simple models that ignore some feedbacks or terms in the equations are often maligned. This doctrine also suits the interest of some in the community, i.e., those working on more complex and costly simulations. It is also a favored tactic of Colorful Fluid Dynamics to portray the ultimately accurate simulation as just around the corner if we get all the “physics” included and use a sufficiently massive parallel computer. This view is not an obvious one when critically examined. It is widely held however among both people who run and use CFD results and those who fund CFD.

3.4       Further Research

So what is the future of such simulations and GCMs? As attempts are made to use them in areas where public health and safety are at stake, estimating uncertainty will become increasingly important. Items deserving attention in my opinion are discussed in some detail in the full paper, posted here on Climate Etc. I would argue that the most important elements needing attention, both in CFD and in climate and weather modeling, are new theoretical work and insights and the development of more accurate data. The latter work is not glamorous and the former can entail career risks. These are hard problems. and in many cases, a particular line of enquiry will not yield anything really new.

The dangers to be combatted include:

  • It is critical to realize that the literature is biased and that replication failures are often not published.
  • We really need to escape from the elliptic boundary value problem (well posed) mental model that are held by so many with a passing familiarity with the issues. A variant of this mental model one encounters in the climate world is the doctrine of “converting an initial value problem to a boundary value problem.” This just confuses the issue, which is really about the attractor and its properties. The methods developed for well-posed elliptic problems have been pursued about as far as they will take us. However, this mental model can result in dramatic overconfidence in models in CFD.
  • A corollary of the “boundary value problem” misnomer is the idea that “If I run the model right, the answer will be right” mental model. This is patently false and even dangerous, however, it gratifies egos and aids in marketing.

4        Conclusion

I have tried to lay out in summary form some of the issues with high Reynolds number fluid simulations and to highlight the problem of overconfidence as well as some avenues to try to fundamentally advance our understanding. Laymen need to be aware of the typical tactics of the dark arts of “Colorful Fluid Dynamics” and “science communication.” It is critical to realize that much of the literature is affected by selection and positive results bias. This is something that most will admit privately, but is almost never publicly discussed.

How does this bias come about? An all too common scenario is for a researcher to have developed a new code or a new feature of an old code or to be trying to apply an existing code or method to a particular test case of interest to a customer. The first step is to find some data that is publicly available or obtain customer supplied data. Much of the older and well documented experiments involve flows that are not tremendously challenging. One then runs the code or model (adjusting grid strategies, discretization and solver methodologies, and turbulence model parameters or methods) until the results match the data reasonably well. Then the work often stops (in many cases because of lack of funding or lack of incentives to draw more scientifically balanced conclusions) and is published. The often large number of runs with different parameters that provided less convincing results are explained as due to “bad gridding,” “inadequate parameter tuning,” “my inexperience in running the code,” etc. The supply of witches to be burned is seemingly endless. These rationalizations are usually quite honest and sincerely believed, but biased. They are based on a cultural bias that if the model is “run right” then the results will be right, if not quantitatively, then at least qualitatively. As we saw above, those who develop the models themselves know this to be incorrect as do those responsible for using the simulations where public safety is at stake. As a last resort one can always point to any deficiencies in the data or for the more brazen, simply claim the data is wrong since it disagrees with the simulation. The far more interesting and valuable questions about robustness and uncertainty or even structural instability in the results are often neglected. One logical conclusion to be drawn from the perspective by Palmer and Stevens calling for eddy resolving climate models is that the world of GCM’s is little better. However, this paper is a hopeful sign of a desire to improve and is to be strongly commended.

This may seem a cynical view, but it is unfortunately based on practices in the pressure filled research environment that are all too common. There is tremendous pressure to produce “good” results to keep the funding stream alive, as those in the field well know. Just as reported in medically related fields, replication efforts for CFD have often been unsuccessful, but almost always go unpublished because of the lack of incentives to do so. It is sad to have to add that in some cases, senior people in the field can suppress negative results. Some way needs to be found to provide incentives for honest and objective replication efforts and publishing those findings regardless of the opinions of the authors of the method. Priorities somehow need to be realigned toward more scientifically valuable information about robustness and stability of results and addressing uncertainty.

However, I see some promising signs of progress in science. In medicine, recent work shows that reforms can have dramatic effects in improving the quality of the literature. There is a growing recognition of the replication crisis generally and the need to take action to prevent science’s reputation with the public from being irreparably damaged. As simulations move into the arena affecting public safety and health, there will be hopefully increasing scrutiny, healthy skepticism, and more honesty. Palmer and Stevens’ recent paper is an important (and difficult in the politically charged climate field) step forward on a long and difficult road to improved science.

In my opinion those who retard progress in CFD are often involved in “science communication” and “Colorful Fluid Dynamics.” They sometimes view their job as justifying political outcomes by whitewashing high levels of uncertainty and bias or making the story good click bait by exaggerating. Worse still, many act as apologists for “science” or senior researchers and tend to minimize any problems. Nothing could be more effective in producing the exact opposite of the desired outcome, viz., a cynical and disillusioned public already tired of the seemingly endless scary stories about dire consequences often based on nothing more than the pseudo-science of “science communication” of politically motivated narratives. This effect has already played out in medicine where the public and many physicians are already quite skeptical of health advice based on retrospective studies, biased reporting, or slick advertising claiming vague but huge benefits for products or procedures. Unfortunately, bad medical science continues to affect the health of millions and wastes untold billions of dollars. The mechanisms for quantifying the state of the science on any topic, and particularly estimating the often high uncertainties, are very weak. As always in human affairs, complete honesty and directness is the best long term strategy. Particularly for science, which tends to hold itself up as having high authority, the danger is in my view worth addressing urgently. This response is demanded not just by concerns about public perceptions, but also by ethical considerations and simple honesty as well as a regard for the lives and well-being of the consumers of our work who deserve the best information available.

Biosketch.: David Young received a PhD in mathematics in 1979 from the University of Colorado-Boulder. After completing graduate school, Dr. Young joined the Boeing Company and has worked on a wide variety of projects involving computational physics, computer programming, and numerical analysis. His work has has been focused on the application areas of aerodynamics, aeroelastics, computational fluid dynamics,airframe design, flutter, acoustics, and electromagnetics. To address these applications, he has done original theoretical work in high performance computing, linear potential flow and boundary integral equations, nonlinear potential flow, discretizations for the Navier-Stokes equations, partial differential equations and the finite element method, preconditioning methods for large linear systems, Krylov subspace methods for very large nonlinear systems, design and optimization methods, and iterative methods for highly nonlinear systems.

Moderation note: This is a technical thread, and comments will be ruthlessly moderated for relevance and civility.

5 11 votes
Article Rating
61 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
December 2, 2022 1:33 pm

A fundamental error with climate models is that they are initialised on the basis that there was an energy balance around 1850 so climate was steady due to long run energy balance.

That is so far from reality that it makes any resulting output next to useless. It takes thousands of years for deep oceans to respond to changing heat input. It takes tens of thousands of years for land to store and discharge ice through a cycle of glaciation.

It does not matter how good the code for fluid flows are, if you apply the wrong drivers such as CO2 forcing and back radiation and no meaningful cloud forming component based on the physics of water solidification and condensing then there is no hope of getting anything meaningful out. The most critical atmospheric control on earth’s energy balance is deep convection. I have not seen any reasonable academic study of this process. It is the major factor in earth having a habitable climate. Without deep convection pumping up the atmosphere, earth would descend into a snowball.

So called “global warming” has nothing to do with CO2. It is a function of the shifting solar intensity primarily due to the precession cycle. There was no energy balance in 1850 like there is no energy balance now. Earth has just started the climb back into glaciation as the northern oceans warm up. No climate model is predicting the accumulation of ice on the northern land masses but it is already occurring on Greenland. It takes a lot of energy to liberate water from the oceans and deposit it on elevated land.

No good solution to Navier-Stokes equations are going to tell you much about the rate of accumulation of ice on land. That ice will play a dramatic role in changing the elevation of the land relative to the oceans then eventually cool the oceans once the calving gets serious. These factors have a huge impact on climate.

None of these things can be solved by simplistic models that might be able to model fluid flow accurately.

I get involved in boat design. It is reasonably easy to numerically model the components of drag for a hull and the performance of sails – in calm water. Throw in the complexity of waves and there is a whole new ;level of complexity, beyond any CFD, due to the ensuing pitching.

ClimateBear
Reply to  RickWill
December 2, 2022 6:19 pm

I was the first to use CFD software in the mid 90’s at the tertiary institution I lectured part time at albeit on a commercial task of my own. The college had just bought the Code and it had been installed on a Sun workstation. I was (and still am) a welded on Mac user so had to regress back to command line interface as well as learn how to use the code. Finally set up a model and pressed ‘solve’ and hey presto got results with sexy colour graphics showing the wavll of it at service speede profile around the hull etc. Hmmm that bow wave looks a bit big! The vessel was in service (we were looking at adding bulbous bows hence the exercise) and I had a photo on mu office wall of same and the actual wave was only about half as high. ??? The problem? My mesh was too coarse in that region of the hull and the program had converged to a false solution.

Now remember this was just modelling frictionless flow around the hull, ‘potential flow’ in the lingo and luckily I had real world evidence thet the ‘solution’ was incorrect.

Imagine applying the Navier Stokes equations to a mesh model covering the planet involving both sea and atmorpheric dynamics, allowing for the tides and various ocean oscillations / dipoles etc, Rossby Waves etc and modelling the local high intensity elemnts such as storms let alone hurricanes/typhoons/cyclones etc. What might go awry if say, in order to keep solution times below years or decades, the mesh size was compromised and certain fudge factors were applied to cells in certain parts of the glode.

First issue, when does a ‘model’ cease to be a ‘model’ using first principles physics and become a partial automaton?

Second issue, how do ‘science communicators’ let alone journalists understand what is actually represented by the sexy coloured graphics?

The Real Engineer
Reply to  RickWill
December 3, 2022 4:46 am

But Rick you can make a model which gives reasonable results, even with your unknowns, if you model and measure a number of cases with a modeled and real boat, then you have plenty of scope to make the model work sufficiently well to use it for prediction. It is the prediction feature which current models miss, lets say I start at 1900 and feed in the data for that date. If it isn’t predicting the known values for 1950 it is obviously useless. It doesn’t matter about energy balance in 1850, or any other random piece of input, unless the 1950 temperatures (the result variable) match the measurement, the model is hopelessly defective!
It doesn’t matter on the basis of the model, whether GCM or something else, the only important factor is the result! The method is an academic abstraction.
The physics of climate is very complex and significantly chaotic. OK, but this cannot be used as an excuse. If the model doesn’t work, you look for more known parameters (ex. clouds) and find some equation that gives some result matching data for their effect. Then you do the process again and again until the result matches the data! This is called the Scientific Method, and leads to actual discovery of new science.

December 2, 2022 1:57 pm

“A simple analogy for the climate system might be a wing as Nick Stokes has suggested. As pointed out above, the drag for a well-designed wing is in some ways a good analogy for the temperature anomaly of the climate system.”

I think what Nick never really appreciates is that that his analogy is simply wrong. Calculating drag for a well designed wing is analogous to a weather model, not a climate model.

For the analogy to work for climate, the wing would have to be slowly changing in response to its very precisely calculated drag. And then weather resulting from climate change is analogous to the drag against a future unknown wing.

I think the article is a good one and the solutions to CFD can be discussed ad infinitum. But IMO the main problem with GCMs is that they have a fitted component that is clouds (as well as many other not-based-in-physics influences) and consequently they’re not physics based at all.

If you add fitted clouds to physics (even if you say simplified physics is still valid physics) then you end up with a fit.

And consequently GCMs are expressions of expected climate change, not calculated climate change.

Nick Stokes
Reply to  TimTheToolMan
December 2, 2022 3:47 pm

“I think what Nick never really appreciates is that that his analogy is simply wrong. Calculating drag for a well designed wing is analogous to a weather model, not a climate model.”
I actually don’t propose flow over a wing as a direct analogue of climate. I mention it as a familiar problem, studied with wind tunnel too, to emphasise what you want to get from such a CFD problem, and what matters and doesn’t matter.

On this point it is analogous to a climate problem, not weather. If it depended on weather (eddies etc) then it wouldn’t work, because you’d never know the weather where your solutions would be used. 

Typically important things you want to know are lift and drag. These are numbers describing a notional steady state. The flow may never be truly steady, but you still want those useful numbers. They pertain to the attractor of the chaotic flow. 

I mention the wing to illustrate the relevance of initial conditions. Transient features of the CFD solution is sensitive to those, and are not reproducibly calculated. But the things you want to know emerge regardless. They don’t depend on the initial conditions, and just as well, because you never know them in real flight.

In both a CFD wing calculation and a GCM, you only care about initial conditions in terms of the trouble some initial unphysical feature might cause you. These sort themselves out over time, so you just allow a period for that. With GCMs, that can be called winding back, since you have a particular time period in mind. You wind back to 1850 or so, not because you have particular knowledge of that time, but because effects of your ignorance will have time to dissipate.

There is another interesting relevance wrt neglecting viscosity. For a long time before computers, the standard starting point for wing calculation was potential flow (Joukowski etc). That has no viscosity, and it does give useful results. The lift of a wing basically depends on the degree to which downward momentum is imparted to the oncoming flow. Viscosity has a minor role there, as also in many flow processes in the atmosphere.

Reply to  Nick Stokes
December 2, 2022 4:14 pm

I mention it as a familiar problem, studied with wind tunnel too, to emphasise what you want to get from such a CFD problem, and what matters and doesn’t matter. On this point it is analogous to a climate problem, not weather.

Typically important things you want to know are lift and drag. These are numbers describing a notional steady state.

Lift and drag are immediate effects (like todays weather) and dont relate to climate change which is changing weather by definition.

I find the notion that modelled steady state flow as representing an attractor as being a very weak argument as relates to understanding (climate) change.

Reply to  Nick Stokes
December 2, 2022 6:14 pm

You wind back to 1850 or so, not because you have particular knowledge of that time, but because effects of your ignorance will have time to dissipate.

Your ignorance will not be sorted from 1850 to the present time by running any simplistic climate model. The climate system has far longer time scales. It takes thousands of years for the cool water currently sinking in the Southern Ocean to find its way to the tropics. 

The northern hemisphere is only showing the signs of snow accumulation again. It will be a century or more before it is obvious that it is accumulating. And then more centuries before the summer temperatures over that ice does not get much above 0C and has a dramatic impact on cooling. 

Once the ice builds, there are good reconstructions that show the last two cycles hung around for 4 precession cycles. The northern land masses that store the ice will be 600m higher elevation of land over ocean than present.

The solar conditions that caused the ice to melt were the same as those that caused the ice to build. The difference was glacier calving, which has time frames of tens of thousands of years to build up. 

No climate model predicted the cooling ocean water south of Greenland. That means they are not taking glacier calving into account. A good solution to atmospheric and ocean fluid flows is not going to help with glacier calving. 

So a belief that two centuries to equilibrate is fanciful thinking. If the climate model cannot produce the last four cycles of glaciation it is not a climate model. 

The acceleration of the Northern Hemisphere warming that is bringing the modern interglacial to an end is breathtaking but it will take centuries for the ice accumulation to have serious impacts. Winter snowfall will increase bt 50% from present level in just 150 years; rapid changes compared with recent trends but only an early stage of acceleration into glaciation.

David Young
Reply to  Nick Stokes
December 2, 2022 6:15 pm

Thanks for commenting Nick. I do disagree though, viscosity plays a critical role in wing design and in the atmosphere. it makes a huge difference as I discuss in detail in the referenced paper.

Nick Stokes
Reply to  David Young
December 2, 2022 9:43 pm

Sorry I haven’t had time to comment more today. But I have downloaded your paper and will read it carefully. Thanks.

ClimateBear
Reply to  Nick Stokes
December 4, 2022 2:51 pm

NIck,
regarding the wing / ship hull analogy to a climate model vis a vis CFD analysis
a) the wing/hull analogy is orders of magnitude less complex than earths climate
b) the viscosity issue is readily handled separaetely by physical model testing and then added nack to the CFD potential flow analysis results
c) the viccosity/turbulence components of the climate range from local storm cells to hurricanes/cyclones/typhoons to the various oeanic oscillations ( La Nina, IOD etc) and their interactions with each other (eg Australia is just at the tail end of a triple La Nina – negative IOD – Southern Oceanoscillation phase, some anomalous jestream behaviour and net easterly (onshore) east coast winds producing severe to extreme flooding.

It is the simplistic treatment of so called climate change that is the real problem and pretending simplistic analysis is the same as actual mpdelling.

Reply to  ClimateBear
December 5, 2022 6:16 am

simplistic analysis is the same as actual mpdelling.”

It’s not really even simplistic analysis. It is curve fitting to match an assumed result. It’s backwards. Results first, calculation second using parameterized values.

David Young
Reply to  TimTheToolMan
December 2, 2022 6:38 pm

I think Nick’s analogy is quit apt. In the case of a wing there is a forcing, viz., the angle of attack and the “climate” is the flow about the wing. As the forcing is varied, both the climate and the lift of the wing vary linearly.

Reply to  David Young
December 2, 2022 6:47 pm

And I dont find it apt because there is no relation between the angle of attack (and consequent flow) and previous angle of attack (and its consequent flow). In this analogy GCMs model how the angle of attack is expected to change given the current the lift and drag.

If you ignore that, then all you’re saying is that one climate state is represented by one angle of attack and another climate state is represent by a different angle of attack and that is not a useful analogy in terms of understanding how CFD calculations help us understand a changing climate.

David Young
Reply to  David Young
December 2, 2022 6:50 pm

Sorry Tim but you are wrong about this. Near stall there are multiple solutions for the wing lift and the result you get does depend strongly on the flow field you start with.

Reply to  David Young
December 2, 2022 7:00 pm

Who said anything about stalling?

That’s a strawman.

Reply to  David Young
December 2, 2022 7:29 pm

I should probably give you some context here. As far as I can tell from my historical dealings with him, Nick is essentially of the opinion that climate change is primarily a problem of TCS and that ECS isn’t really on his radar.

For a very near term TCS calculation, the current forcing is all that matters and that is more or less the forcing from current CO2 levels. We may as well be simply calculating the weather.

But for ECS, the current forcing and feedbacks determine the amount of energy the earth accumulates which in turn determines those future climate states and IMO are really what we should care about.

If there is a lot of energy accumulating then the sea level can rise more quickly for example. But only with precise knowledge of how the accumulated energy changes from year to year and how that impacts the CFD calculations, can we understand the actual longer term climate change.

So when Nick (and you) argue that CFD is represented by the flow from one angle of attack (ie forcing) and then CFD is represented by an increased angle of attack (ie bigger forcing) then you’re limiting the analogy to instantaneous and unconnected forcings (ie angles of attack)

And I find that not a useful way to look at GCMs.

David Young
Reply to  David Young
December 2, 2022 9:03 pm

It is common in CFD for the current solution at a changed forcing to depend on the past state of the system. It seems to me that you are just reacting to Nick because you have disagreed with him in the past but have essentially no knowledge of the subject matter here.

Reply to  David Young
December 2, 2022 9:37 pm

It is common in CFD for the current solution at a changed forcing to depend on the past state of the system.

But not if you set the angle of attack as the analogous forcing with no reference to the past state of the system and the analogy doesn’t allow for that. Hence IMO its not a good analogy for climate but is ok for weather where the forcing doesn’t change.

David Young
Reply to  David Young
December 2, 2022 9:37 pm

Another misconception is that flow about a wing is “steady state”. This is untrue as all turbulent flows have no true “steady state.” Even in wind tunnel or flight testing data is time averaged. Turbulence models ideally mimic this averaging process (in practice they do this very inexactly). This is of course exactly the situation with climate which is an average of a time dependent flow field.

Reply to  David Young
December 2, 2022 9:40 pm

This is of course exactly the situation with climate which is an average of a time dependent flow field.

Nope. Without a change in the wing or airflow, this is weather.

David Young
Reply to  David Young
December 3, 2022 12:57 pm

I’m not sure this distinction has much relevance to the fact that CFD and weather and climate models are based on the same equations and the same methods.

You need to bear in mind that even in the wing case, the adjustment to a changed forcing is not instantaneous and in some cases there are even multiple solutions. Of course in the real world, turbulence and larger eddies in the onset flow will guarantee that there is no such thing as a single “angle of attack.” There is also large scale unsteadiness that varies with time. In this case, it can take quite a while for the final state to be arrived at.

Reply to  David Young
December 3, 2022 5:46 pm

I’m not sure this distinction has much relevance to the fact that CFD and weather and climate models are based on the same equations and the same methods.

Useful analogies need to capture the important features and a climate model is fundamentally different from a weather model in that a GCM accounts for changing weather due to some forcing.

At best the wing analogy works for a GCM control run or initialisation.

December 2, 2022 2:04 pm

wow, complicated stuff– all the more reason I think people who spit out the idea that climate change is obviously due to human caused “carbon pollution” have their frontal lobes disconnected

Rud Istvan
Reply to  Joseph Zorzin
December 2, 2022 2:27 pm

See my longish comment over at Judith’s. You are correct. But there are simpler, still accurate, explanations.

son of mulder
December 2, 2022 2:11 pm

I hate to be negative but back in the seventies I lost faith in known mathematics to deal with long term modelling of fluid mechanics or my chosen area of research General Relativity. Underlying both disciplines is calculus. It works in the small scale as a reasonable approximation for system evolutioneg weather forecasting or precession of the perehelion of Mercury, but on the large scale, long time periods you have to remember that the physical system in the case of fluids is molecules rolling around ie finite particles not a differentiable continuum. Similarly in General Relativity Planck Length somewhat limits the power of calculus for modelling.Add to that the accumulating, iterative errors of computer modelling and forced roundings where often the system is chaotic and who knows what imaginary, none physical rabbit hole just that leads one down.

Even if all that wasn’t an issue just trying to impose a suitable set of dynamic initial conditions and sensibly coupling the oceans and the atmosphere seem to me to be beyond capability of our technology. There is only one computer capable of solving this stuff and that is the earth system itself.Requires a bit of a wait though, It seems to me we might as well adapt if we see bad stuff starting to happen.., a bit like we’ve always done.

Beta Blocker
Reply to  son of mulder
December 2, 2022 3:56 pm

son of mulder: “There is only one computer capable of solving this stuff and that is the earth system itself.”

Twelve years ago, I made a similar comment on WUWT. If I remember correctly, it was in response to an article by Demetris Koutsoyiannis concerning the supposed water vapor feedback mechanism operating inside the earth’s climate system.

I remarked that rather than using digital simulations, we could learn a lot more about these supposed feedback processes simply by looking directly at what was actually happening inside the real atmosphere, doing so in real time.

Koutsoyiannis’ response, offered more than a decade ago, was that it is not possible at the current state of science to observe these alleged feedback processes operating in real time inside the earth’s real atmosphere. Their presence or absence must be inferred from other kinds of observations using other kinds of analysis.

In this way, the climate calculation machine which is the earth’s actual climate system is in some ways a black box which keeps its methods and means under wraps.

Rud Istvan
Reply to  son of mulder
December 2, 2022 3:57 pm

An observation based on a former calculus model practioner. The whole thing is based on the idea of an infinitesimal on a smooth function interval. The problem is, that with nonlinear dynamic systems (definition chaotic), there are abrupt bifurcations so that in certain regions no smooth function exists.

Rud Istvan
December 2, 2022 2:22 pm

Made a long substantive comment over at Judith’s. Only a short synopsis here.
Good post by an SME, but perhaps overly technical for lay discourse.

There are simpler ways to explain why the models cannot be right (the attribution problem illustrated by AR4 WG1 SPM fig 4 and my previous models guest posts here), simpler proofs that they are observationally wrong (Christy’s absence of a tropical troposphere hotspot being only one of three simple ones), simpler evidence that past reliance on them was wrong (sea level rise has not accelerated, summer Arctic sea ice has not disappeared, Glacier National,Park still has glaciers).

That impossible modeling continues despite theoretical and observational evidence of abject failure can be attributed to just two things: the alarmists have not got anything else; and lots of climate modeling money is still available despite ‘settled science’ since they got nothing else anymore.

December 2, 2022 2:30 pm

So, David, have you spoken with Gavin Schmidt?

Also, I suspect Jerry Browning’s work is directly relevant.

The unique, well posed reduced system for atmospheric flows: Robustness in the presence of small scale surface irregularities. Dynamics of Atmospheres and Oceans. 2020;91:101143. doi: 10.1016/j.dynatmoce.2020.101143

David Young
Reply to  Pat Frank
December 2, 2022 3:47 pm

I sent Gavin some papers perhaps 8 years ago. He didn’t get back to me. My discussion with him at RealClimate ended with his assertion that “Every time I run the model, I get a reasonable looking climate.” I understand why he didn’t get back to me. These models are massive pieces of software that have been built up for many decades by accretion. It takes a massive effort to put newer and more modern methods into them and funding is absent to build a new one. In the old days of CFD in the 1970’s it was possible for one person to build their own code because the CFD problems were vastly simpler idealizations of real world problems. As people started to look at real world problems, the codes became much more complex.

I think Browning’s work would improve weather modeling of Rossby waves. Not sure it would help much in the tropics. Convection is an ill-posed problem.

Reply to  Pat Frank
December 3, 2022 4:32 am

I am surprised Gavin Schmidt et al didn’t come back and tell David that because he is not a climate scientist he isn’t allowed to give climate scientists advice.

David Young
Reply to  Pat Frank
December 3, 2022 8:36 am

Over the years, I’ve suffered a lot of abuse online mostly from anonymous non-scientist trolls. Gavin was professional in his responses to my comments. At that time, I was just starting to read and think about these issues and my understanding was incomplete.

Reply to  David Young
December 3, 2022 10:02 am

I was drawn into a debate with Gavin on RealClimate in 2008 after publishing
A Climate of Belief in Skeptic magazine. He was reasonable in debate then, too.

His only censorious misstep was to delete four citations I’d provided, as “contrarian noise.”

At the end, Gavin didn’t dent the article.

December 2, 2022 3:12 pm

Thank you David Young. Greatly appreciated.

“Colorful Fluid Dynamics” – for climate system visualization, what if a system of sensors provides high-resolution radiance data originating from the flow field at short time intervals? Apologies to those here at WUWT who have seen this before. The point is that by viewing the animated series of short-time-interval images, one can readily appreciate the inability to reliably diagnose or predict the atmospheric and surface response to GHG “forcings” using the large-grid, discrete-layer, step-iterated, parameter-tuned GCMs.

This link is for the GOES East geostationary satellite, Band 16. Set for 8 hours of images for the full disk view, i.e. the most recent 8 hours from clicking the link.

https://www.star.nesdis.noaa.gov/GOES/fulldisk_band.php?sat=G16&band=16&length=48

I marvel at the huge volume of data being accumulated by these satellites and the supporting systems. I realize it’s not the whole planet, but maybe full coverage with a dozen or so more satellites and some wider-spectrum sensing would be a reasonable use of funding.

taxed
December 2, 2022 3:23 pm

l don’t know about climate models, but current weather models tells me that the jet stream is currently set up for climate cooling over the NH.
The large increase in jet stream activity within the Arctic circle along with the brake down of the zonal flow of the jet stream across the NH. Allows the static weather patterning and large movement of air mass in and out of the Arctic circle that is needed to cause climate cooling across the NH, should they last over the long term.

December 2, 2022 4:21 pm

I’ve tried to read through this with understanding.

My gut tells me that comparing climate to CFD on airplane wings is simple compared to climate. I envision a wind tunnel with fans above, below, to the sides, in front and in back. All of them are varying with different periods so you don’t have static conditions at any point in time. In addition the wing leading edge will have part knife edge, part concave, part convex and bumps all over the wing surface that also change in time at different periods. Trying to model this with CFD would be nigh unto impossible.

I’ve said this before and I’ll reiterate here. Using temperature as proxy for radiation or to be the input to a flow analysis is fruitless. It is the wrong proxy to be using as an input and the wrong choice in output when dealing with fluid flow analysis.

DY points out that:

Time accurate simulations are also challenging because the numerical errors are in some sense cumulative, i.e., an error at a given time step will be propagated to all subsequent time steps. “

Dr. Pat Frank pointed this out some time ago. The uncertainties grow to a point where the uncertainty interval is so large that the outputs can not be judged to be accurate.

The author goes on to say that:

The level of detail and physical modeling varies greatly, as do the accuracy requirements. For aerodynamic simulations, accurate drag increments between configurations have high value. In climate simulations, a widely used target variable is temperature anomaly. Both drag increments and temperature anomalies are particularly difficult to compute accurately. The reason is simple: both output quantities are several orders of magnitude smaller than the overall absolute levels of momentum for drag or energy for temperature anomalies. This means that without tremendous effort, the output quantity is smaller than the numerical truncation error.”

Jeez, “… output quantities are several orders of magnitude smaller than the overall absolute levels … energy for temperature anomalies.” This is what many of us have been trying to say for quite some time. Trying to compute anomalies to the one-thousandths place when much of the past data only has integer resolution is folly. It is counting angels on the head of a pin and saying “trust me” it is accurate.

Reply to  Jim Gorman
December 2, 2022 5:48 pm

Dr. Pat Frank pointed this out some time ago.

I brought this point up at Judith’s website. David is not a fan of Pat’s work. You might want to go see his comment as it appears he isn’t going to respond to you here.

Reply to  Clyde Spencer
December 3, 2022 10:27 am

Thanks for the heads-up, Clyde. I’ve gone to Judith’s and replied to David. He’s evidently seeing uncertainty and thinking error.

The latter is bounded; the former, not.

Reply to  Pat Frank
December 8, 2022 4:13 pm

The conversation with David has ended in an impasse.

sherro01
December 2, 2022 5:01 pm

Over the last couple of decades of climate-related blog comments, my single dominant theme has been poor scientific treatment of measurement uncertainty. A consequential theme has been the willingness of poor scientists to cover up uncertainty problems by various human mind methods. Only yesterday I posted about authors of dominant heatwave literature using questionable cherry picked start dates, self-invented definitions of variables and homogenised temperature data.
This was before I read the present informative article by David Young, which has many points in common. David is far ahead of me in ability, so it is pleasant to read his similar conclusions. However, the aim is not to be pleased by an outcome. The aim is to be correct.
Poor standards and practices in climate research have dragged down the public view of science in ways that will take generations to restore. We could start now, with influential climate researchers doing a mass mea culpa about their intellectual honesty in a special edition of “Nature”. They are all bright enough to know when they have strayed and of the need to confess. Geoff S

David Young
Reply to  sherro01
December 4, 2022 10:58 am

Thanks sherro. I wouldn’t hold my breath on the mass mea culpa. The science machine is well oiled and self-reinforcing. I have a followup for CLimate Etc. that will appear next year looking at how the pandemic changed science for the worse and giving the evidence that the public has lost faith in the science establishment. We need alternatives to the corporate media, big tech tyrants, and “the science” establishment. There are some promising signs but we all need to stay involved. Support places like Substack, Rumble, and now that Musk has drained the swamp, Twitter. Matt Taibbi and Glen Greenwald are interestingly the best and most honest of the lot. Both can be supported at Substack.

Geoff Sherrington
December 2, 2022 7:07 pm

Colorful fluid dynamics.
Here is 10 minutes of keyboard fiddling with the bottom diagram (from the web) to produce the colourful one at the top in Corel..
Is the field as easy as this? Geoff S
http://www.geoffstuff.com/cfd.jpg

December 2, 2022 8:06 pm

A prominent mistake in GCMs is failure to account for measured average global water vapor. Water vapor molecules have been increasing 7 times faster than CO2 molecules. Water vapor has been increasing substantially more than possible from just feedback. Water vapor increase can account for all climate change attributable to humanity. http://globalclimatedrivers2.blogspot.com

current & paleo T CO2.jpg
JDaniel
December 2, 2022 9:38 pm

My PhD work back in the 1990s involved computer modeling of plasma physics. One of the primary sanity checks of the computer model was that it conserved energy. When I read about climate computer models predicting extreme warning, I thought the entire thing to be absurd. During my PhD work, my advisor had warned me that computer models were regarded with great suspicion by most physicists, and most claims had to be backed up with a lot more than GIGO computer logic. The notion that a computer model that explicitly did not conserve energy could predict exactly how much the energy of the system increased or decreased was an absurd assertion at the time, and remains an absurd assertion today

David Young
Reply to  JDaniel
December 3, 2022 8:39 am

Yes one of the unspoken measures of merit for CFD codes is that they discretely conserve momentum, mass, and energy. GCM’s are so complex, it is opaque whether or not they do this. In the past, they did not but I think they have improved in the last 20 years.

rxc6422
Reply to  JDaniel
December 5, 2022 11:37 am

One of my functions when I was working was to coordinate experiments to test computer codes for nuclear plants. Most of the experimental facilities were scaled models, not using nuclear heat, and there were many discussions about the disagreements of codes and experimental data. We used to joke that no one believed any of the code calculations, except the guys who ran the model, while everyone believed the data from the experimental facility, except the guys who ran the experiment.

This sort of stuff is very hard to do. When you scale it up to the size of a planet, the number of phenomena that have to account for is, literally, phenomenal, and the size of the grid that you would have to use to get meaningful results cannot be simulated on any computer.

It is a fantasy. I tell friends who ask me about global warming that meteorologists using extremely powerful computers and enormous amounts of real-time, measured data cannot predict the high and low temperature 10 days from now on the National Mall in Washington with an accuracy of 1C, but the climate scientists, using essentially simplified models of the same Navier-Stokes equations, say they can “project” the average temperature of the entire planet, to an accuracy of 0.1C for the next 100 years. It does not make any sense at all.

The Real Engineer
December 3, 2022 4:25 am

The basic problem with climate models is that the predictions from previous runs are not used as input to correctly predict what is known to have happened! Either the people using the models have lost their starting point (quite likely) or they are so wedded to the results they have obtained, although obviously erroneous, that they don’t care.
Engineering models, as the author says, have had huge feedback from physical results to make the models very accurate and suitable for prediction and design. Electronic models are a case in point, circuits will behave as the models say, they PREDICT correctly.
However climate models have been wildly wrong now for at least 40 years, because the wanted result is not accuracy but wildly rising temperatures. They do not even give the known result from older starting dates, again much too hot. How much longer can this continue, well in my view the modelers should be sacked, and then the subject is closed. The alternative is to employ a few Engineers to make the models work, but of course that is much too simple a solution!

Reply to  The Real Engineer
December 3, 2022 10:31 am

None of the GCMs have gone through formal Validation & Verification.

Set up a couple or three V&V projects employing unbiased engineers and one would rapidly see GCMs falsified as a class. And properly so.

Geoff Sherrington
Reply to  Pat Frank
December 3, 2022 11:14 pm

Pat,
Here is some relevant discussion.
Geoff S
http://www.geoffstuff.com/rba.docx

Reply to  Geoff Sherrington
December 4, 2022 8:17 am

Hi Geoff — their reply seems about equivalent to, ‘Thank-you for your interest.’

Evidently, determining the validity of data used for investments is considered unnecessary.

Richard M
December 3, 2022 7:10 am

Some models can be useful. GCMs just don’t happen to be one of them. All they do is produce the biased views of the programmers. There are propaganda tools and nothing else.

For example, how does climate sensitivity get simulated? They certainly aren’t simulating added CO2 molecules. They are simulating what they think will happen. They are guessing. That is fine for research projects but should never be considered as having anything to do with planetary climates.

BTW, it is precisely what happens at the molecular level that prevents added CO2 from warming the boundary layer. GCMs will never see this.

Curious George
December 3, 2022 7:52 am

From a pedestrian perspective: I distrust the state-of-the-art computational fluid dynamics code. The Diablo Canyon nuclear plant has a problem with rapidly corroding heat exchangers. They ordered new heat exchangers, but they were not much better.

I assume that the fluid there was a supercritical water, but if they could not handle the simulation, the honest thing would have been to admit that they could no do it.

Reply to  Curious George
December 3, 2022 8:00 am

Admittedly tangential to your point, but does this mean that Gavin should not have extended its operating life? Otherwise, IMO, if they could use fuel already prepped for that other plant now on the chopping block, I was cool with it. As an old neighbor who raised his kids nearby and left one to graduate CPSLO, I thought the safety risk was way overblown.

David Young
December 4, 2022 12:59 pm

Since I was asked about it here, I will extend my comments on the toxic atmosphere in climate blogs, especially “mainstream” blogs.

I think this is changing a little this year, but for over a decade, the comments sections and sometimes the posts themselves were filled with insults and slanders and pseudo-science. In the comments it was almost exclusively anonymous non-scientist trolls or professional obfuscators who used any falaceous nonsense to sow doubt. Exactly the definition of disinformers put forward at Skeptical Science. This noise was almost exclusively politically motivated and it was shameful that the moderators at RealClimate or Annan’s blog winked at it and allowed it.

The only climate scientist I would call out personally is Andew Dessler. He and I had a useful exchange at Science of Doom. Of course, he ignored my litany of studies on how much of science was pseudo-science and insisted that he knew the truth about clouds. I pointed out what this post shows, viz., that GCM’s are untrustworthy and that skill is likely due to cancellation of large errors. And there is Zhou et al. He then later went on twitter and smeared me by calling me a denier.

In fact, I would argue that climate science was the pioneer in the activist politically motivated scientist. The reason I suspect things may be about to change is that the pandemic proved the bankruptcy of the science establishment. It got very personal, nasty, and in some cases dangerous for dissenters. Ioannidis was caned by an online mob of cynical and disrespectful people. Some fellow scientists joined the mob. It is hard to imagine a more apolitical and well respected scientist. Cliff Mass has also suffered quite significant damage to his career for simply being a voice of reason and not knuckling under to the woke mob.

The result is that a large portion of the public distrusts the “mainstream” media and the science establishment, particularly the CDC and the NIH who were caught lying time and time again. They were a perfect analogy to what climate scientists have been doing for 20 years. Now the cat is out of the bag. Regardless of party affiliation, people are so done with masks, restrictions, school closures, science denying “gender” theory, massive printing of money, and just plain politically motivated lies.

Elites beware, winter and Elon are here.

Piteo
December 5, 2022 2:21 am

In the mid 90s I was using Navier-Stokes and CFD to work on an engineering problem (mechanical face seals to be exact). We made some assumption (e.g. one liquid with a constant viscosity, no cavitation, no movements of the liquid in z-direction, no heat generation, no heat transfer, smooth surface, ….). This made the NS-equations easier that, with the knowlegde of partical differential equations and finite elemens methods at that time, it was possible to put it all in computer code and get some results out of the models.

If CFD and NS are used for climate modeling, simplifications in the model can not be made. On the contrary, I am pretty sure that the model needs to be extended to catch the complexity of the problem. I also do not think that our knowledge of converting mathematical equations into computer code has improved that much that if can handle the complexity of an accurate climate model. (However, I am open to hearing if you think differently.)

I therefor do not trust the outcomes of these models.

What I also learn back then, was that one small typo in the code a minus-sign instead of a plus-sign can have a huge difference in the outcome. Have we developed methods to prevent these human errors from happening?

Reply to  Piteo
December 5, 2022 6:39 am

Have we developed methods to prevent these human errors from happening?”

Nope. The method I learned back in the late 60’s was to have two different people type the same program onto different punch cards and then compare the runs of the two different punch card sets. If they were different it was probably a typo somewhere in one of them. I sincerely doubt that is done with the climate model code.

Reply to  Tim Gorman
December 5, 2022 9:04 am

At Missouri School of Mines we swapped out the formatting cards. Just another indicator of engineering laziness. After all, who else goes to college for at least 4 years to learn to do things the easy way.? Anyhow, we’re both giving away our vintages…

BTW, I hope you didn’t confuse an old MGC diss of your school with me. If you paid attention, your school was just fine. You were certainly taught Engineering Statistics, but now refute it. I wonder why…..

David Young
Reply to  Piteo
December 5, 2022 10:41 am

In the old days when we had a large team, we would assign two people to each subroutine so that there were 2 assessments of the accuracy of the code. But yes coding errors are quite common and require building up a set of test cases that isolate parts of the code.

rxc6422
December 5, 2022 11:17 am

The Nuclear Regulatory Commission developed a methodology for evaluating thermal-hydraulic models used to analyze the behavior of nuclear power plants back in the 1980s. It is called the Code Scaling, Applicability and Uncertainty Evaluation Methodology (CSAU) – https://inis.iaea.org/search/search.aspx?orig_q=RN:24012437

I used to be in charge of evaluating codes like this at the NRC, and this method was the gold standard, accepted around the world for this process. It requires, first, an establishment of a Phenomena Identification and Ranking Table (PIRT), to establish all of the relevant and important phenomena that affect the system you want to model. Then there are comparisons of the various parts of the code against real, measured, reproduceable experiments or observations, to establish the uncertainty of the individual parts. Followed by comparisons of the overall model against real data in test facilities, or real operating reactors, to establish overall uncertainty.

A few years ago I tried to find a similar methodology for climate models, and discovered the “Good Practice Guidance Paper on Assessing and Combining Multi Model Climate Projections”, developed by an expert meeting of Working Group 1 of the IPCC, in 2010. It includes a number of similar concepts to the CSAU methodology, but also some very strange concepts, including especially the averaging of the output from different runs from different computer models (an “ensemble”). 

“Multi-model mean (un-weighted): An average of simulations in a multi-model ensemble, treating all models equally. Depending on the application, if more than one realization from a given model is available (differing only in initial conditions), all realizations for a given model might be averaged together before averaging with other models.” There is also a weighted multi-model mean. 

The terms that it uses are similarly confusing – the definition of a “performance metric is particularly confusing. The report also provides guidance about using “narratives” such as “(storylines, quantitative or qualitative descriptions of illustrative possible realizations of climate change)” to fill in for a lack of data or uncertainty. It reads more like a social science report than a physical science report. 

I cannot imagine that anyone using CFD codes for engineering applications ever uses averages of different analyses performed with different initial conditions or different codes. It is common to try to use different CFD codes and inputs to establish an envelope for design purposes, because of uncertainties, but the IPCC is supposed to provide what is referred to in the nuclear business as “best estimate” analyses. Uncertainties are still established, and margins are still included in designs to account for the uncertainties. I can’t say I have ever seen any error bars on any climate temperature “projections”.  

Finally, I note that there is not even any attempt in the IPCC models to predict clouds. Condensation phenomena in multiphase systems are notoriously difficult to model from first principles, which is why engineers do not waste time trying to do so. It is much easier to do testing in a similar representative system and apply those measured results to the system at hand. Which you cannot do when the system is an entire planet with oceans, ice fields, many different landforms and biologicals, including human beings. On this basis alone – they cannot model clouds – the IPCC codes fail the key function of fitness for service, because clouds have many profound effects on planetary temperature. .  

The codes that seem to have been developed for use by the IPCC would not be approved for nuclear work. The only comparisons I have seen of the results against real data show the codes diverging. Maybe one of the codes is better than the others, but I would not call it agreement with the data.  

Thanks to Mr. Young for this technical dissection of the IPCC codes. It is about time that someone stepped up and pointed out these problems.  I expect that only a few CFD engineers will understand the technical details, but the overall message is clear. They are not fit for purpose, and no policy decisions should be made based on their results.

Reply to  rxc6422
December 5, 2022 12:37 pm

“Multi-model mean (un-weighted): An average of simulations in a multi-model ensemble, treating all models equally. “

Which really says nothing. The mean of an ensemble whose members are all wrong will also be wrong. Two wrongs do *not* make a right.

David Young
December 5, 2022 12:48 pm

I posted this at Climate Etc. as well.

I really don’t want to relitigate Pat’s analysis. I did find this response persuasive. The point that is I think the most important is that a model’s uncertainty is related to the uncertainty in the RATE of response to changes in forcing, not to the uncertainty in the current forcing.

In aeronautical CFD, the angle of attack (the forcing) is very difficult to measure accurately. But turbulence models are tuned to match the rate of change of the response with respect to changes in the forcing. Another method is simply to match the lift or some other global force with the angle of attack as a variable. Using this method, CFD models are in better agreement. This method has proved very successful and results are really very good in attacked subsonic flows.

https://patricktbrown.org/2017/01/25/do-propagation-of-error-calculations-invalidate-climate-model-projections-of-global-warming/

This is all I think I want to say on the subject. I’d rather stick to rigorous math and papers by my colleagues that I know to be truthful and free of bias. This is I think a much firmer basis to reject climate models ECS. The lack of skill is actually quite well documented in the literature.