The Global Climate Model clique feedback loop

Elevated from a WUWT comment by Dr. Robert G. Brown, Duke University

Frank K. says: You are spot on with your assessment of ECIMs/GCMs. Unfortunately, those who believe in their ability to predict future climate really don’t want to talk about the differential equations, numerical methods or initial/boundary conditions which comprise these codes. That’s where the real problems are…

Well, let’s be careful how you state this. Those who believe in their ability to predict future climate who aren’t in the business don’t want to talk about all of this, and those who aren’t expert in predictive modeling and statistics in general in the business would prefer in many cases not to have a detailed discussion of the difficulty of properly validating a predictive model — a process which basically never ends as new data comes in.

However, most of the GCMs and ECIMs are well, and reasonably publicly, documented. It’s just that unless you have a Ph.D. in (say) physics, a knowledge of general mathematics and statistics and computer science and numerical computing that would suffice to earn you at least masters degree in each of those subjects if acquired in the context of an academic program, plus substantial subspecialization knowledge in the general fields of computational fluid dynamics and climate science, you don’t know enough to intelligently comment on the code itself. You can only comment on it as a black box, or comment on one tiny fragment of the code, or physics, or initialization, or methods, or the ode solvers, or the dynamical engines, or the averaging, or the spatiotemporal resolution, or…

Look, I actually have a Ph.D in theoretical physics. I’ve completed something like six graduate level math classes (mostly as an undergraduate, but a couple as a physics grad student). I’ve taught (and written a textbook on) graduate level electrodynamics, which is basically a thinly disguised course in elliptical and hyperbolic PDEs. I’ve written a book on large scale cluster computing that people still use when setting up compute clusters, and have several gigabytes worth of code in my personal subversion tree and cannot keep count of how many languages I either know well or have written at least one program in dating back to code written on paper tape. I’ve co-founded two companies on advanced predictive modelling on the basis of code I’ve written and a process for doing indirect Bayesian inference across privacy or other data boundaries that was for a long time patent pending before trying to defend a method patent grew too expensive and cumbersome to continue; the second company is still extant and making substantial progress towards perhaps one day making me rich. I’ve did advanced importance-sampling Monte Carlo simulation as my primary research for around 15 years before quitting that as well. I’ve learned a fair bit of climate science. I basically lack a detailed knowledge and experience of only computational fluid dynamics in the list above (and understand the concepts there pretty well, but that isn’t the same thing as direct experience) and I still have a hard time working through e.g. the CAM 3.1 documentation, and an even harder time working through the open source code, partly because the code is terribly organized and poorly internally documented to the point where just getting it to build correctly requires dedication and a week or two of effort.

Oh, and did I mention that I’m also an experienced systems/network programmer and administrator? So I actually understand the underlying tools REQUIRED for it to build pretty well…

If I have a hard time getting to where I can — for example — simply build an openly published code base and run it on a personal multicore system to watch the whole thing actually run through to a conclusion, let alone start to reorganize the code, replace underlying components such as its absurd lat/long gridding on the surface of a sphere with rescalable symmetric tesselations to make the code adaptive, isolate the various contributing physics subsystems so that they can be easily modified or replaced without affecting other parts of the computation, and so on, you can bet that there aren’t but a handful of people worldwide who are going to be able to do this and willing to do this without a paycheck and substantial support. How does one get the paycheck, the support, the access to supercomputing-scale resources to enable the process? By writing grants (and having enough time to do the work, in an environment capable of providing the required support in exchange for indirect cost money at fixed rates, with the implicit support of the department you work for) and getting grant money to do so.

And who controls who, of the tiny handful of people broadly enough competent in the list above to have a good chance of being able to manage the whole project on the basis of their own directly implemented knowledge and skills AND who has the time and indirect support etc, gets funded? Who reviews the grants?

Why, the very people you would be competing with, who all have a number of vested interests in there being an emergency, because without an emergency the US government might fund two or even three distinct efforts to write a functioning climate model, but they’d never fund forty or fifty such efforts. It is in nobody’s best interests in this group to admit outsiders — all of those groups have grad students they need to place, jobs they need to have materialize for the ones that won’t continue in research, and themselves depend on not antagonizing their friends and colleagues. As AR5 directly remarks — of the 36 or so named components of CMIP5, there aren’t anything LIKE 36 independent models — the models, data, methods, code are all variants of a mere handful of “memetic” code lines, split off on precisely the basis of grad student X starting his or her own version of the code they used in school as part of newly funded program at a new school or institution.

IMO, solving the problem the GCMs are trying to solve is a grand challenge problem in computer science. It isn’t at all surprising that the solutions so far don’t work very well. It would rather be surprising if they did. We don’t even have the data needed to intelligently initialize the models we have got, and those models almost certainly have a completely inadequate spatiotemporal resolution on an insanely stupid, non-rescalable gridding of a sphere. So the programs literally cannot be made to run at a finer resolution without basically rewriting the whole thing, and any such rewrite would only make the problem at the poles worse — quadrature on a spherical surface using a rectilinear lat/long grid is long known to be enormously difficult and to give rise to artifacts and nearly uncontrollable error estimates.

But until the people doing “statistics” on the output of the GCMs come to their senses and stop treating each GCM as if it is an independent and identically distributed sample drawn from a distribution of perfectly written GCM codes plus unknown but unbiased internal errors — which is precisely what AR5 does, as is explicitly acknowledged in section 9.2 in precisely two paragraphs hidden neatly in the middle that more or less add up to “all of the `confidence’ given the estimates listed at the beginning of chapter 9 is basically human opinion bullshit, not something that can be backed up by any sort of axiomatically correct statistical analysis” — the public will be safely protected from any “dangerous” knowledge of the ongoing failure of the GCMs to actually predict or hindcast anything at all particularly accurately outside of the reference interval.

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

219 Comments
Inline Feedbacks
View all comments
May 8, 2014 1:33 pm

davidmhoffer:

That said, I agree with richardscourtney. The central take away for readers of this thread should be the devastating critiques by RGB.

Me too. And if I was awarding grants I’d look kindly on Brown’s generic software component for ‘stable, rescalable quadrature … over the tessera’ – to allow much more flexible global climate models and all kinds of other goodies. For these highly educational posts haven’t primarily been destructive (of current GCMs) but constructive. One can imagine a immensely powerful open source community in the future with software objects like this available to them. Even if, as Brown, Essex and many of us suspect, we discover through them that we can’t ever model the spatio-temporal chaos of climate much better.

Mark
May 8, 2014 3:11 pm

David A says:
The GCMs are informative! They are extremely informative. They all run wrong, (not very close to the observations) in the SAME direction! (To warm)
This would be less remarkable if there are actually only a very small number of GCMs.
Which is something the OP was claiming.
A bit like the way you can have lots of “brands” of a product in a supermarket, but on close examination you find many have things in common.

Mark
May 8, 2014 3:22 pm

beng says:
It takes the most sophisticated models running on supercomputers to “model” a modern aircraft. Those presumably actually work.
Would Boeing, EADS, etc actually build an aircraft purely on the basis of such models or would they still put physical models into wind tunnels at some point?

john robertson
May 8, 2014 4:52 pm

Another fine comment.
Thank you Robert Brown.
So climatology by computer model is a faith based racket.
The shaman and High Priest types who fell from civic dominance with the Reformation and acceptance of the scientific method, are back.
And still lusting to lord it over all.
The climate models are a modern substitute for the incantation and gobbley gook of the dominant state religion.
A nastier state religion would be hard to imagine.
Attacks productive citizens.
Rewards parasitic activity.
Attacks the foundations of civil society.
Seeks active destruction of poor brown persons.
Robs the many to reward the criminal few.
Sure to end well?
These schemes and activities have always been social intelligence tests.
But it can be a mistake to pass the test if the fools and bandits have the guns.
I keep running into this thought; Is it a form of insanity to need to feel yourself morally superior to other people and insist you alone are fit to rule?
Of the nature of;” No.. I am Napoleon”
Is this the nature of the persons who gravitate toward the UN?

Nick Stokes
May 8, 2014 5:14 pm

Mark says:May 8, 2014 at 3:22 pm
“Would Boeing, EADS, etc actually build an aircraft purely on the basis of such models or would they still put physical models into wind tunnels at some point?”

Both, but CFD, solving the same Navier-Stokes that Essex says is impossible, is a big part. Here is an interesting Boeing presentation on CFD and the 787.

May 8, 2014 5:26 pm

I think Frank K has been correct in his comments responding to others who attempted to explain various aspects of the numerical solution methods applied to the discrete approximations to some of the PDEs used in GCMs.
Generally, an initial analysis objective is to determine the type of the PDEs; parabolic, hyperbolic, or elliptic. This analysis is frequently conducted based on the quasi-linear approximations of the PDEs. The type of equation is determined by the roots of the characteristic equation that is given by the determinant of the Jacobian of the equation system. The characteristics determine the locations and content of the inform that must be specified at the boundaries of the solution domain of the independent variables.
Stability properties of the finite-difference approximations to the PDEs is generally based on completely linearized forms of the equation system. Additionally, the analyses generally require that the initial state of the equations be a uniform state with an absence of all temporal and spatial gradients. The stability properties depend on the time level at which each of the terms in the approximations is evaluated. Stability properties are usually summarized in a relationship between the temporal and spatial increments plus geometric descriptions of the discrete grid.
Numerical solution methods can be unconditionally unstable, conditionally stable, or unconditionally stable. The last of these usually arise when ‘fully implicit’ solution methods are used, and is generally interpreted to mean that large discrete temporal increments can be used in applications.
One of the more familiar of limitations for conditionally stable methods is the Courant-Friedrichs-Lewy condition that relates to the time required for a signal to traverse a single spatial increment. The condition is usually encountered in the case of numerical solution of hyperbolic PDEs. The signal can be either the pressure, or the transport of a convected dependent variable.
If the pressure is treated in an explicit manner, i. e. evaluated at the previous time level, the CFL condition leads to the necessity to use small discrete temporal increments in so far as the speed of sound is large or not.
The transport of temperature or energy when treated explicitly leads to conditional stability relating to the time required for the bulk motion to traverse a discrete grid.
Implicit handling of the pressure and transported quantity will eliminate the conditional stability and lead to unconditional stability.
It is very important to note the following:
1. Stability is only one aspect of numerical solutions of systems algebraic equations that arise from discrete approximations to PDEs. The discrete approximations are also required to be consistent with the PDEs. Stability and consistency give convergence.
2. Stability analyses are also usually conducted with linearized forms of the algebraic approximations. Additionally, as in the case of characteristics analyses, a uniform initial state for the complete system is assumed. As Frank K noted above, the model equations used in GCMs are partial differential equations containing algebraic expressions for mass, momentum, and energy exchanges at the boundaries of the constitutes. These exchanges must be assumed to be zero in order for uniform initial states to be obtained.
Thus the algebraic coupling terms do not enter the stability analyses. The time constants for some of these terms can easily be more restrictive than the CFL criterion, or other stability criterion associated with the numerical approach. Generally, all possible initial states that could be realized when the interaction terms are included cannot be covered in stability analyses.
In general, all aspects of all the equations in the total model must be investigated for stability requirements.
3. The model equations used in GCMs are generally not hyperbolic; second-order diffusion terms in the momentum balances, for example, lead to parabolic equations PDEs for the transient case, and elliptic systems for the steady-state case. Additionally, some momentum model equations have been modified to include fourth-order derivatives beyond the second-order physical diffusion terms.
4. The CFL criterion is not a ‘guideline’. It is instead a hard and fast limit. If a non-linear analysis is conducted the effects of the terms not completely handled in the linear analyses can be determined. The same thing for the algebraic coupling terms, although that’s especially messy.
5. Characteristics and stability analyses are math problems. One does not need to have actually worked with CFD codes to understand the ramifications of the analyses.
In summary, throwing out simplistic concepts which are very likely to be completely un-related to any actual GCM does not lead to understanding of the numerical solution aspects of the equation systems used in GCMs.

May 8, 2014 7:28 pm

Dr. Page:
“The whole UNFCCC travelling circus has no empirical basis for its operations and indeed for its existence depending as it does on the predictions of the inherently useless climate models.. The climate is much too complex to model but can be predicted by simply knowing where we are in the natural quasi -cycles”
Of course the UNFCCC has no empirical basis for its operations. It was established as a political entity from the get go, with political aims, and a totally political organization. It was sold in the UN as a method of extracting taxes from more develped countries to be paid as damages to less developed countries. Human caused climate change was a basic assumption. Global warming with negative effects was simply the first interation. It also assumed that all other weather, climate, and environmental changes were human caused with negative consequences, and that they were the fault of the more developed countries.
No number of scientific arguments, models, or improvements will have any effect. It takes political operations to change a political organization. We are starting to see some of that now that it is becoming more obvious of the UNFCCC and IPCC failings at useful analysis of the climate and useful predictions combined with astronomical recommended remediation costs that won’t remediate anything. Even just a few more years of cooling will very likely put an end to global warming and hopefully climate change as useful political tools.

May 8, 2014 8:00 pm

Calculation limits, just a general observation- the Boeing presentation is a slick marketing tool, but what the heck is a “Boise Sled” on the landing gear? Maybe for landing during a snowstorm in Boise, Idaho?
A good friend of mine has made a career of using various CFD programs in aviation. His bottom line obervations are that they can be very useful in designing the wind tunnel models used for validation, and all Boeing’s claims, despite the Blooper, are true. Current CFD is more than adequate for significantly improving aerodynamics. But he also pointed out that they all have limits where they simply fall apart and figuring that the CFD failed, what exactly is causing the failure and whether it means anything for what you are trying to do is not always a simple matter. He gave an example of where a friend of his lost his life in a plane crash caused by pushing the limits of laminar flow on an a critical airfoil. CFD couldn’t give an accurate prediction of the effects of off center gusts on the airflow. A gust well within general aivation limits caused one wing to stall abruptly and flip the plane into the ground. This was, as usual, a cascade of failures that could have been avoided by more careful procedures.
You’ll notice that Boeing is probably using the most advanced software available, but they still include two stages of wind tunnel testing to verify the results, and some features undoubtedly undergo even more WT tests. Verification is almost totally absent from climate modelling. I wonder if any of the climate modelers have rigorously tested the limits of the models and what consequences of discovering major failures in the models long after the unaffordable mediations required by their forcasts have been started ?

May 8, 2014 9:14 pm

Dan Hughes says: May 8, 2014 at 5:26 pm
” The model equations used in GCMs are generally not hyperbolic”

As I demonstrated to Frank K above, the Navier Stokes equations have the acoustic wave equation embedded within. All sound satisfies the Navier-Stokes equations (it must), and sound waves are always a possible solution (and you get them).
“The CFL criterion is not a ‘guideline’. It is instead a hard and fast limit.”
It is the limit, and I’m very familiar with what happens when you get close. Waves of the maximum frequency that the grid can represent (Nyquist) start to switch to a growing mode. Checkerboard instabilities etc. You can do things to prevent that growth, so in that sense the limit is not hard. However, as you go beyond, there are more and more possible modes, so it is an effective limit.
“One does not need to have actually worked with CFD codes to understand the ramifications of the analyses.”
It helps. At least you don’t forget that with all the theoretical handwringing, they do actually work, and the results are used in high stakes applications (not just GCMs).

Martin A
May 9, 2014 12:59 am

Are computer models reliable?
Yes. Computer models are an essential tool in understanding how the climate will respond to changes in greenhouse gas concentrations, and other external effects, such as solar output and volcanoes.
Computer models are the only reliable way to predict changes in climate. Their reliability is tested by seeing if they are able to reproduce the past climate, which gives scientists confidence that they can also predict the future.
But computer models cannot predict the future exactly. They depend, for example, on assumptions made about the levels of future greenhouse gas emissions.

UK Met Office publication “Warming A guide to climate change”, 2011

tty
May 9, 2014 3:06 am

Nick Stokes says:
“It isn’t clear what caused the early 20C warming”
So what is the probability that this “X” factor caused the late 20C warming as well?

Nick Stokes
May 9, 2014 3:58 am

tty says: May 9, 2014 at 3:06 am
“So what is the probability that this “X” factor caused the late 20C warming as well?”

I simply said it isn’t clear. GHGs did rise significantly.
There’s every reason to expect the Earth to warm in response to forcing. But other things happen too, as they do in model runs. And there’s no reason to expect model runs to synchronise with those unforced changes on Earth. Or with each other (unforced), for that matter.

E.M.Smith
Editor
May 9, 2014 11:07 am

Very well put.
I got GISTemp to run. Royal PITA. It’s a hodge podge of code written over several decades by different hands with a couple of different computer languages with no source code control system and no versioning. It looks like Topsy. “It just growed”…
In just one example I found where an F to C conversion was done in the “simple but wrong’ way and introduced a 1/10 C warming in about 1/10 th of the records. Based on just ONE obvious line of code in just ONE program of the set. How many more such? Who knows. Subtile faults of the same order are very hard to spot, and near as I can tell nobody but me even bothered to look.
Oh, and they regularly ignore things like the simple fact that an average of temperatures is NOT a temperature. (NO, they do not convert to anomalies first, then do all the math. The temperatures are carried AS temperatures until the very last stop, then converted to grid-cell anomalies…)
The whole thing is just a computer fantasy run wild, IMHO. Dancing in the error bands of the data and doing it badly. That’s my oppinion as someone who has been a professional programmer / DBA / sysadmin / computer project manager / Director of IT etc. for about 36 years.
I’ve also looked at the GCMs code. Somewhat better, but substantially as described above. Not gotten one to run yet. It’s on my “someday list”… Maybe when I get a grant /sarc?

Dagfinn
May 9, 2014 11:21 am

Martin A says:
May 7, 2014 at 2:13 pm
” Climate models have a sound physical basis and mature, domain-specific software development processes” (
“Engineering the Software for Understanding Climate Change”, Steve M. Easterbrook Timothy C. Johns )
———————–
E.M.Smith says:
May 9, 2014 at 11:07 am
Very well put.
I got GISTemp to run. Royal PITA. It’s a hodge podge of code written over several decades by different hands with a couple of different computer languages with no source code control system and no versioning. It looks like Topsy. “It just growed”…
——————-
The Easterbrook and Johns paper has one occurrence of the word quality: “Overall code quality is hard to assess.”
http://www.cs.toronto.edu/~sme/papers/2008/Easterbrook-Johns-2008.pdf

May 9, 2014 1:41 pm

As I demonstrated to Frank K above, the Navier Stokes equations have the acoustic wave equation embedded within. All sound satisfies the Navier-Stokes equations (it must), and sound waves are always a possible solution (and you get them).
The important issues are not associated with the essentially un-countable number of fluid flows that are captured by the continuous formulation of the Navier-Stokes equations. That would be all the flows that meet the requirement of a continuum with all the fluids that meet the linear rate-of-strain / stress model. There is absolutely nothing useful in pointing out just a single example of these flows. Frank K is well aware of this aspect of these equations.
The critically important issues are those associated with (1) the modifications and limitations of the continuous formulation of the model equation systems used in GCMs ( generally the fluid-flow model equations are not the Navier-Stokes equations ), and this applies to all the equations used in the GCM, (2) the exact transformation of all the continuous equation formulations into discrete approximations, (3) the critically important properties and characteristics of the numerical solution methods used to solve the discrete approximations, (4) the limitations introduced at run time for each type of application and the effects of these on the response functions of interest for the application, and (5) the expertise and experience of users of the GCMs for each application area.
Discussions of specific aspects of the above issues cannot be usefully conducted until the details are known and the issues at hand identified in the code documentation. All else is hand waving
The CFD situation is in no ways an analogy for the GCM situation. Many CFD applications, those that do not resolve the fundamental scales of the un-altered Navier-Stokes equations, involve a few parameters. Additionally, some of these parameters have a basis in fundamental aspects of fluid flows. The GCMs, on the other hand involve a multitude of parameters for phenomena and processes that occur at temporal and spatial scales that are orders of magnitude smaller than the scale resolved with the discrete numerical-methods grid. It is the parameterizations that carry the heavy load of the degree of fidelity between the model results and the real world physical phenomena and processes. The parameterizations are descriptions of states that the materials of interest have previously attained. They are not properties of the materials.
Generally, CFD applications that potentially involve the health and safety of the public are subjected to a degree of Verification, Validation, and Uncertainty Qualification that the GCMs can only dream about. Decades of detailed experimental testing and associated model/code/calculational-procedure Validation for each response function of interest continues to be carried out for such applications.
There is no correspondence to the GCM case at all. Absolutely. None. Whatsoever.
It is getting to be a joke whenever some fundamental approaches to descriptions of material behaviors are invoked as analogies to the status of Climate Science. That reminds me of the good ol’ days when “realizations” using GCMs were equated to actual realizations of the Navier-Stokes equations to investigate the basis nature of turbulence. And the times that statistical mechanics is similarly invoked.
There is a singular, and of upmost importance, critical difference between those, proven, descriptions of materials and the status of Climate Science. The proven fundamental laws will not ever, as in never, incorporate descriptions of previous states that the materials have previously attained. Never. Instead, the proven fundamental laws will always solely contain descriptions of properties of the materials.
The GCMs are based on approximate models of some parts of some fundamental equations, plus a multitude of empirical descriptions of states that the materials in the system have previously attained. Even some of the approximate models will contain parameters that represent previous states of the materials, and are not material properties. Many of the empirical descriptions are somewhat, or completely, ad hoc ( for this case only ). The multitude of parameterizations do all the heavy lifting relative to the fidelity of the results of the model to physical reality.
A “realization” by a GCM is a “realization” of the processes captured by the descriptions of the previous states. Such “realizations” are not in any way actual realizations of the materials that make up the the Earth’s climate systems.
The distinctions between descriptions based on material properties and empirical estimates of previous states, the latter are characterized as ‘process models’, are at such a critical basis that they must always be kept in mind. Climate Science seems to completely ignore these distinctions and continues to invoke false analogies.

David Young
May 14, 2014 11:00 pm

Nick Stokes brings up the problem of sound waves for Navier-Stokes solvers. This is indeed a problem since the speed of sound is orders of magnitude greater than the weather pressure fluctuations that are of interest. Indeed, I recall when I was in graduate school a seminar at NCAR by Gerry Browning on the method he and Kreiss devised to filter out sound waves from weather models to allow much larger time steps. So for Nick, sound waves are in fact an unwanted feature of the Navier-Stokes equations that must be filtered out for effective weather simulations.
I also think this idea that “weather models work” therefore your criticism of climate models is wrong is just silly. Weather models just barely “work” as Browning and others have revealed. Numerically, Navier-Stokes simulations are very difficult and subject to all the problems associated with nonlinear systems.
The real question here is why one would expect a weather model run on a very course grid with a huge time step to yield anything meaningful whatsoever given that it is terrible as a weather model even in the short term. As climate of doom asserted in another venue, the answer climate modelers give is “every time I run the model, I get a reasonable climate.” That is just colorful fluid dynamics and not a scientific argument.

David Young
May 15, 2014 6:11 pm

Nick Stokes references a presentation on CFD and the 787. Be very careful here. Boeing people can also be CFD salesmen and that is what we have here. It’s an internal advocate. More accurate is an excellent paper in journal of math in industry dec 2012 I believe by 2 airbus specialists. CFD is postdictive and only occasionally predictive, especially outside the vast range of past testing.

May 16, 2014 5:50 am

Use of CFD as an analog for GCMs is not valid so long as all the issues discussed in the citations listed below have not been addressed for each code model, numerical solution method, code software, application procedure, system response function, and user.
Additionally, GCMs as process models must also address the issues.
There is now a very large, robust, and directly applicable literature that has been accepted by many organizations that develop engineering and scientific models, methods, and software for a multitude of diverse applications. The Climate Science community remains the singular exception. It is now an undeniable fact that the Climate Science community continues to avoid even the mention of these matters, preferring instead hand-waving dismissal of the existence of the matters and those who dare to mention them.
Fundamentals of Verification and Validation by Patrick J. Roache
Verification and Validation in Computational Science and Engineering by Patrick J. Roache
Verification and Validation in Scientific Computing by William L. Oberkampf and Christopher J. Roy
Additional references are listed here

David Young
May 17, 2014 10:24 am

Dan Hughes, I basically agree with you. Navier-Stokes is infinitely easy compared to climate. Even NS though has serious issues though which are often swept under the rug.

1 7 8 9
Verified by MonsterInsights