Elevated from a WUWT comment by Dr. Robert G. Brown, Duke University
Frank K. says: You are spot on with your assessment of ECIMs/GCMs. Unfortunately, those who believe in their ability to predict future climate really don’t want to talk about the differential equations, numerical methods or initial/boundary conditions which comprise these codes. That’s where the real problems are…
Well, let’s be careful how you state this. Those who believe in their ability to predict future climate who aren’t in the business don’t want to talk about all of this, and those who aren’t expert in predictive modeling and statistics in general in the business would prefer in many cases not to have a detailed discussion of the difficulty of properly validating a predictive model — a process which basically never ends as new data comes in.
However, most of the GCMs and ECIMs are well, and reasonably publicly, documented. It’s just that unless you have a Ph.D. in (say) physics, a knowledge of general mathematics and statistics and computer science and numerical computing that would suffice to earn you at least masters degree in each of those subjects if acquired in the context of an academic program, plus substantial subspecialization knowledge in the general fields of computational fluid dynamics and climate science, you don’t know enough to intelligently comment on the code itself. You can only comment on it as a black box, or comment on one tiny fragment of the code, or physics, or initialization, or methods, or the ode solvers, or the dynamical engines, or the averaging, or the spatiotemporal resolution, or…
Look, I actually have a Ph.D in theoretical physics. I’ve completed something like six graduate level math classes (mostly as an undergraduate, but a couple as a physics grad student). I’ve taught (and written a textbook on) graduate level electrodynamics, which is basically a thinly disguised course in elliptical and hyperbolic PDEs. I’ve written a book on large scale cluster computing that people still use when setting up compute clusters, and have several gigabytes worth of code in my personal subversion tree and cannot keep count of how many languages I either know well or have written at least one program in dating back to code written on paper tape. I’ve co-founded two companies on advanced predictive modelling on the basis of code I’ve written and a process for doing indirect Bayesian inference across privacy or other data boundaries that was for a long time patent pending before trying to defend a method patent grew too expensive and cumbersome to continue; the second company is still extant and making substantial progress towards perhaps one day making me rich. I’ve did advanced importance-sampling Monte Carlo simulation as my primary research for around 15 years before quitting that as well. I’ve learned a fair bit of climate science. I basically lack a detailed knowledge and experience of only computational fluid dynamics in the list above (and understand the concepts there pretty well, but that isn’t the same thing as direct experience) and I still have a hard time working through e.g. the CAM 3.1 documentation, and an even harder time working through the open source code, partly because the code is terribly organized and poorly internally documented to the point where just getting it to build correctly requires dedication and a week or two of effort.
Oh, and did I mention that I’m also an experienced systems/network programmer and administrator? So I actually understand the underlying tools REQUIRED for it to build pretty well…
If I have a hard time getting to where I can — for example — simply build an openly published code base and run it on a personal multicore system to watch the whole thing actually run through to a conclusion, let alone start to reorganize the code, replace underlying components such as its absurd lat/long gridding on the surface of a sphere with rescalable symmetric tesselations to make the code adaptive, isolate the various contributing physics subsystems so that they can be easily modified or replaced without affecting other parts of the computation, and so on, you can bet that there aren’t but a handful of people worldwide who are going to be able to do this and willing to do this without a paycheck and substantial support. How does one get the paycheck, the support, the access to supercomputing-scale resources to enable the process? By writing grants (and having enough time to do the work, in an environment capable of providing the required support in exchange for indirect cost money at fixed rates, with the implicit support of the department you work for) and getting grant money to do so.
And who controls who, of the tiny handful of people broadly enough competent in the list above to have a good chance of being able to manage the whole project on the basis of their own directly implemented knowledge and skills AND who has the time and indirect support etc, gets funded? Who reviews the grants?
Why, the very people you would be competing with, who all have a number of vested interests in there being an emergency, because without an emergency the US government might fund two or even three distinct efforts to write a functioning climate model, but they’d never fund forty or fifty such efforts. It is in nobody’s best interests in this group to admit outsiders — all of those groups have grad students they need to place, jobs they need to have materialize for the ones that won’t continue in research, and themselves depend on not antagonizing their friends and colleagues. As AR5 directly remarks — of the 36 or so named components of CMIP5, there aren’t anything LIKE 36 independent models — the models, data, methods, code are all variants of a mere handful of “memetic” code lines, split off on precisely the basis of grad student X starting his or her own version of the code they used in school as part of newly funded program at a new school or institution.
IMO, solving the problem the GCMs are trying to solve is a grand challenge problem in computer science. It isn’t at all surprising that the solutions so far don’t work very well. It would rather be surprising if they did. We don’t even have the data needed to intelligently initialize the models we have got, and those models almost certainly have a completely inadequate spatiotemporal resolution on an insanely stupid, non-rescalable gridding of a sphere. So the programs literally cannot be made to run at a finer resolution without basically rewriting the whole thing, and any such rewrite would only make the problem at the poles worse — quadrature on a spherical surface using a rectilinear lat/long grid is long known to be enormously difficult and to give rise to artifacts and nearly uncontrollable error estimates.
But until the people doing “statistics” on the output of the GCMs come to their senses and stop treating each GCM as if it is an independent and identically distributed sample drawn from a distribution of perfectly written GCM codes plus unknown but unbiased internal errors — which is precisely what AR5 does, as is explicitly acknowledged in section 9.2 in precisely two paragraphs hidden neatly in the middle that more or less add up to “all of the `confidence’ given the estimates listed at the beginning of chapter 9 is basically human opinion bullshit, not something that can be backed up by any sort of axiomatically correct statistical analysis” — the public will be safely protected from any “dangerous” knowledge of the ongoing failure of the GCMs to actually predict or hindcast anything at all particularly accurately outside of the reference interval.
Here’s a relevant quotation:
Dr. Brown
This post I did with respect to modeling might be useful for this discussion
What Are Climate Models? What Do They Do?
http://pielkeclimatesci.wordpress.com/2005/07/15/what-are-climate-models-what-do-they-do/
On the serious flaws with respect to these models when run for multi-decadal climate predictions (projections) even for current climate [in hindcasts where they can actually be tested against real world data], see, for example, the Preface http://pielkeclimatesci.files.wordpress.com/2013/05/b-18preface.pdf
to
Pielke Sr, R.A., Editor in Chief., 2013: Climate Vulnerability, Understanding and Addressing Threats to Essential Resources, 1st Edition. J. Adegoke, F. Hossain, G. Kallos, D. Niyoki, T. Seastedt, K. Suding, C. Wright, Eds., Academic Press, 1570 pp.
Best Regards
Roger Sr.
Dr Brown
Isn’t it more likely that climate models are predicting what climate modelers BELIEVE the future climate will be, rather than predicting future climate?
Here is my reasoning: Since we don’t know the future climate, it is not possible to validate the models, except against what we believe to be the future. Given two models that hindcast with the same approximate level of accuracy, the model builder will subconsciously select the model that forecasts closer to their own belief, believing this model to be “more correct”
As every non trivial piece of code has “bugs”, this process over time will select for code in which the errors skew the results in the direction of the developers beliefs as to what is the correct answer. In the case of a community of developers, then the model will also be selected based on the community belief system, especially where grant money is involved. Otherwise, those models that do not conform will not get funding and will be eliminated.
In effect, climate models are undergoing “survival of the fittest” in which the fittest model is the one that the climate community believes is delivery the “best” answer, regardless of how accurately it is able to actually predict the future. Over time, the “best answer” is the answer that attracts the most funding, not the answer that delivers the most accuracy.
Oops, thanks Ken Hall for the ECIM definition.
And Dr. Brown, this is one of the best reads on WUWT that I have seen, even though I had only 1 year of college physics, I do get the gist of it. Great post…
Many thanks to Dr, Robert G. Brown for these insights into the GCM’s. For me,the final analysis is in the differing results of the various models. It seems a truism that the model does what the modeler wants it to do.
@Nick Stokes
“Basically, sound waves are how dynamic pressure is transmitted”
Could you please answer my question about this statement? I would like to know where I can find out more information about how dynamic pressure is transmitted by sound waves. Thanks.
jaffa – “space invaders” – LMAO!
So the code is poorly documented, the equations suspect, and it apparently lacks modular design for testing. Is this an example of featherbedding or the need to assign the coding to an independent group?
“all of the `confidence’ given the estimates listed at the beginning of chapter 9 is basically human opinion bullshit, not something that can be backed up by any sort of axiomatically correct statistical analysis”
If memory serves, Lord May refers to the scientific opinions as consensus regarding educated guesses.
Robert G. Brown,
A stimulating read. Thank you.
I think the modelers will just shrug their shoulders at their model’s non-real behavior and exclaim with non-scientific self-righteousness something like => Hey, our climate models are a work in progress and to be on the safe side for humanity (and our grandchildren) we purposely made them dangerously warm until (sometime in the far future) we get our models working right.
I see no other position they can take except they warmed the models for the precautionary principle. They aren’t doing science.
The question is, is what they did lying?
John
I too am a physicist with considerable experience collecting data to initialize and validate models. It occurs to me that one of the biggest challenges for saving the planet from this warming trend is that organizations funded by hydrocarbon billionaires are confusing the public by spreading disinformation on climate change. I think the solution is obvious. Now that 99.999 percent of the scientists agree, our climate model-builders should turn their massive skills to predicting financial markets. We should be able to quickly amass huge amounts of cash to spend on lobbyists, political campaigns, and public education. Then we can buy up all the carbon-capture patents and keep that technology secret, and go on to destroy the evil coal, oil and natural gas industries. Problem solved.
Edward de Bono wrote about how how intelligent people lead themselves astray and won’t retrace their steps. He called it “the intelligence trap.” (Google for it)
Sir:
Can you at least tell us what the price of Tesla Motors will be a year from now?
Thank you in advance for your reply…
Sun et al. (2012) found that
“.in global climate models, [t]he radiation sampling error due to infrequent radiation calculations is investigated …. It is found that.. errors are
very large, exceeding 800Wm2 at many non-radiation time steps due to ignoring the effects of clouds..”
===================================
Watts a silly little watt between friends.
Modern climatology doesn’t even remember the atmosphere’s temperature is governed by the Ideal Gas Law today, just like it was when James Hansen began weighting the atmospheric temperature according to trace gas composition.
It’s fairy physics. The atmosphere obeys Ideal Gas Law. If your model doesn’t explicitly obey it, then your model’s wrong.
NO modern GCM since James Hansen tried to re-invent computer modeling of the atmosphere after NASA already properly modeled it using the proper law for orbital calculations regarding Mercury and Apollo,
has the atmosphere obeying Ideal Gas Law.
That’s why those of you who swear you’re as smart as anybody in the room regarding this
are laughingstocks to the real working scientists
who’ve laughed in all your faces
for the past ten to fifteen years as you’ve sworn you believe trace gas controls atmospheric temperature,
It’s as ridiculous today as it was when Hansen was lying to Congress about it even being potentially real. Just because you believed it doesn’t mean it was maybe right.
Just because you’ve been wrong for going on twenty years, as we in the actual, working scientific world have laughed you to shame,
doesn’t mean there might have actually been something to it after all.
It’s only the people who don’t go out and actually make things fly and land and communicate, who ever believed it. Working scientists who actually make the solar system probes, and the rovers, an the satellite communications networks, and the avionics people, the submariners,
nobody ever believed Hansen’s lunacy. You who believed it have had 40 N.A.S.A. retirees telling you for the past several years, they’re ashamed of Hansen and his sorry science.
You wouldn’t listen to them either.
But there’s one thing the working scientists of this world have been telling you – all of you who believed it – for nearly a quarter of a century.
There’s no such thing as a Green House Gas Law, governing atmospheric temperature.
There is such thing as Ideal Gas Law governing atmospheric temperature.
Dr Brown, you have good company.
“If you live with models for 10 to 20 years, you start to believe in them…A model is such a fascinating toy that you fall in love with your creation.”
– Freeman Dyson (At the age of five he calculated the number of atoms in the sun.)
I read somewhere that just the radiative interactions part of the NASA GCM has N thousand lines of code (not sure the value of N). This code is being maintained and updated by a series of grad students and postdocs. Anyone who knows about such monsters KNOW that many bugs and errors will exist.
Formal validations should be presented of any GCM used in climate research that has policy implications.
Alex, I have no idea what “isothermal assumption” you think I have attached to my name. Also, your points are so riddled with misunderstanding, I don’t even know where to start.
RGB says: “…and an even harder time working through the open source code, partly because the code is terribly organized and poorly internally documented…”
I just wonder what his reaction was to the Harry_read_me code from CG1 – which I doubt could be called ‘open source’.
Frank K. says: May 7, 2014 at 7:24 am
“Could you please answer my question about this statement? I would like to know where I can find out more information about how dynamic pressure is transmitted by sound waves. Thanks.”
OK. But first a correction – it’s late here. I said that they run at Courant number less than 1. In the example I cited, 5 min vs 10-20, they are running at >1. That’s possible to a point, using devices like you mentioned.
Now on sound and pressure, it’s there in the Navier-Stokes equations. I’ve given more details here. But you have the momentum equation
ρDv/Dt = -∇P+…
and the conservation of mass equation:
∂ρ/∂t = -ρ∇.v+…
and a state equation: dP/dρ=c2
Put those together and you have an acoustic wave equation, buried in the Navier-Stokes equation. And you’re limited by its Courant Number.
Glimpse into Type B Errors
Dr. Robert Brown concisely observes:
Re: “IMO, solving the problem the GCMs are trying to solve is a grand challenge problem in computer science. It isn’t at all surprising that the solutions so far don’t work very well. It would rather be surprising if they did. We don’t even have the data needed to intelligently initialize the models we have got, and those models almost certainly have a completely inadequate spatiotemporal resolution on an insanely stupid, non-rescalable gridding of a sphere. . . . .the ongoing failure of the GCMs to actually predict or hindcast anything at all particularly accurately outside of the reference interval.”
That is a brief glimpse into contributions to major “Type B errors” – which the IPCC totally ignores, and yet which are so obvious with 96% of GCM 34 year projections exceeding temperature reality.
See BIPM’s GUM: Guide to the Expression of Uncertainty in Measurement 2008 and
NIST’s Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results 1994
@RGB
Dr. Brown, your post at top was delicious. But I make a small issue with the blockquote above. You make it sound that if you are not some post-doc octopus, you are not worthy to make comments on the GCM code. Quite the contrary, only a post-doc octopus may be capable of writing such a model, but normal one-fisted experts ought to be able to point out flaws in any part of the process.
Building a house requires the skills of a soils and foundation engineering, masonry, structural engineering, carpentry, HVAC, electrical, plumbing, roofing, insulation, cabinetry… etc. Any single element of that endeavor should be and usually is reviewed and critiqued by independent masters of the craft. That a plumbing inspector might not know roofing doesn’t disqualify him for red tagging a drain pipe with insufficient gradient.
You point is that GCMs are horribly complex dynamic processes where ALL the skills are necessary to write a good one. All true. But since all these processes must work correctly in isolation and in dynamic cooperation, EACH process can and should be able to withstand criticism by any expert, indeed anyone with knowledge, in that field. The burden of complexity is on the authors of the GCM, not the peer reviewers.
The human body is horribly complex, but you don’t need the skills of a neurosurgeon to recognize the patient has a broken leg, dislocated shoulder, or is dehydrated.
I am sure you agree with this in principle because you close with the hope that a some ordinary
For here our ordinary statisticians need to make inferences and question not just one GCM, but on the entire genus of GCMs.
“ode solvers” in the article appears to be a typo.
Unless it’s something like fixing problems dealing with the History Channel’s “Vikings” series take on Ragnar Lothbrok.
@David L. Hagen at 8:23 am
I am familiar with Type 1 and Type 2 error. I’m familiar with false positives and false negatives.
Thanks to Lord Monckton, I am familiar with a dozen styles of invalid reasoning.
But I am not familiar with “Type B Error”, at least by that name. What is it? And what is Type A Error?
(Google wasn’t much help: http://www.west.net/~ger/math_errors.html, not bad, but not on point.)
Nick Stokes
The unsolved Navier-Stokes problem involves variously the solution or the breakdown of these equations in 3 dimensions. It’s that so little is known.
More generally, how far do numerical solutions in GCM’s reproduce the dynamics of the actual PDE? Stephen Wolfram knows much more about this than I (or most others, I suspect). He observes:”.. it has become increasingly common to see numerical results given far into the turbulent regime – leading sometimes to the assumption that turbulence has somehow been derived from the Navier-Stokes equations. But just what such numerical results actually have to do with detailed solutions to the Navier-Stokes equations is not clear”
This also goes to the often-cited issue of chaotic climate dynamics. It is likely numerical solutions
will exhibit chaotic features, but does this represent the so-called ‘fundamental physics’?
Reference to climate being a chaotic system seems often to rely on the very simplified version
of N-S studied by Lorenz. Of this Wolfram comments: “The Lorenz equations represent a first-order
approximation to certain Navier-Stokes-like equations, in which viscosity is ignored. And when one
goes to higher orders progressively more account is taken of viscosity, but the chaos phenomenon
becomes progressively weaker. I suspect that in the limit where viscosity is fully included most details of initial conditions will simply be damped out, as physical intuition suggests”. It seems
a reasonable conjecture that these observations may also apply to GCM solutions.
Anyway, it would seem prudent to know more about the ‘fine print’ of these models before tasking them with ‘saving mankind’?
Nick Stokes says: “So there is generally no attempt to make the models predict from, say, 2014 conditions. They couldn’t predict our current run of La Nina’s, because they were not given information that would allow them to do so. Attempting to do so will introduce artefacts. They have ENSO behavior…”
So if ENSO is an emergent phenomenon in the models, one might expect AMO and PDO features to also be emergent. Show us the models that produce these.