The Global Climate Model clique feedback loop

Elevated from a WUWT comment by Dr. Robert G. Brown, Duke University

Frank K. says: You are spot on with your assessment of ECIMs/GCMs. Unfortunately, those who believe in their ability to predict future climate really don’t want to talk about the differential equations, numerical methods or initial/boundary conditions which comprise these codes. That’s where the real problems are…

Well, let’s be careful how you state this. Those who believe in their ability to predict future climate who aren’t in the business don’t want to talk about all of this, and those who aren’t expert in predictive modeling and statistics in general in the business would prefer in many cases not to have a detailed discussion of the difficulty of properly validating a predictive model — a process which basically never ends as new data comes in.

However, most of the GCMs and ECIMs are well, and reasonably publicly, documented. It’s just that unless you have a Ph.D. in (say) physics, a knowledge of general mathematics and statistics and computer science and numerical computing that would suffice to earn you at least masters degree in each of those subjects if acquired in the context of an academic program, plus substantial subspecialization knowledge in the general fields of computational fluid dynamics and climate science, you don’t know enough to intelligently comment on the code itself. You can only comment on it as a black box, or comment on one tiny fragment of the code, or physics, or initialization, or methods, or the ode solvers, or the dynamical engines, or the averaging, or the spatiotemporal resolution, or…

Look, I actually have a Ph.D in theoretical physics. I’ve completed something like six graduate level math classes (mostly as an undergraduate, but a couple as a physics grad student). I’ve taught (and written a textbook on) graduate level electrodynamics, which is basically a thinly disguised course in elliptical and hyperbolic PDEs. I’ve written a book on large scale cluster computing that people still use when setting up compute clusters, and have several gigabytes worth of code in my personal subversion tree and cannot keep count of how many languages I either know well or have written at least one program in dating back to code written on paper tape. I’ve co-founded two companies on advanced predictive modelling on the basis of code I’ve written and a process for doing indirect Bayesian inference across privacy or other data boundaries that was for a long time patent pending before trying to defend a method patent grew too expensive and cumbersome to continue; the second company is still extant and making substantial progress towards perhaps one day making me rich. I’ve did advanced importance-sampling Monte Carlo simulation as my primary research for around 15 years before quitting that as well. I’ve learned a fair bit of climate science. I basically lack a detailed knowledge and experience of only computational fluid dynamics in the list above (and understand the concepts there pretty well, but that isn’t the same thing as direct experience) and I still have a hard time working through e.g. the CAM 3.1 documentation, and an even harder time working through the open source code, partly because the code is terribly organized and poorly internally documented to the point where just getting it to build correctly requires dedication and a week or two of effort.

Oh, and did I mention that I’m also an experienced systems/network programmer and administrator? So I actually understand the underlying tools REQUIRED for it to build pretty well…

If I have a hard time getting to where I can — for example — simply build an openly published code base and run it on a personal multicore system to watch the whole thing actually run through to a conclusion, let alone start to reorganize the code, replace underlying components such as its absurd lat/long gridding on the surface of a sphere with rescalable symmetric tesselations to make the code adaptive, isolate the various contributing physics subsystems so that they can be easily modified or replaced without affecting other parts of the computation, and so on, you can bet that there aren’t but a handful of people worldwide who are going to be able to do this and willing to do this without a paycheck and substantial support. How does one get the paycheck, the support, the access to supercomputing-scale resources to enable the process? By writing grants (and having enough time to do the work, in an environment capable of providing the required support in exchange for indirect cost money at fixed rates, with the implicit support of the department you work for) and getting grant money to do so.

And who controls who, of the tiny handful of people broadly enough competent in the list above to have a good chance of being able to manage the whole project on the basis of their own directly implemented knowledge and skills AND who has the time and indirect support etc, gets funded? Who reviews the grants?

Why, the very people you would be competing with, who all have a number of vested interests in there being an emergency, because without an emergency the US government might fund two or even three distinct efforts to write a functioning climate model, but they’d never fund forty or fifty such efforts. It is in nobody’s best interests in this group to admit outsiders — all of those groups have grad students they need to place, jobs they need to have materialize for the ones that won’t continue in research, and themselves depend on not antagonizing their friends and colleagues. As AR5 directly remarks — of the 36 or so named components of CMIP5, there aren’t anything LIKE 36 independent models — the models, data, methods, code are all variants of a mere handful of “memetic” code lines, split off on precisely the basis of grad student X starting his or her own version of the code they used in school as part of newly funded program at a new school or institution.

IMO, solving the problem the GCMs are trying to solve is a grand challenge problem in computer science. It isn’t at all surprising that the solutions so far don’t work very well. It would rather be surprising if they did. We don’t even have the data needed to intelligently initialize the models we have got, and those models almost certainly have a completely inadequate spatiotemporal resolution on an insanely stupid, non-rescalable gridding of a sphere. So the programs literally cannot be made to run at a finer resolution without basically rewriting the whole thing, and any such rewrite would only make the problem at the poles worse — quadrature on a spherical surface using a rectilinear lat/long grid is long known to be enormously difficult and to give rise to artifacts and nearly uncontrollable error estimates.

But until the people doing “statistics” on the output of the GCMs come to their senses and stop treating each GCM as if it is an independent and identically distributed sample drawn from a distribution of perfectly written GCM codes plus unknown but unbiased internal errors — which is precisely what AR5 does, as is explicitly acknowledged in section 9.2 in precisely two paragraphs hidden neatly in the middle that more or less add up to “all of the `confidence’ given the estimates listed at the beginning of chapter 9 is basically human opinion bullshit, not something that can be backed up by any sort of axiomatically correct statistical analysis” — the public will be safely protected from any “dangerous” knowledge of the ongoing failure of the GCMs to actually predict or hindcast anything at all particularly accurately outside of the reference interval.

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

219 Comments
Inline Feedbacks
View all comments
Charles Nelson
May 7, 2014 3:41 am

Warmists often ask me just how such a giant conspiracy could exist amongst so many scientists…this is one very good particular example of the kind of structures that sustain CAGW.
Thanks for the insight!

Roy Spencer
May 7, 2014 3:54 am

I exchanged a few emails with mathematician Chris Essex recently who claimed (I hope I’m translating this correctly) that climate models are doomed to failure because you can’t use finite difference approximations in long-time scale integrations without destroying the underlying physics. Mass and energy don’t get conserved. Then they try to fix the problem with energy “flux adjustments”, which is just a band aid covering up the problem.
We spent many months trying to run the ARPS cloud resolving models in climate mode, and it has precisely these problems.

dearieme
May 7, 2014 3:56 am

“inadequate spatiotemporal resolution on an insanely stupid, non-rescalable gridding of a sphere”: that reminds me vividly of my first reaction, years ago, when I started to look into “Global Warming”. It was apparent to me that much of the modelling was “insanely stupid” – that it was being done by people who were, by the general standards of the physical sciences, duds.
My qualification to make such a judgement included considerable experience of the modelling of physicochemical systems, starting in 1967. And unlike many of the “climate scientists” I also had plenty of experience in temperature measurement.
My analysis has long been that they started off with hubris and incompetence; the lying started later as they desperately tried to defend their rubbish.

Man Bearpig
May 7, 2014 4:00 am

Well said that man.
Even the most basic (sounding) computer models very soon become more complex. I am in no way described as a computer programmer, but I did a course many years ago in VB. Part of it involved writing a computer ‘model’ or simulation of a scenario that went something like this.
In your city you have 10,000 parking machines, on average 500 break down every day. it takes on average 30 minutes to fix each one and 20 minutes to drive to the location. How many engineers do you need?
Simple? Not on your nelly !

May 7, 2014 4:01 am

Thank you Dr Robert G Brown. You have written exactly what I suspected all along. To put it bluntly the climate models cannot be relied upon. The input data being organised on lat/long grids “gives rise to nearly uncontrollable errors”. I have asked actuaries who have been using this data similar questions about its inherent biases but they didn’t reply. I will be very interested in what Bob Tisdale has to say in this post.
I seem to have a similar history to you as I started programming almost 50 years ago initially using plug board machines and have kept it up as peripheral to my current job as an actuary. I figured the failings of climate models a long time ago but lacked your ability and experience to express it so eloquently.

Man Bearpig
May 7, 2014 4:05 am

” Charles Nelson says:
May 7, 2014 at 3:41 am
Warmists often ask me just how such a giant conspiracy could exist amongst so many scientists…this is one very good particular example of the kind of structures that sustain CAGW.
Thanks for the insight!

I think that it is not that there is a conspiracy, it is more that they do not understand basic scientific principles and ‘believe’ that what they are doing is for the good of mankind (and their pockets)
.

Jos
May 7, 2014 4:07 am

There are actually people doing research on how independent GCMs really are.
An example is this paper by Pennell and Reichler [2011], but there are a few more.
http://journals.ametsoc.org/doi/abs/10.1175/2010JCLI3814.1
http://www.inscc.utah.edu/~reichler/publications/papers/Pennell_Reichler_JC_2010.pdf
They conclude, a.o., that “For the full 24 member ensemble, this leads to an Meff that,
depending on method, lies only between 7.5 and 9 when we control for the multi-model
error (MME). These results are quantitatively consistent with that from Jun. et al. (2008a,
2008b), who also found that CMIP3 cannot be treated as a collection of independent
models.”
(Meff = effective number of models)
So, according to them, to get an estimate about how many independent models there are one should divide the actual number of models by three …

May 7, 2014 4:09 am

Are they designed to actually be predictive in the hard science tradition or are these models actually just artefacts illustrating a social science theory of changing practices in political structures and social governance and economic planning and human behavior? My reading of USGCRP, especially the explicit declaration to use K-12 education to force belief in the models and the inculcation of new values to act on the models is that this is social science or the related system science of the type Ervin Laszlo has pushed for decades.
It’s no different from the Club of Rome Limits to Growth or World Order Models Projects of the 70s that admitted in fine print that they were not modelling physical reality. They were simply trying to gain the political power and taxpayer money to alter physical reality. That is also my reading of the Chapter in yesterday’s report on Adaptation.
I keep hyping this same point because by keeping the focus just on the lack of congruence with the hard science facts, we are missing where the real threats from these reports is coming from. They justify government action to remake social rules without any kind of vote, notice, or citizen consent. It makes us the governed in a despot’s dream utopia that will not end well.

Bloke down the pub
May 7, 2014 4:11 am

Now doesn’t it feel better with that off your chest? By the way, when you say ‘ It’s just that unless you have a Ph.D. in (say) physics, a knowledge of general mathematics and statistics and computer science and numerical computing that would suffice to earn you at least masters degree in each of those subjects if acquired in the context of an academic program, plus substantial subspecialization knowledge in the general fields of computational fluid dynamics and climate science, you don’t know enough to intelligently comment on the code itself’., you weren’t thinking of someone who can’t use excel, like Phil Jones were you?

Bloke down the pub
May 7, 2014 4:15 am

Man Bearpig says:
May 7, 2014 at 4:00 am
The answer is one. The sooner they are all broken the better.

May 7, 2014 4:22 am

Dr. Brown is looking at a real solution. But the modelers are not. They cannot be. They use garbage as input which means regardless of the model, the output will always be garbage. Until they stop monkeying with past temperatures, they have no hope of modeling future climate.

emsnews
May 7, 2014 4:29 am

The real battle is over energy. Note how almost all our wars are against fossil fuel energy exporting nations. Today’s target is Russia.
Unlike previous victims, Russia has a large nuclear armed force that can destroy the US!
The US is pushing for a pipeline for ‘dirty oil’ from Canada while at the same time telling citizens that oil is evil and the US has lots of coal and saying coal is evil.
The rulers want to do this because they know these things are LIMITED. Yes, we are at the Hubbert Oil Peak which isn’t one year or ten years but it is real. Energy is harder to find and more expensive to process.
To preserve this bounty of fossil fuels, the government has to price it out of the reach of the peasants which are the vast bulk of the population. This is why our media is all amazed and loving tiny houses while the elites build bigger and bigger palaces.

HomeBrewer
May 7, 2014 4:30 am

“have several gigabytes worth of code in my personal subversion tree”
I hope it’s not several gigabytes of your code only or else you are a really bad programmer (which often is the case since everybody who’s taken some kind of university course consider themselves fit for computer programmning).

May 7, 2014 4:53 am

Very good exposé, it tallies nicely with the statement that GCM’s are complex re-packed opinions.

Doug Huffman
May 7, 2014 4:53 am

While clique may be appropriate, CLAQUE is then more appropriate – a paid clique. The Magliozzi grace themselves by not being truthful – Clique and Claque, the Racket Brothers.

Tom in Florida
May 7, 2014 4:54 am

Man Bearpig says:
May 7, 2014 at 4:00 am
“In your city you have 10,000 parking machines, on average 500 break down every day. it takes on average 30 minutes to fix each one and 20 minutes to drive to the location. How many engineers do you need?”
It doesn’t matter how many you NEED, it’s government work so figure how many they hire.
Don’t forget that each engineer needs a driver (union rules apply) plus you will need dispatchers, administrative personnel and auditors. Each of those sections would need supervisors who will need supervisors themselves. The agency would end up costing more than the income from the parking meters so they will then need to be removed, most likely at a cost that is higher than the installation cost, most likely from the same contractor who installed them and who is no doubt related to someone in the administration. All in all just a typical government operation.
But back to the subject matter by Dr Brown. Most of the people to whom I speak about this haven’t the faintest clue that models are even involved. They believe that it is all based on proven science with hard data. When models are discussed the answer is very often “they must know what they are doing or the government wouldn’t fund them”.
It’s still going to be a long, uphill battle against a very good propaganda machine.

sleeping bear dunes
May 7, 2014 4:55 am

I always, absolutely always enjoy Dr. Brown’s comments. There are times when I have doubts about my skepticism. And then I read some of his perspectives and go to sleep that night feeling that I am not all that dumb after all. It is comforting that there are some real pros out there thinking about this stuff.

kadaka (KD Knoebel)
May 7, 2014 4:59 am

From Man Bearpig on May 7, 2014 at 4:00 am:

In your city you have 10,000 parking machines, on average 500 break down every day. it takes on average 30 minutes to fix each one and 20 minutes to drive to the location. How many engineers do you need?

None. Engineers don’t fix parking meters, repair techs do.
And at that failure rate, it’s moronic to field repair. Have about an extra thousand working units on hand, drive around in sweeps and swap out the nonworking heads, then repair the broken ones centrally in batches in steps (empty all coin boxes, then open all cases, then check all…). Which throws out your initial assumptions on times.
However at an average 10,000 broken units every 20 days, you might need an engineer or three. As expert witnesses when you sue the supplier and/or manufacturer who stuck you with such obviously mal-engineered inherently inadequate equipment. With an average failure rate of eighteen times a year per unit, anyone selling those deserves to be sued.
Isn’t it wonderful how models and assumptions fall apart from a simple introductory reality check?

May 7, 2014 4:59 am

My analysis has long been that they started off with hubris and incompetence; the lying started later as they desperately tried to defend their rubbish.
It has always been interesting witnessing how much ‘faith’ the AGW hypothesists, dreamers and theorists placed in those models, knowing full well just how defective they really were and are.

May 7, 2014 5:01 am

It’s good to hear Roy Spencer mention Chris Essex’s critique of climate models, as I don’t think anyone has uncovered the reality of the GCM challenge – technical and political – for me as much as Robert Brown, since I read Essex and McKitrick’s Taken by Storm way back.
Just one word of warning to commenters like dearieme: the ‘insanely stupid’ referred to something in what we tend to call the architecture of a software system. The current GCMs don’t have the ability to scale the grid size and that’s insane. The people chosen to program the next variant of such a flawed base system are, as Brown says, taken from “the tiny handful of people broadly enough competent in the list”. They aren’t stupid by any conventional measure but the overall system, IPCC and all, most certainly is. Thanks Dr Brown – this seems to me one of the most important posts I’ve ever seen on WUWT.

Nick Stokes
May 7, 2014 5:02 am

To understand GCM’s and how they fit in, you first need to understand and acknowledge their origin in numerical weather prediction. That’s where the program structure originates. Codes like GFDL are really dual-use.
Numerical Weather Prediction is big. It has been around for forty or so years, and has high-stakes uses. Performance is heavily scrutinised, and it works. It would be unwise for GCM writers to get too far away from their structure. We know there is a core of dynamics that works.
NWP has limited time range because of what people call chaos. Many things are imperfectly known, and stuff just gets out of phase. The weather predicted increasingly does not correspond to the initial state. It is in effect random weather but still responding to climate forcings. GCMs are NWPs run beyond their predictive range.
GCMs are traditionally run with a long wind-back period. That is to ensure that there is indeed no dependence on initial conditions. The reason is that these are not only imperfectly known, especially way back, but are likely (because of that) to cause strange initial behaviour, which needs time to work its way out of the system.
So there is generally no attempt to make the models predict from, say, 2014 conditions. They couldn’t predict our current run of La Nina’s, because they were not given information that would allow them to do so. Attempting to do so will introduce artefacts. They have ENSO behaviour, but they can’t expect to be in phase with any particular realisartion.
Their purpose is not to predict decadal weather but to generate climate statistics reflecting forcing. That’s why when they do ensembles, they aren’t looking for the model that does best on say a decadal basis. That would be like investing in the fund that did best last month. It’s just chance.
I understand there are now hybrid models that do try and synchronise with current states to get a shorter term predictive capability. This is related to the development of reanalysis programs, which are also a sort of hybrid. I haven’t kept up with this very well.

jaffa
May 7, 2014 5:02 am

I have a computer model that suggests the planet will be attacked by wave after wave of space aliens and it’s hopeless because every time we destroy one incoming wave the next is stronger and faster, the consequences are inevitable – we will lose and we will all be killed – the computer model proves it. Oh wait – I’ve just realised I was playing space invaders. Panic over.

May 7, 2014 5:03 am

Thank you, Dr Brown, for an excellent and informative comment. For me, the GCMs are clearly the worst bit of science in the game, mainly because they are so difficult for anyone to understand and hence relatively easy to defend in detail. Of course, once the black box results are examined their uselessness is easy to see, but their supporters continue to lap up tales of their ‘sophistication’.

Bill Marsh
Editor
May 7, 2014 5:05 am

Dr Brown,
Thank you for this explanation. It is interesting to note that the scenario you lay out means that there is no over-arching ‘conspiracy’ among the modelers. Their efforts are a natural social phenomena, sort of like a ‘self-organizing’ organic compound. I think there are many examples of this sort of ‘group dynamic’ in other fields.
I do have some experience (limited) in construction of feedback systems when I started preliminary coursework to pursuing a Phd. I quickly became frustrated with the economic models as the first thing that was always stated was, “in order to make this model mathematically tractable, the following is assumed” – the assumptions that followed essentially assumed away the problem. I suspect it is the same with the GCMs
Could you explain to the uninitiated why the use of “non-rescalable gridding of a sphere” is “insanely stupid”? I have no doubt you are correct, but, when talking with my AGW friends I would like to be able to offer an explanation rather than simply make the statement that it is ‘stupid’.
Thanks.

R2Dtoo
May 7, 2014 5:06 am

An interesting spinoff from this post would be an analysis of just how big this system has grown over the last three/four decades. I did a cursory search a while back and came up with 39 Earth System Models in 21 centres in 12 countries, but got lost in the details and inter-connections of the massive network. I did my PhD at Boulder from 1967-70. At that time NCAR was separate from the campus and virtually nothing trickled down through most programs. I now see many new offshoots, institutes and programs that didn’t exist 45 years ago. Many of these have to be on “soft money” and their continuation ensured only through grants. The university/government interface has to be a complex of state/federal agreements that is so complex as be difficult to decipher. The university logically extracts huge sums of money from the grants as “overhead”, monies that become essential to the campus for survival. Perhaps such a study has been done. If so, it would be instructive just how many “scientists” and programs actually do survive only on federal money, and are solely dependent for continuation on buying into what government wants. The fact the academies world wide also espouse global warming/climate change shows that they recognize that their members are dependents on the government. It is sad, but deeply entrenched system. Can you imagine the scramble if senior governments decided to reduce the dispersed network down to a half-dozen major locations, keep the best, and shed the rest?

1 2 3 9
Verified by MonsterInsights