Under the radar – the NAS Report

Guest Post by Willis Eschenbach

Under the radar, and un-noticed by many climate scientists, there was a recent study by the National Academy of Sciences (NAS), commissioned by the US Government, regarding climate change. Here is the remit under which they were supposed to operate:

Specifically, our charge was

1. To identify the principal premises on which our current understanding of the question [of the climate effects of CO2] is based,

2. To assess quantitatively the adequacy and uncertainty of our knowledge of these factors and processes, and

3. To summarize in concise and objective terms our best present understanding of the carbon dioxide/climate issue for the benefit of policymakers.

Now, that all sounds quite reasonable. In fact, if we knew the answers to those questions, we’d be a long ways ahead of where we are now.

Figure 1. The new Cray supercomputer called “Gaea”, which was recently installed at the National Oceanic and Atmospheric Administration. It will be used to run climate models.

But as it turned out, being AGW supporting climate scientists, the NAS study group decided that they knew better. They decided that to answer the actual question they had been asked would be too difficult, that it would take too long.

Now that’s OK. Sometimes scientists are asked for stuff that might take a decade to figure out. And that’s just what they should have told their political masters, can’t do it, takes too long. But noooo … they knew better, so they decided that instead, they should answer a different question entirely. After listing the reasons that it was too hard to answer the questions they were actually asked, they say (emphasis mine):

A complete assessment of all the issues will be a long and difficult task.

It seemed feasible, however, to start with a single basic question:  If we were indeed certain that atmospheric carbon dioxide would increase on a known schedule, how well could we project the climatic consequences?

Oooookaaaay … I guess that’s now the modern post-normal science method. First, you assume that there will be “climatic consequences” from increasing CO2. Then you see if you can “project the consequences”.

They are right that it is easier to do that than to actually establish IF there will be climatic consequences. It makes it so much simpler if you just assume that CO2 drives the climate. Once you have the answer, the questions get much easier …

However, they did at least try to answer their own question. And what are their findings? Well, they started out with this:

We estimate the most probable global warming for a doubling of CO2 to be near 3’C with a probable error of ± 1.5°C.

No surprise there. They point out that this estimate, of course, comes from climate models. Surprisingly, however, they have no question and are in no mystery about whether climate models are tuned or not. They say (emphasis mine):

Since individual clouds are below the grid scale of the general circulation models, ways must be found to relate the total cloud amount in a grid box to the grid-point variables. Existing parameterizations of cloud amounts in general circulation models are physically very crude. When empirical adjustments of parameters are made to achieve verisimilitude, the model may appear to be validated against the present climate. But such tuning by itself does not guarantee that the response of clouds to a change in the CO2 concentration is also tuned. It must thus be emphasized that the modeling of clouds is one of the weakest links in the general circulation modeling efforts.

Modeling of clouds is one of the weakest links … can’t disagree with that.

So what is the current state of play regarding the climate feedback? The authors say that the positive water vapor feedback overrules any possible negative feedbacks:

We have examined with care ail known negative feedback mechanisms, such as increases in low or middle cloud amount, and have concluded that the oversimplifications and inaccuracies in the models are not likely to have vitiated the principal conclusion that there will be appreciable warming. The known negative feedback mechanisms can reduce the warming, but they do not appear to be so strong as the positive moisture feedback.

However, as has been the case for years, when you get to the actual section of the report where they discuss the clouds (the main negative feedback), the report merely reiterates that the clouds are poorly understood and poorly represented … how does that work, that they are sure the net feedback is positive, but they don’t understand and can only poorly represent the negative feedbacks? They say, for example:

How important the overall cloud effects are is, however, an extremely difficult question to answer. The cloud distribution is a product of the entire climate system, in which many other feedbacks are involved. Trustworthy answers can be obtained only through comprehensive numerical modeling of the general circulations of the atmosphere and oceans together with validation by comparison of the observed with the model-produced cloud types and amounts.

In other words, they don’t know but they’re sure the net is positive.

Regarding whether the models are able to accurately replicate regional climates, the report says:

At present, we cannot simulate accurately the details of regional climate and thus cannot predict the locations and intensities of regional climate changes with confidence. This situation may be expected to improve gradually as greater scientific understanding is acquired and faster computers are built.

So there you have it, folks. The climate sensitivity is 3°C per doubling of CO2, with an error of about ± 1.5°C. Net feedback is positive, although we don’t understand the clouds. The models are not yet able to simulate regional climates. No surprises in any of that. It’s just what you’d expect a NAS panel to say.

Now, before going forwards, since the NAS report is based on computer models, let me take a slight diversion to list a few facts about computers, which are a long-time fascination of mine. As long as I can remember, I wanted a computer of my own. When I was a little kid I dreamed about having one. I speak a half dozen computer languages reasonably well, and there are more that I’ve forgotten. I wrote my first computer program in 1963.

Watching the changes in computer power has been astounding. In 1979, the fastest computer in the world was the Cray-1 supercomputer. In 1979, a Cray-1 supercomputer, a machine far beyond anything that most scientists might have dreamed of having, had 8 Mb of memory, 10 Gb of hard disk space, and ran at 100 MFLOPS (million floating point operations per second). The computer I’m writing this on has a thousand times the memory, fifty times the disk space, and two hundred times the speed of the Cray-1.

And that’s just my desktop computer. The new NASA climate supercomputer “Gaea” shown in Figure 1 runs two and a half million times as fast as a Cray-1. This means that a one-day run on “Gaea” would take a Cray-1 about seven thousand years to complete …

Now, why is the speed of a Cray-1 computer relevant to the NAS report I quoted from above?

It is relevant because as some of you may have realized, the NAS report I quoted from above is called the “Charney Report“. As far as I know, it was the first official National Academy of Science statement on the CO2 question. And when I said it was a “recent report”, I was thinking about it in historical terms. It was published in 1979.

Here’s the bizarre part, the elephant in the climate science room. The Charney Report could have been written yesterday. AGW supporters are still making exactly the same claims, as if no time had passed at all. For example, AGW supporters are still saying the same thing about the clouds now as they were back in 1979—they admit they don’t understand them, that it’s the biggest problem in the models, but all the same but they’re sure the net feedback is positive. I’m not sure clear that works, but it’s been that way since 1979.

That’s the oddity to me—when you read the Charney Report, it is obvious that almost nothing of significance has changed in the field since 1979. There have been no scientific breakthroughs, no new deep understandings. People are still making the same claims about climate sensitivity, with almost no change in the huge error limits. The range still varies by a factor of three, from about 1.5 to about 4.5°C per doubling of CO2.

Meanwhile, the computer horsepower has increased beyond anyone’s wildest expectations. The size of the climate models has done the same. The climate models of 1979 were thousands of lines of code. The modern models are more like millions of lines of code. Back then it was atmosphere only models with a few layers and large gridcells. Now we have fully coupled ocean-atmosphere-cryosphere-biosphere-lithosphere models, with much smaller gridcells and dozens of both oceanic and atmospheric layers.

And since 1979, an entire climate industry has grown up that has spent millions of human-hours applying that constantly increasing computer horsepower to studying the climate.

And after the millions of hours of human effort, after the millions and millions of dollars gone into research, after all of those million-fold increases in computer speed and size, and after the phenomenal increase in model sophistication and detail … the guesstimated range of climate sensitivity hasn’t narrowed in any significant fashion. It’s still right around 3 ± 1.5°C per double of CO2, just like it was in 1979.

And the same thing is true on most fronts in climate science. We still don’t understand the things that were mysteries a third of a century ago.  After all of the gigantic advances in model speed, size, and detail, we still can say nothing definitive about the clouds. We still don’t have a handle on the net feedback. It’s like the whole realm of climate science got stuck in a 1979 time warp, and has basically gone nowhere since then. The models are thousands of times bigger, and thousands of times faster, and thousands of times more complex, but they are still useless for regional predictions.

How can we understand this stupendous lack of progress, a third of a century of intensive work with very little to show for it?

For me, there is only one answer. The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error.

Now we can debate what that fundamental misunderstanding might be.

But I see no other explanation that makes sense. Every other field of science has seen huge advances since 1979. New fields have opened up, old fields have moved ahead. Genomics and nanotechnology and proteomics and optics and carbon chemistry and all the rest, everyone has ridden the computer revolution to heights undreamed of … except climate science.

That’s the elephant in the room—the incredible lack of progress in the field despite a third of a century of intense study.

Now me, I think the fundamental misunderstanding is the idea that the surface air temperature is a linear function of forcing. That’s why it was lethal for the Charney folks to answer the wrong question. They started with the assumption that a change in forcing would change the temperature, and wondered “how well could we project the climatic consequences?”

Once you’ve done that, once you’ve assumed that CO2 is the culprit, you’ve ruled out the understanding of the climate as a heat engine.

Once you’ve done that, you’ve ruled out the idea that like all flow systems, the climate has preferential states, and that it evolves to maximize entropy.

Once you’ve done that, you’ve ruled out all of the various thermostatic and homeostatic climate mechanisms that are operating at a host of spatial and temporal scales.

And as it turns out, once you’ve done that, once you make the assumption that surface temperature is a linear function of forcing, you’ve ruled out any progress in the field until that error is rectified.

But that’s just me. You may have some other explanation for the almost total lack of progress in climate science in the last third of a century, and if so, all cordial comments gladly accepted. Allow me to recommend that your comments be brief, clear and interesting.

w.

PS—Please do not compare this to the lack of progress in something like achieving nuclear fusion. Unlike climate science, that is a practical problem, and a devilishly complex one. The challenge there is to build something never seen in nature—a bottle that can contain the sun here on earth.

Climate, on the other hand, is a theoretical question, not a building challenge.

PPS—Please don’t come in and start off with version number 45,122,164 of the “Willis, you’re an ignorant jerk” meme. I know that. I was born yesterday, and my background music is Tom o’Bedlam’s song:

By a host of furious fancies

Whereof I am commander

With a sword of fire, and a steed of air

Through the universe I wander.

By a ghost of rags and shadows

I summoned am to tourney

Ten leagues beyond the wild world's end

Methinks it is no journey.

So let’s just take my ignorance and my non compos mentation and my general jerkitude as established facts, consider them read into the record, and stick to the science, OK?

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

272 Comments
Inline Feedbacks
View all comments
Disko Troop
March 8, 2012 2:07 am

The NAS report is a climate version of the TV show “Jeopardy”. We know the answer, now ask us a question that fits it!

Markus Fitzhenry
March 8, 2012 2:09 am

The hypothesis of how radiation is treated by an atmosphere under the principle of greenhouse seems peculiar. There is a conflict in the observations of black bodies that feed into the Stephan- Boltzman equations. The observations that lead to the accepted paradigm of black body theory maintained the laws of conservation of energy and recognized the exponential dependency upon energy and temperature.
The conflict manifests in the treatment of the energy balance (budget) of the atmosphere. The saturated adiabatic lapse rate has been reversed engineered from observations of Earths LWR surface emissions. That is, the equation used for the mean atmospheric emission surface is inefficient as a measure of radiative forcing. It is necessary to wholly accept or reject ideal black bodies in natural physics. You can’t have a bob each way it’s one or the other. The existence of the universe depends on it.
The Stephan-Boltzman equation was theorized around a black body, it is an abuse of the equation to fit it to the natural laws of a gaseous atmosphere. I mean, who would have thought that atmospheric body would not mimic a uniform thermal distribution regardless of its S-B emission.
The equation ( Ts ~ Te + τH ) used in modelling is flawed compared to observations. The theory faces a double jeopardy; it commits the conclusion from both the mean radiative surface and the TSI.
The Earth grew its atmosphere into its composition with the enhancement of the Sun and Cosmic rays. In effect it matured to have the physical nature of an astrophysical body. It returns to space the energy it receives. Who knows what the natural force that drives this phenomena is. Without it the universe would not have our Earth amongst it jewels. Our atmosphere supports life because the atmosphere has evolved to maintain equilibrium. But for dynamics we would be a perfect black body.
We don’t even know the answers to gravity. It is presumptuous to think we have the mastery to devise new theory by a virtual model. The observations and experiments do not achieve, well enough, a reasonable certainty of its premises for a modelling of global circulation to be successful.
The lower temperatures of gravity split the strong and electromagnetic forces rendering the theory of ‘back-radiation’ in natural physics something of an anomaly.
Any wonder the models have got the climate upside down lately.

H.R.
March 8, 2012 2:11 am

And… 30 years control of the temperature record and the modelers can’t match unreality either.

thingadonta
March 8, 2012 2:19 am

“….It’s still right around 3 ± 1.5°C per double of CO2, just like it was in 1979.”
They were lucky the climate warmed a little between then and now. (In my opinion due to a positive PDO and a ~20 year+ lag effect from the increase in TSI between about 1750-1985). If it didnt, nobody would have believed them, and there would be little research funding.
T

thingadonta
March 8, 2012 2:24 am

Regarding clouds: the Bureau of Meteorology in Australia are saying the last few years in Australia have been cooler partly due to an increase in cloud cover and the breaking of the drought. But during the 7 year drought from ~2002-2009 they didnt say it was slightly warmer because of the co- incident lack of cloud cover, they of course blamed C02. So when its cooler it’s because of clouds, when its warmer it’s because of c02….

Davy12
March 8, 2012 2:25 am

Brilliant article.
Don’t understand the physics then all you are doing is playing with numbers.

thingadonta
March 8, 2012 2:29 am

I agree with you reasoning regarding the lack of progress in climate science, partly or mostly coming down to basic assumptions of linearity.
But I would add one more thing: the galileo phenomenon. They don’t want the sun to be involved because they can’t do anything about it.

Kev-in-Uk
March 8, 2012 2:34 am

Surely this just proves that governments haven’t spent/invested nearly enough money in the CAGW ‘research’ ?
We need more taxation, more green energy, more AGW based research grants, more consenting warmist-leaning scientists producing models, etc, etc……….
My goodness, perhaps I’m putting ideas in the warmista heads!
(I really don’t need to put /sarc, do I?)

March 8, 2012 2:39 am

There is much discussion of clouds herein, so I trust this is considered on topic. It is also an issue I have discussed in detail in my peer-reviewed paper which uses physics to show that there can be no warming effect from clouds, nor carbon dioxide. And to understand why, you have to think outside the square, but I shall not release the details prior to publication next week..
But, inside “the square” of the atmosphere the fact most often overlooked (or avoided perhaps?) is that water vapour absorbs a significant amount of incident IR radiation from the Sun and the energy returns to space by (upward) backradiation, thus having a cooling effect. Of course the whole atmosphere also has a cooling effect in daylight hours.

WillieB
March 8, 2012 2:39 am

Willis, as always, enjoyed your post. Without fail, I always learn something new and gain greater insight. You have a great knack for being able to wade through the chaff and get to the crux of an issue.
PS–Let me thank you by simply stating: “Willis, you’re an ignorant jerk”. Cha-ching. 45,122,165. Next stop 50 million! (LOL)
PPS–Can’t even imagine where Anthony must rank on the “ignorant jerk” comment list.
PPPS–Reminds me of the old SNL comedy skit “Point/Counterpoint” where Dan Aykroyd turns to Jane Curtain and retorts, “Jane, you ignorant slut”. 😉

Eric (skeptic)
March 8, 2012 2:40 am

Models cannot be validated simply because they cannot model anything of consequence. Want to know what the weather will be like in Peoria next week, next year, next decade, next century? Forget about it, no regional characterization of weather is possible (despite a few claims that have almost always been wrong). Instead the modelers say they do not need to reproduce any regional patterns, they can simply make multiple runs with various kinds of weather and average them to get the trend which they call climate.
The problem is that without properly modeling (NOT “predicting”) the weather, the modelers will never get an accurate trend. If rising CO2 changes the small scale day to day clouds and precipitation, the modelers will never know what that change will be.
The models will also never predict ENSO and again that’s not my concern. It’s that the models will never determine the changing statistics of ENSO in a changing climate. The modelers will thus never know cloud and albedo changes, water cycle changes or how much warmth will be permanently sequestered in the deep ocean.

Dave Wendt
March 8, 2012 2:44 am

Willis
I think your being much to generous in claiming that they have made no progress in a third of a century. In my view the current state of the art is barely distinguishable from what was laid down by old Svante Arrhenius back at the turn of the previous century when quantum mechanics and photons were barely beyond speculations. When you find yourself in argument with one of the true believers, if you can push past the point where they’re calling you a knuckle dragging moron for having the temerity to disagree with your intellectual superiors to actually get them to offer evidence in support of their position, even now their favorite fallback is old Svante. BTW has anyone else noticed that in many of their written works when making this cite they use Arrhenius(1896)? Are they all that ignorant or are they just trying to conceal the fact that he revisited the topic a decade later reducing his estimate of the GHE by 2/3rds to 3/4ths.

Peter Miller
March 8, 2012 2:45 am

The just reconfirms the cornerstones of current ‘climate science’ are dependent on four factors:
1. Totally rejection of the influence and concept of natural climate cycles – something totally self-evident to anyone other than a CAGW fanatic.
2. The use of increasingly complicated computer models, but which are still programmed to produce the same pre-determined results. Any argument here from anyone?
3. Exaggeration of the positive feedback effect of clouds, when this effect is very little understood and increasingly looks like it is a mildly negative effect.
4. Repetition of the same tired old mantra that “carbon dioxide is an evil gas and any increase is to be avoided at all costs”. In reality, the impact of rising carbon dioxide levels seem to be largely beneficial – e.g. it is a natural fertiliser, so crops grow quicker etc.
‘Climate science’ is now a huge industry employing a vast army of bureaucrats and pseudo-scientists. It is hugely expensive, produces distorted, highly qustionable and dubious research and is solely interested in its own self-perpetuation, as witness by the myriad number of unfounded scare stories it generates.

John Marshall
March 8, 2012 2:47 am

First define, scientifically, Climate Consequences.
The alarmists, it seems to me, consider the ‘normal’ GST to be a constant. Error! It never has been. Climates change all the time which means surface temperature changes all the time. There is no ‘normal’ surface temperature. In fact if the last 800Ma are looked at the planet has been in ice house conditions more than hot house but to take the average of this cycle would be meaningless in itself and to the climate argument today.
Perhaps the real problem is the reliance on the theory of GHG’s. Just because two talented scientists formulated it does not mean that it is correct. Fourier and Arhenius were brilliant men but men with man’s biases and presumptions. And Arhenius got a bit funny in old age when he switched from chemistry to physics and not all chemical reactions conform to his rules.

DEEBEE
March 8, 2012 2:51 am

However, as has been the case for years, when you get to the actual section of the report where they discuss the clouds (the main negative feedback), the report merely reiterates that the clouds are poorly understood and poorly represented … how does that work, that they are sure the net feedback is positive, but they don’t understand and can only poorly represent the negative feedbacks?
==============================================
That the money paragraph. Clearly shows you are not a co-religionist, otherwise it would be obvious to you that when you take an essentially auto-correlated ramdom variable (like UM temperature) and globally average it certainity increases.
Willis get with the program man(n).

March 8, 2012 2:57 am

Model vs reality: North Atlantic SST.
http://oi56.tinypic.com/wa6mia.jpg
Model is obviously driven by something unphysical. How anyone dares to mention something based on “models”?

banjo
March 8, 2012 3:11 am

Mark Hladik
March 8, 2012 3:14 am

The current state of climate modeling:
“If the data do not fit the model, then OBVIOUSLY the data are wrong!”
“We do precision guesswork (using HIGH technology).”

Robert of Ottawa
March 8, 2012 3:16 am

Robert Berkley, any scuba diver will correct you. The incoming SW light heats the water, surface interaction is generally evaporation, i.e. cooling. Yes, LW outgoing radiation too … Not sure what the ratio of the two cooling effects are … Anyone got an idea?

Richard S Courtney
March 8, 2012 3:17 am

Willis:
Thanks for this article. It reminds that ‘The Team’ have achieved nothing except the expenditure of billions of dollars per year.
You ask;
“For me, there is only one answer. The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error.
Now we can debate what that fundamental misunderstanding might be.
But I see no other explanation that makes sense. Every other field of science has seen huge advances since 1979. New fields have opened up, old fields have moved ahead. Genomics and nanotechnology and proteomics and optics and carbon chemistry and all the rest, everyone has ridden the computer revolution to heights undreamed of … except climate science.”
With respect I suggest there is another “explanation that makes sense”: i.e.
There is no “far-reaching theoretical error” and no “fundamental misunderstanding” because there is no fundamental understanding of the climate system (instead the models use simplifying curve-fitting assumptions) and, therefore, there cannot be a theoretical error because the models do not apply a theory.
I remind that you demonstrated the models’ outputs can be emulated by a simple one-line equation.
And I remind that the models each assumes a different climate sensitivity and the output of each model is adjusted to match past mean global temperatures by adopting an assumed value of aerosol cooling which is different for each model. This aerosol adjustment being used in the Hadley Centre model was first reported by me in a paper published in 1999, and Kiehle later reported that similar adjustment exists in all of the models but each model uses different values of climate sensitivity and aerosol cooling to those used in every other model.
This is a direct proof that there is no fundamental climate theory being applied in any of the models. They would each use the same values of climate sensitivity if they were all using the same theory.
Furthermore, as you say, the lack of understanding of cloud behaviour is important but the lack of understanding of aerosol behaviour is much, much more important. The ignorance of real aerosol behaviour is being used to provide an appearance that the models emulate past mean global temperature. They don’t emulate past mean global temperature: they are adjusted to match past mean global temperature by use of the variety of assumed aerosol cooling values.
As you say, the 1979 NAS Report said the ability to emulate regional climates was poor. It still is. However, if a climate model were emulating the world’s climate system then it would emulate regional climates and the mean of all the emulated surface temperatures would match observed mean global temperature.
The fact that the models are adjusted to match mean global temperature but fail to match regional temperatures is a direct proof that none of them is emulating the climate system of the real Earth.
And the fact that each model uses a different value of climate sensitivity is a direct proof that they are not applying a unique theory of climate behaviour.
But model falsification seems to play no part in what is disingenuously called ‘climate science’.
Richard

anticlimactic
March 8, 2012 3:23 am

AGW adherents remind me of those who were thought that the Earth was at the centre of the solar system. Without the right viewpoint no progress can be made, just more elaborate explanations of how the planets move which made no sense. It just shows that once you put the Sun at the centre you can progress to better science!
PS. It seems to me that clouds effectively act as insulators, reflecting sunlight during the day to make it cooler and keeping the heat in at night making it warmer.

Ken Hall
March 8, 2012 3:34 am

From the article above:

That’s the oddity to me—when you read the Charney Report, it is obvious that almost nothing of significance has changed in the field since 1979. There have been no scientific breakthroughs, no new deep understandings. People are still making the same claims about climate sensitivity, with almost no change in the huge error limits. The range still varies by a factor of three, from about 1.5 to about 4.5°C per doubling of CO2.
Meanwhile, the computer horsepower has increased beyond anyone’s wildest expectations. The size of the climate models has done the same. The climate models of 1979 were thousands of lines of code. The modern models are more like millions of lines of code. Back then it was atmosphere only models with a few layers and large gridcells. Now we have fully coupled ocean-atmosphere-cryosphere-biosphere-lithosphere models, with much smaller gridcells and dozens of both oceanic and atmospheric layers.
And since 1979, an entire climate industry has grown up that has spent millions of human-hours applying that constantly increasing computer horsepower to studying the climate.


And the other thing that has stayed exactly the same with regards to the climate models, is the human assumptions that they code in to the model, and so regardless of the complexity and power of the computers, they will always get the same conclusion coming out of the computers, because that is what they are programmed to do.
I started computing in 1983 and now I hold in my hand a smartphone which has way more power and application, than the super-computers of that day in 1983 which only Phd computer-science professors had access to… Sometimes I am stopped in my tracks with a WOW moment of how much technology has moved on.

Pete in Cumbria UK
March 8, 2012 3:40 am

Does anyone still work with ‘analogue computers’ any more?
Non techni-digital maybe but they’ve got almost infinite resolution and bandwidth.
The reason I wonder was that many many moons ago, I recall (I think) the UK treasury wanted to model money flow around The Economy (how interest rates affected things for example) and even the likes of Cray 1 were not up to the task.
I dimly recall someone putting together a system of pipes, pumps, (adjustable) valves and water tanks. To run it, the various tanks were primed with certain amounts of water, each representing something different (amount of known money in circulation, personal savings/loans, Government gilts/loans etc) and pumps/valves were adjusted to represent inflation, interest rates, growth of GDP and other important monetary stuff.
The thing would be primed with water, switched on left to run for an hour or so and the level of water in various pipes/tanks would represent the predicted state of the economy. Changing the speed of the ‘inflation’ pump would cause one tank to empty and the level in various others to change and, by accounts, the beast was spookily accurate in its ‘predictions’ Much more so than its digital counterpart.
In its simplest sense, think about the maths of pouring some water from a jug into a glass tumbler. Each of those trillions of trillions of water molecules ‘knows’ where its going, how fast its going and where it will eventually finish up.
Could Gaea even say how many water molecules remain stuck to the inside of the jug, let alone work our where the rest of them went?
Would an analogue computer fare any better in the climate prediction projection game?

Joe
March 8, 2012 3:42 am

Ken says:
March 8, 2012 at 12:41 am
The answer is, of course, that the science was correct then as to the degree of warming, and is correct now. This just goes to show that it is indeed ‘settled’ within the knowledge available both then and currently. Sceptics are really just denying facts, like they always do …
No, I’m not a ‘warmist’, but just trying to anticipate the most likely response that community would be likely or very likely to make, if I can use IPCC language!

Fair one, in which case I’d like to make the following observation:
I will accept that the science is settled, to the point that a 2 million times increase in modeling power, and 30 years of intensive model development, can’t give any finer resolution than 3.0 +/- 1.5 degrees. So it’s reasonable to conclude that no finer resolution is possible.
,
Given the above, 3.0 +/- 1.5 deg is our final answer, and we no longer need to fund any further modeling, research, or analysis of the problem because it’s now as good as it gets. To all the climatologists out there, in recognition of your great service, we’ll be happy to offer you free re-training in a new (climate UNrelated) field of your choice.

cui bono
March 8, 2012 3:47 am

w –
As you demonstrate, the core issue for computer models from the beginning was CO2 sensitivity. This was, after all, the key question the models were designed to answer – not how climate actually works. So I always imagine computer models started with 2*CO2 ~ 3C. This little equation was then ‘cut-n-pasted’ into any subsequent models, and surrounded by millions of lines of code on other peripheral matters. The code gets longer and more complex each year. Meanwhile the computers get faster which makes absolutely no difference except the (wrong) answers are generated in hours, not millenia.
It seems there is no ‘Darwinian evolution’ in the models as none are ever falsified against real-world data, presumably because the AGW-types are not concerned about real-world data. They’re too busy playing with their expensive toy models.