Guest Post by Willis Eschenbach
David Rose has posted this , from the unreleased IPCC Fifth Assessment Report (AR5):
‘ECS is likely in the range 1.5C to 4.5C… The lower limit of the assessed likely range is thus less than the 2C in the [2007 report], reflecting the evidence from new studies.’ SOURCE
I cracked up when I read that … despite the IPCC’s claim of even greater certainty, it’s a step backwards.

You see, back around 1980, about 33 years ago, we got the first estimate from the computer models of the “equilibrium climate sensitivity” (ECS). This is the estimate of how much the world will warm if CO2 doubles. At that time, the range was said to be from 1.5° to 4.5°.
However, that was reduced in the Fourth Assessment Report, to a narrower, presumably more accurate range of from 2°C to 4.5°C. Now, however, they’ve backed away from that, and retreated to their previous estimate.
Now consider: the first estimate was done in 1980, using a simple computer and a simple model. Since then, there has been a huge, almost unimaginable increase in computer power. There has been a correspondingly huge increase in computer speed. The number of gridcells in the models has gone up by a couple orders of magnitude. Separate ocean and atmosphere models have been combined into one to reduce errors. And the size of the models has gone from a few thousand lines of code to millions of lines of code.
And the estimates of climate sensitivity have not gotten even the slightest bit more accurate.
Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.
So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?
I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.
And what is the wrong path?
The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere. The canonical (incorrect) equation is:
∆T = lambda ∆F
where T is temperature, F is forcing, lambda is the climate sensitivity, and ∆ means “the change in”.
I have shown, in a variety of posts, that the temperature of the earth is not a function of the change in forcings. Instead, the climate is a governed system. As an example of another governed system, consider a car. In general, other things being equal, we can say that the change in speed of a car is a linear function of the change in the amount of gas. Mathematically, this would be:
∆S = lambda ∆G
where S is speed, G is gas, and lambda is the coefficient relating the two.
But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.
Note that this is NOT a feedback, which would just change the coefficient “lambda” giving the linear relationship between the change in speed ∆S and the change in gas ∆G. The addition of a governor completely wipes out that linear relationship, de-coupling the changes in gas consumption from the speed changes entirely.
The exact same thing is going on with the climate. It is governed by a variety of emergent climate phenomena such as thunderstorms, the El Nino/La Nina warm water pump, and the PDO. And as a result, the change in global temperature is totally decoupled from the changes in forcings. This is why it is so hard to find traces of e.g. solar and volcano forcings in the temperature record. We know that both of those change the forcings … but the temperatures do not change correspondingly.
To me, that’s the Occam’s Razor explanation of why, after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest. They do not contain or model any of the emergent phenomena that govern the climate, the phenomena that decouple the temperature from the forcing and render the entire idea of “climate sensitivity” meaningless.
w.
PS—I have also shown that despite their huge complexity, the global temperature output of the models can be emulated to a 98% accuracy by a simple one-line equation. This means that their estimate of the “climate sensitivity” is entirely a function of their choice of forcings … meaning, of course, that even on a good day with a following wind they can tell us nothing about the climate sensitivity.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone?”
Astrology and economics are the best comparisons. From the bumbling of astrology using math and science we eventually got astronomy. My feeling is economics is getting closer to that transition as well. We are getting better through observation and computation at modeling economic behavior. The what. From that the may eventually get the why, the forces that rule the results. It is no coincidence that climatology aligns with economics and why both bristle at the frequent comparisons to astrology.
Climate Alchemy is not a science because the models, based on wildly incorrect physics are the same as the old alchemists repeating an experiment time after time in the faint hope that they will find the philosopher’s stone.
It’s lunacy on a gargantuan basis and corrupt politicians are paying them our money to do it.
This is a great analogy, another analogy I’ve been thinking of that might explain the disagreements between the two camps on AGW might be the static versus dynamic view of the revenues to government when a new tax on society is levied. A tax is also a forcing, a forcing on society. The static view on taxation (generally held by liberals) is that when the new tax is enacted people generally behave the same and just pay the tax, and all else being equal, the additional tax revenues to government are easily predicted. The dynamic view of taxation is that a tax changes people’s behavior, sometimes dramatically so, and sometimes when a new tax is enacted the result is even LESS revenues to government. Those subscribing to the static view of taxation cover their ears and never, ever want to consider this possibility, it is basically impossible in their mind, taxes go up, revenues go up, and that’s that.
Perhaps the minds of many are structured to force them to think in terms of simple cause and effect rather than more dynamically, and their views on taxation infect other endeavors such as computer modeling of the climate.
Tom in Florida says: September 15, 2013 at 5:29 am
“What is the estimated gigatons of the entire atmosphere?”
500,000. Basically, surface atm pressure * surface area. Varies a bit with humidity.
papertiger says: September 15, 2013 at 5:27 am
“About 3000 gigatons? So you don’t really know how much co2 is naturally in the atmosphere to less than four decimal places, even when talking gigatons.
Bet you anything you care to name that the “over 30 gigatons a year” is only a guess as well.”
The CO2 arithmetic is simple – 500,000 * 400 ppmv, and then a molecular weight calc. Emissions were here; 32.578645 GTons in 2011, if you like a lot of decimals. Burning C costs money, so these numbers come from accountants, not scientists.
Hi Willis. I like your post as usual. Well thought through, and well said – as usual.
Your theory of emergent weather phenomena as governors of the Earth’s temperature is stimulation and merits rigorous testing.
There need to be peer reviewed articles to present it to the scientific world.
I imagine it will be a hard fight, but that seems to me the only way it can find it’s way into the great debate that is raging in the world.
Your question: “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
I’d say, Striing Theory.
It has been going on for decades now and seems to be no nearer to testable predictions. So in Karl Popper’s science philosophy, it is still pre-scientific.
“At that time, the range was said to be from 1.5° to 4.5°.”
That was the Charney report, 1979. They said of their estimate:
“These are at best informed guesses, but they do enable us to give rough estimates to the probable bounds for the global warming.”
They guessed well.
“And the range that you ridicule is partly a recognition that the relation is indeed complex.”
So you agree with Judith Curry, presumably, that the IPCC’s claim that certainity has increased from 90% to 95% is nonsense?
As a taxpayer, I have to say that, given the amount of public money ‘climate scientists’ have burned up since 1979, your collective failure to to improve this range doesn’t look good. Maybe you have been barking up the wrong tree all this time.
Nick notes “You’ve said this before, but again not quoting anyone. Whose ludicrous idea is it? Whose canonical equation? What did they say?”
Willis has shown that the model output is of the simple form Δ T = lambda ΔF with an unbelievably high correlation (0.97 or so if my memory serves). You’ve seen those posts, Nick. Do you have anything to add to his results? Or are you merely fishing for an appeal to authority (or lack thereof)
KevinM:
At September 15, 2013 at 5:23 am you quote Willis having written
And you reply
I have come to expect warmunists will display scientific illiteracy and inadequate reading comprehension skills. Congratulations! Your comment displays both.
Willis said, and you quoted,
Got that, KevinM? Willis was talking about ATTEMPT TO REDUCE ERRORS.
In other words, Willis’ article is about precision. And the errors (note that, KevinM, the errors) provide a range of “1.5° to 4.5°C” as Willis also says in his article. And that range has not altered in 30 years.
So, the errors have not been reduced in 30 years and after all the expensive effort you have quoted Willis reporting. The range “1.5° to 4.5°C” is as imprecise as it was before all that time money and effort.
KevinM, your comment says you failed to understand what you quoted.
Your comment talks about “the same result”. In other words, it is about the determined accuracy.
KevinM, precision and accuracy are different things.
I think the IPCC determination of climate sensitivity is grossly inaccurate: I think it is less than 1.0°C. Willis’ article suggests he also thinks the value determined by the IPCC is inaccurate and high. But it does not say that. It discusses the unchanged precision of the IPCC determined value. You have failed to understand what he wrote and you have displayed ignorance of the difference between precision and accuracy.
Richard
The equation is what you get if you translate “climate sensitivity” into a mathematical expression (where forcing is the log₂(ratio of CO2 concentrations). “Climate sensitivity” a frequently used expression that climate scientists don’t bother to explain it much any more, e.g. James Hansen says in http://arxiv.org/abs/0804.1126 :
(That paper was just the first applicable reference Google suggested.)
I’m surprised you’re unfamiliar with the concept. 🙂
“So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?”
I just watched a documentary on the Concordia cruise line disaster. Despite the latest state of the art technology, the most sophisticated navigation systems, despite a crew of around 1000, the cruise liner crashed into a well known and well charted outcrop of rock marked on any standard chart of the area.
The reason? Entirely human error. The captain took the ship deliberately off course to do a favour for a colleague, who wanted to show off the ship to his friends on a nearby island.
Doesn’t matter what state of the art technology one has, if someone wants to completely ignore it, it doesn’t make any difference.
And his excuse? This outcrop of rocks didn’t show up on his maps of the area he had on the ship. But on any cruise course, maps which are not part of the scheduled trip are not required to show any sufficient amount of detail, because the ship is not supposed to be in such an area to begin with.
I suppose climate scientists would say, ‘well we didn’t know enough details of natural climate changes in our models, so how are we supposed to have predicted what would happen? ‘
TimTheToolMan says: September 15, 2013 at 6:02 am
“Willis has shown that the model output is of the simple form Δ T = lambda ΔF with an unbelievably high correlation (0.97 or so if my memory serves).”
So it is Willis’ canonical (wrong) equation?
Willis writes “The exact same thing is going on with the climate. It is governed by a variety of emergent climate phenomena such as thunderstorms, the El Nino/La Lina warm water pump, and the PDO.”
My instinct also tells me this is so. And importantly for the modellers its not something you can model. Being an emergent property of the climate, it must also be an emergent property of the model to have any chance at all of being able to reflect associated changes in climatic processes.
Willis is wrong as is anybody who believes the incorrect IR physics used in Climate Alchemy.
The reason is that there is zero net CO2 15 micron band IR emission from the Earth’s surface, simple radiative physics.
Bloke down the pub says:
September 15, 2013 at 4:19 am
To the contrary, it is an excellent example, because the the clouds control the incoming energy based on the temperature, just as cruise control regulates the incoming gas based on the speed.
w.
Nick writes “So it is Willis’ canonical (wrong) equation?”
I wouldn’t know. But you say its wrong and yet correlates closely to the model output. There is a conclusion one may be able to draw from that statement.
Ric Werme says: September 15, 2013 at 6:07 am
“The equation is what you get if you translate “climate sensitivity” into a mathematical expression (where forcing is the log2(ratio of CO2 concentrations).”
Then a lot was lost in translation. Let’s hear the original. That’s not what it says.
Its the inverse of Moore’s Law raised to the third power by the money term.
Mr Stokes
I didn’t expect a straight answer from you presumably because you are incapable of answering my question?
I think someone else wrote this before me, but please wake me up when the estimate of climate sensitivity is indistinguishable from zero.
TimTheToolMan says: September 15, 2013 at 6:15 am
“I wouldn’t know. But you say its wrong…”
No, I was quoting Willis. Well, he said “incorrect”.
Nick Stokes:
Your several posts in this thread display your usual ignorance and – also as usual – present irrelevance.
This thread is NOT about how much CO2 people are emitting to the atmosphere.
This thread is about failure to improve determination of climate sensitivity over 30 years.
Also, contrary to your assertion, there is no reason to think Charney et al. “guessed well” when they originally guessed the climate sensitivity range adopted by the IPCC.
Empirical – n.b. not model-derived – determinations indicate climate sensitivity is much less than the lower bound of the IPCC estimate: they indicate climate sensitivity is less than 1.0°C for a doubling of atmospheric CO2 equivalent.
This is indicated by the studies of
Idso from surface measurements
http://www.warwickhughes.com/papers/Idso_CR_1998.pdf
and Lindzen & Choi from ERBE satellite data
http://www.drroyspencer.com/Lindzen-and-Choi-GRL-2009.pdf
and Gregory from balloon radiosonde data
http://www.friendsofscience.org/assets/documents/OLR&NGF_June2011.pdf
Please provide another post if and when you can think of something to say which is on topic and sensible.
Richard
A general principle regarding taxes, held since roman times I do believe, was that people will pay taxes when either there is a sword at their throats (metaphorical these days, the IRS has other weapons) or paying the tax is less than the cost of avoidance.
Nick Stokes asks “You’ve said this before, but again not quoting anyone. Whose ludicrous idea is it? Whose canonical equation? What did they say?”
Here’s one that is close: http://www.people.fas.harvard.edu/~phuybers/Doc/Ockham.pdf See equation 1 where the change in the temperature of the atmosphere is calculated from the land and ocean surface temperatures plus a forcing (Fc). They would rather not have to separate out forcings and feedbacks, so they lump them all together and use the actual temperatures to determine that parameter. Their land and ocean surface temperatures come from short wave forcing, specifically seasonal short wave.
The admit their model doesn’t work at all in the tropics which lacks seasonal short wave forcing changes. In other words the simplistic models that determine temperature from forcing changes like CO2 changes and seasonal solar changes do not work when the temperature is governed as Eschenbach explained here: http://wattsupwiththat.com/2009/06/14/the-thermostat-hypothesis/
Looking at the earth as whole (as Lindzen has explained many times) the global average temperature is constrained mostly by polar heat transport. The earth absorbs short wave and radiates long wave which is modulated by CO2 and water vapor in various forms. However the global average temperature is not determined by this balance, but by the movement of heat to the poles where it is lost. Here’s a paper Lindzen wrote about that http://www-eaps.mit.edu/faculty/lindzen/prggclhttr.pdf written around the time that Trenberth was heading into the rathole of global (or local) energy balance models.
*The models are on the wrong path.*
It seems clear that they are modelling noise, not signal, the most common sin of incompetent statisticians.