Guest Post by Willis Eschenbach
David Rose has posted this , from the unreleased IPCC Fifth Assessment Report (AR5):
‘ECS is likely in the range 1.5C to 4.5C… The lower limit of the assessed likely range is thus less than the 2C in the [2007 report], reflecting the evidence from new studies.’ SOURCE
I cracked up when I read that … despite the IPCC’s claim of even greater certainty, it’s a step backwards.

You see, back around 1980, about 33 years ago, we got the first estimate from the computer models of the “equilibrium climate sensitivity” (ECS). This is the estimate of how much the world will warm if CO2 doubles. At that time, the range was said to be from 1.5° to 4.5°.
However, that was reduced in the Fourth Assessment Report, to a narrower, presumably more accurate range of from 2°C to 4.5°C. Now, however, they’ve backed away from that, and retreated to their previous estimate.
Now consider: the first estimate was done in 1980, using a simple computer and a simple model. Since then, there has been a huge, almost unimaginable increase in computer power. There has been a correspondingly huge increase in computer speed. The number of gridcells in the models has gone up by a couple orders of magnitude. Separate ocean and atmosphere models have been combined into one to reduce errors. And the size of the models has gone from a few thousand lines of code to millions of lines of code.
And the estimates of climate sensitivity have not gotten even the slightest bit more accurate.
Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.
So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?
I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.
And what is the wrong path?
The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere. The canonical (incorrect) equation is:
∆T = lambda ∆F
where T is temperature, F is forcing, lambda is the climate sensitivity, and ∆ means “the change in”.
I have shown, in a variety of posts, that the temperature of the earth is not a function of the change in forcings. Instead, the climate is a governed system. As an example of another governed system, consider a car. In general, other things being equal, we can say that the change in speed of a car is a linear function of the change in the amount of gas. Mathematically, this would be:
∆S = lambda ∆G
where S is speed, G is gas, and lambda is the coefficient relating the two.
But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.
Note that this is NOT a feedback, which would just change the coefficient “lambda” giving the linear relationship between the change in speed ∆S and the change in gas ∆G. The addition of a governor completely wipes out that linear relationship, de-coupling the changes in gas consumption from the speed changes entirely.
The exact same thing is going on with the climate. It is governed by a variety of emergent climate phenomena such as thunderstorms, the El Nino/La Nina warm water pump, and the PDO. And as a result, the change in global temperature is totally decoupled from the changes in forcings. This is why it is so hard to find traces of e.g. solar and volcano forcings in the temperature record. We know that both of those change the forcings … but the temperatures do not change correspondingly.
To me, that’s the Occam’s Razor explanation of why, after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest. They do not contain or model any of the emergent phenomena that govern the climate, the phenomena that decouple the temperature from the forcing and render the entire idea of “climate sensitivity” meaningless.
w.
PS—I have also shown that despite their huge complexity, the global temperature output of the models can be emulated to a 98% accuracy by a simple one-line equation. This means that their estimate of the “climate sensitivity” is entirely a function of their choice of forcings … meaning, of course, that even on a good day with a following wind they can tell us nothing about the climate sensitivity.
RC Saumarez writes “The solution is numerical, which is based on a large body of maths of how you approximate differential equations and solve them iteriively.”
For a single model run, perhaps. As soon as you parameterise a process, you’re enforcing a curve fit because a parameterised process cant change without changing the parameter which, as I said, is a curve fit. What do you think the complex modelled processes converge to when averaged over many runs?
A. The complex equations?
B. The underlying fundamental assumption which in the climate case is that temperature is determined by forcing with a lag?
Luther Wu;
I’ve been all over hell, with no sign of ‘im so far.
>>>>>>
Ah well, perhaps s/he is just avoiding you? More importantly, can you confirm if hell is exothermic or endothermic?
http://www.pinetree.net/humor/thermodynamics.html
(this thread needs a bit of humour injected into it in my opinion)
I think it needs to be said, this is wonderful news. The IPCC are regressing.
Th e question should have been “Can anyone name any other scientific field that has moved backward in the last third of a century?”
TimTheToolMan says: September 15, 2013 at 4:22 pm
“B. The underlying fundamental assumption which in the climate case is that temperature is determined by forcing with a lag?”
Who made that assumption? Where? Again, no quotes.
It’s true that if you average over time, average over space, Newton’s law of cooling tends to emerge. That’s Willis’ “canonical equation”. But the averaging removed a lot of complexity. And if you average over models (as Willis did), a lot more is lost.
Jim Clarke says: September 15, 2013 at 3:54 pm
“The point is that the models are entirely based on the initial assumption that feed backs are positive.”
Again, no quotes or cites. Models do not make assumptions about the sign of feedback.
Nick writes “Who made that assumption? Where? Again, no quotes.”
Are you asking where in the climate community somebody decided we’ll just model forcing with a lag? Then the answer is nobody explicitely made that assumption. However that doesn’t alter the fact that the models when averaged over many runs, appear to model forcing with a lag.
Are you suggesting the correlation with the forcing with lag equation is purely coincidental? Or perhaps simply a fit? Because its not entirely a fit, is it. Its very much in the ballpark of what the models could be simplified to.
i have to say until this conversation thread i have seen nothing but a group of very smart people apply critical thinking and open minds to others comments. as a lay person the clear and concise way commentators like richard s courtney,willis, allan mcrae (really enjoyed your informative comments today) and many others explain technical subjects to enable basic understanding to people like myself.
on this comment thread however i believe i have seen an example of the worst sort of comment ,the type i had thought was reserved for places like sks ,by rc saumarez. cd alluded that he may have had a point somewhere,and i was looking forward to a discussion whereby his objections to willis reasoning/method would be raised ,backed up by his own informed position on the subject, to which willis could either refute or be in a position to add something to his knowledge base.
unfortunately all i got was rc saying willis was wrong because he said so,with no highlighting of how or why.
richard s courtney did add some clarity to the subject,which hopefully confirmed my knowledge light assumption that it was apples and pears being argued as opposed to apples and apples,accept my apologies if i am incorrect richard s courtney.
as one of a growing band of “new” sceptics i would advise rc saumarez that many ordinary citizens are sick to the back teeth of being “told” what to believe,without any actual facts to back the assertion those doing the telling are correct.
i believe this is how the whole sorry mess of cAGW has arrived at the mess we have today,by politicians accepting what they have been told by scientists that refuse to accept the possibility they are wrong .
rc saumarez,you may well have a very valid point,i personally do not know.but if your responses on this discussion thread are anything to go by,i will never know ,and more importantly neither will the people that you have addressed your critique to. at the very least ,when asked to demonstrate how and why,you should have obliged.
Bingo!
TimTheToolMan says: September 15, 2013 at 4:55 pm
“However that doesn’t alter the fact that the models when averaged over many runs, appear to model forcing with a lag.
Are you suggesting the correlation with the forcing with lag equation is purely coincidental?”
No, as I said, it’s basically Newton’s Law of cooling (whose ludicrous idea?). Hot things emit more heat. Things that are heated get hotter.
But the Earth, too, globally averaged, responds to forcing with a lag. eg Lucia’s lumpy, or Tamino’s two-box model.
thingadonta says:
September 15, 2013 at 6:08 am
“I just watched a documentary on the Concordia cruise line disaster. Despite the latest state of the art technology, the most sophisticated navigation systems, despite a crew of around 1000, the cruise liner crashed into a well known and well charted outcrop of rock marked on any standard chart of the area.
The reason? Entirely human error. The captain took the ship deliberately off course . . .”
IOW, he defeated all of the mechanisms in place to prevent the undesired outcome . . . like an airplane pilot that ignores (for whatever reason) “Terrain” Warnings. I suppose It would be quite a project to enumerate EACH of the Scientific Method processes and failure prevention methods that have been deliberately defeated/ignored by those obsessed with selective perception of climate reality.
Bit Chilly.
A sublimely constructive takedown.
bit chilly says:
September 15, 2013 at 4:56 pm
“…as one of a growing band of “new” sceptics i would advise rc saumarez that many ordinary citizens are sick to the back teeth of being “told” what to believe,without any actual facts to back the assertion those doing the telling are correct.”
________________________
Bravo
————————————-
davidmhoffer says:
September 15, 2013 at 4:24 pm
Luther Wu;
I’ve been all over hell, with no sign of ‘im so far.
>>>>>>
Ah well, perhaps s/he is just avoiding you? More importantly, can you confirm if hell is exothermic or endothermic?
________________________
Well, hell’s probably exothermic, since every time you turn around, either hell’s a poppin’ or all hell’s breakin’ loose.
On the other hand, there is also evidence that the opposite is true: http://www.funnysigns.net/hell-freezes-over/
Nick Stokes writes “But the Earth, too, globally averaged, responds to forcing with a lag. eg Lucia’s lumpy, or Tamino’s two-box model.”
And if the sun were putting out more energy then this would be valid reasoning. But its not like that is it. Instead it is theorised that the CO2 reorganises the energy such that the surface is warmer and that is not valid reasoning without further proof. When the models can be shown to be ignoring all that complexity then there are a couple of options.
A. The earth also ignores all the complexity.
B. The models are wrong.
Given the recent divergence of observed vs measured temperatures I think its becoming clear which is correct.
And of course when I say “observed vs measured temperatures” I mean observed vs modelled temperatures.
Willis Eschenbach says: September 15, 2013 at 11:34 am
Or are you just nitpicking?
w.
>>>>>> >>>>>> >>
>>>>>> >>>>>> >> >>
Willis, your work is great!! Sorry to nitpick. My bad. And I get it about a self-regulated / self-modulating system. That makes sense. Either the established physics is bull and climate sensitivity is zero or ~ 0, or something else is at play that we don’t understand. But, yes, CO2 is not doing what the warmists say it should be doing. It’s doing… nothing. That’s the evidence.
Willis, love your contributions and always look forward to them. But I am with Bloke Down the Pub regarding the relationship between applied fuel and speed. I may just be missing your point, but when reading the article initially I had the same response as Bloke and went to the comments looking to see if anyone else did, and the Bloke (thanks Bloke!) stated it well.
Cruise control merely does more or less automatically what humans try to do manually when attempting to maintain a given speed regardless of road conditions, in both cases by throttle manipulation. But I’m sure you know that. So I think you were making a different point. Could you please explain more clearly how the relationship between speed and fuel consumption gets “uncoupled” with a governor?
Many thanks,
Paul Monaghan
Connecticut, USA
I’m sorry if I have come over as arrogant and didactic.
This was not my intention.
If you look at this post, there is, IMHO, a arrogant thread which amounts to bullyng. Pamela Gray does not deserve the responses that she has received. In normal academic exchanges, this would be unacceptable.
As regards my position. I spent about 24 hours reading the NCAR Cam 3.0 document. It has been assembled by very high grade mathematicians and scientists. I got a general idea of their approach. However, I would think that it would take anyone from a standing start at least a year to come to grips with the numerics of the model. Also, there is a huge amount of physical modelling that makes up the underlying mathematics of the model. I freely admit that I cannot come to grips with this modelling. Although I have worked in some aspects of modelling, this is way beyond my competence.
I think that off the cuff remarks by the coimmentariat on mathematical modelling here are are absurd.
I have been met with a volley of abuse and been called a troll. Mr Eschenbach, who by his own admission, is a self taught mathematician, has defended his position against reasobale criticism with abuse to most his critics, rather than making an argument. I am not saying that he is wrong, bui if he makes a proposition, he should defend it with logic. I agree I have been somewhat dismissive of his forays into signal processing, but there are a number of people who read this this blog who are educated in the subject and do not accept his line line of reasoning.
I have made some points about modelling. I freely admit that I am not primarily a mathematical modeller. I am a measurer. However I used non-linear stress analysis in bone, which was closely related to variational principles i.e.: FEM during my PhD and have used Finite difference methods to solve highly non-linear PDEs to try to explain experimental observations in cardiac electrophysiology. (I do not claim great originality in this approach)
Nevertheless I have been told that I am an ignorant troll and been told to go away. I suggest that this is not the case. I have challenged my most vehement critic to tell us what papers he has published in order to establish his expertise in the subject of mathematical modelling.
I questioned his statement that FE methods were equivalent to finite difference methods because they were based on differet mathematical principles. His response was that I was correct but I must have looked it up on Google, although I have published results using these methods.
I am happy to write a post on why I think that Eschenbachs’ thesis can be critisised on several grounds. I have suggested this several times but have bit received a response. If I am invited to do so, I will.
I repeat my comment that there are some serious scientists who write posts on this blog, which is possibly why it has its reputation.
This makes perfect tactical sense for any agency with a keen eye on the funding and prospects for much more. That is to acknowledge the truth of actual data to some extent but to keep up the rhetoric for main funding prize at the same time. It may well be the last of its kind if the cooling continues. They will have to light fires under city-based land temperature stations beyond that point and shoot down the satellites.
P.S., funny little twist, I imagine slick conditions could be a “positive feedback” leading to “instability” in the system a la another post I read (here?) recently, in the sense that increased acceleration leads to tire slippage and lower speeds, calling for additional increased acceleration, until you’re just doing careening along the ice at max RPM until the motor throws a rod through the block (which I guess would be the ultimate negative feedback…). This could of course be accounted for by referencing wheel RPMs to speed in the programming, and I imagine it is, but the thought made me smile.
Best,
Paul
I’m still reading the comments after having read the comments on a very similar thread at Bishop Hill. I am surprised at the number of people who (still) think the projected warming (climate sensitivity) is a property of the climate models.
Everyone should read Richard Courtney’s excellent and lucid explanations of what the climate models do, or rather what they don’t do.
What follows from this is that it doesn’t matter how much the models are improved (in the sense the physics are better understood and modelled and grid resolution improved), they will never produce better climate predictions, because the (deterministic) models don’t materially contribute to the predictions, and it is deeply dishonest of the climate modellers et al, to pretend better models (more money spent) will produce better predictions.
Similar advances have been made In the field of random noise prediction.
Thousands of years ago, it was important to know whether the next harvest was going to succeed especially after years of drought. The pharaoh relied on the high priests for advice. They would slaughter chickens and examine their entrails to determine whether the the next harvest was would be a success. However the pharaoh was displeased at the lack of success of their predictions. He challenged the priests and said, the chickens can’t predict anything and had no forecasting skill. The priests replied, fearing their jobs, and heads were on the line, said, “Not so sire, what we really need is more chickens”.
It would seem to me that the models would be more convincing if they were not focused on glocal average temps. I can barely accept that this can be determined with any degree of accuracy today, much less 100 years from now. If their models are so wonderful shouldn’t they be able to publish a forecast for the daily temps at O’Hare, or Sidney for the next hundred years. We would know how trustworthy the model is in a few months. No need to burden our greatgrandchildren. Think how beneficial it would be to society if I knew if I should buy a new snowblower in Nov. rather than waiting till Jan. The best NOAA, with billions of dollars of compute power can do is above average, normal or below average, and only a few months out at that. And I haven’t found that to be especially helpful.
http://www.cpc.ncep.noaa.gov/products/predictions/long_range/lead01/off01_temp.gif
The AGW models must be accurate to hundredths of a degree if they can forecast average global temps to a tenth. Their own diagrams show they model thousands of points around the globe. Surely they must feel some of these forecasts are actually accurate.
So what’s the temp for 1/1/14 at O’Hare? Is that too much to ask?
DirkH says:
September 15, 2013 at 11:31 am
“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
On human scale climate science has made enormous jumps- on Earth scale- its a pittance.
Geosciences have also made pitiful progress- still only barely penetrating the crust in small area- trivial progress- much smaller than climate science. The understanding of the Earth is trivial- anyone who understands remote sensing knows it totally dependant on “knowing” what you are measuring to measure it.
Astronomy-Cosmology- progress like atom in the entire Earth.
Willis, I seldom disagree with anything you say. That said, your cruise control analogy isn’t terribly good.
All other things being equal, if I set the cruise control at 50 mph I will get 30 mpg, if I set it at 70 mph I will get 20 mpg. If I change the speed, the gas mileage changes.
The system here is a car operating in a certain environment. The input is the desired speed. The output is the actual speed. The cruise control takes the desired speed, subtracts the actual speed and generates an error signal which actuates the throttle. In other words, the cruise control provides the signal that controls the throttle position.
The feedback signal is the vehicle’s speed. The feedback loop consists of the speed sensor and the cruise control. http://en.wikipedia.org/wiki/Control_theory#An_example The whole process is called feedback. http://en.wikipedia.org/wiki/Feedback
Philip Bradley says:
September 15, 2013 at 6:00 pm
====
100%