Dr. Kiehl's Paradox

Guest Post by Willis Eschenbach

Back in 2007, in a paper published in GRL entitledTwentieth century climate model response and climate sensitivityJeffrey Kiehl noted a curious paradox. All of the various different climate models operated by different groups were able to do a reasonable job of emulating the historical surface temperature record. In fact, much is made of this agreement by people like the IPCC. They claim it shows that the models are valid, physical based representations of reality.

kiehl sensitivity vs total forcing

Figure 1. Kiehl results, comparing climate sensitivity (ECS) and total forcing. 

The paradox is that the models all report greatly varying climate sensitivities but they all give approximately the same answer … what’s up with that? Here’s how Kiehl described it in his paper:

[4] One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.

[5] The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy?

How can that be? The models have widely varying sensitivities … but they all are able to replicate the historical temperatures? How is that possible?

Not to give away the answer, but here’s the answer that Kiehl gives (emphasis mine):

It is found that the total anthropogenic forcing for a wide range of climate models differs by a factor of two and that the total forcing is inversely correlated to climate sensitivity.

This kinda makes sense, because if the total forcing is larger, you’ll have to shrink it more (smaller sensitivity) to end up with a temperature result that fits the historical record. However, Kiehl was not quite correct.

My own research in June of this year, reported in the post Climate Sensitivity Deconstructed,  has shown that the critical factor is not the total forcing as Kiehl hypothesized. What I found was that the climate sensitivity of the models is emulated very accurately by a simple trend ratio—the trend of the forcing divided by the trend of the model output.

lambda vs trend ratio allFigure 2. Lambda compared to the trend ratio. Red shows transient climate sensitivity (TCR) of four individual models plus one 19-model average. Dark blue shows the equilibrium climate sensitivity (ECS) of the same models. Light blue are the results of the forcing datasets applied to actual historical temperature datasets.

Note that Kiehl’s misidentification of the cause of the variations is understandable. First, the output of the models are all fairly similar to the historical temperature. This allowed Kiehl to ignore the model output, which simplifies the question, but it increases the inaccuracy. Second, the total forcing is an anomaly which starts at zero at the start of the historical reconstruction. As a result, the total forcing is somewhat proportional to the trend of the forcing. Again, however, this increases the inaccuracy. But as a first cut at solving the paradox, as well as being the first person to write about it, I give high marks to Dr. Kiehl.

Now, I probably shouldn’t have been surprised by the fact that the sensitivity as calculated by the models is nothing more than the trend ratio. After all, the canonical equation of the prevailing climate paradigm is that forcing is directly related to temperature by the climate sensitivity (lambda). In particular, they say:

Change In Temperature (∆T) = Climate Sensitivity (lambda) times Change In Forcing (∆F), or in short,

∆T = lambda ∆F

But of course, that implies that

lambda = ∆T / ∆F

And the right hand term, on average, is nothing but the ratio of the trends.

So we see that once we’ve decided what forcing dataset the model will use, and decided what historical dataset the output is supposed to match, at that point the climate sensitivity is baked in. We don’t even need the model to calculate it. It will be the trend ratio—the trend of the historical temperature dataset divided by the trend of the forcing dataset. It has to be, by definition.

This completely explains why, after years of better and better computer models, the models are able to hindcast the past in more detail and complexity … but they still don’t agree any better about the climate sensitivity.

The reason is that the climate sensitivity has nothing to do with the models, and everything to do with the trends of the inputs to the models (forcings) and outputs of the models (emulations of historical temperatures).

So to summarize, as Dr. Kiehl suspected, the variations in the climate sensitivity as reported by the models are due entirely to the differences in the trends of the forcings used by the various models as compared to the trends of their outputs.

Given all of that, I actually laughed out loud when I was perusing the latest United Nations Inter-Governmental Panel on Climate Change’s farrago of science, non-science, anti-science, and pseudo-science called the Fifth Assessment Report (AR5). Bear in mind that as the name implies, this is from a panel of governments, not a panel of scientists:

The model spread in equilibrium climate sensitivity ranges from 2.1°C to 4.7°C and is very similar to the assessment in the AR4. There is very high confidence that the primary factor contributing to the spread in equilibrium climate sensitivity continues to be the cloud feedback. This applies to both the modern climate and the last glacial maximum.

I laughed because crying is too depressing … they truly, truly don’t understand what they are doing. How can they have “very high confidence” (95%) that the cause is “cloud feedback”, when they admit they don’t even understand the effects of the clouds? Here’s what they say about the observations of clouds and their effects, much less the models of those observations:

• Substantial ambiguity and therefore low confidence remains in the observations of global-scale cloud variability and trends. {2.5.7}

• There is low confidence in an observed global-scale trend in drought or dryness (lack of rainfall), due to lack of direct observations, methodological uncertainties and choice and geographical inconsistencies in the trends. {2.6.2}

• There is low confidence that any reported long-term (centennial) changes in tropical cyclone characteristics are robust, after accounting for past changes in observing capabilities. {2.6.3}

I’ll tell you, I have “very low” confidence in their analysis of the confidence levels throughout the documents …

But in any case, no, dear Inter-Governmental folks, the spread in model sensitivity is not due to the admittedly poorly modeled effects of the clouds. In fact it has nothing to do with any of the inner workings of the models. Climate sensitivity is a function of the choice of forcings and desired output (historical temperature dataset), and not a lot else.

Given that level of lack of understanding on the part of the Inter-Governments, it’s gonna be a long uphill fight … but I got nothing better to do.

w.

PS—me, I think the whole concept of “climate sensitivity” is meaningless in the context of a naturally thermoregulated system such as the climate. In such a system, an increase in one area is counteracted by a decrease in another area or time frame.  See my posts It’s Not About Feedback and Emergent Climate Phenomena for a discussion of these issues.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
5 2 votes
Article Rating
124 Comments
Inline Feedbacks
View all comments
Admin
October 2, 2013 3:47 am

Willis,
Throw in 50 or so random oscillators that create an envelope of noise and you cover regional variation, precipitation, sea surface, troposphere, etc. as well.
AND….
It will be more sciency!

October 2, 2013 4:51 am

I’d like to start a climate model at a thousand years ago or better yet before the last ice age and watch as it just runs wildly around missing paleoclimatology history at every turn. You’d thnk they’d be excited to do this and prove their long term forecasts. The idea that these toys can simulate the earths climate is so absurd.

thingadonta
October 2, 2013 4:59 am

Regarding the idea that the climate models simulate past temperatures ‘with a reasonable degree of accuracy’, didn’t someone mention the related idea somewhere that correlation is not causation? I read somewhere that wheat prices correlate with global warming, but nobody suggests wheat supply/demand is a primary cause of global warming.
Or am I just missing something.

October 2, 2013 5:02 am

Tucci78 says:
October 1, 2013 at 10:23 pm
“It simply works like so:
First, draw your curve. Then plot your points.
Learned that one in high school, I did.”
For a grade 7 project, one of my sons built a maze, drew a learning curve and then borrowed a pet rabbit from a friend because he had to take all his project to school. Unfortunately, the rabbit was a way too big for the maze. My son decided to take a sick day off and he paid his friend 50 cents to take the “project” to school on his behalf and to explain that the rabbit had grown since completing the experiment. His friend got a detention for bringing a live animal to school which was listed as one of the no-nos for the project. He agreed to pay his friend another 50 cents. I, of course, learned all this much later. My son got an ‘A’ for his bogus learning curve because it pretty much matched the psych 101 stuff that school teachers take. He would have been a star on the IPCC team.

October 2, 2013 5:09 am

Re volcanic aerosols, are they uniform across the globe? Wouldn’t the trade winds tend to block out these aerosols somewhat, at least along the ITCZ? How do, say, southern hemisphere aerosols cross the ITCZ?

October 2, 2013 5:13 am

Let me try that last comment again. I tried the LaTeX before submitting it, and it worked then. Maybe it’ll work this time:
I never understood the previous post, and I don’t understand this one. The emphasis on trend seems a trivial corollary of what you’d done before.
You’d already established that the following is a pretty accurate black-box representation of model behavior:
\frac{dT}{dt} = \frac{\lambda}{\tau}F - \frac{1}{\tau}T
So you know that the temperature response to a forcing having a trend from time t=0, i.e., to F = rt, is
T = \lambda r \left[ t - \tau(1 - e^{-t/\tau}) \right]
That is, the rate of change of temperature is
r\lambda(1 -\tau^2 e^{-t/\tau}).
If you ignore the transient response, that is, the ratio of the temperature trend, \lambda r, to the forcing trend, r, is the transient climate sensitivity \lambda
This result does not seem startling in light of what you’d established previously.
What am I missing?

October 2, 2013 5:30 am

“So we see that once we’ve decided what forcing dataset the model will use, and decided what historical dataset the output is supposed to match, at that point the climate sensitivity is baked in. “
So if I understand it well then basically if in reality there’s no climate sensitivity (say to CO2) whatsoever, but still a trend (due to another unconsidered factor[s]) in the data to be emulated by the model, the (CO2) climate sensitivity is invented and its function adjusted to fit the trends of the emulated data for replacing the unconsidered factor[s] in the model and so the resulting emulation then works hindcasting, but proves being without predictive value for the further development of the real data emulated by the model, exactly because the unconsidered factor[s] behave differently than the invented CO2 forcing function?

Gail Combs
October 2, 2013 6:19 am

Richard111 says: October 1, 2013 at 11:02 pm
… Any tutorials on this subject for baffled laymen?
>>>>>>>>>>>>>>>>>>
You might try John Kehr’s posts. John “is a Chemical Engineer by schooling and Research and Development Process Engineer by profession.”
The Earth’s Energy Balance: Simple Overview
The Energy Balance and the Greenhouse Effect: 1 of 3
The Energy Balance and the Greenhouse Effect: 2 of 3
The Energy Balance and the Greenhouse Effect: 3 of 3
Determining the Correct Climate Sensitivity
Heat Transfer and the Earth’s Surface
What would the temperature of the Earth be without CO2 in the Atmosphere?
Temperature Dependence of the Earth’s Outgoing Energy
And finally KevinK’s Comment here on WUWT

October 2, 2013 6:28 am

The climate models are rather like the following.
Assume a spherical cow, feed the cow peer reviewed “scientific” articles. Calibrate the simulation using selected data from the real world that has been adjusted to fit the the model of the spherical cow. Then compute how much butter can be produced from the milk the cow produces. Force everyone on earth to use the resultant butter on their morning toast. Finally, label anyone who objects to being forced to use the non-existent butter on his non-existent morning toast a “denier”.
What could go wrong?
All the so called climate modelers have done is hidden a tangle of circular reasoning inside of a web of complexity of undefined and undisclosed computer code. Now, assuming what you are to prove is always wrong! From that point on, everything will go wrong. No matter how well you have adjusted your training data set to the assumed behavior of your spherical cow.
Never fear, you are wrong too because the modelers have called you a “denier”. As everyone “knows” to name a thing something changes the thing to mean the name. Oh wait. That went wrong too.
The real question is, did they do anything right? If so, how will we know it? Clearly not by asking the modelers.

Editor
October 2, 2013 6:32 am

Willis wrote:

How can they have “very high confidence” (95%) that the cause is “cloud feedback”, when they admit they don’t even understand the effects of the clouds?

Candidate for Quote of the Week.

Greg Goodman
October 2, 2013 6:39 am

Richard, “….and did not say you were annoyed by little people.” LOL 😉

Rob Potter
October 2, 2013 6:45 am

I am sure that i am just repeating what Willis has said here, but since I am not as good at the maths I will throw in my 2 cents anyway.
The lack of warming for the past 15/6/7 years is finally being recognized and at the same time we have estimates of sensitivity to a doubling of CO2 coming down from 3 (+++) to 1.5 (—). It seems to me that all people are doing is continuing to assume that all warming is because of CO2 and since there is less warming, the sensitivity to CO2 (and equivalent greenhouse gases) is lower.
This sounds way too simplistic, but I can’t get it out of my head. Nowhere in here is there room for any other “forcing” than CO2 and the lack of warming means simply that we got the sensitivity wrong. If the models had any other variable forcing, then you would not need to explain everything by the CO2 sensitivity and so all of this is based on the one single assumption. Yes, I know the physics of IR absorption/emission, but to make CO2 the only variable forcing is how we got into this mess in the first place.
[15/16/17 years ? Mod]

David L. Hagen
October 2, 2013 6:47 am

thingadonta
Wheat prices increase with scarcity and thus with bad weather, war, or government mismanagement. Bad weather may be caused by solar impacts on clouds etc. e.g., see papers by Pustil’nik and Din
On Possible Influence of Space Weather on Agricultural Markets: Necessary Conditions and Probable Scenarios, L. Pustil’nik & G. Yom Din
Influence of Solar Activity on the State of Wheat Market in Medieval England L. Pustil’nik & G. Yom Din
Conversely, see:
Jeffrey Love On the insignificance of Herschel’s sunspot correlation

kadaka (KD Knoebel)
October 2, 2013 6:47 am

Emergency note, the internet is broken.
Wordpress is barely loading with my dial-up connection, I’m not getting full pages. Starting at around 3AM EDT, aka midnight for WordPress and Left Coast times, it went bad. WP and several other sites are only loading in up to ten second spurts, followed by a minute of nothing. Google and Drudge Report load fine.
While loading, the browser flashes from where it’s fetching data. A hangup appears to be akamai-dot-net. For those wondering what Akamai is, it’s a mirror cache service. Rather than directly loading everything for a page of a company’s or group’s servers, content is sent from Akamai’s caches instead, reducing the need for directly hosted bandwidth.
MANY commercial sites use Akamai. When it goes down, much of the internet is broken.
I’ve reconnected several times. Upgraded browser, at usual dial-up blazing speed of about 4.4 KB/s. Switched to different computer, and to its WinXP partition while using a different modem. Same thing, internet still broken.
(Government “shuts down”, the “non-essential” work ceases, and when a NSA internet content monitoring operation would need a human inputting the next-day start-up signal to keep running, a large internet chunk shuts down instead. Curious.)
Some sites like Breitbart-dot-com, no friend of the current occupier of the White House, are inaccessible, taking too long to respond.
I’m dropping this comment here as this page is loaded enough to post a comment, and I effectively won’t be able to reply to comments on my comments here until WordPress is working again.

Gail Combs
October 2, 2013 6:51 am

Gary Pearse says: October 2, 2013 at 5:02 am
…. My son got an ‘A’ for his bogus learning curve because it pretty much matched the psych 101 stuff that school teachers take. He would have been a star on the IPCC team.
>>>>>>>>>>>>>>>>>>
He also would be smart enough to desert the sinking CAGW ship at this time as some of the “Team” seems to be tiptoeing out. doing. More HERE from last year.
(A recent commenter at WUWT noted the controversial IPCC graph had the name Stott attached.)

MikeN
October 2, 2013 6:51 am

Willis, they are not saying that the clouds are the cause of the warming, but rather the uncertainty over the clouds is the reason for the spread in the different models. Presumably that would also cause a wide variation in forcing estimates.

Greg Goodman
October 2, 2013 6:54 am

Joe Born : “If you ignore the transient response, that is, the ratio of the temperature trend, lambda r, to the forcing trend, r, is the transient climate sensitivity lambda”
Not only is that simplification ignoring the transient , it is assuming zero d2T/dt2 as per your first equation. Is that a reasonable assumption? Having looked the temp data from many sources in some detail for both dT/dt and and d2T/dt2 there is a definite long term acceleration over the last century and a half of useful data.
When I say acceleration that is from neg slope to positive slope, which may include other variations than AGW.

Gail Combs
October 2, 2013 6:55 am

I noted some of my comments in the last few days have completely disappeared without a ‘Snip’ to indicate they have been received.

beng
October 2, 2013 6:55 am

***
kadaka (KD Knoebel) says:
October 1, 2013 at 10:08 pm
Oh wait, I don’t think he’s on the clock yet. Will we get a senior obfuscator like KR, or that trainee, Jai Mitchell?
***
Yoda:
“Always there is, master and apprentice.”

Greg Goodman
October 2, 2013 7:04 am

Rob Potter says: It seems to me that all people are doing is continuing to assume that all warming is because of CO2 and since there is less warming, the sensitivity to CO2 (and equivalent greenhouse gases) is lower.
They are desperately trying to patch up a sinking ship.
There are clearly many cardinals that are not happy with the current doctrine but for the moment the conclave that controls the cannon texts that get included in the bible are still attached to the political power they derive from CAGW.

Bill Illis
October 2, 2013 7:05 am

I note the feedbacks are a critical component of the theory. If the water vapor, lapse rate and cloud feedbacks are higher than is assumed, we have runaway global warming. If they are less, say in the 50% range of that assumed, we have just 1.5C per doubling.
So far, water vapor is coming in much lower than expected, clouds are completely unknown in reality.
http://s22.postimg.org/y64l6twpd/Global_Warming_Feedback_Strength.png
The latest IPCC AR5 report did not really change any of the values, we still around 2.3 W/m2/K in feedbacks total. Very little evidence was presented that did not use the ENSO impacts to hype the values.

October 2, 2013 7:14 am

Greg Goodman: “Not only is that simplification ignoring the transient , it is assuming zero d2T/dt2 as per your first equation. Is that a reasonable assumption?”
If you’re asking me whether it’s reasonable to assume a non-zero second time derivative of temperature, the answer in the real world is of course no. But in this parallel universe of a single-order linear system that the models in effect unknowingly simulate, there’s no such assumption; first order systems don’t care about second derivatives.
But I’m not sure that this is relevant to my question, which is what it is Mr. Eschenbach is saying about trend ratio. I think he means something less trivial than I’m understanding him to say, but I haven’t been able to figure out what it is he does mean.

kadaka (KD Knoebel)
October 2, 2013 7:31 am

Gail Combs said on October 2, 2013 at 6:55 am:

I noted some of my comments in the last few days have completely disappeared without a ‘Snip’ to indicate they have been received.

You’re an anti-government cons & piracy nutjob with access to agricultural-grade ammonium nitrate and other weaponry. Maybe one of the “non-essential government personnel” did the quick scan and pass-along for your monitored group so they weren’t there to authorize the flagged remarks.
(Internet still broken, I waited 20 minutes for the page to mostly reload.)

Theo Goodwin
October 2, 2013 8:23 am

More brilliant work from Willis Eschenbach. This essay and your other essays that you link above constitute a brilliant analysis of the follies found in Alarmist modelers’ assumptions. Your version of the paradox is reason enough to junk the climate models.

Theo Goodwin
October 2, 2013 8:25 am

Gary Pearse says:
October 2, 2013 at 5:09 am
An empirical question. IPCC no do empirical.