Guest post by Chip Knappenberger,
republished with permission from Master Resource (now on WUWT’s blogroll)

“A collection of research results have been published in the peer-reviewed scientific literature in recent months that buoys my hopes for a low-end climate sensitivity.”
One of the key pieces to the anthropogenic climate/environment change puzzle is the magnitude of the earth’s climate sensitivity—generally defined as the global average temperature change resulting from a doubling of the atmospheric concentration of carbon dioxide (CO2).
One of the reasons that the “climate change” issue is so contentious is that our understanding of climate sensitivity is still rather incomplete. But new research efforts are beginning to provide evidence suggesting that the current estimates of the climate sensitivity should be better constrained and adjusted downwards. Such results help bolster the case being made by “lukewarmers”—that climate change from anthropogenic fossil-fuel use will be moderate rather than extreme, and that an adaptive response may be more effective than attempts at mitigation.
In its Fourth Assessment Report (AR4), released in 2007, the Intergovernmental Panel on Climate Change (IPCC) provided this general guidance on the climate sensitivity:
[The equilibrium climate sensitivity] is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values.
In IPCC parlance, “likely” means an expertly assessed likelihood of an outcome or result with greater than a 66% chance of occurrence. “Very unlikely” means less than a 10% change of occurrence.
Visually, the IPCC’s assessment of the climate sensitivity based on its interpretation of the extant literature at the time of its assessment is shown in Figure 1. The IPCC routinely includes studies which conclude that there is a greater than a 10% possibility that the true climate sensitivity exceeds 6°C and some which find that there is a greater than 5% possibility that it exceeds 10°C.
Fig 1. Climate sensitivity distributions retained (and in some cases recast) by the IPCC from their assessment of the literature. Note that the distributions fall off much more slowly towards the right, which indicates that the IPCC considers the possibilities of the climate sensitivity having a very large positive value (that is, a large degree of global temperature rise for a doubling of the atmospheric carbon dioxide concentration) to be not inconsequential (source: IPCC AR4).
If the true value of the climate sensitivity does turn out to exceed 6°C, then we will be in for what will probably turn out to be fairly disruptive climate change. Heck, even if the climate sensitivity lies much above 4.5°C, coming climate change will be substantial. I for one, would hope that it lies below 3°C, and actually turns out to be closer to 2°C.
A collection of research results have been published in the peer-reviewed scientific literature in recent months that buoys my hopes for a low-end climate sensitivity. Here are some salient quotes.
From “Climate Sensitivity Estimated from Temperature Reconstructions of the Last Glacial Maximum,” by Andreas Schmittner et al, 2011:
Assessing impacts of future anthropogenic carbon emissions is currently impeded by uncertainties in our knowledge of equilibrium climate sensitivity to atmospheric carbon dioxide doubling. Previous studies suggest 3 K as best estimate, 2–4.5 K as the 66% probability range, and non-zero probabilities for much higher values, the latter implying a small but significant chance of high-impact climate changes that would be difficult to avoid. Here, combining extensive sea and land surface temperature reconstructions from the Last Glacial Maximum with climate model simulations, we estimate a lower median (2.3 K) and reduced uncertainty (1.7–2.6 K 66% probability). Assuming paleoclimatic constraints apply to the future as predicted by our model, these results imply lower probability of imminent extreme climatic change than previously thought.
From “Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations of hemispheric temperatures and global ocean heat content,” by Magne Aldrin et al., 2012:
The [climate sensitivity] mean is 2.0°C… which is lower than the IPCC estimate from the IPCC Fourth Assessment Report (IPCC, 2007), but this estimate increases if an extra forcing component is added, see the following text. The 95% credible interval (CI) ranges from 1.1°C to 4.3°C, whereas the 90% CI ranges from 1.2°C to 3.5°C.
From “A climate sensitivity estimate using Bayesian fusion of instrumental observations and an Earth Systems model,” by Roman Olson et al., 2012:
Current climate model projections are uncertain. This uncertainty is partly driven by the uncertainty in key model parameters such as climate sensitivity (CS)…The mode of [our] climate sensitivity estimate is 2.8°C, with the corresponding 95% credible interval ranging from 1.8 to 4.9°C.
The above papers examined the “equilibrium climate sensitivity”—that is the global temperature change that results when all climate systems reach equilibrium with the changes in climate forcing that result from a doubling of the atmospheric carbon dioxide content. The time it take to reach equilibrium depends largely on the response of the oceans (and how quickly heat is distributed with in them) and is not known with much certainty. Estimates of the time to reach equilibrium run from decades to centuries. Thus, the equilibrium climate sensitivity may not be the best measure of how much temperature (and related) change may occur over the nearer term, like say, over the course of the remainder of the 21st century.
A better estimate of that change is the “transient climate response”, or the amount of global temperature change that is manifest at the actual time that the atmospheric carbon dioxide is doubled (rather than waiting for the system to reach complete equilibrium). The transient climate response (TCR) is somewhat less than the equilibrium climate sensitivity.
Two recent papers examined the transient climate sensitivity. Again, here are salient quotes.
From “Improved constraints on 21st-century warming derived using 160 years of temperature observations,” by Nathan Gillett et al., 2012:
Our analysis also leads to a relatively low and tightly-constrained estimate of Transient Climate Response of 1.3–1.8°C, and relatively low projections of 21st-century warming… which is towards the lower end of the observationally constrained range assessed by [the IPCC AR4].
From “Probabilistic estimated of transient climate sensitivity subject to uncertainty in forcing and natural variability,” by Lauren Padilla et al., 2011:
For uncertainty assumptions best supported by global surface temperature data up to the present time, this paper finds a most likely present-day estimate of the transient climate sensitivity to be 1.6 K, with 90% con?dence the response will fall between 1.3 and 2.6K…
Now, by no means am I suggesting either that 1) the quotes above reflect all the intricacies of the respective papers, or 2) that these results are end all and be all on the topic. Neither, in fact, is true.
But, the excerpts above do reflect the general conclusion of each paper, as well as what makes them noteworthy. In fact, the IPCC in its Fifth Assessment Report (which is now under construction) will be terribly remiss (and misleading) if they present a Figure that looks anything like Figure 1 (above) from their Fourth Assessment Report.
In the intervening years, there has been substantial research into the probability distribution which contains the earth’s equilibrium climate sensitivity and the emerging bulk of evidence suggests that the IPCC’s “likely” range for the equilibrium climate sensitivity is much too large and that the possibility that the equilibrium climate sensitivity lies above 6°C is vanishingly small—if not entirely ruled out. Even the chance that it exceeds 4.5°C has been markedly reduced to being no more than about 5% (if not even less).
And when it comes to the “best estimate” of the “most likely” value of both the equilibrium climate sensitivity as well as the transient climate response, it is refreshing and encouraging to see new results from different research groups pointing to a lower number than that forwarded by the IPCC in its AR4.
It seems as we obtain more knowledge and understanding of reality, the specter of alarming climate change is driven further into the world of make believe.
References:
Aldrin, M., et al., 2012. Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations oh hemispheric temperature and global ocean heat content. Environmetrics, doi:10.1002/env.2140.
Gillett, N.P., et al., 2012. Improved constraints on 21st-century warming derived using 160 years of temperature observations. Geophysical Research Letters, 39, L01704, doi:10.1029/2011GL050226.
Olson, R., et al., 2012. A climate sensitivity estimate using Bayesian fusion of instrumental observations and an Earth System model. Journal of Geophysical Research, 117, D04101, doi:10.1029/2011JD016620.
Padilla, L. E., G. K. Vallis, and C. W. Rowley, 2011. Probabilistic estimates of transient climate sensitivity subject to uncertainty in forcing and natural variability. Journal of Climate, 24, 5521-5537, doi:10.1175/2011JCL13989.1.
Schmittner, A., et al., 2011. Climate sensitivity estimated from temperature reconstructions of the Last Glacial Maximum, Science, 344, 1385-1388, DOI: 10.1126/science.1203513.

I still want someone to address the fact that all of the gases in the atmosphere qualify as greenhouse gases, per the definition as being IR active and “heat-trapping” (of course, this is a stupid definition as it defies thermodynamics, but hey that’s what the “climate scientists” make it).
If all of the gases interact with IR, and CO2 has one of the lesser IR spectra, the atmospheric interaction with IR becomes a constant and, thus, it is no surprise that they have trouble measuring sensitivity.
The satellite records of the IR flux show that the IR efflux has not changed in 40 years and may have even increased a little bit, which would make sense as CO2 replaces a bit of water vapor and makes the atmosphere LESS IR interactive.
When there is a small change in a bit player (CO2) and all the other gases are interactive, how can the change be detected? It is not.
Considering ‘adjustments’ to historical data records that exacerbate perceived warming, estimates of ‘forcing’ from the same can only be exaggerated.
The “no feedback” climate sensitivity is estimated as being about 1.2 C for a doubling of CO2, using the assumption that the “structure of the atmosphere does not change”. In other words, the estimate can be based by ONLY looking at radiation effects. This assumption has never been justified, ansd is almost certainly wrong. The lapse rate almost certainly changes as GHGs are added to the atmosphere.
Since this no-feedback climate sensitivity is the cornerstone of all IPCC estimations, it follows that all these estimations has no basis in physics.
Gah. When I hear “66%” described as being “likely” I want to puke. That is so far below any scientific standard that it is indistinguishable from coin-flipping. And the incestuous “expert opinion” process that generated it is EXACTLY the kind of nonsense that high-sigma falsification standards were designed to avoid.
And 90% is no better. Really. Even in the mushy social pseudo-sciences, where I took my degree, that’s looked on as very weak tea. In physics, it’s way down in speculation land.
As “what about Bob” would say – “baby steps”. At least we are moving in the right direction. GK
Good post thanks…
One more nail in the coffin of alarmist thinking.
There are 2 problematic assumptions in this article:
1. CO2 is bad. Tte 9 to 10 billion population in 2100 is going to need much more food grown on much less land and higher CO2 is one of the easier solutions.
2. CO2 rise will continue unabated. Doubled CO2 means 50% more plant growth – if we stop chopping down rainforest (and other land use changes) it will peak in the 400’s (ppm) if not it will peak in the 500’s. It is really difficult to drastically increase the rate of fossil fuel consumption. Over two-thirds of the CO2 rise is due to land-use changes and other non-fossil fuel sources.
We have 40% of “doubling” of CO2 already in the bank. given the logarithmic nature of CO2, that implies that close to 60% of the effect of “doubling” is already in the bank. That being the case:
1. If sensitivity was high, we would have already seen major temperature increases.
2. We have not seen major temperature increases.
Conclusion: Sensitivity is low.
Good things are happening, we need more of such research. Still a bit more than what I’d have hoped for but definitely not scary anymore.
What’s the climate sensitivity to CO2 if there is glaciation down past Wisconsin… again?
If there is about an 800 year lag between changes in temperature and changes in CO2, then CO2 concentration would continue to climb while people in the upper continental US and Canada are hightailing it to Mexico on their snowmobiles, eh?
I blame Soviets for inflating their temperature in wake of Czechoslovakia invasion:
See graphs 2 and 3 here:
http://www.vukcevic.talktalk.net/GT-AMO.htm
Let me add to my previous comment that there is no CO2 “signal” in the global temperature/time graph, which is discernable above the background noise. Which indicates that the climate sensitivity of CO2, when added from current levels, is indistinguishable from zero.
“Sensitivity” may be present in some imaginary equation, but it’s not present in the ACTUAL DATA. Therefore the setting of this parameter is utterly irrelevant. Might as well cheer that the Phlogiston Hamnicity seems to be lower than the Orgone Bassimation.
Poriwoggu says:
1. CO2 is bad. Tte 9 to 10 billion population in 2100 is going to need much more food grown on much less land and higher CO2 is one of the easier solutions.
In addition to that, if and when this interglacial ends, we are going to need elevated CO2 even more. During the most recent glaciations, CO2 levels fell to plant-starving levels. Now AGW adherents tell us that there will not be any new glaciation any time soon because of the elevated CO2. To me, that sounds like elevated CO2 levels is the perfect insurance against future food catastrophes: Either elevated CO2 stops the next glaciation, or if it doesn’t, it helps plants through it…
It’s less worse than we thought.
There are 2 problematic assumptions in this article:
1. CO2 is bad. Tte 9 to 10 billion population in 2100 is going to need much more food grown on much less land and higher CO2 is one of the easier solutions.
2. CO2 rise will continue unabated. Doubled CO2 means 50% more plant growth – if we stop chopping down rainforest (and other land use changes) it will peak in the 400′s (ppm) if not it will peak in the 500′s. It is really difficult to drastically increase the rate of fossil fuel consumption. Over two-thirds of the CO2 rise is due to land-use changes and other non-fossil fuel sources.
You know, people on this list say things like this, people on the CAGW say the opposite, nobody ever seems to give references. When I sought out references on higher CO_2 and plant growth, I discovered that (not unreasonably) the effect is rather marginal; plant growth is rate limited by many things, and doubling CO_2 isn’t going to e.g. double agricultural productivity. Furthermore, with modern farming there really isn’t any difficulty growing enough food for 9 to 10 billion people. The problem has always been that we do not use modern farming methods worldwide, and most world farming is constrained by a lack of energy and modern tools. Providing tractors and diesel and electricity and sowers and harvesters will have a far greater impact on productivity than the CO_2.
But I’m really interested in the latter claim. Over 2/3 of the CO_2 rise is due to land use changes and not fossil fuels? And you know this how, or from what study? I’m not asking in a critical way — I’m asking because I genuinely do not know the answer. Some fraction of our CO_2 anomaly comes from the warming of the ocean. Some of it comes from burning fossil fuels, where we can certainly tally up fuel used every year and CO_2 produced from the given consumption of fuel. Land use changes actually seem IMplausible as a significant factor to me offhand — a distant third. Things are complicated by the fact that the ocean is a dynamic source and sink, by the fact that volcanoes release CO_2, by the fact that e.g. unburned methane from e.g. the Gulf Oil Spill oxidizes to CO_2 (and water) and much more.
So please, if anybody has a really plausible, well-documented source for the currently enormously variable claims concerning anthropogenic and other CO_2, I’d love to see them.
rgb
Ever wonder why there are no zero or negative sensitivities shown on the plot? This is an example of confirmation bias…projecting our assumptions into the data. For every year there was a downtick of global temperature when CO2 rose, the plot should show correlation to an inverse relationship. It’s in the data, but it’s not in the plot. Interesting, eh?
The Fictive World of Rajendra Pachauri
Tony Thomas
http://www.quadrant.org.au/magazine/issue/2012/3/the-fictive-world-of-rajendra-pachauri
In 2005 the chair of the Intergovernmental Panel on Climate Change (IPCC), Rajendra Pachauri, flew to a Washington conference. He spent ninety minutes getting through the airport formalities. A chauffeur-driven car had been waiting outside, with the engine and air-conditioner running so that Pachauri would have a cool car to step into. Pachauri was indignant. “My God! Why did you do that?” he rebuked the driver. “You probably had the engine on for two hours. Was that really required?” He told North Carolina legislators three years later: “So that’s the kind of change in lifestyle that I’m talking about … which when put together will really make an enormous difference.”
—————————————–
I began this profile with Pachauri’s address to the North Carolina legislators and will end with it. First, he wrongly thought he was “off the record”. Second, he clearly set out to scare the legislators witless, with such forecasts as a 90 per cent drop in African crop harvests by 2100, along with security and peace issues with Africans heading out of Africa literally for greener pastures. To avert such perils, including “several metres” of sea-level rise, he said that the world need only give up one year’s GDP growth (3 per cent) by 2030, or 5.5 per cent of world GDP if we rashly delayed the emission targets to 2050.
Pachauri climaxed his address with a quote (probably taken from Al Gore’s book Earth in Balance) by native American Chief Seattle in 1854: “Man did not weave the web of life, he is merely a strand in it. Whatever he does to the web, he does to himself.” My b.s. detector beeped and sure enough, Chief Seattle’s eco-poetry is, in polite academic-speak, “inauthentic … an evolving work of fiction”.
I accept unreservedly that an increase in CO2 increases the amount of downward radiation and potentially increases global temperatures. I also accept unreservedly that there is the potential for such changes to induce positive feedback effects that increase temperatures further. However I have seen nothing that proves that these changes do not induce negative effects such as increased radiation into space and increased cloud albedo. Indeed, if I look at the paleo data with my control engineering hat on I would say that the data looks more like negative feedback than positive. So why are all the likely values for the sensitivity positive?
Thanks Chip Knappenberger for a good posting, but IMHO you do not go far enough in reducing the most probable climate sensitivity, which I think is closer to 1⁰C than the IPCC mean of 3⁰C.
You mention the excellent Schmittner [2011] paper, which I discussed in my WUWT posting CO2 Sensitivity is Multi-Modal – All bets are off.
The IPCC graph displayed at the head of your posting extends all the way out to sensitivities of 10⁰C, or higher. Schmittner proves to my satisfaction that any values greater than about 6⁰C would retrodict a “total snowball Earth” at the Last Glacial Maximum which contradicts clear evidence that the ice sheets did not extend equatorward beyond the middle of the USA or corresponding latitudes in Europe, Asia, South America, or Africa. Indeed, the tails of the IPCC graphs that extend beyond 5°C (or perhaps even 4°C) should approach zero probability.
The papers cited above average out somewhere between 1.5°C to 2°C, which is closer to the truth than the IPCC 3°C. I predict that further study will show the actual truth is 1°C or less, possibly as low as 0.5°C or even 0.25°C, which would be good news indeed.
Ira
good one, Polistra:
” Phlogiston Hamnicity seems to be lower than the Orgone Bassimation.”
that is a perspicacious summation.
Brian H says (March 19, 2012 at 7:42 am)
“Gah. When I hear “66%” described as being “likely” I want to puke. That is so far below any scientific standard that it is indistinguishable from coin-flipping.”
Quite right! In particle physics, 5-sigmas (repeatable) is required before you can say you have a valid signal in the noise. That’s 20 coin tosses in a row.
To make things worse, it’s GISS-adjusted noise. Puke away!
The current lack of warming indicates low sensitivity,
but even more so does the lack of cooling in the main CO2 band at 15 microns.
These are actual data, not theory.
Also, the theory should be 1 deg or below anyway,
when you look at the theoretical warming for 300-600 ppm rise
it’s only 0.66 K, about 1.2 F, not enough to be able to measure.
Truly, the catastrophe is entirely imaginary.
“One of the reasons that the “climate change” issue is so contentious is that our understanding of climate sensitivity is still rather incomplete.” To this layman, this looks like a satirical understatement out of a Monty Python skit. Also, thanks to others above for identifying problematic assumptions.
I’ll go with the author trying not to be unnecessarily contentious. Should science be trying to tweak the IPCC narrative when that narrative is so limited to the last 30 years of 2 billion and is proven to do a rather poor job of predicting the past and the present with its “rather incomplete” “understanding?” All I can see in the AGW case is retrofitting data to justify concentrating on only one variable. That’s not just rather incomplete, it’s structurally flawed at formation of the problem.
Meak, but serving to underline AGW weakness by trying to be meak and still finding low sensitivity.
“Jim Cripwell says:
March 19, 2012 at 7:35 am (Edit)
The “no feedback” climate sensitivity is estimated as being about 1.2 C for a doub Againling of CO2, using the assumption that the “structure of the atmosphere does not change”. In other words, the estimate can be based by ONLY looking at radiation effects. This assumption has never been justified, ansd is almost certainly wrong. The lapse rate almost certainly changes as GHGs are added to the atmosphere.
Since this no-feedback climate sensitivity is the cornerstone of all IPCC estimations, it follows that all these estimations has no basis in physics.
####################################
you dont know what you are talking about