By Patrick J. Michaels and Paul C. “Chip” Knappenberger
We have two new entries to the long (and growing) list of papers appearing the in recent scientific literature that argue that the earth’s climate sensitivity—the ultimate rise in the earth’s average surface temperature from a doubling of the atmospheric carbon dioxide content—is close to 2°C, or near the low end of the range of possible values presented by the U.N.’s Intergovernmental Panel on Climate Change (IPCC). With a low-end warming comes low-end impacts and an overall lack of urgency for federal rules and regulations (such as those outlined in the President’s Climate Action Plan) to limit carbon dioxide emissions and limit our energy choices.
The first is the result of a research effort conducted by Craig Loehle and published in the journal Ecological Modelling. The paper is a pretty straightforward determination of the climate sensitivity. Loehle first uses a model of natural modulations to remove the influence of natural variability (such as solar activity and ocean circulation cycles) from the observed temperature history since 1850. The linear trend in the post-1950 residuals from Loehle’s natural variability model was then assumed to be largely the result, in net, of human carbon dioxide emissions. By dividing the total temperature change (as indicated by the best-fit linear trend) by the observed rise in atmospheric carbon dioxide content, and then applying that relationship to a doubling of the carbon dioxide content, Loehle arrives at an estimate of the earth’s transient climate sensitivity—transient, in the sense that at the time of CO2 doubling, the earth has yet to reach a state of equilibrium and some warming is still to come.
Loehle estimated the equilibrium climate sensitivity from his transient calculation based on the average transient:equilibrium ratio projected by the collection of climate models used in the IPCC’s most recent Assessment Report. In doing so, he arrived at an equilibrium climate sensitivity estimate of 1.99°C with a 95% confidence range of it being between 1.75°C and 2.23°C.
Compare Loehle’s estimate to the IPCC’s latest assessment of the earth’s equilibrium climate sensitivity which assigns a 66 percent or greater likelihood that it lies somewhere in the range from 1.5°C to 4.5°C. Loehle’s determination is more precise and decidedly towards the low end of the range.
The second entry to our list of low climate sensitivity estimates comes from Roy Spencer and William Braswell and published in the Asia-Pacific Journal of Atmospheric Sciences. Spencer and Braswell used a very simple climate model to simulate the global temperature variations averaged over the top 2000 meters of the global ocean during the period 1955-2011. They first ran the simulation using only volcanic and anthropogenic influences on the climate. They ran the simulation again adding a simple take on the natural variability contributed by the El Niño/La Niña process. And they ran the simulation a final time adding in a more complex situation involving a feedback from El Niño/La Niña onto natural cloud characteristics. They then compared their model results with the set of real-world observations.
What the found, was the that the complex situation involving El Niño/La Niña feedbacks onto cloud properties produced the best match to the observations. And this situation also produced the lowest estimate for the earth’s climate sensitivity to carbon dioxide emissions—a value of 1.3°C.
Spencer and Braswell freely admit that using their simple model is just the first step in a complicated diagnosis, but also point out that the results from simple models provide insight that should help guide the development of more complex models, and ultimately could help unravel some of the mystery as to why full climate models produce high estimates of the earth’s equilibrium climate sensitivity, while estimates based in real-world observations are much lower.
Our Figure below helps to illustrate the discrepancy between climate model estimates and real-world estimates of the earth’s equilibrium climate sensitivity. It shows Loehle’s determination as well as that of Spencer and Braswell along with 16 other estimates reported in the scientific literature, beginning in 2011. Also included in our Figure is both the IPCC’s latest assessment of the literature as well as the characteristics of the equilibrium climate sensitivity from the collection of climate models that the IPCC uses to base its impacts assessment.
Figure 1. Climate sensitivity estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence)range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing.
Quite obviously, the IPCC is rapidly losing is credibility.
As a result, the Obama Administration would do better to come to grips with this fact and stop deferring to the IPCC findings when trying to justify increasingly burdensome federal regulation of carbon dioxide emissions, with the combined effects of manipulating markets and restricting energy choices.
References:
Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.
Spencer, R.W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Sciences, doi:10.1007/s13143-014-0011-z.
=========================================================
Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Climate sensitivity?
Given the team performance to date, we will freeze as atmospheric CO2 concentrations rise.
As for the President, sure was right about that Californian drought.
As there is to date zero empirical causation, for temperature rise due to CO2, sensitivity seems more likely to be a percentage of S.F.A.
It takes a seriously deluded person to pursue a career, damning the stuff of life.
Zero CO2 equals zero carbon based life.
Statement of Patrick Moore, Ph.D. Before the Senate Environment and Public Works Committee, Subcommittee on Oversight:
“…There is no scientific proof that human emissions of carbon dioxide (CO2) are the dominant cause of the minor warming of the Earth’s atmosphere over the past 100 years. If there were such a proof it would be written down for all to see. No actual proof, as it is understood in science, exists.”
===============
pokerguy says:
February 28, 2014 at 10:51 am
“All this talk about climate sensitivity is playing their game. We need to stop and make them explain why for the past seventeen years the sensitivity has been zero.”
****
You’re wrong here. Sensitivity is the whole ball game. Moreover, sensitivity to Co2 is theoretically a constant. A period of no warming doesn’t mean that atmospheric sensitivity is zero, although it certainly argues for lower sensitivity given the ongoing rise in anthro Co2 during that time.
+++++++++
Pokerguy, see the statement from Dr. Moore above. Since there is no proof then talking about sensitivity to something that may not exist is in fact “playing their game” as was stated.
The whole forcing-accounting debate stems from the assumption that the climate needs to be forced to change and that the forcings can be identified. But the thing is that it is possible that over any time scale, climate can be chaotic and all accounting of forcings, which usually is made using statistical methods to identify sensitivities is an argument essentially based on ignorance. Not until the forced contributions can be traced to fundamental physics can we separate forcings and spontaneous variations.
It’s the relevant concern if you buy into their notion that carbon dioxide is the most significant driver of climate. I don’t think there’s any evidence of that yet. Other drivers sensitivity and forcings could be much more relevant.
I believe the climate sensitivity is much lower than 1 to 2 degrees. I believe it is closer to zero degrees. CO2 works similarly to smog, creating a localized heat island effect. People look at CO2 assumptions completely backward, the heating of earth (from solar/ocean influences) causes CO2 to increase (due to solar energy storage within oceans heating the oceans). El ninos/La ninas release that heat (and additional CO2) into the atmosphere. But that CO2 and heat eventually exits the atmosphere into space. If it isn’t replaced with the same level of solar energy, the heat of the ocean decreases; thus, the heat released decreases during future el nino/la nina events.
the change in ocean heat content…
hogwash…..temp, pressure, and density
If you assume heat really is hiding in the deep ocean….and at ~35F…..then, for all practical purposes, the ability of the deep ocean to hide heat is infinite
Thanks, P&P. Good article.
Craig Loehle has it right. If not, we would not be here.
The Earth would have gotten away in a death spiral of temperature, either warming or cooling.
My money is on Lindzen and Choi, 2011.
“All this talk about climate sensitivity is playing their game.”
Yep – a game we will win and they will lose.
OK, the war goes on – as before, they will try to shift the goalposts. But why not enjoy the trappings of winning this battle, and having sensitivity as another line of evidence of falsification (to add to the hotspot).
If recent stasis continues, more data = lower sensitivity. That gives us more salt for their already painful wounds. My hands are already itching to get stuck-in.
Is he bringing new insight or is he simply chasing the observations?
I see where you are coming from, Jordan. And we are winning that game so far. But your goalposts comment is pertinent. They’ll never admit we’ve proven co2 doesn’t drive climate until we can put up a solid case that something else does.
and I garantee that none of them is correct.
41.75°C at 7000ppm co²
Craig Loehle. I hope you read this thread, and I’d like to start by saying that I have really appreciated many of your past works. Thank you.
I haven’t read you paper (sorry). But I don’t have an issue with a linear approximation to an exponential rise. The thing that troubles me in the method (as described above) is the following:
“Loehle estimated the equilibrium climate sensitivity from his transient calculation based on the average transient:equilibrium ratio projected by the collection of climate models used in the IPCC’s most recent Assessment Report.”
To my mind, a significant failure of the warmist script is to use model forecasts as though they produce real data. Statistical properties are claimed for the climate, when the statistics relate to some “ensemble” of model responses. The logical error is the assumptions that, on average, model responses converge to useable climate forecasts. Accepting that individual “realisations” may be at some variance to the future climate trajectory (i.e. not useable).
I call BS on that. Model statistics describe the models and nothing more. There is no justification to make the leap to average model behaviour somehow being a reliable indicator of climate.
I see the same error in the method described above. Estimating the transient stability = OK.
Using average model to extrapolate to ECS is like adding a sprinkling of fairy dust = not OK.
Oops – messed up my last post
“I see the same error in the method described above. Estimating the transient RESPONSE = OK”….
evanmjones says:
February 28, 2014 at 11:22 am
All this talk about climate sensitivity is playing their game.
It’s the relevant concern.
——————————-
Not to Obama or the UN.
cn
Jordan: since few seem to have actually read my paper, let me point out that I argue that the true equilibrium (or at least what we will see by 2100) is probably lower than my calculated value.
I think Spencer and Braswell are getting close. It absolutely makes sense for the sensitivity to reflect net NEGATIVE feedbacks. It is not very likely the true feedbacks taken in total would be positive because earth’s overall climate is so relatively stable. The feedbacks practically have to be net negative or else the earth would have likely passed a tipping point LONG LONG ago.
We report here on the first results of a calculation in which separate estimates were made of the effects on global temperature of large increases in the amount of CO2 and dust in the atmosphere. It is found that even an increase by a factor of 8 in the amount of CO2, which is highly unlikely in the next several thousand years, will produce an increase in the surface temperature of less than 2 deg. K.
Schneider S. & Rasool S., “Atmospheric Carbon Dioxide and Aerosols – Effects of Large Increases on Global Climate”, Science, vol.173, 9 July 1971, p.138-141
Fair comment Craig – I have read other papers you have produced, and found them to be well worthwhile. I picked up a copy of your paper through the link you provided above, and will have a look at it.
The main gripe in my earlier comment is the practice of assuming average model response has meaning, whereas individual “realisations” don’t. Not saying you do this, but it seems to do the rounds in the wider debate.
Anyway – I appreciate your contribution.
If it is assumed (as IPCC does) that AGW has escalated from 1975, and prior to 1975 was negligible, it is possible to calculate CO2 sensitivity knowing natural variability.
Northern Hemisphere temperature is the one with the least uncertainty of all the records.
In this link (with Anthony’s permission)
http://www.vukcevic.talktalk.net/NHT.htm
I show comparison of the Northern Hemisphere temperature anomaly and natural variability (including solar and terrestrial natural oscillations 1880-2012).
Method I used is similar to that employed in Dr Dr. Loehle’s paper, but my result is more like 0.25C for doubling of CO2.
It is possible that my calculation is erroneous, the above link contains all numbers required enabling others to have a go.
Ilma630 has a point, by removing natural variability are they not accepting that it is possible to quantify anthro vs natural from a temperature record.
Fernando Leanme says:
February 28, 2014 at 9:47 am
——–
What’s wrong with the atmosphere having twice as much CO2.
A warming of 2C is not a bad thing, it is a good thing.
More CO2 in the atmosphere means plants grow bigger and need less water.
Yes, the oceans might rise a couple of inches, so what?
If you want to cut imports, a better way is to allow drilling in the US. That has the benefit of not making cars smaller and more expensive and will mean lots fewer people dying.
Fernando Leanme says: @ur momisugly February 28, 2014 at 9:47 am
….So the best solution is to cut emissions doing so in a prudent fashion and reducing the balance of trade déficit.
>>>>>>>>>>>>>>>>>
Yes but is it?
Everyone forgets WHEN we live. At the possible end of the Holocene.
The US Government and Mass Media completely ignored the real climate debate that has been raging.
The thing is, informed geologists hope and pray Greenhouse Gases can delay the next glacial inception.
William McClnney a Geologist has commented here at WUWT:
If Ruddiman’s “Early Anthropogenic Hypothesis” is correct it would be GHG emissions that have prevented glacial inception so far.
A more recent paper from the fall of 2012 says the same thing.
Can we predict the duration of an interglacial?
In addition that paper says the warming of the Arctic and cooling of the Antarctic, (sound familiar) the bipolar seesaw, is an indication that the descent into the next glaciation has already started.
Even Woods Hole Observatory warned that politicians maybe barking up the wrong tree.
Now tell me again why we want to lower CO2? Why the IPCC and the US government wants to strip the evil devil gas from the late Holocene atmosphere? Is it so we can take our glacial inception chances? Really? That is the IPCC and the EPA’s recommendation? According to the early anthropogenic hypothesis we should already be in the next glacial were it not for AGW! So Obama is recommending removing the only (so far) hypothesized glacial inception deterrent!
BRILLIANT!
Hello again Craig.
This part: “This can be converted to equilibrium sensitivity as follows. In IPCC (2007) Table 8.2 shows both transient and equilibrium sensitivity as computed by climate models. For the 18 cases where both are shown, the mean ratio of equilibrium to transient sensitivity is 1.81761◦C. Multiplying this by the transient forcing yields SE = 1.986◦C (1.745–2.227◦C).”
Rhetorically great – use your opponents’ arguments against them. But … scientifically, if your opponents have sprinkled fairy dust in their analysis, by using their results, you have added fairy dust to yours.
I don’t buy the idea that averaging model results adds value compared to individual “realisations”. If individual realisations cannot be relied upon, end of story.
Rhetoric is tempting, but best not to muddy your fingers with their mistakes.
Cheers