New paper finds transient climate sensitivity to doubling of CO2 is about 1°C

A new paper published in Ecological Modelling finds climate sensitivity to doubled CO2 concentrations is significantly lower than estimates from the IPCC and climate models which “utilize uncertain historical data and make various assumptions about forcings.” The author instead uses a ‘minimal model’ with the fewest possible assumptions and least data uncertainty to derive a transient climate sensitivity of only 1.093C:

“A minimal model was used that has the fewest possible assumptions and the least data uncertainty. Using only the historical surface temperature record, the periodic temperature oscillations often associated with the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation were estimated and subtracted from the surface temperature data, leaving a linear warming trend identified as an anthropogenic signal. This estimated rate of warming was related to the fraction of a log CO2 doubling from 1959 to 2013 to give an estimated transient sensitivity of 1.093 °C (0.96–1.23 °C 95% confidence limits) and equilibrium climate sensitivity of 1.99 °C (1.75–2.23 °C). It is argued that higher estimates derived from climate models are incorrect because they disagree with empirical estimates.”

Otto et al find equilibrium climate sensitivity [over the next several centuries] is only ~1.3 times greater than transient climate sensitivity, thus the estimate of 1.093C transient sensitivity could be associated with as little as 1.4C equilibrium sensitivity, less than half of the implied IPCC central estimate in AR5 of ~3.3C.

Moreover, this paper does not assume any solar forcing or solar amplification mechanisms. The integral of solar activity plus ocean oscillations explain ~95% of global temperature change over the past 400 years. Including potential solar forcing into the ‘minimal model’ could substantially reduce estimated climate sensitivity to CO2 to a much greater extent.

  • Empirical estimates of climate sensitivity are highly uncertain.
  • Anthropogenic warming was estimated by signal decomposition.
  • Warming and forcing were equated in the time domain to obtain sensitivity.
  • Estimated sensitivity is 1.093 °C (transient) and 1.99 °C (equilibrium).
  • Empirical study sensitivity estimates fall below those based on GCMs [Global Circulation Models].

Abstract

Climate sensitivity summarizes the net effect of a change in forcing on Earth’s surface temperature. Estimates based on energy balance calculations give generally lower values for sensitivity (< 2 °C per doubling of forcing) than those based on general circulation models, but utilize uncertain historical data and make various assumptions about forcings. A minimal model was used that has the fewest possible assumptions and the least data uncertainty. Using only the historical surface temperature record, the periodic temperature oscillations often associated with the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation were estimated and subtracted from the surface temperature data, leaving a linear warming trend identified as an anthropogenic signal. This estimated rate of warming was related to the fraction of a log CO2 doubling from 1959 to 2013 to give an estimated transient sensitivity of 1.093 °C (0.96–1.23 °C 95% confidence limits) and equilibrium climate sensitivity of 1.99 °C (1.75–2.23 °C). It is argued that higher estimates derived from climate models are incorrect because they disagree with empirical estimates.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

216 Comments
Inline Feedbacks
View all comments
July 23, 2014 8:22 am

This paper makes good sense to me.

July 23, 2014 8:28 am

It seems to me that the link that says
“Otto et al find equilibrium climate sensitivity [over the next several centuries] is only ~1.3 times greater than transient climate sensitivity,”
is not accurate, but should instead say
“Otto et al find equilibrium climate sensitivity [over the next several centuries] is only ~1.3 times that of transient climate sensitivity, ”
1.3 times greater means it is larger by 1.3 times the original amount, which doesn’t appear to be what is meant.

Frank
July 23, 2014 8:30 am

Eric Worrall says: July 23, 2014 at 3:51 am
Isn’t 1c around what would be expected from CO2 warming + no feedbacks? Hilarious 🙂
Don’t we live on a planet with significant feedbacks? In the regions that obviously are in rapid equilibrium with liquid water (cloudy skies and the boundary layer over the ocean), don’t we expect the amount of water vapor in the atmosphere to rise with temperature? We know lapse rate feedback will counter some of the water vapor feedback. Don’t you know that observations of outgoing LWR from space (especially in clear skies) show that seasonal global warming results in less emission of LWR from the TOA than expected for a climate with no feedbacks?

July 23, 2014 8:33 am

This does not hold water since CO2 always follows the temperature. So the amounts of CO2 in the atmosphere are governed by the temperature.
In addition past history clearly shows ice ages have occurred when co2 concentrations were much higher then today. Other big factors must be in play.

Bruce Cobb
July 23, 2014 8:34 am

durango12 says:
July 23, 2014 at 7:53 am
Yes, ~ 1 deg for doubling is what it will turn out to be. It is actually fairly obvious if you look at the data objectively.
No – It is doubtful we will ever know what the climate’s sensitivity to CO2 is, and we most certainly don’t now. There is nothing “obvious” about it at all, and “the data” is both incomplete as well as skewed.

July 23, 2014 8:36 am

Craig Loehle,
Valuable contribution to the dialog on climate sensitivity.
Your calm / measured communication of rigorous applied reasoning is wonderful.
John

Bruce Cobb
July 23, 2014 8:46 am

Jim Clarke says:
July 23, 2014 at 7:41 am
We are a contentious lot, and are prone to nit-picking. That is good for science, but not for policy discussion. I would rather show support for a paper that is at least more accurate AND would argue that CO2 is beneficial; with no requirement for regulation, than dis the paper as being ‘still wrong’.
I’m not aware of any policy discussion, and anyway, the ends don’t justify the means. “Support” in this case means that it’s a step in the right direction. But it’s still wrong. That is telling the truth, and that is what skeptics/climate realists do.

July 23, 2014 8:47 am

It amazes me how historical climatic data from solar variability to past temperature changes as Don Easterbrook correctly points out are ALWAYS ignored by AGW enthusiast.
They make up the data or cast data aside from solar variability to historical temperature trends /co2 concentrations to meet their needs.
If the data does not agree it is ether ignored or wrong. They must keep the hoax alive and mislead the public who in general are clueless in this area.

Greg Goodman
July 23, 2014 9:03 am

Loehle 2014: “A three-component model was fit to the Hadley global land and
ocean data for the period 1850–1950 (101 years) because IPCC has
stated that human effects on climate are only evident (detectable)
after 1950. The fit over the period was good (Fig. 1a). ”
“CO2 forcing alone. It is known from theory that CO2 should saturate
such that the CO2 forcing is proportional to the log of concentra-
tion. The log of an exponentially rising function (as CO2 is) is of
course a straight line, so the linear anthropogenic warming signal
detected by Loehle and Scafetta (2011) is in line with theory and
can be viewed as essentially capturing the CO2 signal. ”
ie of the form , which is indeed linear:
CO2(x)=log(280+k*exp(x-1950))
My earlier criticism, which applies to all IPCC type thinking as well as the present paper is that this still leaves notable decline at the end of 19th, ie there is still significant, non AGW variability in the record. Thus assuming late 20th rise is totally AGW is unfounded.
That is despite Hadley bias “corrections” removing 2/3 of the late 19th century cooling, inserting a post WWII cooling and generally making the SST part of the record more conformative to the IPCC story.
http://judithcurry.com/2012/03/15/on-the-adjustments-to-the-hadsst3-data-set-2
It seems the present paper is a middle of the road, intentionally simplistic, attempt to pull things in the right direction. It adopts most of the IPCC position and data ‘corrections’, so is not too controversial.
It got published, which is a useful step.

July 23, 2014 9:03 am

phlogiston says:
July 23, 2014 at 7:39 am
There’s a typo in the title itself, it should read:
New paper finds transient doubling sensitivity of CO2 to temperature change is about 1°C

Well that’s a smarta$$ answer but of course is totally untrue as one would expect. Actually for sea-water and CO2 the “doubling sensitivity of CO2 to temperature change is about” 16°C.

sleepingbear dunes
July 23, 2014 9:08 am

rgbatduke and Don Easterbrook always give the discussion a needed inoculation of common sense. I feel better for it.

July 23, 2014 9:08 am

“A minimal model was used that has the fewest possible assumptions and the least data uncertainty. Using only the historical surface temperature record, the periodic temperature oscillations often associated with the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation were estimated and subtracted from the surface temperature data, leaving a linear warming trend identified as an anthropogenic signal. This estimated rate of warming was related to the fraction of a log CO2 doubling from 1959 to 2013 to give an estimated transient sensitivity of 1.093 °C (0.96–1.23 °C 95% confidence limits) and equilibrium climate sensitivity of 1.99 °C (1.75–2.23 °C). It is argued that higher estimates derived from climate models are incorrect because they disagree with empirical estimates.”
Typical – using incomplete ,inaccurate data and expecting a good result . Another model output another waste if time.
This article says nothing and tells us nothing.

July 23, 2014 9:15 am

Salvatore Del Prete says:
July 23, 2014 at 8:33 am
This does not hold water since CO2 always follows the temperature. So the amounts of CO2 in the atmosphere are governed by the temperature.
In addition past history clearly shows ice ages have occurred when co2 concentrations were much higher then today. Other big factors must be in play.

Not while the earth has been in its current configuration, e.g. no isthmus of Panama. In fact over the last 800,000 years there’s no evidence of 300ppm being exceeded.

July 23, 2014 9:20 am

Goodman
“firstly the world has been warming for several 100 years, clearly there is a long term process independent of AGW.”
Clearly?
hardly.
I love when good skeptics stop examining all their beliefs. the assertions they make are clearly funny.

Tetragrammaton
July 23, 2014 9:23 am

But…but…but while the notion of transient climate sensitivity may be perfectly reasonable, the idea that it’s a universal constant number is almost certainly flawed. If one looks sideways at some of the recent papers written by the “old guard” climate science team, there seems to be a sub rosa acknowledgment that variable water-vapor content – particularly as regards cloud formation – has not been well integrated into their thinking, and hardly at all into their models. (See A.Lacis et al 2013, for example).
Maybe, they sometimes seem to be saying (although they never actually say it) , the actual value of transient climate sensitivity varies from as high as 3 or 4 to as low as -1. So it’s +1 in winter in dry air over the poles, 3 at night in temperate latitudes, -1 in the afternoon in the tropics, and so forth. Averaged out over a year, in may come close to the Cripwell assumption of zero. But then the next year may have a sufficiently different set of weather patterns that it comes to something else. And the year after that, there may be cosmic ray fluxes which cause more water-vapor nucleation and more clouds and hence more reflection of em radiation. As Professor Brown so aptly points out, it’s really, really complex.
So maybe the effects of CO2 as a greenhouse gas need to be described by a whole array of subsets of radiative physics, geared to particular micro circumstances of temperature, humidity, pressure, cosmic rays, etc. This will take a long time to figure out. In fact, it would be interesting for someone – preferably well-versed in the history and philosophy of science – to trace the evolution of “climate science” from Arrhenius to the current “consensus” mess, and then deduce how long it will take for all these “subsets” to be figured out. Another eighty years, perhaps?

Bart
July 23, 2014 9:31 am

Greg Goodman says:
July 23, 2014 at 4:26 am
“I suppose it’s progress of a sort, inch by inch they are giving ground but it’s going to take decades before they finally admit CO2 sensitivity is sod_all +/- 50% .”
+1

July 23, 2014 9:31 am

Any calculation of Climate Sensitivity or climate forecast which doesn’t accommodate the natural 1000 year +/- quasi-periodicity in the calculation is an exercise in futility.
See Fig 4 at
http://climatesense-norpag.blogspot.com/2013/10/commonsense-climate-science-and.html

July 23, 2014 9:43 am

Dr. Loehle,
Thanks for doing this I always appreciate your work.
The other day I was reviewing another paper that estimated Sensitivity and it looked to me like it would be a cool idea to make the calculation into an online tool where people could play with numbers to see how that changed the estimate.
The sooner more people realize that working on the problem of estimating TCR or ECS is the most powerful step that skeptics can make, the sooner the charge of “denialist” will be seen as ridiculous.
But I fear that some skeptics will refuse to enter the debate. The science debate is
1. What are the best methods to bound our ignorance about sensitivity
2. what do those methods show.
3. what additional observations do we need to constrain the problem further.
Skeptics who address those issues are in the debate.
Nic Lewis and you are part of the debate because you address the question rather than
dismissing it.
You are part of the debate because you publish papers.
Some days i wonder where this debate would be if more energy and effort were put in by skeptics on the key question.. on actually addressing the issue on the debate table rather than dismissing it.
Instead they waste energy and time on peripheral issues.
Thanks for being part of the solution.

Don Easterbrook
July 23, 2014 9:44 am

“Using only the historical surface temperature record, the periodic temperature oscillations often associated with the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation were estimated and subtracted from the surface temperature data, leaving a linear warming trend identified as an anthropogenic signal.”
“Moreover, this paper does not assume any solar forcing or solar amplification mechanisms. The integral of solar activity plus ocean oscillations explain ~95% of global temperature change over the past 400 years. Including potential solar forcing into the ‘minimal model’ could substantially reduce estimated climate sensitivity to CO2 to a much greater extent.”
I worry about any calculation that assumes ALL factors have been accounted for EXCEPT CO2 so what is left over MUST be CO2. Since no solar effects were assumed, isn’t what must be left after subtracting PDO/AMO cycles = solar + CO2? Subtracting PDO/AMO cycles doesn’t necessarily automatically include all solar factors. Some would argue that PDO/AMO cycles are not driven by solar variation and thus do not include a solar factor.
Natural climate changes in the geologic past have been huge compared to what we see now and what CO2 might be capable of, so how can we be sure that a small blip (like warming from 1978-2000) isn’t just natural? And what about climatic cooling from 1945-1977 and 2000-2014 during continuing rise of CO2? Does this not show that natural factors totally overwhelm any CO2 effects?

July 23, 2014 9:46 am

Greg Goodman says:
July 23, 2014 at 4:04 am
firstly the world has been warming for several 100 years, clearly there is a long term process independent of AGW.
If it has, it is not due to solar activity which does not show any trend the past 300 years:
http://www.leif.org/research/New-Group-Numbers.png

July 23, 2014 9:51 am

Leif Svalgaard says: July 23, 2014 at 7:37 am
…………
It was a great loss that von Neumann died in his early 50is, but on the other hand fortunate that a young student (Michael Minovitch) working during his summer vacation at the JPL probably never heard of the Neumann’s advice to Freeman Dyson.

July 23, 2014 10:05 am

vukcevic says:
July 23, 2014 at 9:51 am
on the other hand fortunate that a young student (Michael Minovitch) working during his summer vacation at the JPL probably never heard of the Neumann’s advice to Freeman Dyson.
Wouldn’t have mattered if he had or not.

Jim Cripwell
July 23, 2014 10:06 am

Steven Mosher writes “1. What are the best methods to bound our ignorance about sensitivity”
There is only ONE way to bound our ignorance. And that is to actually MEASURE CS. Since this is impractical at this time, the situation will remain the same. Intelligent people guess numeric values, all of which are meaningless.
Until we start applying the fundamental principles of The Scientific Method to this problem, we are merely spinning our wheels, and making absolutely no progress at all.
All these sorts of estimations do is give some sort of vague idea of what the MAXIMUM value of CS might be.

Latitude
July 23, 2014 10:22 am

1. What are the best methods to bound our ignorance about sensitivity
2. what do those methods show.
3. what additional observations do we need to constrain the problem further.
================
1. (a) based on a temp history that is so wonky it changes every time there’s a new run…(b) and constantly producing new science that says all of the old science was wrong
2. (a) and (b) both showed that we are constantly being lied to and the science is not settled
3. see (a) and (b)

July 23, 2014 10:23 am

If they used published temp series (which is what they look like), it’s still too high.
If you make the time series based on max temps (instead of average), there is no trend.