Yesterday we talked about the new paper from Nic Lewis, now Troy Masters has a new paper in press at Climate Dynamics here.
Observational estimate of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5 models
Unfortunately, Springerlink wants $39.95 for the privilege of reading it, so all I can do is to provide the abstract. From his blog however, Troy does show figure 5 of the paper:
Abstract. Climate sensitivity is estimated based on 0–2,000 m ocean heat content and surface temperature observations from the second half of the 20th century and first decade of the 21st century, using a simple energy balance model and the change in the rate of ocean heat uptake to determine the radiative restoration strength over this time period. The relationship between this 30–50 year radiative restoration strength and longer term effective sensitivity is investigated using an ensemble of 32 model configurations from the Coupled Model Intercomparison Project phase 5 (CMIP5), suggesting a strong correlation between the two. The mean radiative restoration strength over this period for the CMIP5 members examined is 1.16 Wm−2K−1, compared to 2.05 Wm−2K−1from the observations. This suggests that temperature in these CMIP5 models may be too sensitive to perturbations in radiative forcing, although this depends on the actual magnitude of the anthropogenic aerosol forcing in the modern period. The potential change in the radiative restoration strength over longer timescales is also considered, resulting in a likely (67 %) range of 1.5–2.9 K for equilibrium climate sensitivity, and a 90 % confidence interval of 1.2–5.1 K.
=============================================================
Compared to Dr. Roy Spencer’s post about models -vs- reality…
…it looks more and more as if climate sensitivity is on the lower end of the scale, rather than the high end such as was claimed recently at RealClimate by Fasullo and Trenberth which was 4°C for a doubling of CO2.
And there’s yet ANOTHER paper arguing for lower climate sensitivity. See it here
Causes of the global warming observed from the 19th century
M.J. Ring, D. Lindner, E.F. Cross, R.E. Schlesinger
Abstract. Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century. It is critically important to determine whether this global warming is due to natural causes, as contended by climate contrarians, or by human activities, as argued by the Intergovernmental Panel on Climate Change. This study updates our earlier calculations which showed that the observed global warming was predominantly human-caused. Two independent methods are used to analyze the temperature measurements: Singular Spectrum Analysis and Climate Model Simulation. The concurrence of the results of the two methods, each using 13 additional years of temperature measurements from 1998 through 2010, shows that it is humanity, not nature, that has increased the Earth’s global temperature since the 19th century. Humanity is also responsible for the most recent period of warming from 1976 to 2010. Internal climate variability is primarily responsible for the early 20th century warming from 1904 to 1944 and the subsequent cooling from 1944 to 1976. It is also found that the equilibrium climate sensitivity is on the low side of the range given in the IPCC Fourth Assessment Report.
From the paper:
Additionally, our estimates of climate sensitivity using our SCM and the four instrumental temperature records range from about 1.5 ̊C to 2.0 ̊C. These are on the low end of the estimates in the IPCC’s Fourth Assessment Report. So, while we find that most of the observed warming is due to human emissions of LLGHGs, future warming based on these estimations will grow more slowly compared to that under the IPCC’s “likely” range of climate sensitivity, from 2.0 ̊C to 4.5 ̊C. This makes it more likely that mitigation of human emissions will be able to hold the global temperature increase since pre-industrial time below 2 ̊C, as agreed by the Conference of the Parties of the United Nations Framework Convention on Climate Change in Cancun.
Dr. Judith Curry sums it up pretty well:
In weighing the new evidence, especially improvements in the methodology of sensitivity analysis, it is becoming increasing difficult not to downgrade the estimates of climate sensitivity.
All this blows the laughable Skeptical Science claim Climate Sensitivity Single Study Syndrome, Nic Lewis Edition out of the water. Dana should quit while he’s ahead, because his arguments aren’t convincing.
h/t to Mosher
Related articles
- A Comparison Of The Earth’s Climate Sensitivity To Changes In The Nature Of The Initial Forcing (wattsupwiththat.com)

Man made Global warming is a hoax. Until the mainstream media gets it and reports it as christopher booker did with his book ” the real global warming disaster” .we have been fed too much misinfo and propaganda(al gore, completely biased sixty minutes show on agw, wikipedia even defines co2 as causing agw, at best it can only be a theory. I am totally frustrated with this. It is a political problem(thanks is due to this site for proving the “climate models” false along with everthing else the Ipcc has said) the so called solutions being shoved down our throats are a bigger threat. John piccirilli Watts Up With That? <comment-reply@wordpress.com> wrote: >Post : Another paper finds lower climate sensitivity >URL : http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/ >Posted : April 18, 2013 at 3:00 am >Author : Anthony Watts >Categories : Climate sensitivity > >Yesterday we talked about the new paper from Nic Lewis, now Troy Masters has a new paper in press at Climate Dynamics here. Observational estimate of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5 models Unfortunately, Springerlink wants $39.95 for the privilege of reading it, so all I […] > >Read more of this post (http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/) > >Add a comment to this post: http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/#respond > >– >WordPress.com | Thanks for flying with WordPress! > >Manage Subscriptions >https://subscribe.wordpress.com/?key=5490814abe22e42d4efac66b230a2346&email=jnpics%40aol.com > >Unsubscribe: >https://subscribe.wordpress.com/?key=5490814abe22e42d4efac66b230a2346&email=jnpics%40aol.com&b=Cu-%7C%5DqjUaiotJCVlPCkX4e-JaHLX%2C17oMp%7ENHWVZ3_l0nziOnt >
a:hover { color: red; } a { text-decoration: none; color: #0088cc; } a.primaryactionlink:link, a.primaryactionlink:visited { background-color: #2585B2; color: #fff; } a.primaryactionlink:hover, a.primaryactionlink:active { background-color: #11729E !important; color: #fff !important; }
/* @media only screen and (max-device-width: 480px) { .post { min-width: 700px !important; } } */ WordPress.com
Anthony Watts posted: “Yesterday we talked about the new paper from Nic Lewis, now Troy Masters has a new paper in press at Climate Dynamics here.
Observational estimate of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5 models
Un”
Richard M says:
“Sensitivity probably takes the form … f(x,y,z,…) … where we know far too little about x,y,z,…”
I regret that I have but 1 thumbs up to give!
Climate sensitivity is estimated based on 0–2,000 m ocean heat content and surface temperature observations from the second half of the 20th century and first decade of the 21st century, using a simple energy balance model and the change in the rate of ocean heat uptake to determine the radiative restoration strength over this time period.
============
How do we know that the warming from 1904 to 1944, the cooling from 1944 to 1976, and the warming from 1976 to 1997, and the cooling from 1997 to 2013 are not simply an oscillation in the deep ocean heat uptake?
The earth has an enormous river of cold water flowing from the poles to the eastern Pacific. We see this as the ENSO oscillation. This river does not flow at a constant rate. It meanders as all rivers do; that is the nature of water. It oscillates in harmony with tidal forces and ocean basins; that is the nature of dynamic systems. When it finally discharges some 800 years after starting its journey we see this natural meander and oscillation as global periods of warming and cooling.
The oceans have so much heat capacity in them as compared to the surface and atmosphere that just the smallest change in the deep ocean conveyor dramatically changes the surface climate. To assume that this river of cold water, by far the largest river on earth, flows in a steady stream is the second biggest nonsense in climate science.
” Richard M says:
April 18, 2013 at 7:09 am
Sensitivity probably takes the form … f(x,y,z,…) … where we know far too little about x,y,z,…”
And what do we know about f?
“Mike McMillan says:
April 18, 2013 at 3:57 am
How difficult would it be to just dial in a lower sensitivity and run the models? Has anyone ever done that, or are they afraid the result would be closer to reality?
###############
sensitivity is not a parameter that you SET in running a model. sensitivity is an OUTPUT of the model. By adjusting parameters ( such as aerosol forcing which has a lot of uncertainity ) you can “tune” the model to get closure at TOA, for example. When you then run that model in “forecast” mode and double C02 you get an answer and that’s basically your sensitivity.
So, sensitivity is not an input to models .Y ou dont set a variable named sensitivity to 3 .
Models range in sensitivity from about 2.1 to 4.4.
Tilo Reber says:
April 18, 2013 at 6:29 am
So the alternative is to produce numbers that are just a little bit lower, but not low enough to allow the team to declare you a heretic.
=========
interesting example that has been shown to be true from other historical examples. bad scientific estimates by the powerful and influential are rarely corrected overnight. they are whittled away slowly over the years to save face.
rather than having to admit they made a huge boo-boo, climate science can instead admit they simply made a number of very small mistakes. since none of their mistakes were large, they mostly had it right.
so, except for the people that freeze to death in the meantime, everything is peachy. and as dead people rarely complain, who is the wiser?
“…as contended by climate contrarians, …”
Relative to what? Well at least we have graduated from the ugly D-word to the less derogatory ‘contrarians’. But the joke is on the consensus. I do believe we are heading for a tipping point where the consensus becomes the D-folk. It would not be correct to call their position that of a contrarian then.
So this is to be the mechanism. It always was going to be I suppose.
The wild ambit claim. Followed by, as required, a tempering of claims. Retaining the position that these are fundamentally legitimate. But that this alteration of claim demonstrates in itself the reasonableness of proponents. And that while certain more dramatic policy responses can be modified the core thrust should be maintained. Most particularly, the maintainence of status and money flowing to the developers.
It demonstrates that to provide evidence that discredits “the science” is not enough. The “scientists” themselves must be discredited. By their own actions.
Steven Mosher says:
April 18, 2013 at 7:56 am
“sensitivity is not a parameter that you SET in running a model. sensitivity is an OUTPUT of the model.”
Hmm…. I think it might be a cool idea to SET it in as a parameter at 1.5, and adjust parameters, starting with those with the largest error bars, and see if we can get TOA. What would be wrong with that – it seems we are constraining sensitivity better and better with observations.
Steven Mosher says:
April 18, 2013 at 7:56 am
So, sensitivity is not an input to models .You dont set a variable named sensitivity to 3 .
==========
the variable is called positive water feedback. this is set to 3 and directly controls CO2 sensitivity, similar to the way a volume control on your radio controls sound. so while the model builders can try and pretend they are not building assumed CO2 sensitivity into their models, they are fooling themselves.
Richard Feynman
“The first principle is that you must not fool yourself—and you are the easiest person to fool.”
Anyone that has ever brought a microphone next to speaker knows what happens when you turn up the volume. You get runaway feedback. This is what the climate models are doing. They are feeding the sensitivity outputs back into the inputs and as they turn up the volume they get runaway feedback. It is a nonsense of the way they have built the models and this is made clear by the divergence between models and reality.
One can not determine the quality of the output from a climate simulation code without first knowing what differential equations are being solved numerically, and the numerical methods that are used to solve them. Could someone write those down for us (including the initial and boundary conditions), so we can examine the consistency and stability of the numerical approximations to the differential equations? And I would assume all models are using the same equations (including parameterizations of sub-scale physics) – otherwise, how can you compare one model with another? Right??
By the way, could someone replot the global “temperature” history in terms of absolute temperatures rather than “anomalies”? That would be interesting to see…
We test the null hypothesis: the UAH and the RSS data sets are not statistically different from the ensemble of 44 models.
There is a 1/46 chance that *each* is going to lie at an extremum (in this case the low end) of the ensemble. The chance that both lie at the same extremum is 1/(46^2), or one chance in 2116. The null hypothesis fails at the 95%, the 99%, and even the 99.9% levels. This is about as unlikely as dealing two Jokers off the top of a ‘well-shuffled’ deck – even if you take the 2’s and 3’s out of the deck first.
You should be concerned that the deck has been stacked!
The assumption of positive feedback in climate models is at odds with what we know about dynamic systems. When we look at things that do have positive feedback, such as hurricanes and wild-fires, we see that they have one factor in common. They are very short lived. They quickly exhaust all available energy and die. The very long history of life on earth tells us that climate cannot have long lived positive feedback. Thus, the IPCC models, which all assume long term positive feedback must be a nonsense. They are inconsistent with long term paleo records. As is being demonstrated day by day from observations.
To Joeldshore:
You state: “The correct conversion would say that it predicts a surface warming of ~0.7 C per CO2 doubling.”
Well, good, thank you for the correction. Note, however, that this figure is
still 1/2 of the LOWEST possible figure used by the IPCC for a CO2 doubling.
Since these GCMs are essentially Monte Carlo, I think it’s appropriate to repeat
what they say in Vegas: “A LOSER! Thanks for playing!”
If the Earth were at all sensitive to changes in CO2 then we would not be here. Its because the Earth survives CO2 changes and temperature changes that life evolves.
Interesting, isn’t it, that this “study of a series of models studying the influence of CO2 “forcing” on temperature” finds out ….. THAT (according to the models) CO2 AFFECTS TEMPERATURE!
Wow.
What will they pay for next?
/sarchasm – that gaping whole between a socialist and the real world.
Further, a study showed 97% of government-paid ‘scientists’ agree that “government-paid studies to find out that temperature is affected by CO2 find out that temperature is affected by CO2 in direct proportion to the level of new government funding paid out to scientists who are paid to find models that prove temperature is affected by CO2! “
From Steven Mosher on April 18, 2013 at 7:56 am:
Wait a sec…
The models are built with assumptions of how the climate initially responds to CO₂ concentration changes, the feedbacks from those responses, etc.
From this is generated the temperature changes resulting from the CO₂ changes, from which sensitivity is calculated.
Aren’t you just saying the mileage is not a direct input to the vehicle’s electronics, it’s something the manufacturer determines from testing the finished vehicle? Even though the desired mileage was built into the vehicle designs from the start?
Michael Schlesinger, the last author on the Ring et al paper and probably the most senior of the authors, would likely be called a “warmist” by many readers of WUWT. He’s an insider, unlike, say, Nic Lewis. They both deserve praise for bucking the trend, although pretty soon their latest findings may be seen as the new consensus.
I think the TRENDS of multi-decadal warming and cooling prior to 1979 can be compared to the recent period of warming and (soon) the extension of a new cooling period. If the slopes of these recent trends are the same as that of the late 19th and 3/4 of the 20th century, then it is essentially natural variability, except for the natural rise out of the LIA. A significant warming signal would be that warming slopes would be steeper and cooling slopes less steep. The only fly in the ointment of this simple definitive quantitative test is the corruption of the temperature records. What is wrong with this, really?
Judith Curry’s latest blog entry discusses a slew of new studies, including Schlesinger’s (Ring et al). She refers to Schlesinger as a “heavyweight.” First, the link to this entry:
http://judithcurry.com/2013/04/17/meta-uncertainty-in-the-determination-of-climate-sensitivity/#more-11524
And the most relevant three paragraphs:
“Two heavyweight climate scientists have published very different ideas about how much the Earth is going to warm in the coming decades. And neither has much regard for the other’s estimate – casting light on a long-standing, thorny issue in climate science.
Future warming is likely to be on the high end of predictions says Kevin Trenberth of the National Center for Atmospheric Research who has been a lead author for the United Nations Intergovernmental Panel on Climate Change (IPCC).
But Michael Schlesinger, who heads the Climate Research Group within the Department of the Atmospheric Sciences at the University of Illinois, has just published a study with his group finding warming will be at the low end of projections.”
On the Ring et al. paper, their introduction makes clear that the only solar forcing they take into account is the very slight variation in solar irradiance, but strong correlations between solar activity and climate going back many thousands of years indicate that some substantial mechanism of solar amplification must be at work (some mechanism of solar forcing substantially stronger than the variance in solar radiation). Since solar activity was high over the 20th century, that would substantially increase the forcing over the 20th century, which would further reduce the implied sensitivity, so even their reduced estimates are still way to high. On theoretical grounds (the thermostat hypotheses of Lindzen, Spencer and Eschenbach) there is solid reason to think that climate feedbacks are negative (sensitivity < 1).
I don’t remember but I read this yesterday about the use of “uniform priors”. http://bishophill.squarespace.com/blog/2013/1/25/uniform-priors-and-the-ipcc.htm
It’s about some blog comments at RC. http://www.realclimate.org/index.php/archives/2013/01/on-sensitivity-part-i/comment-page-2/#comments
Does Masters use “uniform priors” ?
Chris R says:
I recommend you read the next sentence of my comment http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/#comment-1278857 where I explained why noone who is a skeptic in the true sense of the word would believe their result based on an untested method.
No, really, it’s worse than we thought. /sarc
If the global atmospheric CO2 concentration continues to rise and the global mean temperature continues to (approximately) flatline, then the estimates (and credible intervals) of “climate sensitivity to CO2” will continue to decline. Every decision humans might make depends on future data.