Another paper finds lower climate sensitivity

Yesterday we talked about the new paper from Nic Lewis, now Troy Masters has a new paper in press at Climate Dynamics here.

Observational estimate of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5 models

Unfortunately, Springerlink wants $39.95 for the privilege of reading it, so all I can do is to provide the abstract. From his blog however, Troy does show figure 5 of the paper:

Figure5

Abstract. Climate sensitivity is estimated based on 0–2,000 m ocean heat content and surface temperature observations from the second half of the 20th century and first decade of the 21st century, using a simple energy balance model and the change in the rate of ocean heat uptake to determine the radiative restoration strength over this time period. The relationship between this 30–50 year radiative restoration strength and longer term effective sensitivity is investigated using an ensemble of 32 model configurations from the Coupled Model Intercomparison Project phase 5 (CMIP5), suggesting a strong correlation between the two. The mean radiative restoration strength over this period for the CMIP5 members examined is 1.16 Wm−2K−1, compared to 2.05 Wm−2K−1from the observations. This suggests that temperature in these CMIP5 models may be too sensitive to perturbations in radiative forcing, although this depends on the actual magnitude of the anthropogenic aerosol forcing in the modern period. The potential change in the radiative restoration strength over longer timescales is also considered, resulting in a likely (67 %) range of 1.5–2.9 K for equilibrium climate sensitivity, and a 90 % confidence interval of 1.2–5.1 K.

=============================================================

Compared to Dr. Roy Spencer’s post about models -vs- reality

CMIP5-global-LT-vs-UAH-and-RSS

…it looks more and more as if climate sensitivity is on the lower end of the scale, rather than the high end such as was claimed recently at RealClimate by Fasullo and  Trenberth which was 4°C for a doubling of CO2.

And there’s yet ANOTHER paper arguing for lower climate sensitivity. See it here

Causes of the global warming observed from the 19th century

M.J. Ring, D. Lindner, E.F. Cross, R.E. Schlesinger

Abstract.  Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century. It is critically important to determine whether this global warming is due to natural causes, as contended by climate contrarians, or by human activities, as argued by the Intergovernmental Panel on Climate Change. This study updates our earlier calculations which showed that the observed global warming was predominantly human-caused. Two independent methods are used to analyze the temperature measurements: Singular Spectrum Analysis and Climate Model Simulation. The concurrence of the results of the two methods, each using 13 additional years of temperature measurements from 1998 through 2010, shows that it is humanity, not nature, that has increased the Earth’s global temperature since the 19th century. Humanity is also responsible for the most recent period of warming from 1976 to 2010. Internal climate variability is primarily responsible for the early 20th century warming from 1904 to 1944 and the subsequent cooling from 1944 to 1976. It is also found that the equilibrium climate sensitivity is on the low side of the range given in the IPCC Fourth Assessment Report.

From the paper:

Additionally, our estimates of climate sensitivity using our SCM and the four instrumental temperature records range from about 1.5 ̊C to 2.0 ̊C. These are on the low end of the estimates in the IPCC’s Fourth Assessment Report. So, while we find that most of the observed warming is due to human emissions of LLGHGs, future warming based on these estimations will grow more slowly compared to that under the IPCC’s “likely” range of climate sensitivity, from 2.0 ̊C to 4.5 ̊C. This makes it more likely that mitigation of human emissions will be able to hold the global temperature increase since pre-industrial time below 2 ̊C, as agreed by the Conference of the Parties of the United Nations Framework Convention on Climate Change in Cancun.

Dr. Judith Curry sums it up pretty well:

In weighing the new evidence, especially improvements in the methodology of sensitivity analysis, it is becoming increasing difficult not to downgrade the estimates of climate sensitivity.

All this blows the laughable Skeptical Science claim Climate Sensitivity Single Study Syndrome, Nic Lewis Edition out of the water. Dana should quit while he’s ahead, because his arguments aren’t convincing.

h/t to Mosher

0 0 votes
Article Rating
62 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
April 18, 2013 3:04 am

I can think we can safely say that a consensus of recent papers – probably 97% of them in fact – points to a low sensitivity… .

April 18, 2013 3:08 am

The IPCC’s suggestion, that climate sensitivity is most likely to be in the range 2.0 to 4.5°C , is shown to be barely supportable and then only by favoring computer simulations of the climate over empirical measurements. Yes confusion reigns, but is that not part of the plan? If it had not been for the internet, the Alarmists plan would in a large part already been implemented. If that would have happened we now would all be living in FUBAR land.
So in regards to climate sensitivity and the Alarmist screaming’s of massive positive feedback. It seems to me to throw lesson one of the scientific method out of the classroom window. And I really do mean lesson one: “ In general, we look for a new law by the following process: First we guess it; then we compute the consequences of the guess to see what would be implied if this law that we guessed is right; then we compare the result of the computation to nature, with experiment or experience, compare it directly with observation, to see if it works. If it disagrees with experiment, it is wrong. In that simple statement is the key to science. It does not make any difference how beautiful your guess is, it does not make any difference how smart you are, who made the guess, or what his name is — if it disagrees with experiment, it is wrong.” Richard Feynman”.
So an examination suggests that the values of climate sensitivity used in the PAGE model are highly debatable. But of course it’s actually even worse than that (it usually is). Close followers of the climate debate will recall Nic Lewis’s guest post at Prof Curry’s blog last year, in which he noted that the “Forster and Gregory” values in the IPCC graph were not the values that were implicit in Forster and Gregory’s published results – the IPCC had notoriously chosen to restate the findings in a way that gave a radically higher estimate of climate sensitivity. See: http://judithcurry.com/2011/07/05/the-ipccs-alteration-of-forster-gregorys-model-independent-climate-sensitivity-results/
Love this: “The IPCC took eight studies on climate sensitivity of which one (Forster/Gregory 06) was the only study based purely on observational evidence, with no dependence on any climate model simulations threw said study in their voodoo math machines and basically came up with 2 x the result. It then put the study up in the graph with the other studies and basically pulled the “Mikes nature trick/hide the decline” game.”.
On another study: http://www.agu.org/pubs/crossref/2006/2005GL023977.shtml “Estimated PDFs of climate system properties including natural and anthropogenic forcings.” Nicholas Lewis has been trying for over a year and a half, without success, to try and obtain from Dr Forest the data used in Forest 2006. However, he has been able to obtain without any difficulty the data used in two related studies that were stated to be based on the Forest 2006 data. It appears that Dr Forest only provided pre-processed data for use in those studies, which is understandable as the raw model dataset is very large. Unfortunately, Dr Forest reports that the raw model data is now lost. Yes LOST, the dog ate my data! Worse, the sets of pre-processed model data that he provided for use in the two related studies, while both apparently deriving from the same set of model simulation runs, were very different. Talking of dogs, seems my vet keeps better a record of my dog’s health that Dr Forrest can.
The global warming scare has fizzled like a wet fire cracker. The sun has entered a new quiet phase, and average global temperatures have been stable for 16 years. Wasteful and ridiculous Climate conferences in Doha and elsewhere have achieved nothing except hot air and Green bollocks. Kyoto has become a sick joke and is now a footnote at the bottom of the page in history. Countries that agreed to climate stabilization policies have failed miserably and now are retreating from that untenable position. The general public has really had enough of the scaremongering and have more important things in their lives. In short most people just laugh as CACW is now just a very expensive bad joke.

MattN
April 18, 2013 3:12 am

Looks like it’s about 1-1.5C for a doubling….

Stefan
April 18, 2013 3:20 am

To the layman that spaghetti graph is quite clear: the 1998 spike wasn’t “consistent with models”, it was just a blip, and the models are all over the place anyway.

Evan Jones
Editor
April 18, 2013 3:24 am

I prefer the caveman approach. I am a top-down sort of a person, having designed a number of non-climate simulations. The problem with the models is that they try to do the Russian Front using a man-to-man approach rather than an Army Group-to-Front approach, and that results in huge bottom-up divergence error.
Stipulating that temperatures rose 0.7C over the last century (and I think it is significantly lower), that comes after natural forcings (e.g., most of the warming pre-1950) and non-CO2 anthropogenic effects (such as soot), that comes after a CO2 increase of 40%. Including feedbacks.
Given the gross data and effects, I find it hard to conclude that sensitivity is as much as double the 0.7C figure, for a doubling of CO2. Probably more in the range of 1.0 or so: Arrhenius effect alone, with a net-neutral feedback.
I see very little analysis based on this basic approach in the literature. And it seems to me that this is the approach to start with.

Ken Hall
April 18, 2013 3:27 am

So when the EU and UN leaders were calling for drastic and economy destroying action to keep temperature rise to a minimum of 2C by 2100 as their goal, we are now seeing that taking NO ACTION WHATSOEVER, will still see us reach and do even better than their goal.
Can we stop all these expensive, pointless, damaging, industry destroying, elderly killing “green” taxes now, please???

Ken Hall
April 18, 2013 3:49 am

in reply to jb frodsham above…
You said, “Countries that agreed to climate stabilization policies have failed miserably and now are retreating from that untenable position. “
I wish that were true. The UK passed a climate change act which will destroy our economy and leave tens of thousands of elderly, infirm and the poorest people dead from hypothermia and the Government in the UK is determined to implement it in full. The UK government is, tragically, meeting it’s obligation to reduce CO2.
We in the UK desperately need bring lots of pressure to bare on the MPs to force them to repeal the futile, damaging and expensive climate change act.
The climate change act is an exercise in cruel futility, as even if we in the UK stopped ALL our CO2 emissions, China, Brazil and India will all massively off-set our losses in emissions and our outsourcing of emissions may even see our indirect (outsourced) emissions increase in other parts of the world. Our Climate Change Act will not reduce global temperatures at all, they will not reduce sea levels, or protect a single glacier, or save a single polar bear, or prevent a single flood, or drought. This Act is futile, completely and utterly futile.
Now on top of the fact that our emissions will be offset, thus failing to reduce global CO2 emissions at all, we are becoming increasingly aware, that the climate’s sensitivity to CO2 is nowhere near as great as the threat that the alarmists perceived it to be, so again, more and more reason to abandon the industry destroying and elderly killing CO2 reduction targets in the UK.
Sadly, our demented leaders seem obsessed to have the CO2 targets as one of the very very few things that they are not prepared to U turn on. In spite of more and more evidence clearly showing that they should.

Mike McMillan
April 18, 2013 3:57 am

How difficult would it be to just dial in a lower sensitivity and run the models? Has anyone ever done that, or are they afraid the result would be closer to reality?

Other_Andy
April 18, 2013 4:54 am

So, according to this paper, humanity is responsible for the period of warming from 1976 to 2010
And how do they know….?
Empirical evidence?
You wish…….!
Climate Model Simulation.
Sigh……….

Nial
April 18, 2013 4:55 am

That Skeptical ‘science’ article calls it “single study syndrome” then links to another study with more or less the same outcome. Is this deliberate self parody?

Douglas Hoyt
April 18, 2013 4:58 am

There is also a paper by Bjornbom that deduces that a doubling of CO2 will lead to a rise of 0.18 C at the surface.
Here is the abstract:
Estimation of the climate feedback parameter by using radiative fluxes from CERES EBAF
P. Björnbom
KTH Royal Institute of Technology, c/o Pehr Björnbom, Kometvägen 1, 18333 Täby, Sweden
Abstract. Top-of-the-Atmosphere (TOA) net radiative flux anomalies from Clouds and Earth’s Radiant Energy Systems (CERES) Energy Balanced and Filled (EBAF) and surface air temperature anomalies from HadCRUT3 were compared for the time interval September 2000–May 2011. In a phase plane plot with the radiative flux anomalies lagging the temperature anomalies with 7 months the phase plane curve approached straight lines during about an eight months long period at the beginning and a five year period at the end of the interval. Both of those periods, but more clearly the latter one, could be connected to the occurrence of distinct El Niño Southern Oscillation (ENSO) episodes. This result is explained by using a hypothesis stating that non-radiative forcing connected to the ENSO is dominating the temperature changes during those two periods and that there is a lag between the temperature change and the radiative flux feedback. According to the hypothesis the slopes of the straight lines equal the value of the climate feedback parameter. By linear regression based on the mentioned five year period the value of the climate feedback parameter was estimated to 5.5 ± 0.6 W m−2 K−1 (± two standard errors).
See http://hockeyschtick.blogspot.com/2013/01/new-paper-confirms-findings-of-lindzen.html

tobias smit
April 18, 2013 5:00 am

“humanity is also responsible for the warming from 1976 – 2010”
is that according to his (GIGO) model ?
or reality and if so, who’s reality ?

Leonard Weinstein
April 18, 2013 5:46 am

Even the low sensitivity versions look at the atmospheric temperature rise in the 1970 -1998, and assume the flat level for 1998-2013 as just a pause in rise. Ocean data is obviously not reliable before about 2003, so I would not place any emphasis in that. If the atmospheric and ocean temperature levels continues nearly flat to downward going forward, any claimed sensitivity greater than zero would have to explain why there is no significant rise.

kadaka (KD Knoebel)
April 18, 2013 5:54 am

…Troy Masters has a new paper in press at Climate Dynamics
How many journals sprang up to take advantage of, er, spread the sanctified gospel about, er, provide additional outlets for the needed dissemination of information concerning, the climate change crisis?
How many journals have been become irredeemable following their endless shameless extremism, er, partisan promotion of politically-expedient positions, er, questionable editorial practices yielding apparent bias, concerning their believed certainty of CAGW?
With the Climate Change Creature vanishing in the daylight,
The assorted “Green technologies” revealed as a handful of bubbles left to burst and drain away as consumers rouse themselves and get out of the cold fiscal bath,
The “impending unwavering doom” revealed as possible discomfort after several hundreds of years of not taking advantage of the coming advancing technology with inherent “carbon emissions” reductions,
Whatever will happen to all these unneeded and largely-unread climate-focused journals?

joeldshore
April 18, 2013 6:01 am

Douglas Hoyt says:

There is also a paper by Bjornbom that deduces that a doubling of CO2 will lead to a rise of 0.18 C at the surface.
Here is the abstract: …
See http://hockeyschtick.blogspot.com/2013/01/new-paper-confirms-findings-of-lindzen.html

You are just repeating a false statement made by HockeySchtick on his website. He used an incorrect method of converting from the finding that the feedback parameter is 5.5 ± 0.6 W m−2 K−1 to what the sensitivity to doubling CO2 is, apparently because he doesn’t understand what people mean by “top of the atmosphere”. The correct conversion would say that it predicts a surface warming of ~0.7 C per CO2 doubling.
Furthermore, as noted by one of the online reviewers ( http://www.earth-syst-dynam-discuss.net/4/C161/2013/esdd-4-C161-2013.pdf ), the author of the paper has made no attempt to validate his method of computing the feedback parameter. Hence, this is a result using a method that is completely untested. You just have to take it on faith that this is a valid method for assessing the feedback parameter and climate sensitivity…Hardly the way one would expect a real skeptic to proceed.

April 18, 2013 6:02 am

Stefan (April 18, 2013 at 3:20 am) “To the layman that spaghetti graph is quite clear: the 1998 spike wasn’t “consistent with models”, it was just a blip”
Stefan, I agree. The modelers made a number of conclusions in the early 2000’s based on that blip and 1980s/90’s warming that are now proving wrong: high sensitivity, more El Ninos (although that was still under debate), less blocking (stronger, more northerly polar jet), etc. Their fundamental mistake was to believe that the slight warming from CO2 would create new weather regimes resulting in positive feedback. To the contrary, weather drives fluctuations an order of magnitude (more in the short run) greater than warming from increased CO2.
Now various alarmists (e.g. Romm) without realizing the ramifications are claiming that warming from CO2 is creating new weather regimes which are in fact negative feedback (stronger storms, droughts, blocking patterns, etc) If we are to believe those claims of weather doom then we must also accept lower sensitivity. On the SkepSci thread a couple years ago, they called this my theory of mutually exclusive catastrophe. Their reply was basically that it wasn’t true or that we are doomed whether we have storms or strong warming which melts Greenland’s ice.

richardscourtney
April 18, 2013 6:18 am

Friends:
I write to ask a genuine question. This post is NOT ‘knocking copy’.
I argue that climate sensitivity is low and, therefore, if I were being prejudiced then I would be supporting this study by Troy Masters.
But my question is
Does anybody think the paper by Troy Masters has any credibility and, if so, why?
The reason for my question is simple and is as follows.
As recently as yesterday there was a article about ocean heat uptake by Bob Tisdale posted on WUWT. It is at
http://wattsupwiththat.com/2013/04/17/a-different-perspective-on-trenberths-missing-heat-the-warming-of-the-global-oceans-0-to-2000-meters-in-deg-c/
and it can be summarised by this quotation from it.

That’s right. According to Levitus et al 2012, the average temperature of the global oceans to depths of 2000 meters warmed a miniscule 0.09 deg C (or 0.16 deg F) from 1955 to 2010. Granted, the heat capacity of the ocean is much greater than the atmosphere, but that warming of 0.09 deg C strains believability. Are we able to sense such a small change?

Of course, we cannot “sense such a small change” and it is a construct of assumptions.
Today we have a report of this study by Troy Masters which says

Climate sensitivity is estimated based on 0–2,000 m ocean heat content and surface temperature observations from the second half of the 20th century and first decade of the 21st century, using a simple energy balance model and the change in the rate of ocean heat uptake to determine the radiative restoration strength over this time period.

So, the paper by Masters derives a value for climate sensitivity based on assumptions. And I fail to understand how anybody can confuse such a derivation with science.
Richard

April 18, 2013 6:18 am

These “lower estimates” are just absurd. They are like the gambler who having tossed a coin three times and getting it right, then says for the forth when they get it wrong: “my ability to predict isn’t quite as high as I thought”.
In other words, we will see natural climate variation continue to be the main factor changing the global temperature. And if it goes down … they will continue to adjust their models downward, and if it goes up … they will adjust their models upward.
Without ever contemplating the reality that they cannot predict the climate at all.
Climate sensitivity is just a jazzed up climate researcher’s name for ESP

April 18, 2013 6:29 am

These papers are important small steps in the right direction. If you are a climate scientist today and you want to question the alarmist orthodoxy, you are taking big chances with your career. So the alternative is to produce numbers that are just a little bit lower, but not low enough to allow the team to declare you a heretic. Then, as the number of published papers pile up on the lower end of the sensitivity range, it allows others to take it even a little further. In the long run the guys that are being careful will arrive at the same place as people like Spencer and Lindzen.

April 18, 2013 6:50 am

Now, just for a moment try to imagine that the UAH and RSS lines in Roy’s post are ABOVE all the models instead of below them.
Raise your hand if you think Hansen, Trenberth, etc., would be defending the models, as opposed to shrieking at the top of their lungs “THE MODELS WERE WRONG, IT’S EVEN WORSE THAN WE THOUGHT!!
Anyone? Anyone? Bueller?

April 18, 2013 6:51 am

Abstract. Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century.
********
So it is a kind of mystery, because the global average has gone up by 0.8 deg C already between 1910-1945:
http://www.woodfortrees.org/plot/hadcrut3vgl/from:1910/to:1945
At the same time, CO2 went up by tenth, compared to modern, post1-1975 warming when temperature went up by mere half, so CO2 has definitively a cooling effect or something 😮
http://www.woodfortrees.org/plot/rss/from:1975/to:2004
None of the models replicates 1910-1945 warming, maybe some 0.1 deg C. Even with overblown positive feed backs and thick radiation arrows, one can not get 0.8 deg C warming from humans in the early part of 20th century!! Zero hypothesis is, that the natural warming in 1910-45 has repeated itself again in 1975-2005 period, with natural cooling in between and since 200X again.

ferd berple
April 18, 2013 7:08 am

If the climate models were truly concerned with calculating CO2 sensitivity, then CO2 sensitivity would be one of the outputs of the models. The models would use observed temperature as an input and spit out climate sensitivity as an output.
However, that is not what has been done. Climate models use an assumed climate sensitivity as an input and spit out temperature as an output. The divergence between the models and observations tells us that the models have wrong assumptions. However, it tells us very little about what the true sensitivity should be because of the non-linear aspects of climate.
what was built:
climate model (assumed sensitivity co2) = predicted temperature
what the customer ordered:
inverse climate model (observed temperature) = true sensitivity co2

April 18, 2013 7:08 am

Unfortunately, Springerlink wants $39.95 for the privilege of reading it
It’s hilarious how things like this are real stumbling blocks for skeptics, while the AGWers claim we’re getting billions from oil companies, Koch brothers, etc.
I bet Mann, Gavin, Hansen, Jones, etc don’t even have to submit a receipt for something like this to have it paid for by us (the taxpayers).

Richard M
April 18, 2013 7:09 am

Climate sensitivity is not a constant. Trying to find a fixed value for a variable is a fool’s errand. There are many factors in our climate system that will lead to different sensitivities at different times. I think this is evident as people are now finding different numbers than they did when the Earth was warming instead of cooling.
Sensitivity probably takes the form … f(x,y,z,…) … where we know far too little about x,y,z,…

John West
April 18, 2013 7:19 am

jb frodsham says:
”If it had not been for the internet, the Alarmists plan would in a large part already been implemented.”
And the current pause could be being attributed to their valiant efforts to convince the world of their brilliant soothsaying. Annual Nobel Prizes for “The Team”! Hooray, they saved us from ourselves just in the nick of time!

Jnpics@aol.com
April 18, 2013 7:19 am

Man made Global warming is a hoax. Until the mainstream media gets it and reports it as christopher booker did with his book ” the real global warming disaster” .we have been fed too much misinfo and propaganda(al gore, completely biased sixty minutes show on agw, wikipedia even defines co2  as causing agw, at best it can only be a theory. I am totally frustrated with this. It is a political problem(thanks is due to this site for proving the “climate models” false along with everthing else the Ipcc has said) the so called solutions being shoved down our throats are a bigger threat. John piccirilli Watts Up With That? <comment-reply@wordpress.com> wrote: >Post       : Another paper finds lower climate sensitivity >URL        : http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/ >Posted     : April 18, 2013 at 3:00 am >Author     : Anthony Watts >Categories : Climate sensitivity > >Yesterday we talked about the new paper from Nic Lewis, now Troy Masters has a new paper in press at Climate Dynamics here. Observational estimate of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5 models Unfortunately, Springerlink wants $39.95 for the privilege of reading it, so all I […] > >Read more of this post (http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/) > >Add a comment to this post: http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/#respond > >– >WordPress.com | Thanks for flying with WordPress! > >Manage Subscriptions >https://subscribe.wordpress.com/?key=5490814abe22e42d4efac66b230a2346&email=jnpics%40aol.com > >Unsubscribe: >https://subscribe.wordpress.com/?key=5490814abe22e42d4efac66b230a2346&email=jnpics%40aol.com&b=Cu-%7C%5DqjUaiotJCVlPCkX4e-JaHLX%2C17oMp%7ENHWVZ3_l0nziOnt >
a:hover { color: red; } a { text-decoration: none; color: #0088cc; } a.primaryactionlink:link, a.primaryactionlink:visited { background-color: #2585B2; color: #fff; } a.primaryactionlink:hover, a.primaryactionlink:active { background-color: #11729E !important; color: #fff !important; }
/* @media only screen and (max-device-width: 480px) { .post { min-width: 700px !important; } } */ WordPress.com
Anthony Watts posted: “Yesterday we talked about the new paper from Nic Lewis, now Troy Masters has a new paper in press at Climate Dynamics here.

Observational estimate of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5 models

Un”

John West
April 18, 2013 7:23 am

Richard M says:
“Sensitivity probably takes the form … f(x,y,z,…) … where we know far too little about x,y,z,…”
I regret that I have but 1 thumbs up to give!

ferd berple
April 18, 2013 7:39 am

Climate sensitivity is estimated based on 0–2,000 m ocean heat content and surface temperature observations from the second half of the 20th century and first decade of the 21st century, using a simple energy balance model and the change in the rate of ocean heat uptake to determine the radiative restoration strength over this time period.
============
How do we know that the warming from 1904 to 1944, the cooling from 1944 to 1976, and the warming from 1976 to 1997, and the cooling from 1997 to 2013 are not simply an oscillation in the deep ocean heat uptake?
The earth has an enormous river of cold water flowing from the poles to the eastern Pacific. We see this as the ENSO oscillation. This river does not flow at a constant rate. It meanders as all rivers do; that is the nature of water. It oscillates in harmony with tidal forces and ocean basins; that is the nature of dynamic systems. When it finally discharges some 800 years after starting its journey we see this natural meander and oscillation as global periods of warming and cooling.
The oceans have so much heat capacity in them as compared to the surface and atmosphere that just the smallest change in the deep ocean conveyor dramatically changes the surface climate. To assume that this river of cold water, by far the largest river on earth, flows in a steady stream is the second biggest nonsense in climate science.

son of mulder
April 18, 2013 7:55 am

” Richard M says:
April 18, 2013 at 7:09 am
Sensitivity probably takes the form … f(x,y,z,…) … where we know far too little about x,y,z,…”
And what do we know about f?

April 18, 2013 7:56 am

“Mike McMillan says:
April 18, 2013 at 3:57 am
How difficult would it be to just dial in a lower sensitivity and run the models? Has anyone ever done that, or are they afraid the result would be closer to reality?
###############
sensitivity is not a parameter that you SET in running a model. sensitivity is an OUTPUT of the model. By adjusting parameters ( such as aerosol forcing which has a lot of uncertainity ) you can “tune” the model to get closure at TOA, for example. When you then run that model in “forecast” mode and double C02 you get an answer and that’s basically your sensitivity.
So, sensitivity is not an input to models .Y ou dont set a variable named sensitivity to 3 .
Models range in sensitivity from about 2.1 to 4.4.

ferd berple
April 18, 2013 7:57 am

Tilo Reber says:
April 18, 2013 at 6:29 am
So the alternative is to produce numbers that are just a little bit lower, but not low enough to allow the team to declare you a heretic.
=========
interesting example that has been shown to be true from other historical examples. bad scientific estimates by the powerful and influential are rarely corrected overnight. they are whittled away slowly over the years to save face.
rather than having to admit they made a huge boo-boo, climate science can instead admit they simply made a number of very small mistakes. since none of their mistakes were large, they mostly had it right.
so, except for the people that freeze to death in the meantime, everything is peachy. and as dead people rarely complain, who is the wiser?

Gary Pearse
April 18, 2013 7:59 am

“…as contended by climate contrarians, …”
Relative to what? Well at least we have graduated from the ugly D-word to the less derogatory ‘contrarians’. But the joke is on the consensus. I do believe we are heading for a tipping point where the consensus becomes the D-folk. It would not be correct to call their position that of a contrarian then.

jc
April 18, 2013 8:07 am

So this is to be the mechanism. It always was going to be I suppose.
The wild ambit claim. Followed by, as required, a tempering of claims. Retaining the position that these are fundamentally legitimate. But that this alteration of claim demonstrates in itself the reasonableness of proponents. And that while certain more dramatic policy responses can be modified the core thrust should be maintained. Most particularly, the maintainence of status and money flowing to the developers.
It demonstrates that to provide evidence that discredits “the science” is not enough. The “scientists” themselves must be discredited. By their own actions.

Gary Pearse
April 18, 2013 8:10 am

Steven Mosher says:
April 18, 2013 at 7:56 am
“sensitivity is not a parameter that you SET in running a model. sensitivity is an OUTPUT of the model.”
Hmm…. I think it might be a cool idea to SET it in as a parameter at 1.5, and adjust parameters, starting with those with the largest error bars, and see if we can get TOA. What would be wrong with that – it seems we are constraining sensitivity better and better with observations.

ferd berple
April 18, 2013 8:13 am

Steven Mosher says:
April 18, 2013 at 7:56 am
So, sensitivity is not an input to models .You dont set a variable named sensitivity to 3 .
==========
the variable is called positive water feedback. this is set to 3 and directly controls CO2 sensitivity, similar to the way a volume control on your radio controls sound. so while the model builders can try and pretend they are not building assumed CO2 sensitivity into their models, they are fooling themselves.
Richard Feynman
“The first principle is that you must not fool yourself—and you are the easiest person to fool.”
Anyone that has ever brought a microphone next to speaker knows what happens when you turn up the volume. You get runaway feedback. This is what the climate models are doing. They are feeding the sensitivity outputs back into the inputs and as they turn up the volume they get runaway feedback. It is a nonsense of the way they have built the models and this is made clear by the divergence between models and reality.

Frank K.
April 18, 2013 8:34 am

One can not determine the quality of the output from a climate simulation code without first knowing what differential equations are being solved numerically, and the numerical methods that are used to solve them. Could someone write those down for us (including the initial and boundary conditions), so we can examine the consistency and stability of the numerical approximations to the differential equations? And I would assume all models are using the same equations (including parameterizations of sub-scale physics) – otherwise, how can you compare one model with another? Right??
By the way, could someone replot the global “temperature” history in terms of absolute temperatures rather than “anomalies”? That would be interesting to see…

tadchem
April 18, 2013 8:35 am

We test the null hypothesis: the UAH and the RSS data sets are not statistically different from the ensemble of 44 models.
There is a 1/46 chance that *each* is going to lie at an extremum (in this case the low end) of the ensemble. The chance that both lie at the same extremum is 1/(46^2), or one chance in 2116. The null hypothesis fails at the 95%, the 99%, and even the 99.9% levels. This is about as unlikely as dealing two Jokers off the top of a ‘well-shuffled’ deck – even if you take the 2’s and 3’s out of the deck first.
You should be concerned that the deck has been stacked!

ferd berple
April 18, 2013 8:35 am

The assumption of positive feedback in climate models is at odds with what we know about dynamic systems. When we look at things that do have positive feedback, such as hurricanes and wild-fires, we see that they have one factor in common. They are very short lived. They quickly exhaust all available energy and die. The very long history of life on earth tells us that climate cannot have long lived positive feedback. Thus, the IPCC models, which all assume long term positive feedback must be a nonsense. They are inconsistent with long term paleo records. As is being demonstrated day by day from observations.

Chris R.
April 18, 2013 8:43 am

To Joeldshore:
You state: “The correct conversion would say that it predicts a surface warming of ~0.7 C per CO2 doubling.”
Well, good, thank you for the correction. Note, however, that this figure is
still 1/2 of the LOWEST possible figure used by the IPCC for a CO2 doubling.
Since these GCMs are essentially Monte Carlo, I think it’s appropriate to repeat
what they say in Vegas: “A LOSER! Thanks for playing!”

Trev
April 18, 2013 8:46 am

If the Earth were at all sensitive to changes in CO2 then we would not be here. Its because the Earth survives CO2 changes and temperature changes that life evolves.

RACookPE1978
Editor
April 18, 2013 8:47 am

Interesting, isn’t it, that this “study of a series of models studying the influence of CO2 “forcing” on temperature” finds out ….. THAT (according to the models) CO2 AFFECTS TEMPERATURE!
Wow.
What will they pay for next?
/sarchasm – that gaping whole between a socialist and the real world.
Further, a study showed 97% of government-paid ‘scientists’ agree that “government-paid studies to find out that temperature is affected by CO2 find out that temperature is affected by CO2 in direct proportion to the level of new government funding paid out to scientists who are paid to find models that prove temperature is affected by CO2! “

kadaka (KD Knoebel)
April 18, 2013 8:48 am

From Steven Mosher on April 18, 2013 at 7:56 am:

sensitivity is not a parameter that you SET in running a model. sensitivity is an OUTPUT of the model.

Wait a sec…
The models are built with assumptions of how the climate initially responds to CO₂ concentration changes, the feedbacks from those responses, etc.
From this is generated the temperature changes resulting from the CO₂ changes, from which sensitivity is calculated.
Aren’t you just saying the mileage is not a direct input to the vehicle’s electronics, it’s something the manufacturer determines from testing the finished vehicle? Even though the desired mileage was built into the vehicle designs from the start?

John
April 18, 2013 9:07 am

Michael Schlesinger, the last author on the Ring et al paper and probably the most senior of the authors, would likely be called a “warmist” by many readers of WUWT. He’s an insider, unlike, say, Nic Lewis. They both deserve praise for bucking the trend, although pretty soon their latest findings may be seen as the new consensus.

Gary Pearse
April 18, 2013 9:20 am

I think the TRENDS of multi-decadal warming and cooling prior to 1979 can be compared to the recent period of warming and (soon) the extension of a new cooling period. If the slopes of these recent trends are the same as that of the late 19th and 3/4 of the 20th century, then it is essentially natural variability, except for the natural rise out of the LIA. A significant warming signal would be that warming slopes would be steeper and cooling slopes less steep. The only fly in the ointment of this simple definitive quantitative test is the corruption of the temperature records. What is wrong with this, really?

John
April 18, 2013 9:34 am

Judith Curry’s latest blog entry discusses a slew of new studies, including Schlesinger’s (Ring et al). She refers to Schlesinger as a “heavyweight.” First, the link to this entry:
http://judithcurry.com/2013/04/17/meta-uncertainty-in-the-determination-of-climate-sensitivity/#more-11524
And the most relevant three paragraphs:
“Two heavyweight climate scientists have published very different ideas about how much the Earth is going to warm in the coming decades. And neither has much regard for the other’s estimate – casting light on a long-standing, thorny issue in climate science.
Future warming is likely to be on the high end of predictions says Kevin Trenberth of the National Center for Atmospheric Research who has been a lead author for the United Nations Intergovernmental Panel on Climate Change (IPCC).
But Michael Schlesinger, who heads the Climate Research Group within the Department of the Atmospheric Sciences at the University of Illinois, has just published a study with his group finding warming will be at the low end of projections.”

Editor
April 18, 2013 10:12 am

On the Ring et al. paper, their introduction makes clear that the only solar forcing they take into account is the very slight variation in solar irradiance, but strong correlations between solar activity and climate going back many thousands of years indicate that some substantial mechanism of solar amplification must be at work (some mechanism of solar forcing substantially stronger than the variance in solar radiation). Since solar activity was high over the 20th century, that would substantially increase the forcing over the 20th century, which would further reduce the implied sensitivity, so even their reduced estimates are still way to high. On theoretical grounds (the thermostat hypotheses of Lindzen, Spencer and Eschenbach) there is solid reason to think that climate feedbacks are negative (sensitivity < 1).

DCA
April 18, 2013 10:14 am

I don’t remember but I read this yesterday about the use of “uniform priors”. http://bishophill.squarespace.com/blog/2013/1/25/uniform-priors-and-the-ipcc.htm
It’s about some blog comments at RC. http://www.realclimate.org/index.php/archives/2013/01/on-sensitivity-part-i/comment-page-2/#comments
Does Masters use “uniform priors” ?

joeldshore
April 18, 2013 10:46 am

Chris R says:

Well, good, thank you for the correction. Note, however, that this figure is
still 1/2 of the LOWEST possible figure used by the IPCC for a CO2 doubling.
Since these GCMs are essentially Monte Carlo, I think it’s appropriate to repeat
what they say in Vegas: “A LOSER! Thanks for playing!”

I recommend you read the next sentence of my comment http://wattsupwiththat.com/2013/04/18/another-paper-finds-lower-climate-sensitivity/#comment-1278857 where I explained why noone who is a skeptic in the true sense of the word would believe their result based on an untested method.

Richard Howes
April 18, 2013 11:50 am

No, really, it’s worse than we thought. /sarc

Matthew R Marler
April 18, 2013 1:36 pm

If the global atmospheric CO2 concentration continues to rise and the global mean temperature continues to (approximately) flatline, then the estimates (and credible intervals) of “climate sensitivity to CO2” will continue to decline. Every decision humans might make depends on future data.

Chris R.
April 18, 2013 2:53 pm

To joeldshore:
And I recommend that you grow a sense of humor.

richardscourtney
April 18, 2013 3:33 pm

Chris R:
A joke is laughed at by those who have a sense of humor. It cannot have a sense of humor.
Richard

Chris R.
April 18, 2013 5:35 pm

To richardscourtney, also joeldshore:
Richard:
Thank you. I should remember that there are those who can
be ribbed, and those who cannot. It seems that
joeldshore
is among those who cannot.
Joel:
I should have made clear to you that I did, in fact, read your comment
in its entirety. I also read the reviewer comment you linked.
“Anonymous reviewer number 2” never said that Bjornbom
“…made no attempt to validate” his method. The reviewer
said that the discussion of the interpretation of results was
unclear. Not the same thing. Beyond that, when the reviewer
says, in his first paragraph, that this result is ” …
politically controversial…” I automatically begin applying a
50% discount to what follows.
I suppose I shouldn’t have gone for a cheap laugh, but the fact
that even with your correction applied, the result was still
50% smaller than the lowest IPCC estimate
seemed to
me to be quite amusing.

April 18, 2013 6:05 pm

Here, I will help you out for free! … the real climate “sensitivity” to CO2 is exactly, and precisely, ZERO degrees C or F. Nadda, nothing, zilch! There is no “sensitivity”. That is complete garbage!

John Robertson
April 18, 2013 6:11 pm

Paywalled? He does state:
” It is pay-walled, but please contact me if you need a copy and do not have University access. Anyhow, a zip that includes all my code and data is available here.”
So it sounds like he is willing to share it with people that take the trouble to contact him directly. His data is freely available too.
I don’t have the time to review it, however I don’t feel the comment “Unfortunately, Springerlink wants $39.95 for the privilege of reading it, so all I can do is to provide the abstract.” is entirely appropriate – perhaps he has perfectly legitimate reasons for paywalling it that he didn’t mention.
Appreciate reading about this article, wish I had more time to follow along…

joeldshore
April 18, 2013 6:17 pm

Chris R. says:

“Anonymous reviewer number 2″ never said that Bjornbom “…made no attempt to validate” his method.

From the reviewer’s comments:

The method devised by Gregory et al 2004 was created to enable modellers to estimate this parameter from short transient experiments, and the method was validated by comparing results with equilibrium studies using the same model. However, it has not been shown that estimating the climate sensitivity from shorter term predominately regional oscillations such as ENSO gives any insight to the value of long global sensitivity to increases in greenhouse gases in models, let alone the real world (see for example, Dessler, 2013).

I said: “..the author of the paper has made no attempt to validate his method of computing the feedback parameter. Hence, this is a result using a method that is completely untested. You just have to take it on faith that this is a valid method for assessing the feedback parameter and climate sensitivity.” How is this an unreasonable characterization of the information that the reviewer provided?

I suppose I shouldn’t have gone for a cheap laugh, but the fact that even with your correction applied, the result was still50% smaller than the lowest IPCC estimate seemed to me to be quite amusing.

What was more amusing was the fact that somebody (Douglas Hoyt) who presumably characterizes himself as a skeptic uncritically accepted a HockeySchtick’s gross (factor of 4) mischaracterization of the paper’s result…and also uncritically accepted the paper’s result itself despite the fact that the method has not been validated as giving “any insight to the value of long global sensitivity to increases in greenhouse gases in models, let alone the real world”.

joeldshore
April 18, 2013 7:26 pm

ferd berple says:

The assumption of positive feedback in climate models is at odds with what we know about dynamic systems. When we look at things that do have positive feedback, such as hurricanes and wild-fires, we see that they have one factor in common. They are very short lived. They quickly exhaust all available energy and die. The very long history of life on earth tells us that climate cannot have long lived positive feedback. Thus, the IPCC models, which all assume long term positive feedback must be a nonsense.

Your entire comment is based on a misunderstanding of what “positive feedback” means in the context of these climate discussions. You might want to read up on what Troy Masters (a skeptic, or at least a lukewarmer) has to say about it here: http://troyca.wordpress.com/radiation-budget-and-climate-sensitivity/ in particular, note this part:

The overall value of λ must be positive, but given the estimated Planck response of 3.3 W/m^2/K, the common usage of “positive” or “negative” feedbacks refers to the deviation of λ from this value (somewhat counter-intuitively, a “positive” feedback actually refers to the situation where the overall λ is *less* than the Planck response, according to the set-up).

In other words, when climate scientists says that the net feedback is positive, they mean the net feedback not including the Planck response.

Janice Moore
April 18, 2013 8:56 pm

@ jnpics, re: 4//18/13, 7:19 AM
When I saw that your fervently sincere post (which started out fine but ended up sounding like a hysterical piglet) got all those “thumbs down!”, my heart went out to you. Every time I’ve tried to show an Envirocult member the truth, I end up nearly running down the road hollering my head off in much the same way. I think the following edited version captures the essence of your comment:
Man made Global warming is a hoax. I am totally frustrated with this. It is a political problem. Thanks is due to this site for proving the “climate models” false along with everything else the IPCC has said. [jnpics abridged]
Try re-posting that. No thumbs up, perhaps (and who gives a rat’s micrometer anyway!), but, at least you can redeem your reputation for rationality. Keep on posting!

richardscourtney
April 19, 2013 5:37 am

Janice Moore:
In your fine post at April 18, 2013 at 8:56 pm kindly addressed to jnpics you say
Try re-posting that. No thumbs up, perhaps (and who gives a rat’s micrometer anyway!),
I am at a loss to understand why these ‘thumbs’ have been added to WUWT.
They slow down and corrupt links to posts, they discourage debate, they prejudice opinions of posts by readers who see a ‘log’ of recorded opinions, and as you point out they can discourage new and inexperienced contributors from making future posts. Indeed, a person thinking of making a post for the first time may be dissuaded by seeing the ‘thumbs’ obtained by jnpics.
Simply, the ‘thumbs’ seem to act against the nature of WUWT which has made it the Best Science Blog on the web.
Trolls attempting to disrupt threads need to be dissuaded. New posters need to be encouraged and not dissuaded because with experience they may become very useful contributors.
As your kind post to jnpics implies, the ‘thumbs’ may have inhibited debate on this thread, and I add that I suspect this is probably not the only thread to have been similarly affected.
Richard

April 19, 2013 8:01 am

If the climate models were truly concerned with calculating CO2 sensitivity, then CO2 sensitivity would be one of the outputs of the models. The models would use observed temperature as an input and spit out climate sensitivity as an output.
Well said! Or at least, they would optimize hindcast/training/trial set performance in a predictive model on its value or — oops — do a rather difficult multivariate performance optimization on an entire vector of presumed inputs, along with a meta-sensitivity analysis of the statistical sort that integrates covariant regimes with roughly equal predictivity that determine (sadly broad, multidimensional ellipsoidal at best) ranges for those parameters all of which yields roughly equivalent hindcast/trial set performance once the model is built using a training set.
One seriously wonders if any of the climate modelers really understand predictive modeling theory in the abstract at all. I do this sort of thing for money — you don’t get the opportunity to fail to handle input degeneracy, input covariance, overtraining, and so on twice when people pay you for it and you screw up. This is actually a stronger criterion than “peer review”, where all too often the only money on the table comes from not pissing off the powers that regulate the flow of grant money or (biased) reviewers who have their own dog in the race.
The point being that one ultimately has to concentrate on models that are robustly predictive on trial data after being built on training data (at the expense of model detail as needed) and even then, forward prediction is fraught with peril in a chaotic system. Spencer’s lovely diagram above is a perfectly wonderful example of why.
Since everybody on this list loves numerology, here’s an interesting bit of it. Plot the estimates of climate sensitivity over time. Hmm, a monotonic decreasing function (with an artificially tightening range, since literally nobody dares to publish a result where the error estimate includes zero — they’d have hockey team career-assassins on their trail overnight, so the ranges are carefully asymmetric and skewed on the warm side instead of being nice and Gaussian above, to which I can only reply “hah, bah!”).
Is it converging low? No sign of it yet. How can there be? Nobody has the guts to drop it to where the actual data record in Spencer’s graph above is at the centroid of the model — they persist in building models where the bottom of the range can reach it instead, assuming utterly without justification that the current climate isn’t “what it should be” but is lowball noise. Maximum entropy? Maximum likelihood? Bayes with uniform priors optimized to the actual data? Ability to hindcast any trial set outside of some carefully limited range that (for example) carefully excludes the LIA and/or MWP or the rest of the Holocene? Hell no. Just build a model with (in the end) a simple one-parameter dominant behavior that works only for trial sets selected from the last century where the data was largely monotonic (regardless of how many parameters contributed to it) and tune the parameters so that they BARELY contain the current observations while preserving the illusion of understanding and one’s grant stream.
Hah. Bah.
I can’t really complain, though. More and more climate scientists are having the courage to break ranks, and eventually we will have enough, good enough data and honest enough analysis that we will start to actually work out some ROBUST ranges for things, and perhaps in a few decades have a vastly improved idea of how the climate actually works. Perhaps after the climate has worked through a few more solar cycles and the NAO finally changes phase, perhaps after ENSO finishes surprising us.
One day perhaps I’ll have the time and energy to do a top post on chaotic oscillation and the danger of single-parameter numerology. You’d think climate scientists had never heard of a Poincare cycle, a Van der Pol nonlinear oscillator:
http://en.wikipedia.org/wiki/Van_der_Pol_oscillator
or any of the rest of this. The Van der Pol oscillator is simple compared to the climate. For one thing, its driver is hardly a clean sinusoid, it is itself a chaotic noisy function. For another, its nonlinear feedback terms have multiple signs. For a third, the fact that global temperature does follow Hurst-Kolmogorov patterns of stability followed by rapid jumps suggests that the global climate cycle is best described by a phase space of multiple attractors with many locally stable orbits where the system jumps between attractors on a decadal timescale with century-scale factors that continuously change the local phase space potential surfaces upon which the state orbits.
rgb

Janice Moore
April 19, 2013 11:11 am

Dear Mr. Courtney,
I gave your comment complimenting mine a thumbs up. Heh, heh. Thank you for your affirmation and, yes, I agree. While the thumbs up is a nice way to say “atta boy” or “atta girl,” the collateral damage from the thumbs down with its implication that not only should this post disappear but you should, too, it not worth it. I’m not too sure about jnpics… . I think she or he gave me a thumbs down. (:|}
Other posters have unfavorably compared the new rating system to Facebook. I think similarly. The main reason I have not joined Facebook is for that nauseating faux popularity element. Oh, certainly, if one is “big” enough, one will overlook and not “give a rat’s squeek” about it, but, we are, all of us, “pervious, through a chink or two”. Indeed, WUWT, where respect and focus on facts prevails, is “the Best Science Blog on the web.”

April 19, 2013 12:07 pm

As I’ve pointed out many times in the past, there is no such thing as an equilbrium climate sensitivity for is is defined in terms of an equilibrium temperature but it is not an observable.