Yet another study shows lower climate sensitivity

Global warming less extreme than feared?

Policymakers are attempting to contain global warming at less than 2°C. New estimates from a Norwegian project on climate calculations indicate this target may be more attainable than many experts have feared.

Photo: Shutterstock
The researchers have arrived at an estimate of 1.9°C as the most likely level of warming. (Photo: Shutterstock)

Internationally renowned climate researcher Caroline Leck of Stockholm University has evaluated the Norwegian project and is enthusiastic.

“These results are truly sensational,” says Dr Leck. “If confirmed by other studies, this could have far-reaching impacts on efforts to achieve the political targets for climate.”

Temperature rise is levelling off

After Earth’s mean surface temperature climbed sharply through the 1990s, the increase has levelled off nearly completely at its 2000 level. Ocean warming also appears to have stabilised somewhat, despite the fact that CO2 emissions and other anthropogenic factors thought to contribute to global warming are still on the rise.

It is the focus on this post-2000 trend that sets the Norwegian researchers’ calculations on global warming apart. 

Sensitive to greenhouse gases

Climate sensitivity is a measure of how much the global mean temperature is expected to rise if we continue increasing our emissions of greenhouse gases into the atmosphere.

CO2 is the primary greenhouse gas emitted by human activity. A simple way to measure climate sensitivity is to calculate how much the mean air temperature will rise if we were to double the level of overall CO2 emissions compared to the world’s pre-industrialised level around the year 1750.

If we continue to emit greenhouse gases at our current rate, we risk doubling that atmospheric CO2 level in roughly 2050.

Mutual influences

A number of factors affect the formation of climate development. The complexity of the climate system is further compounded by a phenomenon known as feedback mechanisms, i.e. how factors such as clouds, evaporation, snow and ice mutually affect one another.

Uncertainties about the overall results of feedback mechanisms make it very difficult to predict just how much of the rise in Earth’s mean surface temperature is due to manmade emissions. According to the Intergovernmental Panel on Climate Change (IPCC) the climate sensitivity to doubled atmospheric CO2 levels is probably between 2°C and 4.5°C, with the most probable being 3°C of warming.

In the Norwegian project, however, researchers have arrived at an estimate of 1.9°C as the most likely level of warming.

Manmade climate forcing

“In our project we have worked on finding out the overall effect of all known feedback mechanisms,” says project manager Terje Berntsen, who is a professor at the University of Oslo’s Department of Geosciences and a senior research fellow at the Center for International Climate and Environmental Research – Oslo (CICERO). The project has received funding from the Research Council of Norway’s Large-scale Programme on Climate Change and its Impacts in Norway (NORKLIMA).

“We used a method that enables us to view the entire earth as one giant ‘laboratory’ where humankind has been conducting a collective experiment through our emissions of greenhouse gases and particulates, deforestation, and other activities that affect climate.”

For their analysis, Professor Berntsen and his colleagues entered all the factors contributing to human-induced climate forcings since 1750 into their model. In addition, they entered fluctuations in climate caused by natural factors such as volcanic eruptions and solar activity. They also entered measurements of temperatures taken in the air, on ground, and in the oceans.

The researchers used a single climate model that repeated calculations millions of times in order to form a basis for statistical analysis. Highly advanced calculations based on Bayesian statistics were carried out by statisticians at the Norwegian Computing Center.

2000 figures make the difference

When the researchers at CICERO and the Norwegian Computing Center applied their model and statistics to analyse temperature readings from the air and ocean for the period ending in 2000, they found that climate sensitivity to a doubling of atmospheric CO2 concentration will most likely be 3.7°C, which is somewhat higher than the IPCC prognosis.

But the researchers were surprised when they entered temperatures and other data from the decade 2000-2010 into the model; climate sensitivity was greatly reduced to a “mere” 1.9°C.

Professor Berntsen says this temperature increase will first be upon us only after we reach the doubled level of CO2 concentration (compared to 1750) and maintain that level for an extended time, because the oceans delay the effect by several decades.

Photo: UiB
We used a method that enables us to view the entire earth as one giant ‘laboratory’ where humankind has been conducting a collective experiment through our emissions of greenhouse gases and particulates, deforestation, and other activities that affect climate, explains professor Terje Berntsen at UiO. (Photo: UiB) Natural changes also a major factor

The figure of 1.9°C as a prediction of global warming from a doubling of atmospheric CO2 concentration is an average. When researchers instead calculate a probability interval of what will occur, including observations and data up to 2010, they determine with 90% probability that global warming from a doubling of CO2 concentration would lie between 1.2°C and 2.9°C.

This maximum of 2.9°C global warming is substantially lower than many previous calculations have estimated. Thus, when the researchers factor in the observations of temperature trends from 2000 to 2010, they significantly reduce the probability of our experiencing the most dramatic climate change forecast up to now.

Professor Berntsen explains the changed predictions:

“The Earth’s mean temperature rose sharply during the 1990s. This may have caused us to overestimate climate sensitivity.

“We are most likely witnessing natural fluctuations in the climate system – changes that can occur over several decades – and which are coming on top of a long-term warming. The natural changes resulted in a rapid global temperature rise in the 1990s, whereas the natural variations between 2000 and 2010 may have resulted in the levelling off we are observing now.”

Climate issues must be dealt with

Terje Berntsen emphasises that his project’s findings must not be construed as an excuse for complacency in addressing human-induced global warming. The results do indicate, however, that it may be more within our reach to achieve global climate targets than previously thought.

Regardless, the fight cannot be won without implementing substantial climate measures within the next few years.

Sulphate particulates

The project’s researchers may have shed new light on another factor: the effects of sulphur-containing atmospheric particulates.

Burning coal is the main way that humans continue to add to the vast amounts of tiny sulphate particulates in the atmosphere. These particulates can act as condensation nuclei for cloud formation, cooling the climate indirectly by causing more cloud cover, scientists believe. According to this reasoning, if Europe, the US and potentially China reduce their particulate emissions in the coming years as planned, it should actually contribute to more global warming.

But the findings of the Norwegian project indicate that particulate emissions probably have less of an impact on climate through indirect cooling effects than previously thought.

So the good news is that even if we do manage to cut emissions of sulphate particulates in the coming years, global warming will probably be less extreme than feared.

About the project
Geophysicists at the research institute CICERO collaborated with statisticians at the Norwegian Computing Center on a novel approach to global climate calculations in the project “Constraining total feedback in the climate system by observations and models”. The project received funding from the Research Council of Norway’s NORKLIMA programme.The researchers succeeded in reducing uncertainty around the climatic effects of feedback mechanisms, and their findings indicate a lowered estimate of probable global temperature increase as a result of human-induced emissions of greenhouse gases.The project researchers were able to carry out their calculations thanks to the free use of the high-performance computing facility in Oslo under the Norwegian Metacenter for Computational Science (Notur). The research project is a prime example of how collaboration across subject fields can generate surprising new findings.
Written by:
Bård Amundsen/Else Lie. Translation: Darren McKellep/Carol B. Eckmann
h/t to Andrew Montford via Leo Hickman
Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
187 Comments
Inline Feedbacks
View all comments
Matthew R Marler
January 25, 2013 12:49 pm

Steven Mosher: And sensitivity to 2x c02 cannot be zero. If it were then changes in solar input would be zero.
Sensitivity to either a future doubling of CO2 or future increases in solar input could be 0. There is no physical reason why sensitivity has to be independent of the current state. Besides that, only a very simple model posits that the only effect of adding CO2 is to increase downwelling long-wave IR and that downwelling long-wave IR is perfectly equivalent to full spectrum light..

January 25, 2013 1:04 pm

Henry
Yes, more WP also causes a warming effect as the spectra from WP and clouds are very similar; usually the presence of clouds is also accompanied by higher humidity.

January 25, 2013 1:08 pm

Sorry I am getting a bit blind. WP =WV= water vapor gas.

michael hammer
January 25, 2013 1:10 pm

Hmm, so what they are saying is we used known results to adjust our model and then it hind cast those results correctly. Using the model with those parameters it forecast 3.7C per doubling. Checking that forecast over the next 10 years showed that it was massively in error. Adding the actual data over those 10 years to readjust the model led to a new prediction of 1.9C which again correctly hindcasts to known results.
As a result they claim the model must be right this time so please believe our new forecasts. Models are verified by making correct predictions. Here the first prediction was massively off. If in 10 years the new prediction still holds up it may be grounds to have some confidence in the model but until then their model has produced 100% wrong predictions and is thus utterly useless as a predictor. Until they can demonstrate correct forward predictions of results not known at the time of the prediction they have nothing of value. So far all they have done is elegantly proven their model was wrong.

Manfred
January 25, 2013 1:33 pm

The variation in their results may be due to poor/non-modeling of ENSO/PDO/AMO.
It is not clear from the text, if they already included the new AR5 reduced aerosol cooling which would further decrease sensitivity by 25%.
The certainly did not consider the increased black carbon forcing which further decreases sensitivity by almost 20%.

R Barker
January 25, 2013 1:51 pm

Well whatever it turns out to be, how many of you folks up north think maybe a little bit more global warming might not be such a bad thing right now? ;<)

Jim Clarke
January 25, 2013 1:58 pm

“These results are truly sensational,”
I see what is going on here. The climate science community is going to very gradually reveal precisely what the skeptics have been saying for 20 years. Each micro step will be heralded as a brilliant and sensational bit of scientific discovery, qualifying these scientist for prestige and additional grant money. In the end, they will discover that CO2 is not a pollutant and has little impact on global temperatures. They will take credit for this ‘discovery’ and compliment themselves for saving the world from unnecessary regulations. If any warmist dies, they will let the dead one take the fall for the misinformation and proclaim that they were always skeptical of the global warming scare. They will give each other awards, honorary chairs at universities and leading positions at National Academies.
Watch!

Jim Clarke
January 25, 2013 2:04 pm

It shows how far the science has fallen when it is considered a revolutionary idea to give real world data weight in a climate model. It violates the climate change maxim of never letting facts get in the way of the theory. Truly a brave move for these researchers. /sarc off

January 25, 2013 2:12 pm

The researchers used a single climate model that repeated calculations millions of times in order to form a basis for statistical analysis. Highly advanced calculations based on Bayesian statistics were carried out by statisticians at the Norwegian Computing Center.
All digital computers are (baring faults) completely deterministic. If you repeated the same calculations millions of times, you would get the identical result millions of times. The only way to get different results is to change the input values or include psuedo-random functions.
And as Gavin Schmidt himself has said. The outputs of climate models aren’t samples of populations and statistical methods do not apply.

Carsten Arnholm
January 25, 2013 2:13 pm

the1pag says:
January 25, 2013 at 9:03 am
It would seem that Norway, being at fairly high latitudes, should be more concerned about the prospect of oncoming global cooling as indicated by the odd current behavior of sunspot cycle 24.

That is indeed true, and I have been trying to inform my fellow Norwegians about this fact, but politicians are totally immune. We have several political parties over here, but all support the idea of CAGW. There is truly no-one to turn to.

Of course CO2 assists the sun in warming us in a minor manner, so in 30 or 40 years will Norwegians be welcoming all the CO2 that can be economically produced?

CO2 has no measurable effect on climate. The only measurable effect of CO2 in this country is Prime Minister Jens Stoltenberg’s new year speech in 2007, declaring the Mongstad CCS (“Carbon Capture and Storage”) project would be our “moon landing”, apparently comparable only to the Apollo project. However, billions later, we will find that the landing wasn’t soft, and yes Houston, we have a problem over here.

lgl
January 25, 2013 2:15 pm

Willis,
Your 30°C could be relevant if most of the global ocean was at 30°C but it isn’t, and in the ex-tropics cloud cover decreases as temperature rises during summer.

bones
January 25, 2013 2:21 pm

RMB says:
January 25, 2013 at 7:42 am
“You can not heat water from above. surface tension blocks the heat very emphatically and very convincingly. Thats why there is no climate sensitivity.”
Baloney. This is the kind of hopelessly wrong stuff that the CAGW sympathisers pray that they will find here.

DirkH
January 25, 2013 2:26 pm

Steven Mosher says:
January 25, 2013 at 9:39 am
“they determine with 90% probability that global warming from a doubling of CO2 concentration would lie between 1.2°C and 2.9°C.
Luke warmers.”
That’s why I said the rent seekers are queuing at the exit. No Norwegian scientist or bureaucrat, excuse me, I wanted to say policymaker, is a skeptic for all I know; they are a terribly, terribly guilt-ridden people because they earn all their money with gas and oil exports; and voluntarily try to drive electric noddycars in their glaciated country; to not deplete the batteries they install oil heaters in them. And they live under the superstition that their exports harm the planet. When they in fact help to return valuable carbon into the biosphere.

Carsten Arnholm
January 25, 2013 2:30 pm

Bruce Cobb says:
January 25, 2013 at 10:25 am
What’s that sound? Oh, that’s just the sound of the CAGW goalposts being moved. Again. Same game though. Do they really think that anyone is fooled by this?

It is aimed at the norwegian politicians. After all CICERO was founded in 1990 by former prime minister Gro Harlem Brundtland (now mostly busy with Agenda21). Originally, CICERO’s formal mission statement in 1990 was to «fremskaffe kunnskap som kan bidra til å løse det menneskeskapte klimaproblemet» (“acquire knowledge that can help solve the man-made climate problem.”). The assertion of an existing problem and who was supposedly to blame was built into this organisation from the very beginning. As they got criticised, the mission statement was changed in 1990 to “…med sikte på å framskaffe kunnskap som kan bidra til å redusere klimaproblemet…” (“… with a view to acquiring knowledge that can help reduce the climate problem …”).
CICERO is a tool. And yes, all our politicians will fall for this.

Carsten Arnholm
January 25, 2013 2:35 pm

Btw. reference to info about the CICERO mission statement and further links to sources is found in the following norwegian forum http://klimaforskning.com/forum/index.php/topic,1131.msg21886.html#msg21886

Doug Huffman
January 25, 2013 3:07 pm

On Bayesian statistics/inference/epistemology, I believe them completely validated by Physicist Edwin Thompson Jaynes in his masterwork, Probability Theory: The Logic of Science (Cambridge University Press, (2003). ISBN 0-521-59271-2). We can watch his ‘Converging and Diverging Views’ (Eqn. 5.32) at work right here on WUWT.

anticlimactic
January 25, 2013 3:30 pm

I see no reason why the exact effect of CO2 doubling could not be measured directly in a lab experiment. It is only sunlight and air so is not a complicated experiment. We can measure temperature changes to a millionth of a degree so any effect could be measured.
I doubt if this experiment will be performed as I suspect the answer would be ‘zero’.

Ian W
January 25, 2013 3:33 pm

Willis Eschenbach says:
January 25, 2013 at 11:12 am
RMB says:
January 25, 2013 at 7:42 am
You can not heat water from above. surface tension blocks the heat very emphatically and very convincingly. Thats why there is no climate sensitivity.
So the ocean is heated from below? Funny how no one noticed that all these years …
w.

You cannot heat water with long-wave infrared radiation. It will only penetrate a few microns and at best cause the liberation (evaporation) of surface molecules – cooling the surface.
You can heat water with shorter wave radiation and UV which will penetrate a long way into the ocean. It is these shorter wavelengths that heat the ocean and of course it is those that get reflected by cloud albedo.

January 25, 2013 3:49 pm

Doug Huffman on January 25, 2013 at 3:07 pm
“On Bayesian statistics/inference/epistemology, I believe them completely validated by Physicist Edwin Thompson Jaynes in his masterwork, Probability Theory: The Logic of Science (Cambridge University Press, (2003). ISBN 0-521-59271-2). We can watch his ‘Converging and Diverging Views’ (Eqn. 5.32) at work right here on WUWT.”

– – – – – –
Doug Huffman,
You are the the fourth or fifth person here at WUWT in the last two months to highly recommend Physicist Edwin Thompson Jaynes’ book ‘Probability Theory: The Logic of Science’
I am going to locate a copy and have a go at it.
John

Gail Combs
January 25, 2013 3:50 pm

RMB says: January 25, 2013 at 7:42 am
“You can not heat water from above. surface tension blocks the heat very emphatically and very convincingly. Thats why there is no climate sensitivity.”
>>>>>>>>>>>>>>>>
bones says: January 25, 2013 at 2:21 pm
Baloney. This is the kind of hopelessly wrong stuff that the CAGW sympathisers pray that they will find here.
>>>>>>>>>>>>>>>
When RMB says ‘heat’ he is talking about infrared radiation and therefore he is correct.
Wavelengths of sunlight penetrating the ocean: Graph from Colorado University
Graph of solar radiation and terestrial radiation
Graph Solar wavelengths at ocean depths
legend:

The solar radiation ‘envelope’ penetrates the ocean to 100 metres at visible wavelengths but to much shallower depths as the wavelength increases. Back radiation in the far infra-red from the Greenhouse Effect occurs at wavelengths centred around 10 micrometres, well off the scale of this chart, and cannot penetrate the ocean beyond the surface ‘skin’.

A close up of the IR field we see in the following graph

(Google translation) of Oceaanopwarming of zeespiegelstijging door CO2 is niet mogelijk
The wavelengths of CO2 are only between 13.5 and 16.5 microns, and here we see that this radiation is can only penetrate 5 to 10 microns deep.
The ocean surface is approximately 3 degrees warmer than the atmosphere. Where the surface of the water is cooler than incidental in the atmosphere (by wind), the atmosphere possible micrometers from the top by conduction heat up the water, but this too will cause water evaporation, and not to warming. The evaporation boundary layer contemplates a particular temperature gradient as we will see.
Further, the upper 700 meters of the ocean for 50 times as much mass as the entire atmosphere. The specific heat of water is 4 times higher than that of air, so that possesses low water 200 times as much heat energy as the total atmosphere (for the entire ocean, this is even as much as 1200 times). It is the ocean warms the atmosphere primarily by evaporation and not vice versa.
The boundary layer of water evaporation
There is a very important phenomenon which occurs in the boundary layer of the surface of the water for evaporation occurs, where water evaporates and the layer in the atmosphere.
Below is a very nice diagram (Schematic plot or open ocean surface thermal structures) that the temperature gradient in water indicates depends on the depth below the surface. The top value on the y-axis indicates the 10 micrometers up to where IR radiation comes, the next value below it is already 1 mm deep (1000 microns) far outside the range of IR radiation.
The temperature goes up to the right, and curve (a) overnight (b) during the day.

evaporation graph

Update 12-05-2012
Oceanograaf Dr. Robert E. Stevenson schrijft in zijn rapport op pagina 8:
The atmosphere cannot warm until the underlying surface warms first. The lower atmosphere is transparent to direct solar radiation, preventing it from being significantly warmed by sunlight alone. The surface atmosphere thus gets its warmth in three ways: from direct contact with the oceans; from infrared radiation off the ocean surface; and, from the removal of latent heat from the ocean by evaporation. Consequently, the temperature of the lower atmosphere is largely determined by the temperature of the ocean.
Warming the ocean is not a simple matter, not like heating a small glass of water. The first thing to remember is that the ocean is not warmed by the overlying air.

gbaikie
January 25, 2013 4:04 pm

“If we continue to emit greenhouse gases at our current rate, we risk doubling that atmospheric CO2 level in roughly 2050.”
Ah, 2 times 280. So 560 ppm by 2050?
Not likely. 40 years at 4 ppm increase is 160 ppm.
Does anyone think we going to get to 3 ppm per year within a decade?
It seems 2.5 ppm is more likely, in terms of a high end, which require more
than 4 ppm in remaining 3 decades.
Probably at best 500 ppm by 2050.
But there are problems with that.
We probably see less coal use and more natural gas use in coming
decades. For economic reasons- natural gas will be cheaper to use.
Every country other US is paying too high a price for coal, and US is
reducing coal because natural gas is cheaper. And the rest of the world
is not that stupid. So within decade or so, world should be emitting less
CO2, rather than continue the mad increases.
And for most part all these wind mills and solar energy fad is increasing
CO2 emission. One might argue [wrongly] that over long term wind mills and
solar power could reduce CO2, but more wrong to imagine in short term it’s
reducing CO2 [the total CO2 emission cost for this “alternative energies” is
not being accounted for. And just like hybrid cars aren’t accounting for all
the CO2 they cause to emit [not mention their toxic waste] the alternative
energies are not accounting for all the CO2 emit- only it’s worse than
Hybids cars. But in short term, wind and solar are far worse than
making ethanol in terms causing more Co2 emissions].
So, since governments fleeing from further alternative energy subsidies [probably
mostly cause they can’t afford it- rather than any sensible reason] we will see a
global reduction of CO2 emission because this.

Bill Illis
January 25, 2013 4:16 pm

This is a good Bayesian Statistics example for you.
What is the CO2 sensitivity over the last 40 million years?
Now this chart uses the Actual global temperature estimates and all of the reliable CO2 estimates over the last 40 million years (shown as 3.0C per doubling of CO2). Versus the typical climate science example where they don’t use any actual data – they just make it up and ignore the actual data, or they use some climate model simulation of a fantasy climate.
The actual climate history says the climate does whatever it wants to do regardless of the CO2 level.
http://s12.postimage.org/gqh2y043x/Temp_vs_CO2_Last_40_Mys.png
Technically, I think there are two big drivers – Albedo and GHGs – but you need some way to separate the two. Albedo is clearly the biggest driver in this 40 million year example.
The Bayesians, for some reason, do not think ice over Antarctica or no ice over Antarctica makes any difference. Well it certainly does to the tune of at least -2.0C to global temperatures.
The Bayesians think that ice down Chicago in the ice ages makes no difference at all while it clearly shows up to -4.0C impact.
Logical simple statistics (and one can actually calculate the Albedo-climate-impact of Antarctica icing over – surprise, it is exactly -2.0C) versus fantasy made-up statistics
.

Steve B
January 25, 2013 4:59 pm

“RMB says:
January 25, 2013 at 7:42 am
You can not heat water from above. surface tension blocks the heat very emphatically and very convincingly. Thats why there is no climate sensitivity.”
Yes water can heat from above. Ever swum in a small river after a few hot summer days? The top 2 feet is warm but it drops a few degrees at about 2 meters down. Or how about a shallow pond or the kiddies swimming pool at a swim center.
(facepalm)

u.k.(us)
January 25, 2013 5:17 pm

I live in Chicago, which experienced a drought this year, that been getting normal moisture of late. (but no snow cover).
Now the MSM has been feeding me things like:
“It’s pretty rare to get below zero on the ground at O’Hare without snow on the ground,” said Richard Castro, a Romeoville-based meteorologist for the National Weather Service.
The weather service reports that between 1960 and 2010, there have been 469 days with low temperatures below zero; only 16 of those occurred with no snow on the ground.”
http://articles.chicagotribune.com/2013-01-21/news/chi-wind-chill-advisory-issued-dangerous-lows-predicted-for-overnight-20130121_1_wind-chill-national-weather-service-low-temperature

And
“Under ideal conditions of clear skies and light winds, a fresh snow cover can result in minimum temperature readings 20 degrees or more lower than they would be over bare ground. That being said, Chicago’s temperatures will drop below zero without a snow cover, but probably not lower than about minus 10.
Weather historian Frank Wachowski found several subzero occurrences without snow cover, the lowest 8 below on Dec. 14, 1985.”
http://blog.chicagoweathercenter.com/2013/01/22/ask-tom-lowest-temp-with-no-snow-on-the-ground-in-chicago/
==================
Yep, it’s cold, but it’s a dry cold.
Not sure where the extra added Global Warming moisture went, but at least I didn’t have to shovel it.

January 25, 2013 5:30 pm

Stuart L
I am a stupid layman, but wonder about the effects of water vapour (clouds) when I lived in the UK cloud conditions would cause the temps to be milder (warmer) here in Philippines cloud causes cooler conditions, how can one calculate the overall effect on the earths surface?.