Global warming less extreme than feared?
Policymakers are attempting to contain global warming at less than 2°C. New estimates from a Norwegian project on climate calculations indicate this target may be more attainable than many experts have feared.
Internationally renowned climate researcher Caroline Leck of Stockholm University has evaluated the Norwegian project and is enthusiastic.
“These results are truly sensational,” says Dr Leck. “If confirmed by other studies, this could have far-reaching impacts on efforts to achieve the political targets for climate.”
Temperature rise is levelling off
After Earth’s mean surface temperature climbed sharply through the 1990s, the increase has levelled off nearly completely at its 2000 level. Ocean warming also appears to have stabilised somewhat, despite the fact that CO2 emissions and other anthropogenic factors thought to contribute to global warming are still on the rise.
It is the focus on this post-2000 trend that sets the Norwegian researchers’ calculations on global warming apart.
Sensitive to greenhouse gases
Climate sensitivity is a measure of how much the global mean temperature is expected to rise if we continue increasing our emissions of greenhouse gases into the atmosphere.
CO2 is the primary greenhouse gas emitted by human activity. A simple way to measure climate sensitivity is to calculate how much the mean air temperature will rise if we were to double the level of overall CO2 emissions compared to the world’s pre-industrialised level around the year 1750.
If we continue to emit greenhouse gases at our current rate, we risk doubling that atmospheric CO2 level in roughly 2050.
Mutual influences
A number of factors affect the formation of climate development. The complexity of the climate system is further compounded by a phenomenon known as feedback mechanisms, i.e. how factors such as clouds, evaporation, snow and ice mutually affect one another.
Uncertainties about the overall results of feedback mechanisms make it very difficult to predict just how much of the rise in Earth’s mean surface temperature is due to manmade emissions. According to the Intergovernmental Panel on Climate Change (IPCC) the climate sensitivity to doubled atmospheric CO2 levels is probably between 2°C and 4.5°C, with the most probable being 3°C of warming.
In the Norwegian project, however, researchers have arrived at an estimate of 1.9°C as the most likely level of warming.
Manmade climate forcing
“In our project we have worked on finding out the overall effect of all known feedback mechanisms,” says project manager Terje Berntsen, who is a professor at the University of Oslo’s Department of Geosciences and a senior research fellow at the Center for International Climate and Environmental Research – Oslo (CICERO). The project has received funding from the Research Council of Norway’s Large-scale Programme on Climate Change and its Impacts in Norway (NORKLIMA).
“We used a method that enables us to view the entire earth as one giant ‘laboratory’ where humankind has been conducting a collective experiment through our emissions of greenhouse gases and particulates, deforestation, and other activities that affect climate.”
For their analysis, Professor Berntsen and his colleagues entered all the factors contributing to human-induced climate forcings since 1750 into their model. In addition, they entered fluctuations in climate caused by natural factors such as volcanic eruptions and solar activity. They also entered measurements of temperatures taken in the air, on ground, and in the oceans.
The researchers used a single climate model that repeated calculations millions of times in order to form a basis for statistical analysis. Highly advanced calculations based on Bayesian statistics were carried out by statisticians at the Norwegian Computing Center.
2000 figures make the difference
When the researchers at CICERO and the Norwegian Computing Center applied their model and statistics to analyse temperature readings from the air and ocean for the period ending in 2000, they found that climate sensitivity to a doubling of atmospheric CO2 concentration will most likely be 3.7°C, which is somewhat higher than the IPCC prognosis.
But the researchers were surprised when they entered temperatures and other data from the decade 2000-2010 into the model; climate sensitivity was greatly reduced to a “mere” 1.9°C.
Professor Berntsen says this temperature increase will first be upon us only after we reach the doubled level of CO2 concentration (compared to 1750) and maintain that level for an extended time, because the oceans delay the effect by several decades.
The figure of 1.9°C as a prediction of global warming from a doubling of atmospheric CO2 concentration is an average. When researchers instead calculate a probability interval of what will occur, including observations and data up to 2010, they determine with 90% probability that global warming from a doubling of CO2 concentration would lie between 1.2°C and 2.9°C.
This maximum of 2.9°C global warming is substantially lower than many previous calculations have estimated. Thus, when the researchers factor in the observations of temperature trends from 2000 to 2010, they significantly reduce the probability of our experiencing the most dramatic climate change forecast up to now.
Professor Berntsen explains the changed predictions:
“The Earth’s mean temperature rose sharply during the 1990s. This may have caused us to overestimate climate sensitivity.
“We are most likely witnessing natural fluctuations in the climate system – changes that can occur over several decades – and which are coming on top of a long-term warming. The natural changes resulted in a rapid global temperature rise in the 1990s, whereas the natural variations between 2000 and 2010 may have resulted in the levelling off we are observing now.”
Climate issues must be dealt with
Terje Berntsen emphasises that his project’s findings must not be construed as an excuse for complacency in addressing human-induced global warming. The results do indicate, however, that it may be more within our reach to achieve global climate targets than previously thought.
Regardless, the fight cannot be won without implementing substantial climate measures within the next few years.
Sulphate particulates
The project’s researchers may have shed new light on another factor: the effects of sulphur-containing atmospheric particulates.
Burning coal is the main way that humans continue to add to the vast amounts of tiny sulphate particulates in the atmosphere. These particulates can act as condensation nuclei for cloud formation, cooling the climate indirectly by causing more cloud cover, scientists believe. According to this reasoning, if Europe, the US and potentially China reduce their particulate emissions in the coming years as planned, it should actually contribute to more global warming.
But the findings of the Norwegian project indicate that particulate emissions probably have less of an impact on climate through indirect cooling effects than previously thought.
So the good news is that even if we do manage to cut emissions of sulphate particulates in the coming years, global warming will probably be less extreme than feared.
| About the project |
| Geophysicists at the research institute CICERO collaborated with statisticians at the Norwegian Computing Center on a novel approach to global climate calculations in the project “Constraining total feedback in the climate system by observations and models”. The project received funding from the Research Council of Norway’s NORKLIMA programme.The researchers succeeded in reducing uncertainty around the climatic effects of feedback mechanisms, and their findings indicate a lowered estimate of probable global temperature increase as a result of human-induced emissions of greenhouse gases.The project researchers were able to carry out their calculations thanks to the free use of the high-performance computing facility in Oslo under the Norwegian Metacenter for Computational Science (Notur). The research project is a prime example of how collaboration across subject fields can generate surprising new findings. |
- Written by:
- Bård Amundsen/Else Lie. Translation: Darren McKellep/Carol B. Eckmann
- h/t to Andrew Montford via Leo Hickman
We must create a World Climate Control Commission with the power to Tax The World!
The really, really amusing thing about this is the enormous impact that a tiny timespan has had on the estimate (which is absolutely on the right track). And that timespan is not “finished”. Every year with basically neutral temperature at this point shaves another 0.1 C off of the overall expected sensitivity, at least down to the 1-1.4C expected from CO_2 only, at least 0.3 to 0.4 of which we’ve already experienced as CO_2 went from 300 to 400 ppm.
Indeed, that’s the pace — 1.2 C total warming by the time CO_2 gets to 600 ppm. Maybe. But we really have only a tiny segment of good, tamper-proof data (RSS or UAH, take your pick). If they remain flat for another decade, or go down, that will completely alter the predictions. If they go up, or sharply up, that will completely alter the predictions.
Doesn’t sound particularly settled, but then, our ignorance is profound. It isn’t settled, as this study clearly shows.
Note well, if the climate remains flat, AR6 is going to reduce sensitivity to less than 2 C just as AR5 is dropping it well below 3C — nobody believes the extreme predictions any more. The IPCC simply is unwilling to face how meaningless all of these predictions are, given their extreme statistical sensitivity to new data as it comes in.
rgb
Is there a link? The one you give is to an email address. Thx.
“The Earth’s mean temperature rose sharply during the 1990s.”
‘Sharply’ ? Fractional increases in temperature that humans
would not be able to discern without the aid of instrumentation.
The rent seekers are queuing at the exit.
A new tinker toy contraption that is going to show us how things do not work. How many global climate models does this make so far?
Off topic, but this morning on cspan they had a Reuters reporter taking questions on the drought. When the question of whether global warming has any influence, the reporter mentioned droughts in the 50s and the 80s and pretty much blamed cycles. I nearly fell out of my chair. I believe the ship jumping is becoming an epidemic.
I dunno, is it just me, or does this seem yet more of the ‘gradual climbdown’ over recent months?
Their hypothesis of CO2 being the primary driver has been fairly clearly falsified due to lack of warming despite significant CO2 increase and that hurts them!
or – if they want to claim this current lack of warming is due to natural variability, then the previous warming could also be due to natural variability – and that hurts them too!
it just strikes me they are desperate to find something they can turn too – but the simple fact remains, that the data (even fudged data) is not showing what they need it to show.
Now would be a good time to record the names of all the alarmists, along with any screencaps of their actions – as in due course, they will all be reversing course and claiming they were misled, etc, etc.
What if the economy still doesn’t recover after all that wasted money on AGW ? – I can well imagine Jones, Hansen et al being physically forced to eat their words in a few years time by the impoverished folk they have ‘created’! The sounds of several hundred million cold starving folk, baying for blood must be beginning to haunt their dreams!
arthur4563 says:
January 25, 2013 at 7:12 am
“I still insist that future carbon emissions are going to be drastically reduced simply as a matter of advanced technology in personal vehicles (they’re going electric, just as fast as batteries can be reduced in price)”
No Moore’s Law for batteries.
More Post-Normal thinking, OY! If the methodology used to derive the so called “Climate Sensitivity” was valid, then it would be independent on the set of data used, as long as the size of the domain is greater then zero. So what happens to the derived figure when data from 2010 to 2013?
These jokers are just trying to keep CO2 reduction a viable tool in the Marxist war against prosperity by explaining away the failure of temperatures to raise as expected. They showed their hand in these paragraphs:
‘“These results are truly sensational,” says Dr Leck. “If confirmed by other studies, this could have far-reaching impacts on efforts to achieve the political targets for climate.”’
and
“Terje Berntsen emphasises that his project’s findings must not be construed as an excuse for complacency in addressing human-induced global warming. The results do indicate, however, that it may be more within our reach to achieve global climate targets than previously thought.
Regardless, the fight cannot be won without implementing substantial climate measures within the next few years.”
“Less extreme than feared”? or `It`s milder than predicted…how wonderful`
Well, at least we know what they were hoping for, and what they actually fear.
Sometimes I truly wish I could time travel:
Circa 2020 — Climate Scientists announced today that they have updated their model with
data from the period 2010 – 2020 and are very surprised that the model outputs a climate
sensitivity of 1.4125.
Scientists were reportedly very excited that the lower sensitivity implied that emissions
control policies were now even closer within reach, but cautioned that we should not become
complacent, and that further study was necessary.
/sarc
davidmhoffer wrote in http://wattsupwiththat.com/2013/01/25/yet-another-study-shows-lower-climate-sensitivity/#comment-1208754
It’s not often that we agree. I am not going to jump to the conclusion that is was “trash”, but such a result is suspicious at least.
_Jim says:
January 25, 2013 at 7:56 am
Policymakers are attempting to …
###
“Policymaker” is standard term from the Marxist lexicon that encapsulates a concept that exists within the Marxist world-view. Like most Marxist jargon, what it appears to mean and what it means to the Marxist are different. In this case, the term is used to identify those pressure points that the Marxist activist can target in order to achieve their goal of the enslavement of mankind.
Whenever you read such terminology, you can be pretty sure that you are reading something embedded within the Marxist world-view. Even if the writer is not aware of it. Marxist have had such an influence on our society that even conservative thought is polluted with this non-sense.
I saw this today at bishop-hill re: CS
Uniform priors and the IPCC
http://www.bishop-hill.net/blog/2013/1/25/uniform-priors-and-the-ipcc.html
But the researchers were surprised when they entered temperatures and other data from the decade 2000-2010 into the model; climate sensitivity was greatly reduced to a “mere” 1.9°C.
I know others have already commented on this but…seriously? They were surprised by this?!
(And for Jim who wanted clarification at January 25, 2013 at 7:56 am. Let’s be absolutely clear about this; it depends.)
It would seem that Norway, being at fairly high latitudes, should be more concerned about the prospect of oncoming global cooling as indicated by the odd current behavior of sunspot cycle 24. Of course CO2 assists the sun in warming us in a minor manner, so in 30 or 40 years will Norwegians be welcoming all the CO2 that can be economically produced?
Ho hum. Another day, another climate model…
“If confirmed by other studies, this could have far-reaching impacts on efforts to achieve the political targets for climate.”
Political targets… nuff said.
Who would have thought using a model sensitive to recent temperatures would produce such a result? Amazing, simply amazing. /sarc
“arthur4563 says:
January 25, 2013 at 7:07 am
I cringe every time I hear about Bayesian statistics being employed. Such statistics amount to
educated (or not so educated) guesses about what someone thinks the values are, presumably
incorporating knowledge that he has, and weighted according to more educated guessing.”
I wonder if the discussion between learned statisticians at Bishop Hill has any bearing on the use of Bayesian statistics in this study.
http://bishophill.squarespace.com/blog/2013/1/25/uniform-priors-and-the-ipcc.html
Nick Lewis and Steve Jewson chime in with advice on why not to use flat prior, but Jeffreys’ Prior instead.
Each new ‘study’ ratchets down the climate sensitivity estimate. In fact, sensitivity to 2xCO2 is statistically zero. This shows why. Any effect from adding more CO2 is lost in the noise. It is simply too small to measure. Thus, the “carbon” scare is falsified.
I am listening but I can’t make out the sound of any apologies…Nope, in fact Lindzen, Spencer, and others that have been estimating sensitivity at less than 2C are still being denigrated by the consensus. Of course the politics haven’t changed…
Yawn. Mostly what this confirms is that for all their insisting climate models are fancy and robust, they’re really just “we expect what’s been happening will keep happening”, and aren’t truly independently predictive at all.
they determine with 90% probability that global warming from a doubling of CO2 concentration would lie between 1.2°C and 2.9°C.
Luke warmers.
And sensitivity to 2x c02 cannot be zero. If it were then changes in solar input would be zero.