Scientists Develop New Method to Quantify Climate Modeling Uncertainty

(From PhysOrg.com h/t to Leif Svalgaard )– Climate scientists recognize that climate modeling projections include a significant level of uncertainty. A team of researchers using computing facilities at Oak Ridge National Laboratory has identified a new method for quantifying this uncertainty.

Photo: Martin Koser of Denmark
Photo: Martin Koser of Denmark

The new approach suggests that the range of uncertainty in climate projections may be greater than previously assumed. One consequence is the possibility of greater warming and more heat waves later in the century under the Intergovernmental Panel on Climate Change’s (IPCC) high fossil fuel use scenario.

The team performed an ensemble of computer “runs” using one of the most comprehensive climate models–the Community Climate System Model version 3, developed by the National Center for Atmospheric Research (NCAR)–on each of three IPCC scenarios. The first IPCC scenario, known as A1F1, assumes high global economic growth and continued heavy reliance on fossil fuels for the remainder of the century. The second scenario, known as B1, assumes a major move away from fossil fuels toward alternative and renewable energy as the century progresses. The third scenario, known as A2, is a middling scenario, with less even economic growth and some adoption of alternative and renewable energy sources as the century unfolds.

The team computed uncertainty by comparing model outcomes with historical climate data from the period 2000-2007. Models run on historical periods typically depart from the actual weather data recorded for those time spans. The team used statistical methods to develop a range of temperature variance for each of the three scenarios, based on their departure from actual historical data.

The approach’s outcome is roughly similar to the National Weather Service’s computer predictions of a hurricane’s path, familiar to TV viewers. There is typically a dark line on the weather map showing the hurricane’s predicted path over the next few days, and there is a gray or colored area to either side of the line showing how the hurricane may diverge from the predicted path, within a certain level of probability. The ORNL team developed a similar range of variance–technically known as “error bars”–for each of the scenarios.

Using resources at ORNL’s Leadership Computing Facility, the team then performed ensemble runs on three decade-long periods at the beginning, middle, and end of the twenty-first century (2000-2009, 2045-2055, and 2090-2099) to get a sense of how the scenarios would unfold over the twenty-first century’s hundred years.

Interestingly, when the variance or “error bars” are taken into account, there is no statistically significant difference between the projected temperatures resulting from the high fossil fuel A1F1 scenario and the middling A2 scenario up through 2050. That is, the A1F1 and A2 error bars overlap. After 2050, however, the A1F1 range of temperature projections rise above those of A2, until they begin to overlap again toward the century’s end.

Typically climate scientists have understood the range of uncertainty in projections to be the variance between high and low scenarios. But when the error bars are added in the range between high and low possibilities actually widens, indicating greater uncertainty.

“We found that the uncertainties obtained when we compare model simulations with observations are significantly larger than what the ensemble bounds would appear to suggest,” said ORNL’s Auroop R. Ganguly, the study’s lead author.

In addition, the error bars in the A1F1 scenario suggests at least the possibility of even higher temperatures and more heat waves after 2050, if fossil fuel use is not curtailed.

The team also looked at regional effects and found large geographical variability under the various scenarios. The findings reinforce the IPCC’s call for greater focus on regional climate studies in an effort to understand specific impacts and develop strategies for mitigation of and adaptation to climate change.

The study was published in the Proceedings of the National Academy of Sciences. Co-authors include Marcia Branstetter, John Drake, David Erickson, Esther Parish, Nagendra Singh, and Karsten Steinhaeuser of ORNL, and Lawrence Buja of NCAR. Funding for the work was provided by ORNL’s new cross-cutting initiative called Understanding Climate Change Impacts through the Laboratory Directed Research and Development program.

More information: The paper can be accessed electronically here: http://www.pnas.org/content/106/37/15555

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

104 Comments
Inline Feedbacks
View all comments
October 22, 2009 2:50 pm

Come on!, a model of a model?…or a self indulging model?

October 22, 2009 2:53 pm

Wait, this is like dividing by zero or like pouring the empty into the void!

KimW
October 22, 2009 2:56 pm

So even more computer models predict disaster and the more they ran them, the more disaster. How very scientific. That was sarcasm by theway.

dearieme
October 22, 2009 2:57 pm

“The team computed uncertainty by comparing model outcomes with historical climate data from the period 2000-2007.” Is 2000-2007 a cherry?

Ron de Haan
October 22, 2009 3:00 pm

“One consequence is the possibility of greater warming and more heat waves later in the century under the Intergovernmental Panel on Climate Change’s (IPCC) high fossil fuel use scenario”.
The article starts with crap so it probably ends with crap so I stopped reading.
It’s a pathetic attempt to clean talk the faulty computer models.
I have adopted Lord Moncton assessment it is not possible to make any reliable climate prediction using computer models.
Anyone who claims to have a reliable climate model is a fraud serving the First Global Revolution.

Sean
October 22, 2009 3:02 pm

The models can’t get clouds right. Shouldn’t they try to get the physics of the weather better before they play these statistical games.

Michael D Smith
October 22, 2009 3:08 pm

I wonder if they also put a distribution around climate sensitivity, since most recent studies show CO2 has a MUCH smaller effect than has been used in models of the past.
Start doing some model runs with a reasonable H2O feedback parameter to CO2 and guess what, there will be no statistically significant difference between ANY CO2 emissions scenarios… (big DUH). Let me guess, they didn’t do that…

DaveE
October 22, 2009 3:12 pm

More uncertainty but it’s WTWT
DaveE.

peter_dtm
October 22, 2009 3:13 pm

Feed the data in up to 1995 – do any of the models predict the current state of the weather ?
If they don’t why does anyone bither with them ?

jack mosevich
October 22, 2009 3:15 pm

If the range of uncertainty is greater than previously assumed then cooler temperatures could be in the offing too. They only clain that higher temperatures migh happen. A bit of bias, no?

Richard Wright
October 22, 2009 3:22 pm

Let’s see, the uncertainty increases but the consequence is that there may be higher temperatures and more heat waves. Of course higher uncertainty couldn’t possibly mean that there may be lower temperatures and fewer heat waves. Using their illustration from projected hurricane paths, it would be like the band of uncertainty was higher on one side of the projected path than the other. No bias in this reporting, is there? I also find it interesting that they judge the uncertainty by the models’ ability to predict the past. They have yet to be able to predict the future.

crosspatch
October 22, 2009 3:24 pm

In addition, the error bars in the A1F1 scenario suggests at least the possibility of even higher temperatures and more heat waves after 2050, if fossil fuel use is not curtailed.

This is the sort of idiotic nonsense that pervades the people pushing the anti-fossil fuel agenda. Again it is proof that it isn’t about “global warming” but about using “global warming” to achieve a different agenda.
As stated in jack mosevich (15:15:04), the wider error bars ALSO produce the same possibility that temperatures will be even colder DESPITE the continued use of fossil fuel. What really chaps my cheeks is this implied notion that fossil fuel use is responsible for any warming or cooling of the climate and if we somehow eliminated the use of it, we could be assured that the climate wouldn’t warm. There has been absolutely no observed relationship between climate behavior and fossil fuel use. That they continue spouting this as if it is some kind of truth makes rubbish of the entire article.
What a bunch of idiots!

michel
October 22, 2009 3:24 pm

“Anyone who claims to have a reliable climate model is a fraud serving the First Global Revolution.”
No, he/she is mistaken. Possibly because of lack of experience with the use of models in forecasting. Possibly because of a political agenda. Possibly because of confirmation bias – one fools oneself. We all do it sometime. Wrong, almost certainly. Fraud, most likely not, and anyway you can’t prove it, so it adds nothing to the observation that they are wrong.

DR
October 22, 2009 3:25 pm

Until they can correctly simulate clouds, solar, ENSO and other aspects of climate processes, reports like that are useless.

Murray Duffin
October 22, 2009 3:29 pm

There isn’t enough fossil fuel to supply the CO2 fo A1F1, and maybe not enough for B1, so what does that do form their uncertainty?

Craigo
October 22, 2009 3:32 pm

The uncertainty is “worse than we predicted”.
Could the error bars also suggest “lower than we predicted” temperatures and less heatwaves? (aren’t heat waves weather, not climate?).
This is all too uncertain for me. Sounds like the science is a littled unsettled today.

Bob Koss
October 22, 2009 3:34 pm

Actual global temperature is 14c-15c. How can they be modeling the earth when the temperatures shown in their fig. 1 graphs stay below 10c throughout the period? It seems to me the emulation of snow and ice coverage along with the albedo figures have to be totally erroneous.

Glenn
October 22, 2009 3:35 pm

Not much surprise that these gavin clone computer geek warmingologists would “suggest” “greater urgency and complexity of adaptation or mitigation decisions.”
The paper “edited” by Stephen H. Schneider after April 2009…
who in March 2009 authored and signed this letter to Congress
http://stephenschneider.stanford.edu/Publications/PDF_Papers/Congressional_Ldrs_Ltr.pdf

October 22, 2009 3:40 pm

“The team computed uncertainty by comparing model outcomes with historical climate data from the period 2000-2007.”
That’s very strange. Hasn’t it been revealed over and over again that 30 years are required to establish “climate”?

SemiChemE
October 22, 2009 3:40 pm

So, if I’m reading this correctly, the study found that the range of uncertainty in the model was larger than they thought, which is all well and good. What is disturbing is how this article cherry picks the interpretation of this result, stating:
“In addition, the error bars in the A1F1 scenario suggests at least the possibility of even higher temperatures and more heat waves after 2050, if fossil fuel use is not curtailed.”
The problem is that they forget the equally likely possibility of even lower temperatures and cold waves after 2050, if fossil fuel use is not curtailed.

Tenuc
October 22, 2009 3:42 pm

Just illustrates the futility of trying to model a dynamic chaotic system with insufficient knowledge of how each part of our climate operates and interacts, then using data of insufficient granularity and accuracy, on a computer who’s arithmetic unit has far too few decimal places to capture the subtleness of the initial conditions.
Another good example of GIGO.

P Walker
October 22, 2009 3:44 pm

A “modest proposal” – Why not scrap current models , which obviously don’t work , and start over again using all the known variables instead of relying on co2 ? BTW , this is a mostly rhetorical question , although I don’t understand why no one has tried this .

DennisA
October 22, 2009 3:55 pm

For a very revealing background to IPCC modeling, check out the 1999-2001 ECLAT series of seminars on Representing Uncertainty in Climate Models:
http://www.cru.uea.ac.uk/eclat/ All downloadable pdf’s.
Just a few choice selections…..
“Projecting the future state(s) of the world with respect to demographic, economic, social, and technological developments at a time scale consistent with climate change projections is a daunting task, some even consider as straightforward impossible.
Over a century time scale, current states and trends simply cannot be extrapolated. The only certainty is that the future will not be just more of the same of today, but will entail numerous surprises, novelties and discontinuities.
“The probability of occurrence of long-term trends is inversely proportional to the ‘expert’ consensus.”
“….excessive self-cite and “benchmarking” of modeling studies to existing scenarios creates the danger of artificially constructing “expert consensus””
That was two years after Kyoto, when current UK chief climate scientist, Bob Watson, then IPCC Chairman, had said the science was settled.
Since the ECLAT seminars, they became so worried at how uncertain everything was, that they decided the only thing to do was ignore it and boast of ever more robust climate models, because they didn’t want policy makers to think there was any doubt about global warming.
Funny though, eight years later, in 2007, Professor Lenny Smith, a statistician at the London School of Economics, warned about the “naïve realism” of current climate modelling. That’s eight years of ever more expansive and expensive computer systems and ever more complex models, giving rise to almost weekly calls of “new research suggests it’s worse than we thought.”
“Our models are being over-interpreted and misinterpreted,” he said. Over-interpretation of models is already leading to poor financial decision-making, Smith says. “We need to drop the pretence that they are nearly perfect.”
“He singled out for criticism the British government’s UK Climate Impacts Programme and Met Office. He accused both of making detailed climate projections for regions of the UK when global climate models disagree strongly about how climate change will affect the British Isles.” (From New Scientist magazine, 16 August 2007.)
I think the theme song is “I can do MAGICC”…..

tallbloke
October 22, 2009 3:55 pm

HAHAHAHAHAHAHAHAHAHAHAHA! THEY’VE LOST THE PLOT.
Give it up guys, you’re just making yourselves look sillier all the time.

realitycheck
October 22, 2009 3:55 pm

Haven’t posted in a while, but this one definitely got the alarm bells ringing.
These climate modelers really do see the world through some form of catastrophe-tinted spectacles don’t they?
The uncertainty is worse than we first thought – therefore it could get EVEN WARMER!!
???
It could also mean that all the warm projections are a pile of horse excrement too couldn’t it? Ever though of that one Batman? Huh? Huh?
I’m sorry, they must be using that new kind of mathematics in which 5-5 = 10
Now I see.

1 2 3 5