(From PhysOrg.com h/t to Leif Svalgaard )– Climate scientists recognize that climate modeling projections include a significant level of uncertainty. A team of researchers using computing facilities at Oak Ridge National Laboratory has identified a new method for quantifying this uncertainty.

The new approach suggests that the range of uncertainty in climate projections may be greater than previously assumed. One consequence is the possibility of greater warming and more heat waves later in the century under the Intergovernmental Panel on Climate Change’s (IPCC) high fossil fuel use scenario.
The team performed an ensemble of computer “runs” using one of the most comprehensive climate models–the Community Climate System Model version 3, developed by the National Center for Atmospheric Research (NCAR)–on each of three IPCC scenarios. The first IPCC scenario, known as A1F1, assumes high global economic growth and continued heavy reliance on fossil fuels for the remainder of the century. The second scenario, known as B1, assumes a major move away from fossil fuels toward alternative and renewable energy as the century progresses. The third scenario, known as A2, is a middling scenario, with less even economic growth and some adoption of alternative and renewable energy sources as the century unfolds.
The team computed uncertainty by comparing model outcomes with historical climate data from the period 2000-2007. Models run on historical periods typically depart from the actual weather data recorded for those time spans. The team used statistical methods to develop a range of temperature variance for each of the three scenarios, based on their departure from actual historical data.
The approach’s outcome is roughly similar to the National Weather Service’s computer predictions of a hurricane’s path, familiar to TV viewers. There is typically a dark line on the weather map showing the hurricane’s predicted path over the next few days, and there is a gray or colored area to either side of the line showing how the hurricane may diverge from the predicted path, within a certain level of probability. The ORNL team developed a similar range of variance–technically known as “error bars”–for each of the scenarios.
Using resources at ORNL’s Leadership Computing Facility, the team then performed ensemble runs on three decade-long periods at the beginning, middle, and end of the twenty-first century (2000-2009, 2045-2055, and 2090-2099) to get a sense of how the scenarios would unfold over the twenty-first century’s hundred years.
Interestingly, when the variance or “error bars” are taken into account, there is no statistically significant difference between the projected temperatures resulting from the high fossil fuel A1F1 scenario and the middling A2 scenario up through 2050. That is, the A1F1 and A2 error bars overlap. After 2050, however, the A1F1 range of temperature projections rise above those of A2, until they begin to overlap again toward the century’s end.
Typically climate scientists have understood the range of uncertainty in projections to be the variance between high and low scenarios. But when the error bars are added in the range between high and low possibilities actually widens, indicating greater uncertainty.
“We found that the uncertainties obtained when we compare model simulations with observations are significantly larger than what the ensemble bounds would appear to suggest,” said ORNL’s Auroop R. Ganguly, the study’s lead author.
In addition, the error bars in the A1F1 scenario suggests at least the possibility of even higher temperatures and more heat waves after 2050, if fossil fuel use is not curtailed.
The team also looked at regional effects and found large geographical variability under the various scenarios. The findings reinforce the IPCC’s call for greater focus on regional climate studies in an effort to understand specific impacts and develop strategies for mitigation of and adaptation to climate change.
The study was published in the Proceedings of the National Academy of Sciences. Co-authors include Marcia Branstetter, John Drake, David Erickson, Esther Parish, Nagendra Singh, and Karsten Steinhaeuser of ORNL, and Lawrence Buja of NCAR. Funding for the work was provided by ORNL’s new cross-cutting initiative called Understanding Climate Change Impacts through the Laboratory Directed Research and Development program.
More information: The paper can be accessed electronically here: http://www.pnas.org/content/106/37/15555
They are the very model,
Of modern climate modelers.
When they don’t even know what they don’t know, there isn’t any way to set reliable error bars. This is simply bad science applied to bad science.
“Two words: pet banks.” Evan
Touché!
Banking is such a corrupting business which is why there should be no government privilege. If only this were a banking blog. I have some ideas …
But better 23 pet banks than 1.
Thanks Evan.
balderdash. the uncertainty can’t be bounded because the only thing they really know is that there is a lot they don’t know, if they’re not completely self deluded.
Philip_B (17:53:51) :
For a period I worked as consultant assessing the prospects of ultimate success of computer software projects of similar scale to the climate models.
The individuals who had worked on the projects had enormous psychological investment in their systems and would go to great lengths to ensure the systems continued development.
All evidence presented to them that their system was unrecoverably flawed and had little or no chance of ultimate success was rationalized away or avoided. Often combined with the kind of ad hominen attacks we are so familiar with in climate circles.
I have little doubt the climate models are irretreviably flawed and should be scrapped and the development process started again from scratch under a public and transparent process.
I am also certain that this will never happen if the modellers are left to make the decision themselves.
This all bears repeating…again and again. May the truth win out. The truth is out there.
Chris
Norfolk, VA, USA
But better 23 pet banks than 1.
Come to think of it, I’m also part Cherokee . . .
Smokey I must admit I have saved your words for the future (but I will give credit). Worth saving….and worth repeating…again and again…..again.
May the truth win out…
Smokey (18:32:41) :
No GCM [computer climate model] predicted the flat to declining global temperatures for most of the past decade, just as no computer financial model predicted the severity and extent of the global economic decline over the past year. And the universe of stocks is much simpler than the individual entities affecting the climate.
The only truly successful financial computers are those that take advantage of arbitrage in the microsecond delays between buy/sell orders and execution. But there are no truly successful computer financial models that accurately predict future stock prices.
What is ignored by the faction of the climate community engaged in selling AGW is the fact that COMPUTER MODELS ARE NOT DATA [emphasis mine] . Models are simply a tool. And they are not a very good tool for predicting the climate. They are not evidence, although this paper typically implies that they are.
Empirical evidence is RAW [emphasis mine LOL], verifiable data. Arguments can be made for adjusting the raw data. And arguments can be made for using different methodologies and computer algorithms in the models. But none of these are evidence.
The problem emerges when the people arguing the case that computer models can be tweaked sufficiently to predict the future refuse to disclose their raw data and their specific methodology, thus making replication and falsification of their hypothetical claims almost impossible. They are uncooperative because they are hiding something.
The scientific method requires that all information resulting in a new hypothesis must be promptly made available to anyone who requests it. Yet the AGW contingent disrespects and ignores the scientific method.
The undeniable fact that certain climate scientists of the CO2=AGW persuasion are still deliberately withholding the data and methodologies that they claim shows that a tiny trace gas will cause runaway global warming means that they are engaging in self-serving propaganda, not science.
Anyone withholding any such information should be barred from receiving further public funding of any kind, because they are deliberately running a scam to defraud the public, nothing less. For example, Michael Mann, of the debunked UN/IPCC hockey stick, still refuses to cooperate with other scientists.
Mann stonewalls requests for his data. And when his methodology is finally teased out of his results, it turns out that not only is Mann devious, but he is INCOMPETENT AS WELL [emphasis mine], as is shown in his paper published with upside-down Tiljander sediments — which was presumably refereed, and then simply hand-waved through the climate peer review process by his cronies, who perceive scientific skepticism as their enemy.
The scientific method obligates all scientists to do their best to falsify every hypothesis, even those scientist proposing their own hypothesis. This is done mainly through replication. But those climate scientists who have both front feet in the public trough fight the scientific method and scientific skepticism every step of the way.
As we are well aware here, the climate alarmist clique has learned how to game the system for their own personal benefit. They feather their own nests by lining their pockets with undeserved taxpayer money when they refuse to abide by the scientific method; instead they adopt a siege mentality against skepticism in place of the required scientific cooperation.
Bravo. The BEST apologia yet.
DAMN worth repeating.
Chris
Norfolk, VA, USA
Maybe we should develop our own simplified, easily revisable, tentative, “top down” climate model that could be continually adjusted whenever the data jumped the error bars.
As a matter of fact, I think I may do just that . . .
I designed a Civil War game (Blue Vs Gray, GMT Games), which was storyboarded down to the card pick, step loss, and die roll. (If you play wargames you know how rare that is.) So I have a little hands-on experience in relatively simple modeling of immeasurably complex issues.
“So I have a little hands-on experience in relatively simple modeling of immeasurably complex issues.” Evan
Cool. I love modeling myself. My current model is an Excel model of a new non-fractional reserve banking model. It is honest and profitable. One day, maybe it will be made into “Sim-Banker”
Evan… that is a great idea… Mike
Why write a book when GIGO says it all?
Ron de Haan (15:00:08) :
I’m with you all the way on this one, Ron.
Silly question, if they are saying that the uncertainty is greater than first calculated, does that mean temperatures may be lower than models predict? It seems the article is trying to say the tempertures will be so much higher and forget to say they may be so much lower. Why can’t they admit the uncertainty is to great for them to predict anything at all. My crystal ball works real well though. Send me a buck and I’ll tell you the high temperature of July 4, 2109. If I’m wrong, I’ll refund your money. 🙂
I got it. You take the today’s stock market close. Tomorrow should be within 10% at the most, under most circumstances. You can run that for a year down the road, and Oct 22, 2010 will be between 9,000 and 11,000, provided nothing really bad happens or someone hasn’t re-invented the Bubble.
Go out 10 years, and you could be looking at 5,000 or 15,000.
In 20 years, you could be near zero or 20,000.
All of this assumes no outliers. The things that turn the game upside down.
In 40 years, you could have crashed and rebuilt, and the error bars start converging, adjusting for inflation.
Climate should be, and most often is, self-correcting. In 20 years, we may be back to pre-1900, and in 40 years we could be back to 1800’s. Or it could end up being another Medieval Warm period, with balmy temps.
Or it could cool & cool & cool, like what happened after the MWP.
What the danger ahead is all about is that some folks have in their head to monkey with that natural process, and thereby rob & emperil our ability to adapt to natural change, citing man’ ability to ruin the climate. Well, if you set out to radically alter it at a global and abrupt scale, that is playing with fire in my book.
The chances of predicting correctly the sequence of climate cycles ahead is nigh impossible.
dearieme (14:57:29) :
“The team computed uncertainty by comparing model outcomes with historical climate data from the period 2000-2007.” Is 2000-2007 a cherry?
That being a cherry picking is a certainty, 100%
[no profanity or implied profanity ~ ctm]
[no profanity or implied profanity ~ ctm]
Correction: WELL SAID (lol)
Reply: Wow, you anticipated my reaction in advance. Kudos ~ ctm
evanmjones (20:10:02) :
You’ll be a smash hit with that. The sophitication of those board wargames has grown exponentially since the first AH boardgame release. Tactics II was the name, if I remember right.
I guess I’m first with:
It is worse than we thought.
Heh.
If you look carefully at the references to this paper, most are at least two years old, and the reference to the data used in the ‘reanalysis’ is in fact eight years old. Is the data more recent than that?
This sort of paper is like a group of Biblical scholars reanalysing the Gospels and even though the source information has either been lost or mangled in translation, they find that although the texts are somewhat inconsistent in their telling of the life of Christ they are close enough. And from this the scholars find that the apocalyptic visions of the Book of Revelation are a valid view of the future.
Replace the appropriate words in the above and it is almost an Abstract for a peer reviewed paper on Climate Science.
Re: Paddy (18:24:29)
We all have a responsibility to oppose what you describe like our lives depend on it (…and some will legitimately debate whether I should be using the word ‘like’). The sustainable defense of civilization cannot be sacrificed to bad assumptions (something completely within our control).
“If the range of uncertainty is greater than previously assumed then cooler temperatures could be in the offing too.”
No, the models are programmed to create warming, the only variability is in the amount of warming. Any model parameters that create cooling are rejected. CO2 = warming is a central assumption coded in, so any increase in CO2 will always produce warming over the long term in the model’s output as this is what they are programmed to produce.
Bob Koss (15:34:02)
You mustn’t be to hard on those models. It is very difficult to get them to both give a temperature increase (AGW) and a global temperature that look plausible. Since the AGW is the important thing, that is what they tweak the models for. Many of them model global temperatures that are way off target. That’s the reason you alway see temperature anomalies displayed, not absolute temperatures.
What Ron de Haan said. After that first statement, my bs detector prevented me from reading further.