(From PhysOrg.com h/t to Leif Svalgaard )– Climate scientists recognize that climate modeling projections include a significant level of uncertainty. A team of researchers using computing facilities at Oak Ridge National Laboratory has identified a new method for quantifying this uncertainty.

The new approach suggests that the range of uncertainty in climate projections may be greater than previously assumed. One consequence is the possibility of greater warming and more heat waves later in the century under the Intergovernmental Panel on Climate Change’s (IPCC) high fossil fuel use scenario.
The team performed an ensemble of computer “runs” using one of the most comprehensive climate models–the Community Climate System Model version 3, developed by the National Center for Atmospheric Research (NCAR)–on each of three IPCC scenarios. The first IPCC scenario, known as A1F1, assumes high global economic growth and continued heavy reliance on fossil fuels for the remainder of the century. The second scenario, known as B1, assumes a major move away from fossil fuels toward alternative and renewable energy as the century progresses. The third scenario, known as A2, is a middling scenario, with less even economic growth and some adoption of alternative and renewable energy sources as the century unfolds.
The team computed uncertainty by comparing model outcomes with historical climate data from the period 2000-2007. Models run on historical periods typically depart from the actual weather data recorded for those time spans. The team used statistical methods to develop a range of temperature variance for each of the three scenarios, based on their departure from actual historical data.
The approach’s outcome is roughly similar to the National Weather Service’s computer predictions of a hurricane’s path, familiar to TV viewers. There is typically a dark line on the weather map showing the hurricane’s predicted path over the next few days, and there is a gray or colored area to either side of the line showing how the hurricane may diverge from the predicted path, within a certain level of probability. The ORNL team developed a similar range of variance–technically known as “error bars”–for each of the scenarios.
Using resources at ORNL’s Leadership Computing Facility, the team then performed ensemble runs on three decade-long periods at the beginning, middle, and end of the twenty-first century (2000-2009, 2045-2055, and 2090-2099) to get a sense of how the scenarios would unfold over the twenty-first century’s hundred years.
Interestingly, when the variance or “error bars” are taken into account, there is no statistically significant difference between the projected temperatures resulting from the high fossil fuel A1F1 scenario and the middling A2 scenario up through 2050. That is, the A1F1 and A2 error bars overlap. After 2050, however, the A1F1 range of temperature projections rise above those of A2, until they begin to overlap again toward the century’s end.
Typically climate scientists have understood the range of uncertainty in projections to be the variance between high and low scenarios. But when the error bars are added in the range between high and low possibilities actually widens, indicating greater uncertainty.
“We found that the uncertainties obtained when we compare model simulations with observations are significantly larger than what the ensemble bounds would appear to suggest,” said ORNL’s Auroop R. Ganguly, the study’s lead author.
In addition, the error bars in the A1F1 scenario suggests at least the possibility of even higher temperatures and more heat waves after 2050, if fossil fuel use is not curtailed.
The team also looked at regional effects and found large geographical variability under the various scenarios. The findings reinforce the IPCC’s call for greater focus on regional climate studies in an effort to understand specific impacts and develop strategies for mitigation of and adaptation to climate change.
The study was published in the Proceedings of the National Academy of Sciences. Co-authors include Marcia Branstetter, John Drake, David Erickson, Esther Parish, Nagendra Singh, and Karsten Steinhaeuser of ORNL, and Lawrence Buja of NCAR. Funding for the work was provided by ORNL’s new cross-cutting initiative called Understanding Climate Change Impacts through the Laboratory Directed Research and Development program.
More information: The paper can be accessed electronically here: http://www.pnas.org/content/106/37/15555
Well, it took about 5 minutes of reading the SI to find an interesting thing.
A quick scan through the charts and the comparison of Models to “observations” led to this. When they want to look at the bias of the models ( to test for things like normality with Q Q plots) they look at the difference between Models and NCEP “re analysis data” Looking at the charts of “reanalyis” versus the models it stuck me that the reanalysis wasnt observation data at all. So, if you read through the SI you will find this twoard the end. renanalysis data is….. you guessed it.. output from a model. Now, that doesnt make that wrong on its face, but I’ll see if lucia wants to take it up
If the models are judged more uncertain then they are less likely to be right so it says even less about what will happen. Additionally they say “more uncertain than assumed”. Is that not like saying the answer was assumed to be hot horrible and dangerous and now we aren’t so sure of our assumption. I’ve assumed the models are useless and I seem to be getting more confidence that they are useless.
I know I’m late to the party……but you can’t calculate real uncertainty by comparing model runs. All they’ve really done is calculate the instability of their computational method.
And in related news, scientists can now confidently say that the Kansas City Royals will not play in this year’s World Series.
But seriously, now with more uncertainty, it’s even more likely that temperature measurements will not be inconsistent with the models.
Oh WOW this is amazing. I mean just when you think some branches of mathematical modelling couldn’t be pushed further away from the scientific method they come and surprise you with something like this. So when a mathematical model doesn’t predict observed data, then that indicates uncertainty in the model. It is then (apparently) reasonable to conclude from this uncertainty that you should be concerned with your proposed mechanism being more dominant than you previously imagined. So the more your model doesn’t fit the data, the more you should be concerned about your proposed mechanism being dominant. Incredible! And all this time here’s me, Johnny Chumpo over here thinking my proposed mechanism was just WRONG!
And the error bars on the ‘data’??
Headline – IPCC Climate Model Uncertainty Greater Than Expected –Ice Age Possible By End Of Century
Climate scientists recognize that climate modeling projections include a significant level of uncertainty. A team of researchers using computing facilities at Oak Ridge National Laboratory has identified a new method for quantifying this uncertainty. One consequence is the possibility of greater cooling and more cold waves later in the century under the Intergovernmental Panel on Climate Change’s (IPCC) high fossil fuel use scenario.
Sound familiar???
I don’t think it’s fraud. If it was they’d at least try to make their story believable.
Steven Mosher (15:58:58) :
Well, it took about 5 minutes of reading the SI to find an interesting thing.
Great catch Mosher.
Is that the same re-analysis data that previuosly touched off a thread?
So, what’s the uncertainty of the degree of uncertainty? Can they predict the uncertainty of the uncertainty with a high degree of uncertainty? Uncertainly they can!
Though this be madness, yet there is method in’t.
–William Shakespeare, Hamlet
I love it! So predictable – the uncertainty is only to the warm side. What about uncertainty to the low side? No chance of that? I’ll take that bet. Looks like yet another Copengagen run-up paper. What will these people do after Copenhagen (especially when they come home with a hollow agreement at best).
OT- 5.5 in of snow here at the house in the foothills of SW Denver. GFS showing more snow next Tuesday & on Halloween (it always snows on Halloween here)
A “modest proposal” – Why not scrap current models , which obviously don’t work , and start over again using all the known variables instead of relying on co2 ? BTW , this is a mostly rhetorical question , although I don’t understand why no one has tried this .
For a period I worked as consultant assessing the prospects of ultimate success of computer software projects of similar scale to the climate models.
The individuals who had worked on the projects had enormous psychological investment in their systems and would go to great lengths to ensure the systems continued development.
All evidence presented to them that their system was unrecoverably flawed and had little or no chance of ultimate success was rationalized away or avoided. Often combined with the kind of ad hominen attacks we are so familiar with in climate circles.
I have little doubt the climate models are irretreviably flawed and should be scrapped and the development process started again from scratch under a public and transparent process.
I am also certain that this will never happen if the modellers are left to make the decision themselves.
What will likely happen is one or more open source climate models already under development will produce better predictions and make the HadCRU, etc models irrelevant. However, this will take a few years.
“Anyone who claims to have a reliable climate model is a fraud serving the First Global Revolution.”
I have one. I call it Earth. It is 100% accurate and modeling the current state of the climate. Sorry no forward looking capability.
A model of a model and a paper that investigates the data from this model of a model as if it were a thermometer. Reminds me of a book. Anybody read Fahrenheit 451? Remember the virtual family scenes projected on the walls? These models of family life became reality. Now I don’t know which is stranger, reality or fantasy.
Just more GIGO
Uncertainty-quantification depends on assumptions.
As I recall the risk assessment models used by AIG for default credit swaps failed because the modelers obviously could not identify or provide for unknown-unknowns. This is the disaster that brought the company down when one or more of them occurred.
Isn’t this AIG all over again? How much does this folly with super computers cost us?
No GCM [computer climate model] predicted the flat to declining global temperatures for most of the past decade, just as no computer financial model predicted the severity and extent of the global economic decline over the past year. And the universe of stocks is much simpler than the individual entities affecting the climate.
The only truly successful financial computers are those that take advantage of arbitrage in the microsecond delays between buy/sell orders and execution. But there are no truly successful computer financial models that accurately predict future stock prices.
What is ignored by the faction of the climate community engaged in selling AGW is the fact that computer models are not data. Models are simply a tool. And they are not a very good tool for predicting the climate. They are not evidence, although this paper typically implies that they are.
Empirical evidence is raw, verifiable data. Arguments can be made for adjusting the raw data. And arguments can be made for using different methodologies and computer algorithms in the models. But none of these are evidence.
The problem emerges when the people arguing the case that computer models can be tweaked sufficiently to predict the future refuse to disclose their raw data and their specific methodology, thus making replication and falsification of their hypothetical claims almost impossible. They are uncooperative because they are hiding something.
The scientific method requires that all information resulting in a new hypothesis must be promptly made available to anyone who requests it. Yet the AGW contingent disrespects and ignores the scientific method.
The undeniable fact that certain climate scientists of the CO2=AGW persuasion are still deliberately withholding the data and methodologies that they claim shows that a tiny trace gas will cause runaway global warming means that they are engaging in self-serving propaganda, not science.
Anyone withholding any such information should be barred from receiving further public funding of any kind, because they are deliberately running a scam to defraud the public, nothing less. For example, Michael Mann, of the debunked UN/IPCC hockey stick, still refuses to cooperate with other scientists.
Mann stonewalls requests for his data. And when his methodology is finally teased out of his results, it turns out that not only is Mann devious, but he is incompetent as well, as is shown in his paper published with upside-down Tiljander sediments — which was presumably refereed, and then simply hand-waved through the climate peer review process by his cronies, who perceive scientific skepticism as their enemy.
The scientific method obligates all scientists to do their best to falsify every hypothesis, even those scientist proposing their own hypothesis. This is done mainly through replication. But those climate scientists who have both front feet in the public trough fight the scientific method and scientific skepticism every step of the way.
As we are well aware here, the climate alarmist clique has learned how to game the system for their own personal benefit. They feather their own nests by lining their pockets with undeserved taxpayer money when they refuse to abide by the scientific method; instead they adopt a siege mentality against skepticism in place of the required scientific cooperation.
Smokey,
Great comment! Here is one reason the scientific method has been thrown out the door,IMO: The AGW side reasons: “What is the harm if we are wrong? The rich countries will “waste” less and the poor countries will be compensated.”
They are clueless with regard to economics.
“Convenient Lies” is the order of the day. Now they are terrorized of fessing up, is my guess.
Infinite improbability drive? I’m just saying…
B2B,
Regarding the Copenhagen proposal to compensate ‘poor’ countries like China: click
Smokey,
You are a wonderful throwback. Too bad you are not President. We need another Andrew Jackson badly. (Sorry about the Cherokees, though)
“I have one. I call it Earth. It is 100% accurate and modeling the current state of the climate. Sorry no forward looking capability.” Mac
I’m reading the Owner’s manual; it says nothing about not burning carbon.
We need another Andrew Jackson badly. (Sorry about the Cherokees, though)
Two words: pet banks.