National Academy of Sciences: climate models still 'decades away' from being useful

climate-model-1[1]
Climate Model: resolution still too coarse to provide useful predictions
From the National Academy of Sciences report A National Strategy for Advancing Climate Modeling:

Computer models that simulate the climate are an integral part of providing climate information, in particular for future changes in the climate. Overall, climate modeling has made enormous progress in the past several decades, but meeting the information needs of users will require further advances in the coming decades.

The fundamental science of greenhouse gas-induced climate change is simple and compelling. However, genuine and important uncertainties remain (e.g., the response of clouds,

ecosystems, and the polar regions) and need to be considered in developing scientifically based strategies for societal response to climate change.

Description:

As climate change has pushed climate patterns outside of historic norms, the need for detailed projections is growing across all sectors, including agriculture, insurance, and emergency preparedness planning. A National Strategy for Advancing Climate Modeling emphasizes the needs for climate models to evolve substantially in order to deliver climate projections at the scale and level of detail desired by decision makers, this report finds. Despite much recent progress in developing reliable climate models, there are still efficiencies to be gained across the large and diverse U.S. climate modeling community. Evolving to a more unified climate modeling enterprise-in particular by developing a common software infrastructure shared by all climate researchers and holding an annual climate modeling forum-could help speed progress.

Throughout this report, several recommendations and guidelines are outlined to accelerate progress in climate modeling. The U.S. supports several climate models, each conceptually similar but with components assembled with slightly different software and data output standards. If all U.S. climate models employed a single software system, it could simplify testing and migration to new computing hardware, and allow scientists to compare and interchange climate model components, such as land surface or ocean models. A National Strategy for Advancing Climate Modeling recommends an annual U.S. climate modeling forum be held to help bring the nation’s diverse modeling communities together with the users of climate data. This would provide climate model data users with an opportunity to learn more about the strengths and limitations of models and provide input to modelers on their needs and provide a venue for discussions of priorities for the national modeling enterprise, and bring disparate climate science communities together to design common modeling experiments.

In addition, A National Strategy for Advancing Climate Modeling explains that U.S. climate modelers will need to address an expanding breadth of scientific problems while striving to make predictions and projections more accurate. Progress toward this goal can be made through a combination of increasing model resolution, advances in observations, improved model physics, and more complete representations of the Earth system. To address the computing needs of the climate modeling community, the report suggests a two-pronged approach that involves the continued use and upgrading of existing climate-dedicated computing resources at modeling centers, together with research on how to effectively exploit the more complex computer hardware systems expected over the next 10 to 20 years.

Read the report.

h/t to Steve Milloy of junkscience.com

 

See also this video from Bob Tisdale:  A Video Preview of “Climate Models Fail”

 

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
114 Comments
Inline Feedbacks
View all comments
rogerknights
September 13, 2013 6:57 pm

“The fundamental science of greenhouse gas-induced climate change is simple and compelling. ”

To a simple-minded compulsive.

September 13, 2013 7:16 pm

It’s an interesting report. They aren’t of course saying the models are “decades away from being useful”; the quote was ” meeting the information needs of users will require further advances in the coming decades”.
What they are saying is that a whole lot more people should be using them and need to be able to use them. Their big proposal is for a common software interface so that they could become like consumer products. They also contemplate a network of trained model interpreters. These are not the recommendations of people who think models are useless.
Their list of four main recommendations is:
“1. Evolve to a common national software infrastructure that supports a diverse hierarchy of different models…
2. Convene an annual climate modeling forum that promotes tighter coordination…
3. Nurture a unified weather-climate modeling effort that better exploits the synergies between weather forecasting, data assimilation, and climate modeling;
4. Develop training, accreditation, and continuing education for “climate interpreters”…”

It’s not saying the models are useless – it’s about how to get them more used.
Here’s a quote from the report that you might like:
“Over the next several decades climate change and its myriad consequences will be further unfolding and likely accelerating (NRC, 2011a). Probable impacts from climate change, including sea-level rise, a seasonally ice-free Arctic, large-scale ecosystem changes, regional droughts, and intense flooding events, will increase demand for climate information. The value of this climate information is large. One of the more prominent places to see this is through the impacts of extreme climate and weather events; extreme climate and weather events are one of the leading causes of economic and human losses, with total losses between 1980 and 2009 exceeding $700 billion (NCDC, 2010) and damages from more than 14 weather- and climate-related disasters totaling more than $50 billion in 2011 alone.1 Climate change is affecting the occurrence of and impacts from extreme events, such that the past is not necessarily a reliable guide for the future, which further underscores the value of climate information in the future.”

david eisenstadt
September 13, 2013 7:22 pm

clearly the models arent useless…many people make their livings from maintaining them…not so many make a living betting on their accuracy however.

AndyG55
September 13, 2013 7:24 pm

“It’s not saying the models are useless ”
Well it b******y well should be saying that.
And you KNOW it !!

R2Dtoo
September 13, 2013 7:32 pm

This reads as nothing more than part of your President’s doubling down proposition. The logical thing to do is to accept the “pause” as buying time, and redirecting funds to research that concentrates on defining the known critical “unknowns”. If this was done for ten years the computer nerds might have some useful data to put into the models. As it is now, the failure of the models implies that the whole scheme is nothing more than a big group of ex-pacman geeks playing games. This policy, if followed, will guarantee that the whole cadre of government controlled scientists and purchased academics can go on playing their games. Perhaps Mosher can let us in on why he thinks they are listening- the premises all seem the same, and the same actors will receive continued funding. Nothing new here.

Graham
September 13, 2013 7:51 pm

Meanwhile, it’s a case of “Just gimme the money anyway.”

gerrydorrian66
September 13, 2013 8:16 pm

So basically the NAS has nothing of substance to say to this generation on climate change, and may not have for the next generation?

gallopingcamel
September 13, 2013 8:18 pm

One of the major problems for climate skeptics is that many accept the idea that CO2 drives temperature even though “Hard Science” shows the exact opposite. The admirable Richard Lindzen is guilty of this fault when he debates the exact value of the “Climate Sensitivity Constant”.
There is a statistically significant correlation between CO2 and the average global temperature if you look at the period from 1850 to 1998. Even in that timeframe it is hard to explain why temperatures fell during decades when CO2 was rising steadily. Since 1998, temperatatures have fallen while CO2 concentrations rose faster. “Global Warming” fanatics are now denying this reality. They have become “Deniers of Reality
Mother Nature has a wicked sense of humor. In a few weeks the IPCC will publish its fifth “Assessment Report” that will be eerily similar to its fourth assessment report (AR4). AR5 will be so absurd that there won’t be an AR6 in 2020. Ouch!

Nick Condell
September 13, 2013 8:24 pm

Gavin schmidt has left an extrodinary piece … “On mismatches between models and observations” at Real Climate, presumably in his capacity as an expert commentator on how science should treat testability of predictions. His comments (response to first comment, critical of model use for advocacy) are delusional … “If you think that policies are being made based on exact numbers coming from a climate model, I’d have to ask for some evidence. Polices are being made (or at least considered) on the strongly evidence based premise that climate sensitivity is non-negligible, but that conclusion doesn’t depend on models as much as paleo-climate and so is unlikely to change.”
My mouth is still open in awe at the audacity.

TalentKeyHole Mole
September 13, 2013 8:24 pm

Oh Dear.
Perhaps, and this is a prediction, in as short as 20-years hence ‘Climate Science’ will be recognized as a branch of Numerology.
The impact of this is that ‘Climate’ other than the word Climate used in such late night or early morning (depending on the time zone) ruminations as “The Climate of Portugal”, or the “Climate of Fear”, or the “Climate of ‘I Love Lucy’ In The Post World War II America In The Early Years Of The Cold War And Other Explorations In The Context Of Galois Group Theory” will be recognized as not having anything to do with Physics nor the Physics Of Meteorology nor Weather.
Happy Friday 13 2013 to Climate Scientists; you deserve it, what ever “it” may manifest itself before you in your wakeful or dreamful state or altered state of being at the time being be it as it may.
😉

Mark Luhman
September 13, 2013 8:24 pm

Not possible, no model can model a chaotic system, the weather models are somewhat accurate three days out, they are reran every four hours, they will never be much better than a guess six day out, climate models have no chance of every being right. In the eighties much money was spend by investment companies trying to model the stock market, again a chaotic system and not possible. They gave up since there was no return on the money. Of course the weather climate modelers are not limited by market forces so the climate model farce continues.

Robert of Ottawa
September 13, 2013 8:30 pm

Seems like a plea for more megabucks for even more megacomputers.
I

Robert of Ottawa
September 13, 2013 8:35 pm

Jay Davis rhetorically asks on September 13, 2013 at 4:23 pm
This question needs to be answered – If they are still decades away from being useful, why then are they being used to tax us and price electricity out of reach of the common man?
Money, my boy, money. If they had a working and accurate model, they would no longer need climate modelers. Whoever “they” is.

cwon14
September 13, 2013 8:39 pm

The models are based on political agenda, it’s insulting to pretend otherwise.

John Mason
September 13, 2013 8:46 pm

Starts with an article of faith
“As climate change has pushed climate patterns outside of historic norms”
Which is also contrary to any facts.
sigh…

Robert of Ottawa
September 13, 2013 8:47 pm

gallopingcamel, understand that the only reason that the warmistas can even make a case is because of some imaginary, and ridiculous, positive feedback.

Robert of Ottawa
September 13, 2013 8:55 pm

Nick Stokes September 13, 2013 at 7:16 pm
the quote was ” meeting the information needs of users will require further advances in the coming decades”.
i.e. providing the “scientific results” that their political masters require. Lysenko anyone?

Reed Coray
September 13, 2013 9:04 pm

As one dyslexic said to the other: “NSA / NAS, one of six, half-dozen of the other–big brother is watching out for us. Just send money; and don’t ask what’s behind the curtain.”

jbird
September 13, 2013 9:07 pm

Huh? It’s hard to know exactly what this means, but I’ll take a stab:
Our climate models have proven useless to project future climate trends, but we still believe in human caused climate change. Please give us more money so that we can improve our models’ abilities to predict the future climate.

September 13, 2013 9:11 pm

Nick Stokes,
You mentioned sea level rise.
You are better than I am with numbers.
Would you care to study this recent paper, whose abstracts includes –
“Although mean sea levels are rising by 1mm/year, sea level rise is local rather than global, and is concentrated in the Baltic and Adriatic seas, South East Asia and the Atlantic coast of the United States. In these locations, covering 35 percent of tide gauges, sea levels rose on average by 3.8mm/year. Sea levels were stable in locations covered by 61 percent of tide gauges, and sea levels fell in locations covered by 4 percent of tide gauges. In these locations sea levels fell on average by almost 6mm/year.
http://pluto.mscc.huji.ac.il/~msdfels/wpapers/Tide%20gauge%20location.pdf
I could not find objections to it. If you cannot, you might have to accept the possibility that ocean rise wrt land is now pegged at 1 mm per year from historic data in PSMSL. The authors did not report evidence of acceleration in their chosen study period of 1900 to 2000.
Note, from their conclusions, “We refer to this as the “conservative methodology” because it avoids the use of data reconstructions.”

September 13, 2013 9:15 pm

It is worth remembering that the need to invoke GHG mechanism arose because the models were different to reality and the difference needed an explanation.
If the models were wrong in certain ways…… no GHG needed.
Still looking for delta T = fn(GHG concentration). All I can find is “the physics since year dot has shown that light passing through CO2 will cause heating”. The early experiments confined the gas and massively interfered with heat transport. Their findings are not immediately applicable to atmospheric conditions. Is there a quantitative paper about this heat?

September 13, 2013 9:27 pm

Reblogged this on wwlee4411 and commented:
Notice: DECADES!

September 13, 2013 9:30 pm

Now is an important time for a critical review of models. The paused 20 years or so provides a study period when many confounding variables are held by Nature to be more constant than at other times of rapid change in one direction or another.
For example, tree rings for the last decade or two should be essentially unresponsive if a near-constant local temperature is the dominant driver of their characteristics. We need more ground truth studies of concepts like this before we launch into complex forward models.
The proposal to group the modellers more closely is perversely wrong. At a time like this, the best strategy is to separate the modellers further, and allowing them to search for their own parameters, like starting conditions, coefficients to weight variables and so on. Hopefully, one or more independent models will then match measured conditions better. There is no point in massively parallel approaches if there is a fundamental flaw carried by all models. Having 2 modellers working in parallel is nearly as good as having 20 in the sense of avoiding/finding small mistakes in the quality control regime.

Nick Stokes
September 13, 2013 9:35 pm

Geoff,
On sea level, you’ve linked to an unpublished paper by Beenstock and Reingewertz et al. These are the economists who brought us the proof that AGW can’t exist because of polynomial cointegration.
It’s a long ramble about alleged non-random placement of tide gauges. I thought their cointegration stuff was very bad, and have no enthusiasm for delving into this. They ignore the more important contributions to modern MSL measurements from satellites. And it’s purely statistical – they make no effort to sort out the effect of land movement.
“It is worth remembering that the need to invoke GHG mechanism arose because the models were different to reality and the difference needed an explanation.”
The GHG mechansim goes back to at least Arrhenius in 1896. He wasn’t bothered by model discrepancies.
“Still looking for delta T = fn(GHG concentration). “
Climate sensitivity is the linear version.

george e. smith
September 13, 2013 9:39 pm

So onto the trash heap of history with them, and get some data instead.. How many of these years away from working models, is Peter Humbug responsible for ?