National Academy of Sciences: climate models still 'decades away' from being useful

climate-model-1[1]
Climate Model: resolution still too coarse to provide useful predictions
From the National Academy of Sciences report A National Strategy for Advancing Climate Modeling:

Computer models that simulate the climate are an integral part of providing climate information, in particular for future changes in the climate. Overall, climate modeling has made enormous progress in the past several decades, but meeting the information needs of users will require further advances in the coming decades.

The fundamental science of greenhouse gas-induced climate change is simple and compelling. However, genuine and important uncertainties remain (e.g., the response of clouds,

ecosystems, and the polar regions) and need to be considered in developing scientifically based strategies for societal response to climate change.

Description:

As climate change has pushed climate patterns outside of historic norms, the need for detailed projections is growing across all sectors, including agriculture, insurance, and emergency preparedness planning. A National Strategy for Advancing Climate Modeling emphasizes the needs for climate models to evolve substantially in order to deliver climate projections at the scale and level of detail desired by decision makers, this report finds. Despite much recent progress in developing reliable climate models, there are still efficiencies to be gained across the large and diverse U.S. climate modeling community. Evolving to a more unified climate modeling enterprise-in particular by developing a common software infrastructure shared by all climate researchers and holding an annual climate modeling forum-could help speed progress.

Throughout this report, several recommendations and guidelines are outlined to accelerate progress in climate modeling. The U.S. supports several climate models, each conceptually similar but with components assembled with slightly different software and data output standards. If all U.S. climate models employed a single software system, it could simplify testing and migration to new computing hardware, and allow scientists to compare and interchange climate model components, such as land surface or ocean models. A National Strategy for Advancing Climate Modeling recommends an annual U.S. climate modeling forum be held to help bring the nation’s diverse modeling communities together with the users of climate data. This would provide climate model data users with an opportunity to learn more about the strengths and limitations of models and provide input to modelers on their needs and provide a venue for discussions of priorities for the national modeling enterprise, and bring disparate climate science communities together to design common modeling experiments.

In addition, A National Strategy for Advancing Climate Modeling explains that U.S. climate modelers will need to address an expanding breadth of scientific problems while striving to make predictions and projections more accurate. Progress toward this goal can be made through a combination of increasing model resolution, advances in observations, improved model physics, and more complete representations of the Earth system. To address the computing needs of the climate modeling community, the report suggests a two-pronged approach that involves the continued use and upgrading of existing climate-dedicated computing resources at modeling centers, together with research on how to effectively exploit the more complex computer hardware systems expected over the next 10 to 20 years.

Read the report.

h/t to Steve Milloy of junkscience.com

 

See also this video from Bob Tisdale:  A Video Preview of “Climate Models Fail”

 

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
114 Comments
Inline Feedbacks
View all comments
Louis
September 13, 2013 10:04 pm

“…genuine and important uncertainties remain (e.g., the response of clouds, ecosystems, and the polar regions) and need to be considered in developing scientifically based strategies for societal response to climate change.”

I see several important concessions in the NAS quote above. First, there is the admission that the science is not settled. There are still some important uncertainties remaining. Second, some of the uncertainties not yet understood include clouds, ecosystems, and the polar regions. And third, they concede that those uncertainties need to be considered before we develop strategies to respond to climate change. To me, that means we need to fully understand those things before asking the world to make sacrifices that may not be necessary or even helpful. I doubt the authors would openly agree with me, but that’s what I gleam from their comments.

u.k.(us)
September 13, 2013 10:05 pm

“The fundamental science of greenhouse gas-induced climate change is simple and compelling. However, genuine and important uncertainties remain (e.g., the response of clouds,
ecosystems, and the polar regions) and need to be considered in developing scientifically based strategies for societal response to climate change.”
=============
Simple, compelling, important uncertainties, then the money shot “societal response”.
This is science ?

September 13, 2013 10:30 pm

“There are two different types of people in the world; those who want to know, and those who want to believe.”
~ Friedrich Nietzsche

The people who want to know the truth are the majority of readers of this “Best Science & Technology” site. Conversely, the people who want to ‘believe’ are those relative few who argue with the verifiable facts, and with the Scientific Method, and with Occam’s Razor, and with the climate Null Hypothesis, etc. You know the ones. They are easy to spot.
Winnowing the truth out of the Universe is always an uphill battle. It is never easy, because there are always people who opine, based on their emotions, rather than on verifiable scientific observations.
Thus, changing public opinion is always hard! But it is doable, as we can see here. Public opinion is gradually shifting in favor of scientific truth [as we currently understand it]. Eventually, the public will come to see that the catastrophic AGW scare is simply a grant-fed scam. More than anything, the impact of that false alarm on our tax liability will bring about change. But it is always hard, and communicating the truth is always an uphill battle.

NeilC
September 13, 2013 10:54 pm

“Overall, climate modeling has made enormous progress in the past several decades,” Yup they didn’t work 30 years ago and they don’t work ‘even better’ today!

dp
September 13, 2013 11:19 pm

Nick – if more people should be using them but don’t what does that say about the usefulness of the information? I’ll help – the information is not useful and so unused. I suspect too that nobody today wants to to have their career stained by referencing anything coming from the consensus believing gate keepers. That stuff is so over.

Manniac
September 13, 2013 11:28 pm

The models need to gather more wood and faster.

Latimer Alder
September 13, 2013 11:35 pm

So exactly what did we – the public – gain by spending $50+ billion on models that still aren’t fit for purpose after 30 years?

richardscourtney
September 14, 2013 12:28 am

Friends:
The IPCC AR5 is due.
It is now clear that the climate models are useless. As the NAS Report says

the needs for climate models to evolve substantially in order to deliver climate projections at the scale and level of detail desired by decision makers

That needs to be shouted whenever the AR5 Summary For Policymakers is mentioned.
I peer reviewed the IPCC AR4. My review comments were ignored.
The IPCC Chairman asked me to review the Synthesis Report but I refused because I saw no reason to do that when the IPCC had ignored my work on the AR4. And I did not review the AR5.
My AR4 review comments included strong objections to claims that the models had “improved” and “have been improved”. The supposed “improvements” were additions of complexity and detail. I pointed out that an improved model provided more accurate and/or more precise and/or more confident predictions. Addition of complexity and detail does NOT of itself certainly “improve” a model and may damage the model, and there was no evidence that the additions had “improved” the model.
The IPCC ignored those comments and the erroneous claim of model “improvements” remained in the AR4. Similar comments in the AR5 need to be ridiculed.
Richard

richardscourtney
September 14, 2013 12:33 am

OOps! I intended to write
Similar claims in the AR5 need to be ridiculed.
Sorry
Richard

Nick Stokes
September 14, 2013 1:12 am

dp says: September 13, 2013 at 11:19 pm
“Nick – if more people should be using them but don’t what does that say about the usefulness of the information?”

Nothing. It just means that they can’t. As set up the models are (mostly) hard to run outside their home institution. The NAS report has recommendations for changing that.

Jon
September 14, 2013 1:16 am

“National Academy of Sciences: climate models still ‘decades away’ from being useful
Yep!”
Climate models that are based on the political established UNFCCC are usefull today to the UNFCCC ?
In other words. Climate models will loose it’s political usefulness the day they become scientific useful?

Scottish Sceptic
September 14, 2013 1:26 am

Decades … more like millennia!!
It is a well quoted maxim that climate is the same basic physics as weather but just at a different scale. This is meant to suggest that weather models can predict the climate by the mere change of scaling them up. However anyone familiar with the learning curve would know what it really tells us is they are likely to follow a very similar learning curve: that climate forecasting is likely to show the same rate of improvements after the same number of iterations as weather forecasts.
So how long has it taken numerical modelling to come up with a model with much skill? We have been learning how to do numerical forecasts using computer models for about 20 years. So as a first approximation the time to get to the same skill level as the weekly forecast will take about 1000 iterations.
It therefore follows that to get to the same level of forecasts for the climate model using the same numerical modelling approach based on the best physics, it will take roughly:
Yearly forecast … 1000 years
Decadal forecast …. 10,0000 years
Century forecast … 100,000 years.
So how well does this fit with the actual data of forecasting skill? When I last looked the UK Met Office had given up yearly global temperature forecasts because their forecasts had no skill whatsoever and there was not the slightest indication that it was getting better.
And since the most rapid period of improvement is in the first few iterations coming down the learning curve, WE SHOULD NOT EXPECT AN INCREASING RATE OF IMPROVEMENT so the painfully, unobservably, geological rate of improvement in climate forecasts is not only likely to continue BUT GET WORSE
For a few graphs on the subject see my paper Climate changes: the importance of supra-national institutions in nurturing the paradigm shifts of scientific development. (translation into sceptic: the global warming scam is dying and if the Royal Society want any credibility it has no choice but to change)

Berényi Péter
September 14, 2013 1:42 am

There is an open access copy of the report on a French server. It is not an easy read though, heavy use of a bullshit generator is undeniable.

rogerknights
September 14, 2013 1:43 am

Geoff Sherrington says:
September 13, 2013 at 9:30 pm
Now is an important time for a critical review of models. The paused 20 years or so provides a study period when many confounding variables are held by Nature to be more constant than at other times of rapid change in one direction or another.
For example, tree rings for the last decade or two should be essentially unresponsive if a near-constant local temperature is the dominant driver of their characteristics. We need more ground truth studies of concepts like this before we launch into complex forward models.

In the US, the temperature has declined in many areas, but CO2 has increased. If tree rings have increased, that can be attributed to CO2.

Bill Hunter
September 14, 2013 1:55 am

“the need for detailed projections is growing across all sectors, including agriculture, insurance, and emergency preparedness planning.”
A careful read of that would suggest the highest priority should be to extend the 10 day extended forecast to 11 days.

John R Walker
September 14, 2013 2:33 am

“Overall, climate modeling has made enormous progress in the past several decades,…”
Not on CO2 it hasn’t. Guy Callendar’s 1938 work was closer mainly because he took the CO2 on its own.
Why am I not surprised that an old fashioned steam engineer with a slide-rule and a clear understanding of logarithmic effects can out-think and out-calculate Big Green’s finest?

September 14, 2013 3:01 am

The NAS report on models is blantant begging for alms for continued modeling in the harsh reality of US Congressional sequestration of funding.
Given the models have failed against reality tests, then sayonara to them.
Note: The AGU is begging for general alms. I suspect all scientific bodies are in the face of US Congressional sequestration of funding.
John

September 14, 2013 3:17 am

A new web for ” SAVE OUR EARTH”
NEED advise all reader .Please Please

Bruce Cobb
September 14, 2013 5:20 am

The Mosh says “They’re listening.”
But, what are they listening to? Wait, it’s off in the distance, coming closer. It’s an old, familiar sound for them, now getting louder. Yes, that’s it. It’s the still-comfy, with more miles to go yet CAGW/CC/WC GRAVY TRAIN. Ka-choo-choo-CHING!

September 14, 2013 7:35 am

This is one of those “reasonable deniability” reports for the record if everything falls apart while it still clings to every tenet of the GHG control knob/sea level/temp/weird weather…..I was a public servant in the 1970s and I’m very, very familiar with this type of prose. It is a message for every eventuality and is put out when there is a lot of angst about the inadequacy of existing policy underpinnings. When the ultimate abandonment of the CO2 control knob can’t be put off any longer, they will say, yeah, well we had such concerns back as far as 2013 about the problems of excluding natural variability, clouds, etc and felt that models needed a major overhaul.
Nick Stokes says:
September 13, 2013 at 9:35 pm
“”“It is worth remembering that the need to invoke GHG mechanism arose because the models were different to reality and the difference needed an explanation.”
The GHG mechansim goes back to at least Arrhenius in 1896.”
Nick, now with the GHGs CO2 and methane in there, it still needs an explanation – what does that tell you? Also it went back to Tyndall in 1850. From the excerpt from Wiki, you can see why you CAGW folks prefer the stuff of Arrhenius half a century later.
http://en.wikipedia.org/wiki/John_Tyndall
Tyndall: He was the first to correctly measure the relative infrared absorptive powers of the gases nitrogen, oxygen, water vapour, carbon dioxide, ozone, methane, etc. (year 1859). He concluded that water vapour is the strongest absorber of radiant heat in the atmosphere and is the principal gas controlling air temperature. Absorption by the other gases [including CO2] is not negligible but relatively small.
Now let’s see what CAGW has chosen to ignore from Arrhenius:
http://en.wikipedia.org/wiki/Svante_Arrhenius
Arrhenius on coal burning: “Although the sea, by absorbing carbonic acid, acts as a regulator of huge capacity, which takes up about five-sixths of the produced carbonic acid, we yet recognize that the slight percentage of carbonic acid in the atmosphere may by the advances of industry be changed to a noticeable degree in the course of a few centuries.”
“Since, now, warm ages have alternated with glacial periods, ……By the influence of the increasing percentage of carbonic acid in the atmosphere, we may hope to enjoy ages with more equable and better climates, especially as regards the colder regions of the earth, ages when the earth will bring forth much more abundant crops than at present, for the benefit of rapidly propagating mankind.”
Also:
ΔF = α Ln(C/C_0)
“In his calculation Arrhenius included the feedback from changes in water vapor as well as latitudinal effects, but he omitted clouds, convection of heat upward in the atmosphere, and other essential factors.”
So considering all this, Arrhenius was great for his prediction that agricultural “…will bring forth much more abundant crops..” He left out clouds and convection – features of the most abundant IR absorber- H2O. To sum this up;, Arrhenius for probably under $100 covered everything that you CAGW guys have rehashed for $700 billion in your quiver, and you are still leaving out clouds and convection, which, even if CO2 had a sensitivity of 10 or more, would act, a la ‘Eschenbach Effect’ to counter the warming. Ask yourself, why with all the swings in temp, that ocean SSTs CANNOT surpass the 31C barrier!!! And please, let’s reinstate the only correct prediction of Arrhenius concerning the benefits.

hunter
September 14, 2013 7:36 am

The claim that climate models are not useful is completely untrue. Climate modles have been very useful for climate science budgets, ethanol subsidies, wind power subsidies, solar power scammers, Greenpeace, WWF and other “NGO” fund raising, Al Gore and other climate change hucksters. Climate models have successfully guided all of these groups to vastly more funding and personal wealth.

Bruce Cobb
September 14, 2013 8:25 am

Even the Warmists’ hero Arrhenius, in ten years time dialed back his estimate of climates’ CO2 sensitivity. In his 1896 paper, he estimated that a doubling of CO2 would cause a temperature rise of 5–6 °C. But, in his 1906 publication, Arrhenius adjusted the value downwards to 2.1 °C (including water vapor feedback). Had he been interested in pursuing it further, perhaps he would have discovered that his contemporary, Knut Ångström was closer to the truth; that adding more CO2 would make little difference at that point, due to the saturation effect.

Matthew R Marler
September 14, 2013 9:02 am

Overall, climate modeling has made enormous progress in the past several decades, but meeting the information needs of users will require further advances in the coming decades.
Naturally I am pleased that the NAS would agree with me.

Matthew R Marler
September 14, 2013 9:06 am

Description:
As climate change has pushed climate patterns outside of historic norms,

oops. Not so pleased now.

Ed Barbar
September 14, 2013 9:12 am

Does anyone else think having one climate science model is a disaster waiting to happen?
There are values to having multiple models. The first is obvious: competition. If there are more than one, then market forces will drive the them to improve. Second, who gets to control the single client science model, and what if it’s has something fundamentally wrong? To anyone that doubts the value of competition, consider what happened when that Quasi government/business organization ATT was split up by Judge Green. Costs went down. There was competition finally in long distance rates, and they dropped. More importantly, ATT stopped controlling the market place, and the market place started to control ATT. Or, alternatively, look at the IPCC. If the IPCC had another organization to compete against, perhaps with one organization pointing out the ludicrous claims of the other, maybe we would get better information.
And it would be cheaper, too. That’s the amazing thing. ATT split up, infrastructure has been increasingly replicated, and costs went down, a lot.