NCAR's new 2010 climate model for the next IPCC report

New computer model advances climate change research

From an NCAR/UCAR press release

BOULDER—Scientists can now study climate change in far more detail with powerful new computer software released by the National Center for Atmospheric Research (NCAR).

climate modeling

Modeling climate’s complexity. This image, taken from a larger simulation of 20th century climate, depicts several aspects of Earth’s climate system. Sea surface temperatures and sea ice concentrations are shown by the two color scales. The figure also captures sea level pressure and low-level winds, including warmer air moving north on the eastern side of low-pressure regions and colder air moving south on the western side of the lows. Such simulations, produced by the NCAR-based Community Climate System Model, can also depict additional features of the climate system, such as precipitation. Companion software, recently released as the Community Earth System Model, will enable scientists to study the climate system in even greater complexity.

The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). The CESM is the latest in a series of NCAR-based global models developed over the last 30 years. The models are jointly supported by the Department of Energy (DOE) and the National Science Foundation, which is NCAR’s sponsor.

Scientists and engineers at NCAR, DOE laboratories, and several universities developed the CESM.

The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:

  • What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
  • How will patterns in the ocean and atmosphere affect regional climate in coming decades?
  • How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
  • What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?

The CESM is one of about a dozen climate models worldwide that can be used to simulate the many components of Earth’s climate system, including the oceans, atmosphere, sea ice, and land cover. The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.

“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”

Scientists rely on computer models to better understand Earth’s climate system because they cannot conduct large-scale experiments on the atmosphere itself. Climate models, like weather models, rely on a three-dimensional mesh that reaches high into the atmosphere and into the oceans. At regularly spaced intervals, or grid points, the models use laws of physics to compute atmospheric and environmental variables, simulating the exchanges among gases, particles, and energy across the atmosphere.

Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.

A broader view of our climate system

The CESM builds on the Community Climate System Model, which NCAR scientists and collaborators have regularly updated since first developing it more than a decade ago. The new model enables scientists to gain a broader picture of Earth’s climate system by incorporating more influences. Using the CESM, researchers can now simulate the interaction of marine ecosystems with greenhouse gases; the climatic influence of ozone, dust, and other atmospheric chemicals; the cycling of carbon through the atmosphere, oceans, and land surfaces; and the influence of greenhouse gases on the upper atmosphere.

In addition, an entirely new representation of atmospheric processes in the CESM will allow researchers to pursue a much wider variety of applications, including studies of air quality and biogeochemical feedback mechanisms.

Scientists have begun using both the CESM and the Community Climate System Model for an ambitious set of climate experiments to be featured in the next IPCC assessment reports, scheduled for release during 2013–14. Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. The new IPCC report will include information on regional climate change in coming decades.

Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts, such as a particular region facing a high probability of drought, or another region likely facing several years of cold and wet conditions.

“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
160 Comments
Inline Feedbacks
View all comments
tarpon
August 19, 2010 11:41 am

The connection between spiffy new software, fancy new computers, and man’s total knowledge of how climate works remains elusive.
Are they studying computers now? As a former computer engineer, I understand GIGO.

M White
August 19, 2010 12:07 pm

“To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.”
Similar models were used to prove that nighttime and daytime were caused by the sun orbiting the earth.

Brian H
August 19, 2010 12:42 pm

The model(s) will exhibit chaotic instabilities due to unpredictable interactions of parameters; these sensitivities will match observational fluctuations only by sheer chance, and in narrow bands.
We need models to emulate and predict the behavior of the models.

899
August 19, 2010 1:40 pm

After a measured amount of cogitation on this matter, I’ve come to the conclusion that the people who propose or put forth these ‘models’ with pictorial results for display, are actually engaging in the worst sort of propaganda.
The very terrible thing about that is that many —dare I say most?— layman haven’t a clue about what’s being placed under their noses.
Think about it: Those laymen see the pretty pictures with colors calculated to induce the mind to draw a conclusion: The one intended. Red = blazing Hades, while blue = freaking cold. And white = The Great White North, i.e., Canada.
And of course, it doesn’t help any when the MSM resorts to giving the impression that the models represent the REAL WORLD as opposed to what really represent: Make believe.
Instead of playing with models, why don’t the so-called ‘researchers’ actually resort to using actual data which is currently available?
But that wouldn’t attract grant money, now would it?

ANH
August 19, 2010 1:44 pm

The article said it is freely available. Is that true? Is the source code freely available?

August 19, 2010 2:08 pm

Phil.: August 19, 2010 at 3:48 am
I doubt whether Boeing would agree with you since they rely on Tony Jameson’s CFD code for their wing design.
[…]
“The list of those airplanes begins with the Boeing 767, designed in the late 1970s. That was followed by the 757, the 747-400, the new Boeing 777, and the recentlyannounced Boeing 737-700 which embodies a new wing and other advanced features. Each of those airplanes is displayed on the model.
[…]
There is one model position reserved for a future Boeing 787 airplane, and another for a 797. Those airplanes are presently only a gleam in our eye, but when they are designed and built, they undoubtedly also will contain the imprint of Jameson’s computational methodology.”

The B-777 went $2 billion over budget just in the R&D because of problems with the coding, and went a year behind schedule because the metal-benders couldn’t reconcile the fly-by-wire system (not just in the wing) with the computer-generated schematics.
BTW, the “recently-announced B-737-700 first rolled off the production line in 1996, and the B-787 is three years’ overdue for delivery to the first customer — problems with the fuselage needing structural reinforcement during *actual flight testing*, y’know…

August 19, 2010 2:27 pm

Again, I can’t find any mention of the sensitivity of CO2 to temperature. Perhaps this and similar models might be reasonable for a sterile planet, but on earth the biosphere absorbs and releases CO2 at rates that depend on temperature.

kwik
August 19, 2010 2:33 pm

I like the new understanding of GIGO.
Garbage In, Grants Out.
Priceless!

August 19, 2010 2:55 pm

stephen richards says:
August 19, 2010 at 6:35 am
Phil. says:
August 19, 2010 at 3:48 am
Another of your pathetic strawmen. Engage brain before opening mouth.

We’d be a lot better off if you kept your mouth shut, is there even a brain there to connect too? Geof Hoyle raised the subject of aero-design and suggested that “It would be nice to see some of these vast sums of money spent on simulation design channelled to better use?” I pointed out that Boeing thought the money spent was very worthwhile indeed.

August 19, 2010 3:03 pm

ANH says:
August 19, 2010 at 1:44 pm
The article said it is freely available. Is that true? Is the source code freely available?

If you follow the link in the original post you’ll see where you can download the code: http://www.cesm.ucar.edu/models/cesm1.0/

Jim Barker
August 19, 2010 3:33 pm

I’m not sure it would help, but my Ouija board will be available for loan, when I get it back from calibration;-)
Seriously, they seem to be implying that this new system is capable of solving at least an n-body problem and that Only involves a few bodies and gravity. They are really capable of modeling all of the molecules on Earth, in all of their interactions?

August 19, 2010 3:44 pm

So these models take into consideration air temperatures, ocean temperatures, sea ice, land coverage, and greenhouse gases.
Do these models take into consideration the ten fold increase in global cooling due to clouds over the trival change in global warming from ghg?
Do these models take into consideration planetary mechanics, the titanic forces in climate change from varying Sun spot cycles, coronal mass ejections, etc.?
Do these models take into consideration the impact of changing atmospheric chemical reactions created by the cyclical bombardment of xrays, cosmic rays, and uv rays?
If not, the latest models are woefully inadequate to predict anything. They are as predictive as looking at an oil pressure gauge to see what direction your car is moving.

AndrewG
August 19, 2010 3:50 pm

I keep looking for the sentance “Accurately predicted the temperature patterns in 2010 when fed the data from 1990-2000” but it just isn’t there.

SSam
August 19, 2010 3:55 pm

I know they don’t mean it the way it sounds… I think.
…CESM source code is distributed through a public Subversion code repository…

EthicallyCivil
August 19, 2010 4:42 pm

I wonder if they included the empirical feedback models from Lindzen and Choi in their “improved physics”. Of course, moving to the stable half of phase space wouldn’t be as exciting as the old result.

August 19, 2010 4:49 pm

Just compare the image above, full of red, with the following:
http://weather.unisys.com/surface/sst_anom.html
Do they count on fools?, but, surprise, the majority of healthy and normal people are not fools, so beware.

August 19, 2010 6:00 pm

Unfortunately there is nothing static about the Earth’s climate system. It is not a closed system. The basic parameters are changing in a way that we do not understand and can not explain: See http://climatechange1.wordpress.com/

Tom Harley
August 19, 2010 6:12 pm

I would much sooner believe that much cheaper super computer between the ears of farmers, fishermen, geologists etc. whose outdoor lives depend on knowledge of the climate for a living.

H.R.
August 19, 2010 6:53 pm

erlhapp says:
August 19, 2010 at 6:00 pm
“Unfortunately there is nothing static about the Earth’s climate system. It is not a closed system. The basic parameters are changing in a way that we do not understand and can not explain: See http://climatechange1.wordpress.com/
Good reminder. And another reminder (geological perspective); earth’s climate never repeats exactly, and has changed from states that will probably never be seen again. And undoubtedly, earth’s climate will change to states that have never been seen before. Model that!
(h/t to anna v, who posted an observation of one of the Greek philosophers, “You can never cross the same river twice.”)

August 19, 2010 6:54 pm

This is really cool. It is fantastic how NCAR makes their codes easily and freely available.

Mark W
August 19, 2010 7:38 pm

GIGO in climate science = Garbage in Gospel out…

Ammonite
August 19, 2010 8:09 pm

brokenhockeystick says: August 19, 2010 at 5:24 am
These record levels would be from the mangled/homogenised/cmpletely discredited temperature records used by CRU, GISS etc which have been inexplicably adjusted upwards, rather than downwards, to account for UHI effects, as pointed out in countless posts on this site.
There are significant signs that the planet is warming independent of the temperature data. Species are migrating away from the equator and to higher altitudes. The Greenland and Antarctic ice sheets are losing mass and so are glaciers from all the world’s major mountain chains.
If you haven’t already done so, please examine the GISS website for their analysis of UHI. It is very accessible and interesting and places the effect in its proper context with respect to the temperature record – real, but small.

pwl
August 19, 2010 8:47 pm

“Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.”
That’s a very bold claim, in fact it’s an extraordinary claim that needs extraordinary hard evidence. Produce such evidence forthwith NCARS, otherwise you’re just another doomsday soothsayer in the same league as Nostradamus.

pwl
August 19, 2010 10:23 pm

Mr Lynn quotes ” . . . The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.”
and then asks some good questions:
“Does that mean it is open-source?”
It seems to be. Mostly it looks like Fortran 90 programs. Shivers.
“Is all the code “freely available”?”
It seems to be under various open source licences.
“Are all the built-in parameters clearly stated?”
That I don’t know. It’s cryptic so it will require decoding.
“Are the assumptions used to create the parameters fully variable, so their values can be freely changed by scientists who are not programmers?”
Don’t know. Must study it.
“Are the conditions under which new parameters can be added clearly specified?”
Don’t know.
“Is the model made available with all the datasets it draws upon, and are all sources of and modifications to that data “freely available” and transparent?”
They have over one terabyte of data available and don’t want it all downloaded but just want people to use what they need. How the heck do they get one terabyte of data? I’ve not yet looked at what it is but yikes that’s one heck of a lot of data for sure. It would take quite sometime to download it all even with a high speed connection.
I doubt very much that the “modifications” are documented. I’d be pleasantly surprised if they were.

August 20, 2010 3:13 am

Phil.: August 19, 2010 at 2:55 pm
I pointed out that Boeing thought the money spent was very worthwhile indeed.
At that time (1994), it was *very* worthwhile, because Boeing used a lot of NASA-derived technology — which the US taxpayers paid for. To be fair, Boeing *did* re-imburse NASA for the wind-tunnel testing portion.
That’s not a knock at Anthony Jameson, btw — he’s one of aviation’s shining stars.