NCAR's new 2010 climate model for the next IPCC report

New computer model advances climate change research

From an NCAR/UCAR press release

BOULDER—Scientists can now study climate change in far more detail with powerful new computer software released by the National Center for Atmospheric Research (NCAR).

climate modeling

Modeling climate’s complexity. This image, taken from a larger simulation of 20th century climate, depicts several aspects of Earth’s climate system. Sea surface temperatures and sea ice concentrations are shown by the two color scales. The figure also captures sea level pressure and low-level winds, including warmer air moving north on the eastern side of low-pressure regions and colder air moving south on the western side of the lows. Such simulations, produced by the NCAR-based Community Climate System Model, can also depict additional features of the climate system, such as precipitation. Companion software, recently released as the Community Earth System Model, will enable scientists to study the climate system in even greater complexity.

The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). The CESM is the latest in a series of NCAR-based global models developed over the last 30 years. The models are jointly supported by the Department of Energy (DOE) and the National Science Foundation, which is NCAR’s sponsor.

Scientists and engineers at NCAR, DOE laboratories, and several universities developed the CESM.

The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:

  • What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
  • How will patterns in the ocean and atmosphere affect regional climate in coming decades?
  • How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
  • What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?

The CESM is one of about a dozen climate models worldwide that can be used to simulate the many components of Earth’s climate system, including the oceans, atmosphere, sea ice, and land cover. The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.

“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”

Scientists rely on computer models to better understand Earth’s climate system because they cannot conduct large-scale experiments on the atmosphere itself. Climate models, like weather models, rely on a three-dimensional mesh that reaches high into the atmosphere and into the oceans. At regularly spaced intervals, or grid points, the models use laws of physics to compute atmospheric and environmental variables, simulating the exchanges among gases, particles, and energy across the atmosphere.

Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.

A broader view of our climate system

The CESM builds on the Community Climate System Model, which NCAR scientists and collaborators have regularly updated since first developing it more than a decade ago. The new model enables scientists to gain a broader picture of Earth’s climate system by incorporating more influences. Using the CESM, researchers can now simulate the interaction of marine ecosystems with greenhouse gases; the climatic influence of ozone, dust, and other atmospheric chemicals; the cycling of carbon through the atmosphere, oceans, and land surfaces; and the influence of greenhouse gases on the upper atmosphere.

In addition, an entirely new representation of atmospheric processes in the CESM will allow researchers to pursue a much wider variety of applications, including studies of air quality and biogeochemical feedback mechanisms.

Scientists have begun using both the CESM and the Community Climate System Model for an ambitious set of climate experiments to be featured in the next IPCC assessment reports, scheduled for release during 2013–14. Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. The new IPCC report will include information on regional climate change in coming decades.

Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts, such as a particular region facing a high probability of drought, or another region likely facing several years of cold and wet conditions.

“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
160 Comments
Inline Feedbacks
View all comments
Dave F
August 19, 2010 5:50 am

Biogeochemistry? Put the horse before the cart…

August 19, 2010 5:54 am

Another expensive screensaver.

Jim
August 19, 2010 5:56 am

The implication that a computer climate model run is a “climate experiment” is just wrong and also misleading. An experiment would involve manipulation of the Earth’s climate in some manner and comparing the outcome to a control Earth. A computer model run isn’t that.

Frank K.
August 19, 2010 6:00 am

I applaud the NCAR effort. Please note the difference between this:
http://www.cesm.ucar.edu/models/cesm1.0/
and this…
http://www.giss.nasa.gov/tools/modelE/
I would like to know why we, the taxpayers, are funding (at least) two identical research efforts! We should consolidate ALL climate GCM research at NCAR – that would save a lot of money and resources.

Ed Fix
August 19, 2010 6:05 am

I have a bit of experience in computer modeling, and here’s a basic conundrum that many modelers don’t recognize.
The unspoken assumption behind the kind of detailed modeling this article is describing is that if you can just get all the sub-processes right, and get all their multifaceted interactions right, and get all the initial conditions and inputs right, then the model will mimic the emergent behavior of the real system.
In a computer model, all data, all processes, and all interactions are necessarily approximations. To get a reasonable answer, some approximations must be more accurate than others, but there’s no way to know in advance which those are. If you get something wrong, then the model will fail in an unexpected, sometimes spectacular way.
The more detailed the model, the greater the chance that you have something wrong, and that your model will fail as a result.
The die-hard modeler’s reaction to his model’s failure is often to make his model even more detailed and complex. He feels he’s made progress when he fixes any one particular problem, but there’s an unexpected failure in another area, so he works on fixing that, and the cycle continues.
In the end, you have a minutely detailed, hugely expensive, insanely compute-intensive behemoth that actually performs no better than the simple model you started out with. There’s just this one more little detail you need to fix, if only you can get the funding. You’ve learned a great deal about computer modeling, but have really added very little to your understanding of the process you were trying to model in the first place. And you’ve spent an unconscionable amount of money and man-hours doing it.

Henry chance
August 19, 2010 6:08 am

I used fortran and wrote programs for quant analysis and forecasting several decades ago. I read the line of instruction that Jones referred to in e-mails that obviously “evolved” temp readings from the earlier part of the last century downward.

The CESM1 Timing Table provides performance data that will continue to evolve due to changes in the model, machine hardware and input from the user community.

http://www.cesm.ucar.edu/models/cesm1.0/
So historical data continues to “evolve” so the models can be helped to forcast what the people need to believe.
This method is not new. It was applied in reading tea leaves.

Martin Brumby
August 19, 2010 6:10 am

Since the alarmists have already decided that 2010 is the hottest year ever in the total history of the universe (correct to 17 decimal places), not sure what this new fancy model will add.
Meanwhile, back in the real world, the Climate Just Keeps On Doin’ Whatta Climate’s Gotta Do……..

August 19, 2010 6:12 am

The last paragraph in the news release gives me some hope that they intend to do some objective research rather than trying to find proofs that CAGW is an irreversable problem. It sounds like this will be an improvement of the Reanalysis model (that does not include CO2 as a factor). http://kidswincom.net/CO2OLR.pdf.

jack morrow
August 19, 2010 6:12 am

I wonder if this “New Fangled Computer”(NFO) includes anything about the things mentioned by the previous post by Paul Vaughan or anything about planetary influence or any other out side factor? It seems to me to just be the same stuff just made to look better and get more attention.

jack morrow
August 19, 2010 6:13 am

(NFC)

Martin Lewitt
August 19, 2010 6:15 am

“Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. ”
If the IPCC adheres to past practice, it will report the model projections without qualification as if they are relevant anyway, even though the diagnostic literature shows they errors far larger than the energy imbalance of interest. Watch for publication of model results before the diagnostic work has been done. These models should be assumed to have the correlated diagnostic issues that the past models have had until they prove that they don’t, e.g., the correlated surface albedo bias published by Roesch, the reproduction of less than half the increase in precipitation published by Wentz, the inability to reproduce the amplitude of the observed response to the solar cycle published by Camp and Tung and separately by Lean, the radiative tropical radiative imbalances published by Lindzen and separately by Spencer, etc. It will be interesting to use the average daily tropical thunderstorm feedback published by Eschenbach as an additional diagnostic.

Patrick Davis
August 19, 2010 6:19 am

“Mike Edwards says:
August 19, 2010 at 4:24 am
The source code is available from the NCAR site here:
http://www.cesm.ucar.edu/models/cesm1.0/
Yeah, any programmer knows about source code, blah blah blah, to build the model, the difference here is what are the data inputs? Is it raw, real, observerd or “adjusted”, homogenised, any “fudge factors built in”, “tricks”, “assumptions” and/or “invertions” used? I bet it’ll prove to be worse than ANYONE thought or any past GCM “simulated”.

Solomon Green
August 19, 2010 6:22 am

How many variables does this model contain? Lorenz found that 12 variables each with as little as 0.001% error in initial input was hopeless for forecasting weather (chaos theory). OK climate is not weather. But the same applies, the greater the number of variables the less accurate the model as a forecasting tool, no matter how well it has been calibrated to past events.

August 19, 2010 6:23 am

There is nothing in the press release about the model including a sensitivity of CO2 to temperature, as well as a sensitivity of temperature to CO2. Unless this fundamental but common fallacy has been corrected, GIGO. Of course, a model with significant values for both sensitivities cannot match observations over any substantial interval, but at least this would expose the absurdity of the CO2-induced-warming hypothesis.

stephen richards
August 19, 2010 6:35 am

Phil. says:
August 19, 2010 at 3:48 am
Another of your pathetic strawmen. Engage brain before opening mouth.
Aero dynamics and climate for the next 1000 yrs. Oh yes, I can see the similarites. Like hell.

stephen richards
August 19, 2010 6:36 am

Fred H. Haynie says:
August 19, 2010 at 6:12 am
The last paragraph in the news release gives me some hope that they intend to do some objective research rather than trying to find proofs that CAGW is an irreversable problem. It sounds like this will be an improvement of the Reanalysis model (that does not include CO2 as a factor). http://kidswincom.net/CO2OLR.pdf.
Fred I sure hope you are right. But nothing they have done in the past would allow me to arrive at the same conclusion as you.

Chuck L
August 19, 2010 6:38 am

Oh, goody, another computer model:
“The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). ”
“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
■What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
■How will patterns in the ocean and atmosphere affect regional climate in coming decades?
■How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
■What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?
It seems to me that the result of the climate simulations has already been predetermined, after reading the press release, and that result wil be (drumroll….)
“IT’S WORSE THAN WE THOUGHT!”

Alan the Brit
August 19, 2010 6:39 am

Joe Lalonde says:
August 19, 2010 at 4:04 am
Quote:“Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
I forgot that word too!
OED says Representation: description or portrayal or imagination, place likeness of before the mind or senses, serve or be meant as likeness, symbolise, act as embodiment of, stand for, be a specimen for, fill place of, work of art (now we’re getting close) portraying something, be a substitute for………………………!
It doesn’t actually define is as the actual or real thing. So, are these all little clues to what is actually being done in these models? Are they merely a bunch of sophisticated, simulated, representative, mathematical equations? In other words they’re guessing? I recall as a young engineering student that during examinations, one was not permitted to take a “programmable” calculator into the exam hall, only basic function ones, because some smart arse could use one or two parameters given in the questions to produce the right number, without actually knowing or understanding any engineering principles! They’ll develop a model to represent sex next then we’re really done for!

Frank K.
August 19, 2010 6:40 am

Phil. says:
August 19, 2010 at 3:48 am
Phil. – meet the Boeing wind tunnel…
http://www.boeing.com/news/releases/2003/q2/nr_030616i.html
“Fewer tests are needed today because modern computational fluid dynamics (CFD) tools allow designers to consider and run virtual experiments on designs with a higher degree of confidence than ever before. Only those designs that show real promise will make it to the wind tunnel.”
“Our computer modeling tools allow us to predict much more closely what will transpire in the wind tunnel, which then accurately verifies the flight performance of the airplane,” Bair said. “This process gives us the opportunity to refine our designs with computers long before we ever get to the wind tunnel. We are looking for the design that will allow the airplane to fly most efficiently.”

I would also note that the aircraft aerodynamics problem is about an order of magnitude simpler than the physics that are purportedly being modeled in a climate code. Of course, with codes like GISS Model E, you really don’t know what the heck they’re modeling…

red432
August 19, 2010 6:42 am

Take dozens of chunks of software using formulas based on guess work and hook all their inputs and outputs together. Then bootstrap the process with thousands of guessed input parameters and let the thing chug away for months. The result? Impressive looking nonsense. Illusionists of old had nothing on computer models. I do like to watch the animations, though — they’re cool.
It’s not clear to me that if you greatly simplify any component of a climate model that anyone really knows when the results of the computation have any meaning or not. As I mentioned before protein folding is many orders of magnitude simpler than any component of a climate model, and no one knows how to correctly emulate protein folding using any amount of computational power.

Michael Schaefer
August 19, 2010 6:42 am

DJ Meredith says:
August 19, 2010 at 5:03 am
“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming…”
In other words, the output is predetermined.
——————————————————–
“SIGH!” I agree…

ian middleton
August 19, 2010 7:02 am

This is the bit that worries me:
USING the CESM, Hurrell and other scientists HOPE TO LEARN more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation.
So now the computer is teaching the scientists. I don’t think so buddy. A computer is nothing more than a sophisticated slide rule. Call them out on this, if it so good at predicting or defining future climatic conditions, whats the weather going to be like on 25th december 2010 in Sydney, Australia, to the nearest 5 degrees. Any scientist who puts their name to this in the next IPCC report is (1) not a scientist, (2) a complete prat.
Sorry for the rant but had I said what I really meant I would be banned from the blog.

Warren in Minnesota
August 19, 2010 7:03 am

The press release starts with the word SCIENTISTS and continues with SCIENTISTS do this and SCIENTISTS do that. I think a shortened version of the press release could be stated as this:
SCIENTISTS continue to play with their computer simulations to find human-influenced global warming for the IPCC assessment.
But I don’t think that scientists do that.

August 19, 2010 7:05 am

The opening of the NCAR press release states: “Modeling climate’s complexity. This image, taken from a larger simulation of 20th century climate, depicts several aspects of Earth’s climate system.”
What does it mean that the image shows Greenland without its ice cover? Is Greenland’s ice cover not one of the aspects of Earth’s climate system?
I wonder what their depiction of the Antarctic shows. Have they done away with the antarctic icecap, too?

Enneagram
August 19, 2010 7:10 am

This says it all: The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC)
Assessment:
Definition: assignment of fee, amount .
Hands up!, don’t move!