New computer model advances climate change research
From an NCAR/UCAR press release
BOULDER—Scientists can now study climate change in far more detail with powerful new computer software released by the National Center for Atmospheric Research (NCAR).
The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). The CESM is the latest in a series of NCAR-based global models developed over the last 30 years. The models are jointly supported by the Department of Energy (DOE) and the National Science Foundation, which is NCAR’s sponsor.
Scientists and engineers at NCAR, DOE laboratories, and several universities developed the CESM.
The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
- What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
- How will patterns in the ocean and atmosphere affect regional climate in coming decades?
- How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
- What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?
The CESM is one of about a dozen climate models worldwide that can be used to simulate the many components of Earth’s climate system, including the oceans, atmosphere, sea ice, and land cover. The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.
“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
Scientists rely on computer models to better understand Earth’s climate system because they cannot conduct large-scale experiments on the atmosphere itself. Climate models, like weather models, rely on a three-dimensional mesh that reaches high into the atmosphere and into the oceans. At regularly spaced intervals, or grid points, the models use laws of physics to compute atmospheric and environmental variables, simulating the exchanges among gases, particles, and energy across the atmosphere.
Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.
A broader view of our climate system
The CESM builds on the Community Climate System Model, which NCAR scientists and collaborators have regularly updated since first developing it more than a decade ago. The new model enables scientists to gain a broader picture of Earth’s climate system by incorporating more influences. Using the CESM, researchers can now simulate the interaction of marine ecosystems with greenhouse gases; the climatic influence of ozone, dust, and other atmospheric chemicals; the cycling of carbon through the atmosphere, oceans, and land surfaces; and the influence of greenhouse gases on the upper atmosphere.
In addition, an entirely new representation of atmospheric processes in the CESM will allow researchers to pursue a much wider variety of applications, including studies of air quality and biogeochemical feedback mechanisms.
Scientists have begun using both the CESM and the Community Climate System Model for an ambitious set of climate experiments to be featured in the next IPCC assessment reports, scheduled for release during 2013–14. Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. The new IPCC report will include information on regional climate change in coming decades.
Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts, such as a particular region facing a high probability of drought, or another region likely facing several years of cold and wet conditions.
“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”
Or should we read?:
The Communist Earth System Model (CESM) will be one of the primary climate models used for the next assessment of Global Governance.
“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
* What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?…”
If you start with a bias – What impact will warming temperatures – then the model is absolute garbage because of the scientists built in prejudices.
There is a darn good chance we are looking at a cooling trend. And a cooling trend is going to be a lot more damaging that a warming trend. The CAGW type scientists KNOW there is a possibility of a COOLING trend.
Item number four over at Joe Romm’s Climate Progress states:
“4. Absent human emissions, we’d probably be in a slow long-term cooling trend due primarily by changes in the Earth’s orbit — see Human-caused Arctic warming overtakes 2,000 years of natural cooling, “seminal” study finds”
He is talking about the Milankovitch cycles and the fact the sun’s apparent energy has declined by 9% since the beginning of the Holocene.
This paper puts it a little more bluntly: Lesson from the past: present insolation minimum holds potential for glacial inception (2007)
“Because the intensities of the 397 ka BP and present insolation minima are very similar, we conclude that under natural boundary conditions the present insolation minimum holds the potential to terminate the Holocene interglacial. Our findings support the Ruddiman hypothesis [Ruddiman, W., 2003. The Anthropogenic Greenhouse Era began thousands of years ago. Climate Change 61, 261–293], which proposes that early anthropogenic greenhouse gas emission prevented the inception of a glacial that would otherwise already have started….”
So why is that possibility not mentioned as a problem to be investigated? The slide into an Ice Age has got to be a lot more “catastrophic” that a minor amount of warming and it sure would be nice to have an idea of when that is going to happen as it inevitably will.
“Scientists can now study climate change” using “The Community Earth System Model”
How
Can
Millions
Of
People
Not
See
The
Problem
With
That
Problems (a short list):
Resolution still not adequate.
Is the water cycle heat engine included?
Are the solar influences included? The 80 and 200 year cycles (a minimum approaches)?
ENSO, EL Nino, La Nina?
The Earth’s tilt as it varies?
And, it appears that even the Jet Stream should be in there.
ShrNfr says:
August 19, 2010 at 5:43 am
> As all of us who used computers used to say Garbage In, Garbage Out. It was that way when I got my PhD in EE from MIT in the 70s and has not, nor will it ever, change.
Sure it has, now it’s Garbage In, Gospel Out. (Actually, that was a CMU observation from the 70s.)
——-
More seriously, I think this model has gotten too little attention at WUWT. It’s been readily available as source for years but has been pretty much ignored here.
Mike Edwards notes “Instead of simply criticising, it would be good if folk from WUWT with relevant skills could take a look at the model and make an assessment of its approach – both the good points and the bad.”
I agree with the sentiment, but given the tenor of comments here I think any rational analysis would be drowned out by knee jerk criticism. There’s over 100 comments here – about a dozen have something that adds to understanding (this comment is not one of them). The rest make it hard to find the useful ones.
If I had time to dig into this, I’d be much more inclined to do it as a project at http://chiefio.wordpress.com/ and report back here with guest posts.
Apparently the model can run on personal computers, the recent reports (don’t see the link off hand modeling the American Southwest was done with that code on a smallish computer.
Modeling is fun. We have models of just about everything. In the end, they are what they are: models; a pretend representation, an approximation of realism. We can walk away from flight sims, and battles sims…but climate modellers want us to live the fear they model.
Typo in last paragraph, but found the link I wanted, http://wattsupwiththat.com/2010/07/09/modeling-the-big-toasty/#comment-426718
[SNIP: Citing biblical verse is not an argument… – mike]
starzmom says:
August 19, 2010 at 5:27 am
Now I understand why it is necessary to continually adjust and readjust historical data.
Yes, the main reason behind Mann’s hockey stick has always been to provide a basis for claiming that CO2 is the cause of warming.
I wonder what historic record these models are now using? I’d bet they still use the hockey stick which makes them completely invalid.
As for improvements … how does taking something from .00000000001% accuracy up to .0000000001% accuracy make any difference whatsoever? It’s still no more valid than a wild guess.
How can there ever be any value in using an incomplete model to make predictions? It’s just plain silly.
——————
“Modellers have an inbuilt bias towards forced climate change because the causes and effect are clear.” Gavin Schmidt, Michael Mann et. al. Source [pdf] 2004
——————
So, badly placed thermometers, guessing temps for unmonitored regions, inbuilt bias of modellers, unknown unknowns / new discoveries, contradictory evidence, politicization of science, lucrative funding etc. and they expect their model to predict what the climate will be like at the end of the century. We have had enough already of their failed predictions and IPCC forecasts.
Household garbage in utter rubbish out.
This mean Canadian smoothing is going to get decent billing in the credits for The Sims: Climate Edition? After playing such an important part in the last game’s outcome, we won’t settle for being just above the grips.
I see the source code is available here:
http://www.cesm.ucar.edu/models/cesm1.0/cesm/cesmBbrowser/
It’s written in f90 (fortran 90).
The latest model reminds me of upgrading from XP to Vista – same crap different wallpaper
Foly Hund,
“Modeling is fun. We have models of just about everything. In the end, they are what they are: models; a pretend representation, an approximation of realism. We can walk away from flight sims, and battles sims…but climate modellers want us to live the fear they model.”
The ELOC on this thing won’t be any better than every other program out there. The more complex, the worse things are and if there’s seriously wonky biasing, it’ll be buried deep in some obscure check with an imaginative injection vector. Fun fun fun finding that by hand!
For more information on the program structure, see also:
http://www.ccsm.ucar.edu/models/
The importance of this technological advancement cannot be overstated.
The day has apparently arrived where a highly trained and specialized individuals can look into a silicon-based device and be able to forcast the past, look into the present, and forcast the future.
These insights gained by utilizing this cutting edge silicon processing technology may be used to inform and advise policy makers on important decisions in diverse areas regarding taxation, weath distribution, land use, power consumption, etc.
I suppose the difference between this and the ancient art of scrying into a crystal ball is that these are not Druids, they are climate scientists. And critically, the implement used is not composed of flawless SiO2, but it is based on Si. These differences cannot be emphasized enough.
I see no working group on data collection and quality assessment?
How can they possibly initialise a chaotic model without a clear definition of system state?
GIGO == Garbage In; Grants Out.
seems to me like this will be a model that will just project continents as warm, even if there will be a lot of cold in the continent, because the model can only pick up regions, and its going to be used by the IPCC, who will try to cherry-pick the data to say its warmer then it is
Similar situation to what happened this summer in Russia, Moscow and St. Petersburg were really warm, though a lot of the nation was also extremely cold. But no one focuses on the cold, just the warmth, and then proclaims the whole area is really warm, when in fact it isn’t
Tommy says:
August 19, 2010 at 9:12 am
I see the source code is available here:
http://www.cesm.ucar.edu/models/cesm1.0/cesm/cesmBbrowser/
It’s written in f90 (fortran 90).
—
A quick look at the source code shows me that it’s MUCH better written than GISS Model E…and they have real documentation.
I totally support the effort here. I totally disagree with the merit many “scientists” give its results, which are simply speculative returns based on inaccurate input (what exactly was the aerosol concentrations across the earth in 1947?).
Nice press release, sounds like it is perfect, I’m interested in:
* What do the modelers feel are the greatest challenges of current modeling? How do they plan to overcome these?
* What do the modelers (not their boss) say about the reliability of the current model, from a high level perspective?
* Does it properly model the last decade’s “pause” in warming? What was changed in the model to allow this to happen?
* Does it predict warming at the same rate as previously? What is the estimated CO2 forcing? Why should we have to wait for an IPCC report for this result?
* If it can predict droughts, when, where, and how long are the next several going to happen?
* What do the modelers believe would be the earliest indications that their model is not performing as well as they wish? What unique characteristic do they look for as an early indicator of performance?
* Does it predict a tropospheric hot spot as previous models did (which has shown to not be accurate)? What was changed to allow this?
* Does this model show it is worse than we thought?! (Just kidding)
Note: I don’t believe they haven’t done runs on this model and they don’t know how it performs in general. Call me a cynic, but I say it would not be released if it hasn’t shown to give “correct” results.
I agree with a previous poster that taking a model that is not known to work very well and adding additional layers of complexity is not likely to encounter success.
As much as the modelers want to hide it, these models are highly tuned so that they can backcast previous climate with some degree of accuracy, which is fine. It is this “hand” tuning and the inaccuracy of the climate input / starting conditions that make the future predictions speculative at best. Claims of “it’s just physics” stretch credibility.
Oh good. A quicker computer to do stupid things faster.
Can they run CET record backwards?
http://www.climate4you.com/images/CentralEnglandTempSince1659%20880pixel.gif
Wake me up, if the model will match it.
Ric Werme says:
August 19, 2010 at 7:54 am
______________________
Can you provide a link to a documented source? I base my criticism of carbon-control models on my inability to find any mention of the sensitivity of CO2 to temperature in the IPCC AR4 or in any paper that purports to have model output similar to the empirical record in phase, amplitude and spectrum.
Sorry, I should have read the other comments as well, as there are links aplenty. I’ll look for sensitivities.
Hmm, I wonder if they got the data for this model from NOAA-16?