New computer model advances climate change research
From an NCAR/UCAR press release
BOULDER—Scientists can now study climate change in far more detail with powerful new computer software released by the National Center for Atmospheric Research (NCAR).
The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). The CESM is the latest in a series of NCAR-based global models developed over the last 30 years. The models are jointly supported by the Department of Energy (DOE) and the National Science Foundation, which is NCAR’s sponsor.
Scientists and engineers at NCAR, DOE laboratories, and several universities developed the CESM.
The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
- What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
- How will patterns in the ocean and atmosphere affect regional climate in coming decades?
- How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
- What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?
The CESM is one of about a dozen climate models worldwide that can be used to simulate the many components of Earth’s climate system, including the oceans, atmosphere, sea ice, and land cover. The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.
“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
Scientists rely on computer models to better understand Earth’s climate system because they cannot conduct large-scale experiments on the atmosphere itself. Climate models, like weather models, rely on a three-dimensional mesh that reaches high into the atmosphere and into the oceans. At regularly spaced intervals, or grid points, the models use laws of physics to compute atmospheric and environmental variables, simulating the exchanges among gases, particles, and energy across the atmosphere.
Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.
A broader view of our climate system
The CESM builds on the Community Climate System Model, which NCAR scientists and collaborators have regularly updated since first developing it more than a decade ago. The new model enables scientists to gain a broader picture of Earth’s climate system by incorporating more influences. Using the CESM, researchers can now simulate the interaction of marine ecosystems with greenhouse gases; the climatic influence of ozone, dust, and other atmospheric chemicals; the cycling of carbon through the atmosphere, oceans, and land surfaces; and the influence of greenhouse gases on the upper atmosphere.
In addition, an entirely new representation of atmospheric processes in the CESM will allow researchers to pursue a much wider variety of applications, including studies of air quality and biogeochemical feedback mechanisms.
Scientists have begun using both the CESM and the Community Climate System Model for an ambitious set of climate experiments to be featured in the next IPCC assessment reports, scheduled for release during 2013–14. Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. The new IPCC report will include information on regional climate change in coming decades.
Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts, such as a particular region facing a high probability of drought, or another region likely facing several years of cold and wet conditions.
“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”
It would be normal to take the predictions of past models from 10, 20 years and measure them against actuality. Then explain why the results were so and backcast with the new model including the original input paramaters from x years ago.
Basic sanity check.
Cold Englishman says:
August 19, 2010 at 2:01 am
Does this mean that all previous models were wrong?
____________________________________________________________
Now that is the question. And of course it does not need an answer. And how much $ did they cost us? Billions?
And are we still trying to save the planet because these failed models told us the we were all gonna die?
They only model I like is Caudia Schiffer.
To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.
“And then, when the models are shown to overestimate actually observed temperatures, we get our friends at GISS to make the past conditions colder so the modelled warming DOES fit the actual observation. Simple.”
The author of this PR was very likely in the used car business last week. Caveat Emptor!
No doubt it will prove many things the IPCC is interested in. Bon Voyage!
One day a SuperDuperComputer will gobble up all the data there is and we won’t need TV Weather forecasters to tell us what the likelyhood of rain is this afternoon at Ayer’s Rock.
The source code is available from the NCAR site here:
http://www.cesm.ucar.edu/models/cesm1.0/
So, they are doing this in open source mode – a big round of applause for that.
If anyone wants to see the details of how the model is built – it’s all there. You just need to understand both Fortran and C (to which – yuk!! at least from me – and I’ve programmed using both of these beauties in my time).
Instead of simply criticising, it would be good if folk from WUWT with relevant skills could take a look at the model and make an assessment of its approach – both the good points and the bad.
Science has yet to figure out that the equator of the sun gives off more heat than the upper and lower hemispheres of the sun due to centrifugal force.
The position of our planet to the sun is very important to the amount of heat the planet absorbs from what the sun gives off of.
Are these observations included in the models?
“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”
============================
What is it in human influence on the climate that makes it irreversible? Is it because we are fallen angels?
[REPLY: because ending all CO2 emissions requires we go back to a pre-industrial agrarian feudal existence, that just happens to only be sustainable if 2/3 of the population of the planet disappeared suddenly. So, who decides which 2/3? Same people telling you it’s all your fault, so guess who they pick? – mike]
Computer games are not experiments.
Does that mean it is open-source? Is all the code “freely available”? Are all the built-in parameters clearly stated? Are the assumptions used to create the parameters fully variable, so their values can be freely changed by scientists who are not programmers? Are the conditions under which new parameters can be added clearly specified? Is the model made available with all the datasets it draws upon, and are all sources of and modifications to that data “freely available” and transparent?
If not, why not? This model has been created with taxpayer money, so any researcher who wants to play with it should have complete and unfettered access.
Perhaps some of the experts who frequent this blog could look into these questions. Why should only climate alarmists be privy to the internal construction of such simulations?
/Mr Lynn
“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming…”
In other words, the output is predetermined.
“So is this the next version of the Sims? And when can I get a copy at Walmart?”
Nope. Try NHL 2011 (Lots of hockey sticks!)
Weather patterns are too complex to expand to decades and centuries they say, so what do they do but omit all the details that makes the weather pattern apparatuses work, and call it a day, pat their backs, cross their thumbs and hope the mention of global warming will pay for the updates.
What do people generally call a product that has been abstracted to the point of lunacy but homeopathy. So I welcome the Climate Homeopathy Soft Apparatus CESM to the daylight.
And did they name it community to attract communists or just lefties in general? O_o
When I posted my comment above, I had not seen this:
Ditto. Perhaps a team of knowledgeable folk could work together to provide an independent assessment of the strengths and weaknesses of the model—and then let the world know. It’s time the media had input from more than one side.
/Mr Lynn
I see they use neural networks in CESM’s CAM module. See the excuses for using this technology at http://www.cesm.ucar.edu/events/ws.2010/Presentations/AMWG/tribbia.pdf
To refute any claims that these models are based on physics you only need to point to this use of neural networks.
Every result needs to be prefaced by a statement such as:
“Based on climate conditions X,Y,Z and the assumption that these conditions will continue, the models show……….”
Anything less will be hogwash.
Computer models? “Bah!..Humbug!” I say! And “Bah!” again. That’s the problem with the world outside our Village – computers! What use have we of these machines? Modelling weather, modelling proteins, businesses, engineering…and now climate!!
These ‘scientists’ with phoney titles pretending to know something about the climate, trying to indoctrinate us with their lies!! We know we can learn of matters climatical here, safe in our Village. Why the Oracle Goddard is living proof that no so called ‘qualifications’ are needed to possess all geophysical knowledge, and beat these people at their own game.
Keep faith, the Great Cooling is fast approaching…
Ammonite says:
“August 19, 2010 at 4:02 am
UK Sceptic says: August 19, 2010 at 1:04 am
What part of global warming has stopped do these people not understand?
The part where global temperature is at record or near record levels across the last twelve months in all the major temperature measuring systems…”
There you go disagreeing with Phil Jones who says there’s been no warming in the past decade.
These record levels would be from the mangled/homogenised/cmpletely discredited temperature records used by CRU, GISS etc which have been inexplicably adjusted upwards, rather than downwards, to account for UHI effects, as pointed out in countless posts on this site.
Now I understand why it is necessary to continually adjust and readjust historical data.
The points made that all the modeling in the world is worthless if the basic data is corrupted, due to the GIGO factor, is spot on.
It is heartening, at least, to see a refreshing appearance of openness with this latest attempt. Despite trepidation about the difficulties with attempting to model a dynamic, chaotic system, I think it’s a bit unseemly to automatically hip shoot with aspersions as to the designer’s and developer’s intentions, even though it’s likely that most of those rocks are being tossed by folks who won’t be doing what is done best around here, which is to roll up the sleeves, and dive into the eye watering and brain pounding lines of code to validate its accuracy or utility, insofar as that is relevant, even though a number of folks have expressed the opinion that such is a Quixotic quest.
Full disclosure – I’ll admit that upon seeing this a phsyorg.com yesterday, my intitial reaction was to cast the write-up in the same voice and timing as Dan Akroyd selling the Bass-o-matic. Still it does appear, prima facia, to be a good faith attempt towards further understanding – and instead of chucking rocks at the folks attempting it, it would probably be best to become as cognizant as possible about both the strengths (if there are any) as well as the pitfalls of this effort, which according to the press release, is on track to be a significant factor, or “forcing” if I may borrow the phrase, for the continued deliberations and work of the IPCC, which, like it or not, isn;t going away any time soon, and will be relied upon, rightly or wrongly, by policymakers and politicians worldwide. Simply dismissing it will not prepare anyone to be in a position to rationally engage in the discussions which are sure to continue on this topic, given the imprt it has to society, at least from a political aspect, even if you believe that the planet is just gonna do what the planet is just gonna do, regardless of silly human shennanigans.
They are going to need two computers as this one is only built for the Northern hemisphere!
“The figure also captures ……. including warmer air moving north on the eastern side of low-pressure regions and colder air moving south on the western side of the lows.
GIGO
Is that an expression of progress in science or, rather, of progressive science?
As all of us who used computers used to say Garbage In, Garbage Out. It was that way when I got my PhD in EE from MIT in the 70s and has not, nor will it ever, change.
“Because climate models cover far longer periods than weather models, they cannot include as much detail.”
Now if this isn’t the total opposite, I don’t know what is.
You would need even more “detail” in a climate model.
The further out to take it, the more it’s going to amplify any mistakes or anything you leave out.
Does the model include natural phenomino such as volcano’s exploding? what about the much hyped cloud’s? or is this just a simple model that you input temp into a CO2 model and it tells you the erath will heat up and up till we all cook?
I can understand modelling a building before it is built as we understand the physics behind it but modelling a planets future climate without understanding the climate you are trying to model, sorry but computers and nice useful tools but they cannot do the work of modelling a climate its the human input that does this and if you don’t full understand something how can you model it?
I’d do a simple test and model 2005-2015 and then see how it gets on, my guess is the finger in the air method maybe as accurate.