New computer model advances climate change research
From an NCAR/UCAR press release
BOULDER—Scientists can now study climate change in far more detail with powerful new computer software released by the National Center for Atmospheric Research (NCAR).
The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). The CESM is the latest in a series of NCAR-based global models developed over the last 30 years. The models are jointly supported by the Department of Energy (DOE) and the National Science Foundation, which is NCAR’s sponsor.
Scientists and engineers at NCAR, DOE laboratories, and several universities developed the CESM.
The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
- What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
- How will patterns in the ocean and atmosphere affect regional climate in coming decades?
- How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
- What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?
The CESM is one of about a dozen climate models worldwide that can be used to simulate the many components of Earth’s climate system, including the oceans, atmosphere, sea ice, and land cover. The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.
“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
Scientists rely on computer models to better understand Earth’s climate system because they cannot conduct large-scale experiments on the atmosphere itself. Climate models, like weather models, rely on a three-dimensional mesh that reaches high into the atmosphere and into the oceans. At regularly spaced intervals, or grid points, the models use laws of physics to compute atmospheric and environmental variables, simulating the exchanges among gases, particles, and energy across the atmosphere.
Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.
A broader view of our climate system
The CESM builds on the Community Climate System Model, which NCAR scientists and collaborators have regularly updated since first developing it more than a decade ago. The new model enables scientists to gain a broader picture of Earth’s climate system by incorporating more influences. Using the CESM, researchers can now simulate the interaction of marine ecosystems with greenhouse gases; the climatic influence of ozone, dust, and other atmospheric chemicals; the cycling of carbon through the atmosphere, oceans, and land surfaces; and the influence of greenhouse gases on the upper atmosphere.
In addition, an entirely new representation of atmospheric processes in the CESM will allow researchers to pursue a much wider variety of applications, including studies of air quality and biogeochemical feedback mechanisms.
Scientists have begun using both the CESM and the Community Climate System Model for an ambitious set of climate experiments to be featured in the next IPCC assessment reports, scheduled for release during 2013–14. Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. The new IPCC report will include information on regional climate change in coming decades.
Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts, such as a particular region facing a high probability of drought, or another region likely facing several years of cold and wet conditions.
“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”
“What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?”
Good question.
I bet their glitzy new software isn’t going to give a definitive answer though.
The only climate they will study better is the virtual climate existing in the virtual world represented in the model:
* What impact will artificially warming temperatures have on the massive ice sheets programmed in the model?
* How do patterns in the virtual ocean and atmosphere make the regional climate evolve in the model?
* How will climate change influence the severity and frequency of tropical cyclones, including hurricanes, in the model?
* What are the effects of simulated tiny airborne particles, known as aerosols, on simulated clouds and temperatures?
So is this the next version of the Sims? And when can I get a copy at Walmart?
The big problem all these models face which is the proverbial ‘elephant in the room’ is the possibility that current global temperature measurement records are wrong.
If the way they measure global temperatures is inaccurate, (and the recent admission that the satellite data is corrupted through degraded instruments and the way UHI is homologated is throwing off the record to a false positive and the way that 60% of surface stations in colder locations have been closed in the last 30 years), then all the going back and checking the models against the real world historical data is going to do is constantly introduce the problems they have had all along.
They cannot match today’s real world temperatures in the models without adding an anthropogenic forcing. Well, what if today’s real world temperatures in the record are out by half a degree centigrade on the warming side due to the inaccuracies outlined in brackets above?
Then they are adding a false warming trend into the models to get them to match an inaccurate real-world record.
After all, there must be something wrong with the models if they have to add something extra to get them to match the historical record, but then all the projections forward are giving a falsely high rate of warming which has not been matched by the real world.
I love the ‘we can’t handle the details but our generalities are great!’ theme.
Carried to the logical extreme, the less they actually know about anything, the more certainly they can predict everything, then.
No wonder the mission seems so messianic – the fountainhead of truth is revelation.
They even have a J.C. figure doing the slumming thing, trying to heal the peasants.
I’m going to get out the O.E.D. and double check the definition of ‘sanity’. I’m just curious to see how thoroughly data has been adjusted.
So how about modeling cloud cover?
Benchmarking? How accurately can the model reproduce recent climate, for which there is good data?
Presumably such benchmarks exist – does anyone have a reference?
>”I bet their glitzy new software isn’t going to give a definitive answer though.”
No. But I bet it does give hockey sticks.
NYLO: The only climate they will study better is the virtual climate existing in the virtual world represented in the model:
Maybe they could incorporate “Farmville” into this virtual world and see if the virtual crops grow better with more virtual CO2.
Maybe they could work out how many inefficient windfarms we need to build to cope with the cooling areas.
So the conclusion is already drawn, temps are going up in their model no matter what the data says…just look at the unscientific questions they are asking.
Potentially daft question chaps (and chapettes), when they release this model for use (if they have not already done so) do they specifically state all the modelling paramaters? I.e. just what it includes in the model (oceans, currents, landmass, sun, GCR’s, clouds etc etc).
I’ve always struggled to find what it is the models ACTUALLY model, but the i could just be being a chimp…
Cheers,
LM
We’ll have to see if this model can get the sensitivity to match the ERBE satellite value, and whether it insists on that hot spot above the tropics.
So where can I download a copy?
What part of global warming has stopped do these people not understand? Why do they keep wasting hard working people’s tax dollars on technological crystal balls designed to line the pockets of sociopathic fools?
Now with the climate, the clouds pretty much determine the temperature, so if you can make a computer that can predict the future, and determine every single cloud on earth for the next 100 years (you can’t even get it to do the next 1 minutes) you have a model that works. Also throw in a few volcanic eruptions, the undersea eruptions, the position of plankton on the oceans, how much cosmic rays are hitting the earth (which seeds the clouds), what the sun is doing, as it determines the amount of cosmic rays and the whole water cycle, including all the underwater rivers we don’t know about. Pretty much everything on earth determines the temperature and around it.
When they have a computer that does that, can I borrow it, since I want it to predict the stockmarket and be rich
Someone tell them, that you can’t model a dynamic and chaotic system – there are some things you can’t digitise, climate is one of them.
Quote:
““Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability”
I need to rub my eyes……………natural variability, yes they did say it ……I cannot believe the hubris in such a pompous statement.
I used to build sandcastles as a child and think maybe just maybe I could beat the sea, in the back of my mind knowing that it was a ‘bit silly’.
Last week I sat in the car taking in an overview (of the same small beach and cliffs), musing how difficult would it be to simply simulate weather/model local climate- in this small locale, well nigh impossible was the conclusion……. and what about the fifty mile column of atmosphere rising up to space, you cannot take one bit in isolation – impossible.
We can point to trends, I rate Joe Bastardi, he is a great meteorologist IMHO but model the climate? – No I don’t think so.
Even if you have trillions of floating point operations per second, earth can outdo that without ‘batting an eyelid’ and we are just talking local weather events, not future predictions or continental weather, future climate generalisation is a moot exercise, when will these twerps get it into their heads?
Yes! a great academic adventure but man should not take as gospel, climate modellers ‘predictions’ at the end of the day all of it is, only abstract numbers, man made algorithms, best guesses.
Oh yeah – and another thing, we don’t even understand the (atmospheric) processes yet, all the above (article – is an exercise in hope over scientific process) is – is a shot in the dark, they would be better employed trying a medium to ‘get in touch’ with Nostradamus/or divining the tea leaves.
Whenever I hear the phrase; “models predict” my eyes glaze over.
Garbage in, Garbage out.
Doesn’t matter how good your computer and software are, G.I.G.O. still applies
So, does your brand new 2011 model do clouds? If not, I’ll just put new tires on my old model and keep it on the road for a few more years.
Interesting however, isn’t this just a case of even more garbage in and even more garbage out?
Undoubtedly this model has been designed with a built in bias to ‘prove’ that the statements and publications of Mann/Hansen/Jones/Briffa/Gore etc are ‘correct’.
Undoubtedly this model will be so complex that few, if any, will be able to unravel exactly why its predictions for the real world are wrong.
Undoubtedly this model will be designed to ignore the effects of natural climate cycles and exaggerate the impact of human activity.
Undoubtedly the ‘findings’ of this model will be used by politicians to demonstrate why we all need to be taxed more.
Undoubtedly no sceptic will be allowed access to it, in order to ensure GIGO can become “the science is settled”.
“potentially irreversible, human-influenced climate change” – this says it all; the model has been designed with the sole intention of scaring the crap out of as many people as possible.
I wonder if this will be like with Zhang’s work, as in IDAO (Ice-Diminished Arctic Ocean). Since global warming is unequivocal, as well as diminishing global ice, there is no need for models that can show otherwise. Then the models made will not show otherwise, since there was no need for them to do so, no matter what real data is inputted into them.
## “Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts,….”
It is difficult to trust any of the high expectations as long as none of the climate models have demonstrated that they are able to explain the most significant climatic changes during the last century, namely
___the Artic warming (1919 to 1939), http://www.arctic-heats-up.com/
___and the reasons for the global cooling (1940 –1970s); http://climate-ocean.com/
for which sufficient real data are available.
The new model’s limited capabilities will not help scientists shed light on some of the critical mysteries of global warming, including:
– How to erase years of skeptical evidence so they can use
the line “in light of new information from improved
models…” to stage an exit from the hoax
– How to save their careers as the AWG hoax collapses
– And how to move the public onto the new Bio-Crisis hoax
and get them to pay Bio-Debt to Hugo Chavez
It certainly will not be able to produce any useful information about climate, as most of the information that will be used as input has already been corrupted.
Studying a model is fine but studying the real thing is better.
If the real thing diverges from the model then further study of the model output is somewhat counterproductive.
Still, it’s a useful addition to the armoury provided it is not used to generate what may well turn out to be fantasies.
Odds are this is the same old model with a glitzy new interface.
It looks great, but all the same old errors and omission are still un-addressed.
How do we know that the model produces skilful results? THis:
“To VERIFY A MODEL’S ACCURACY, scientists typically simulate past conditions and then compare the model results to actual observations” [my emphasis]
So that’s OK then.
Oh, hang on. Presumably all models have their accuracy “verified” by hind-casting. But all models give different results. But they’ve all been verified as being accurate. But none of them give the same results….
Oh no, my head just exploded.
“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
Persuing a question doesn’t necessarily mean you find the answer!
Nylo has said it all!
Why oh why oh why do they keep using words like simulate & sophisticated to describe these computer models? These twits with PhDs coming out of every which way & no brains between them! My 1925 Pocket OED has wonderful definitions of these, a better term endowing more certainty would be “Replicate”, meaning to copy or duplicate; however, “Simulate”, means to feign, pretend, in the guise of, counterfeit, unreal thing, shadowy likeness of; “Sophisticated”, meaning corrupted or adulterated, spoil the simplicity or pruity or naturalness of. Words that aptly describe these computer models to a tee! One wonders why they stick to using these hackneyed terms. The problems are clear & simple to me, they know what result they expect to get, they know what result they want to get, they think they know it all, & surprise, surprise, they get the anwser they wanted to get! They then do run upon run, getting presumably the same answer over & over again therefore apparently verifying their results so they believe it’s real.