NCAR's new 2010 climate model for the next IPCC report

New computer model advances climate change research

From an NCAR/UCAR press release

BOULDER—Scientists can now study climate change in far more detail with powerful new computer software released by the National Center for Atmospheric Research (NCAR).

climate modeling

Modeling climate’s complexity. This image, taken from a larger simulation of 20th century climate, depicts several aspects of Earth’s climate system. Sea surface temperatures and sea ice concentrations are shown by the two color scales. The figure also captures sea level pressure and low-level winds, including warmer air moving north on the eastern side of low-pressure regions and colder air moving south on the western side of the lows. Such simulations, produced by the NCAR-based Community Climate System Model, can also depict additional features of the climate system, such as precipitation. Companion software, recently released as the Community Earth System Model, will enable scientists to study the climate system in even greater complexity.

The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). The CESM is the latest in a series of NCAR-based global models developed over the last 30 years. The models are jointly supported by the Department of Energy (DOE) and the National Science Foundation, which is NCAR’s sponsor.

Scientists and engineers at NCAR, DOE laboratories, and several universities developed the CESM.

The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:

  • What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
  • How will patterns in the ocean and atmosphere affect regional climate in coming decades?
  • How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
  • What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?

The CESM is one of about a dozen climate models worldwide that can be used to simulate the many components of Earth’s climate system, including the oceans, atmosphere, sea ice, and land cover. The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.

“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”

Scientists rely on computer models to better understand Earth’s climate system because they cannot conduct large-scale experiments on the atmosphere itself. Climate models, like weather models, rely on a three-dimensional mesh that reaches high into the atmosphere and into the oceans. At regularly spaced intervals, or grid points, the models use laws of physics to compute atmospheric and environmental variables, simulating the exchanges among gases, particles, and energy across the atmosphere.

Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.

A broader view of our climate system

The CESM builds on the Community Climate System Model, which NCAR scientists and collaborators have regularly updated since first developing it more than a decade ago. The new model enables scientists to gain a broader picture of Earth’s climate system by incorporating more influences. Using the CESM, researchers can now simulate the interaction of marine ecosystems with greenhouse gases; the climatic influence of ozone, dust, and other atmospheric chemicals; the cycling of carbon through the atmosphere, oceans, and land surfaces; and the influence of greenhouse gases on the upper atmosphere.

In addition, an entirely new representation of atmospheric processes in the CESM will allow researchers to pursue a much wider variety of applications, including studies of air quality and biogeochemical feedback mechanisms.

Scientists have begun using both the CESM and the Community Climate System Model for an ambitious set of climate experiments to be featured in the next IPCC assessment reports, scheduled for release during 2013–14. Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. The new IPCC report will include information on regional climate change in coming decades.

Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts, such as a particular region facing a high probability of drought, or another region likely facing several years of cold and wet conditions.

“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”

0 0 votes
Article Rating
160 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
tallbloke
August 19, 2010 12:13 am

“What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?”
Good question.
I bet their glitzy new software isn’t going to give a definitive answer though.

Nylo
August 19, 2010 12:16 am

The only climate they will study better is the virtual climate existing in the virtual world represented in the model:
* What impact will artificially warming temperatures have on the massive ice sheets programmed in the model?
* How do patterns in the virtual ocean and atmosphere make the regional climate evolve in the model?
* How will climate change influence the severity and frequency of tropical cyclones, including hurricanes, in the model?
* What are the effects of simulated tiny airborne particles, known as aerosols, on simulated clouds and temperatures?

August 19, 2010 12:23 am

So is this the next version of the Sims? And when can I get a copy at Walmart?

Ken Hall
August 19, 2010 12:25 am

The big problem all these models face which is the proverbial ‘elephant in the room’ is the possibility that current global temperature measurement records are wrong.
If the way they measure global temperatures is inaccurate, (and the recent admission that the satellite data is corrupted through degraded instruments and the way UHI is homologated is throwing off the record to a false positive and the way that 60% of surface stations in colder locations have been closed in the last 30 years), then all the going back and checking the models against the real world historical data is going to do is constantly introduce the problems they have had all along.
They cannot match today’s real world temperatures in the models without adding an anthropogenic forcing. Well, what if today’s real world temperatures in the record are out by half a degree centigrade on the warming side due to the inaccuracies outlined in brackets above?
Then they are adding a false warming trend into the models to get them to match an inaccurate real-world record.
After all, there must be something wrong with the models if they have to add something extra to get them to match the historical record, but then all the projections forward are giving a falsely high rate of warming which has not been matched by the real world.

Gnomish
August 19, 2010 12:32 am

I love the ‘we can’t handle the details but our generalities are great!’ theme.
Carried to the logical extreme, the less they actually know about anything, the more certainly they can predict everything, then.
No wonder the mission seems so messianic – the fountainhead of truth is revelation.
They even have a J.C. figure doing the slumming thing, trying to heal the peasants.
I’m going to get out the O.E.D. and double check the definition of ‘sanity’. I’m just curious to see how thoroughly data has been adjusted.

Ulf
August 19, 2010 12:33 am

So how about modeling cloud cover?
Benchmarking? How accurately can the model reproduce recent climate, for which there is good data?
Presumably such benchmarks exist – does anyone have a reference?

jv
August 19, 2010 12:36 am

>”I bet their glitzy new software isn’t going to give a definitive answer though.”
No. But I bet it does give hockey sticks.

Richard
August 19, 2010 12:38 am

NYLO: The only climate they will study better is the virtual climate existing in the virtual world represented in the model:
Maybe they could incorporate “Farmville” into this virtual world and see if the virtual crops grow better with more virtual CO2.
Maybe they could work out how many inefficient windfarms we need to build to cope with the cooling areas.

Brad
August 19, 2010 12:40 am

So the conclusion is already drawn, temps are going up in their model no matter what the data says…just look at the unscientific questions they are asking.

LabMunkey
August 19, 2010 12:56 am

Potentially daft question chaps (and chapettes), when they release this model for use (if they have not already done so) do they specifically state all the modelling paramaters? I.e. just what it includes in the model (oceans, currents, landmass, sun, GCR’s, clouds etc etc).
I’ve always struggled to find what it is the models ACTUALLY model, but the i could just be being a chimp…
Cheers,
LM

August 19, 2010 1:02 am

We’ll have to see if this model can get the sensitivity to match the ERBE satellite value, and whether it insists on that hot spot above the tropics.
So where can I download a copy?

August 19, 2010 1:04 am

What part of global warming has stopped do these people not understand? Why do they keep wasting hard working people’s tax dollars on technological crystal balls designed to line the pockets of sociopathic fools?

Scarlet Pumpernickel
August 19, 2010 1:06 am

Now with the climate, the clouds pretty much determine the temperature, so if you can make a computer that can predict the future, and determine every single cloud on earth for the next 100 years (you can’t even get it to do the next 1 minutes) you have a model that works. Also throw in a few volcanic eruptions, the undersea eruptions, the position of plankton on the oceans, how much cosmic rays are hitting the earth (which seeds the clouds), what the sun is doing, as it determines the amount of cosmic rays and the whole water cycle, including all the underwater rivers we don’t know about. Pretty much everything on earth determines the temperature and around it.
When they have a computer that does that, can I borrow it, since I want it to predict the stockmarket and be rich

Athelstan
August 19, 2010 1:11 am

Someone tell them, that you can’t model a dynamic and chaotic system – there are some things you can’t digitise, climate is one of them.
Quote:
““Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability”
I need to rub my eyes……………natural variability, yes they did say it ……I cannot believe the hubris in such a pompous statement.
I used to build sandcastles as a child and think maybe just maybe I could beat the sea, in the back of my mind knowing that it was a ‘bit silly’.
Last week I sat in the car taking in an overview (of the same small beach and cliffs), musing how difficult would it be to simply simulate weather/model local climate- in this small locale, well nigh impossible was the conclusion……. and what about the fifty mile column of atmosphere rising up to space, you cannot take one bit in isolation – impossible.
We can point to trends, I rate Joe Bastardi, he is a great meteorologist IMHO but model the climate? – No I don’t think so.
Even if you have trillions of floating point operations per second, earth can outdo that without ‘batting an eyelid’ and we are just talking local weather events, not future predictions or continental weather, future climate generalisation is a moot exercise, when will these twerps get it into their heads?
Yes! a great academic adventure but man should not take as gospel, climate modellers ‘predictions’ at the end of the day all of it is, only abstract numbers, man made algorithms, best guesses.
Oh yeah – and another thing, we don’t even understand the (atmospheric) processes yet, all the above (article – is an exercise in hope over scientific process) is – is a shot in the dark, they would be better employed trying a medium to ‘get in touch’ with Nostradamus/or divining the tea leaves.
Whenever I hear the phrase; “models predict” my eyes glaze over.
Garbage in, Garbage out.

Les Francis
August 19, 2010 1:18 am

Doesn’t matter how good your computer and software are, G.I.G.O. still applies

ParmaJohn
August 19, 2010 1:19 am

So, does your brand new 2011 model do clouds? If not, I’ll just put new tires on my old model and keep it on the road for a few more years.

Patrick Davis
August 19, 2010 1:20 am

Interesting however, isn’t this just a case of even more garbage in and even more garbage out?

Peter Miller
August 19, 2010 1:23 am

Undoubtedly this model has been designed with a built in bias to ‘prove’ that the statements and publications of Mann/Hansen/Jones/Briffa/Gore etc are ‘correct’.
Undoubtedly this model will be so complex that few, if any, will be able to unravel exactly why its predictions for the real world are wrong.
Undoubtedly this model will be designed to ignore the effects of natural climate cycles and exaggerate the impact of human activity.
Undoubtedly the ‘findings’ of this model will be used by politicians to demonstrate why we all need to be taxed more.
Undoubtedly no sceptic will be allowed access to it, in order to ensure GIGO can become “the science is settled”.
“potentially irreversible, human-influenced climate change” – this says it all; the model has been designed with the sole intention of scaring the crap out of as many people as possible.

kadaka (KD Knoebel)
August 19, 2010 1:25 am

I wonder if this will be like with Zhang’s work, as in IDAO (Ice-Diminished Arctic Ocean). Since global warming is unequivocal, as well as diminishing global ice, there is no need for models that can show otherwise. Then the models made will not show otherwise, since there was no need for them to do so, no matter what real data is inputted into them.

August 19, 2010 1:31 am

## “Using the CESM, Hurrell and other scientists hope to learn more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation, which affect sea surface temperatures as well as atmospheric conditions. Such knowledge, Hurrell says, can eventually lead to forecasts spanning several years of potential weather impacts,….”
It is difficult to trust any of the high expectations as long as none of the climate models have demonstrated that they are able to explain the most significant climatic changes during the last century, namely
___the Artic warming (1919 to 1939), http://www.arctic-heats-up.com/
___and the reasons for the global cooling (1940 –1970s); http://climate-ocean.com/
for which sufficient real data are available.

Konrad
August 19, 2010 1:32 am

The new model’s limited capabilities will not help scientists shed light on some of the critical mysteries of global warming, including:
– How to erase years of skeptical evidence so they can use
the line “in light of new information from improved
models…” to stage an exit from the hoax
– How to save their careers as the AWG hoax collapses
– And how to move the public onto the new Bio-Crisis hoax
and get them to pay Bio-Debt to Hugo Chavez
It certainly will not be able to produce any useful information about climate, as most of the information that will be used as input has already been corrupted.

Stephen Wilde
August 19, 2010 1:38 am

Studying a model is fine but studying the real thing is better.
If the real thing diverges from the model then further study of the model output is somewhat counterproductive.
Still, it’s a useful addition to the armoury provided it is not used to generate what may well turn out to be fantasies.

My2Cents
August 19, 2010 1:41 am

Odds are this is the same old model with a glitzy new interface.
It looks great, but all the same old errors and omission are still un-addressed.

AngusPangus
August 19, 2010 1:50 am

How do we know that the model produces skilful results? THis:
“To VERIFY A MODEL’S ACCURACY, scientists typically simulate past conditions and then compare the model results to actual observations” [my emphasis]
So that’s OK then.
Oh, hang on. Presumably all models have their accuracy “verified” by hind-casting. But all models give different results. But they’ve all been verified as being accurate. But none of them give the same results….
Oh no, my head just exploded.

Alan the Brit
August 19, 2010 1:58 am

“With the Community Earth System Model, we can pursue scientific questions that we could not address previously,” says NCAR scientist James Hurrell, chair of the scientific steering committee that developed the model. “Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
Persuing a question doesn’t necessarily mean you find the answer!
Nylo has said it all!
Why oh why oh why do they keep using words like simulate & sophisticated to describe these computer models? These twits with PhDs coming out of every which way & no brains between them! My 1925 Pocket OED has wonderful definitions of these, a better term endowing more certainty would be “Replicate”, meaning to copy or duplicate; however, “Simulate”, means to feign, pretend, in the guise of, counterfeit, unreal thing, shadowy likeness of; “Sophisticated”, meaning corrupted or adulterated, spoil the simplicity or pruity or naturalness of. Words that aptly describe these computer models to a tee! One wonders why they stick to using these hackneyed terms. The problems are clear & simple to me, they know what result they expect to get, they know what result they want to get, they think they know it all, & surprise, surprise, they get the anwser they wanted to get! They then do run upon run, getting presumably the same answer over & over again therefore apparently verifying their results so they believe it’s real.

Cold Englishman
August 19, 2010 2:01 am

Does this mean that all previous models were wrong?

rbateman
August 19, 2010 2:03 am

The CESM builds on the Community Climate System Model, which NCAR scientists and collaborators have regularly updated since first developing it more than a decade ago.
I is this based on the same model that blew the winter forecasts for entire continents?
If so, CESM ‘s purpose might be viewed as saving the CCSM, which was invented to save the Planet.
Should we look for a newer, steamier novel to go with the burger?

Phillip Bratby
August 19, 2010 2:20 am

It appears that the new model has built-in (assumed) warming. Presumably this new unvalidated model will be able to calculate garbage at greater cost and with even more precision.

geronimo
August 19, 2010 2:23 am

Another set of birds’ entrails to bamboozle the public with.

Vargs
August 19, 2010 2:24 am

To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.
Now, I wonder where they get those data?

morgo
August 19, 2010 2:43 am

there dreaming

Alex the skeptic
August 19, 2010 2:45 am

“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:………….”
Did I read correctly? So is the science settled or not? Or was it settled and now has been unsettled by Climategate and those criminal skeptics and deniers who are killing babies and grannies while pumping that poisonous gas into the atmosphere, killing all life in the process?
How many billions$ did this model cost?

Graeme W
August 19, 2010 2:46 am

Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.

And if it can simulate global climate over years, it should be able to make predictions for the next few years so we can evaluate it’s accuracy. Validating a model against the past is easy, because you can always ‘tweak’ things to ensure it matches observations. The real test is how well it does in predicting the future.
Since they’re reporting scales less than decades, it should be able to make predictions in the scale of years that can then be tested against observations.

AdderW
August 19, 2010 2:55 am

Have they got new data to feed these powerful machines with, or is it still the same GIGO?

August 19, 2010 2:59 am

GI-GO. Models only work based on what parameters are programmed (i.e. there are assumptions made). If any of those assumptions are wrong (if, instead of CO2 drives climate, it turns out climate drives CO2, for example), the model is worthless. If data is left out (as another commenter stated, what about cloud cover), the model is skewed and probably worthless.
I’m not saying we shouldn’t be modeling climate and attempting to understand it, but I am saying we should recognize the limitations of the model.

Geoffrey Hoyle
August 19, 2010 3:09 am

I applaud the construction of such a complex piece of software but it cannot take the place of the nuts and bolts of scientific investigation whether theoretical or practical. Trying to predict the future is like trying to find the crock of gold at the end of a rainbow.
CFD is a set of mathematical theories waiting to be proven. The use of simulation is used across a broad field of endeavour from video gaming, formula one car design to VFX in the film industry. However I do wonder how many on us would set foot inside an aeroplane if CFD modelling where the only tool being used to create reliability.
I trust flying because I know engineers dedicated to solving the aviation industries problems post simulation. They are a small breed who still use blue tack, talcum powder and small scale models to solve the problems they investigate.
It would be nice to see some of these vast sums of money spent on simulation design channelled to better use?

sleeper
August 19, 2010 3:15 am

I don’t have a problem with people attempting to “predict” the future climate. I have a problem with their lack of humility.

Ken Hall
August 19, 2010 3:16 am

“Interesting however, isn’t this just a case of even more garbage in and even more garbage out?”
No. Not at all. It is new, more sophisticated garbage in; same pre-programmed garbage out.

Ken Hall
August 19, 2010 3:20 am

AngusPangus, Not to mention that the baseline data against which they test the model data is the same inaccurate temperature data that is regularly flagged on this site as being inadequate and error-filled and showing a significant confirmation bias in the figures all on the warming side.

brokenhockeystick
August 19, 2010 3:26 am

But if the science is already settled beyond argument why do they need a new model and how could the existing ones possibly be improved? Surely this is a tacit admission that the current models might not be right and the science is not settled?

Michael Schaefer
August 19, 2010 3:29 am

The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
* What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
* How will patterns in the ocean and atmosphere affect regional climate in coming decades?
* How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
* What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?
——————————————————————————-
What about putting on Long Johns, boots and parkas, taking shovel, bucket, anemometer, barometer and thermometer and go out to study the effects of weather LIVE in the environment?
This whole computer-modelling-stuff more and more appears to me like some kind of – pardon – scientific cybersex: Nice to look at, feels good, but no connection to real life whatsoever.

August 19, 2010 3:31 am

Les Francis says:
August 19, 2010 at 1:18 am
Doesn’t matter how good your computer and software are, G.I.G.O. still applies
Dont you know that the warmaholics have redfined GIGO. For their models it now means Garbage In Gospel Out.

Editor
August 19, 2010 3:34 am

At least we will now see “proof” of AGW so much faster………

Alexander K
August 19, 2010 3:44 am

This ‘news’ is all a bit sad, really, and reminds me of small and nerdy boys arguing the merits of their own model racing cars and aeroplanes. Is there any chance that, one day, these model-reliant fantasists may grow up and do real science?

August 19, 2010 3:48 am

Geoffrey Hoyle says:
August 19, 2010 at 3:09 am
I applaud the construction of such a complex piece of software but it cannot take the place of the nuts and bolts of scientific investigation whether theoretical or practical. Trying to predict the future is like trying to find the crock of gold at the end of a rainbow.
CFD is a set of mathematical theories waiting to be proven. The use of simulation is used across a broad field of endeavour from video gaming, formula one car design to VFX in the film industry. However I do wonder how many on us would set foot inside an aeroplane if CFD modelling where the only tool being used to create reliability.
I trust flying because I know engineers dedicated to solving the aviation industries problems post simulation. They are a small breed who still use blue tack, talcum powder and small scale models to solve the problems they investigate.
It would be nice to see some of these vast sums of money spent on simulation design channelled to better use?

I doubt whether Boeing would agree with you since they rely on Tony Jameson’s CFD code for their wing design.
http://aero-comlab.stanford.edu/jameson/boeing.html
“It is in recognition of those many contributions that The Boeing Company, on the occasion of his 60th birthday, presents to Antony Jameson a model displaying the many Boeing airplanes whose aerodynamics design was carried out with the aid of CFD technology and codes developed by him. The list of those airplanes begins with the Boeing 767, designed in the late 1970s. That was followed by the 757, the 747-400, the new Boeing 777, and the recentlyannounced Boeing 737-700 which embodies a new wing and other advanced features. Each of those airplanes is displayed on the model.
Within the spirit of modern airplane design practice, the model also contains room for growth. There is one model position reserved for a future Boeing 787 airplane, and another for a 797. Those airplanes are presently only a gleam in our eye, but when they are designed and built, they undoubtedly also will contain the imprint of Jameson’s computational methodology.”

Roger Carr
August 19, 2010 3:52 am

Geoffrey Hoyle says: (August 19, 2010 at 3:09 am) Trying to predict the future is like trying to find the crock of gold at the end of a rainbow.
But it is a crock they find, Geoffrey; so perhaps it will work to expectations…

August 19, 2010 3:53 am

The CESM is one of about a dozen climate models worldwide that can be used to simulate the many components of Earth’s climate system, including the oceans, atmosphere, sea ice, and land cover.
They need to be reminded that simulation is not replication. I have a picture of a roll cloud which formed where — theoretically — no roll cloud could possibly have formed.

David Holliday
August 19, 2010 3:59 am

They could have a computer 10 or 100 times faster than what they have today and it wouldn’t help their predictive accuracy. To mangle the words of the great playwright, “It is not in the computer to hold our destiny but in our understanding of climate.”

Ammonite
August 19, 2010 4:02 am

UK Sceptic says: August 19, 2010 at 1:04 am
What part of global warming has stopped do these people not understand?
The part where global temperature is at record or near record levels across the last twelve months in all the major temperature measuring systems…

Joe Lalonde
August 19, 2010 4:04 am

Quote:“Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
There is no care what-so-ever about using correct physics or data. The current physics in motion is incorrect in a major way as there was two paths to this area.
One being perpetual and the other a slow energy release. Science currently being used is in the perpetual of moving forever with NEVER a change. The slow energy release is the actual correct path as due to constant energy change as bodies slowdown and chemical compositions change.
Have they included that we travel at 1669.8 km/hr at the equator?
That this speed is faster as you move to the poles due to size changes on the rotational axis with the shape of this planet?
Planetary mechanics has NEVER been included as scientists flunked that lesson.

NS
August 19, 2010 4:08 am

It would be normal to take the predictions of past models from 10, 20 years and measure them against actuality. Then explain why the results were so and backcast with the new model including the original input paramaters from x years ago.
Basic sanity check.

Alex the skeptic
August 19, 2010 4:12 am

Cold Englishman says:
August 19, 2010 at 2:01 am
Does this mean that all previous models were wrong?
____________________________________________________________
Now that is the question. And of course it does not need an answer. And how much $ did they cost us? Billions?
And are we still trying to save the planet because these failed models told us the we were all gonna die?
They only model I like is Caudia Schiffer.

Mark
August 19, 2010 4:17 am

To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.
“And then, when the models are shown to overestimate actually observed temperatures, we get our friends at GISS to make the past conditions colder so the modelled warming DOES fit the actual observation. Simple.”

Pascvaks
August 19, 2010 4:23 am

The author of this PR was very likely in the used car business last week. Caveat Emptor!
No doubt it will prove many things the IPCC is interested in. Bon Voyage!
One day a SuperDuperComputer will gobble up all the data there is and we won’t need TV Weather forecasters to tell us what the likelyhood of rain is this afternoon at Ayer’s Rock.

Mike Edwards
August 19, 2010 4:24 am

Mike McMillan says:
August 19, 2010 at 1:02 am

So where can I download a copy?

The source code is available from the NCAR site here:
http://www.cesm.ucar.edu/models/cesm1.0/
So, they are doing this in open source mode – a big round of applause for that.
If anyone wants to see the details of how the model is built – it’s all there. You just need to understand both Fortran and C (to which – yuk!! at least from me – and I’ve programmed using both of these beauties in my time).
Instead of simply criticising, it would be good if folk from WUWT with relevant skills could take a look at the model and make an assessment of its approach – both the good points and the bad.

Joe Lalonde
August 19, 2010 4:30 am

Science has yet to figure out that the equator of the sun gives off more heat than the upper and lower hemispheres of the sun due to centrifugal force.
The position of our planet to the sun is very important to the amount of heat the planet absorbs from what the sun gives off of.
Are these observations included in the models?

DEEBEE
August 19, 2010 4:35 am

“Decision makers in diverse arenas need to know the extent to which the climate events they see are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible, human-influenced climate change,” Hurrell says. “CESM will be a major tool to address such questions.”
============================
What is it in human influence on the climate that makes it irreversible? Is it because we are fallen angels?
[REPLY: because ending all CO2 emissions requires we go back to a pre-industrial agrarian feudal existence, that just happens to only be sustainable if 2/3 of the population of the planet disappeared suddenly. So, who decides which 2/3? Same people telling you it’s all your fault, so guess who they pick? – mike]

Robert of Ottawa
August 19, 2010 4:46 am

Computer games are not experiments.

August 19, 2010 4:54 am

From the release:
. . . The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.

Does that mean it is open-source? Is all the code “freely available”? Are all the built-in parameters clearly stated? Are the assumptions used to create the parameters fully variable, so their values can be freely changed by scientists who are not programmers? Are the conditions under which new parameters can be added clearly specified? Is the model made available with all the datasets it draws upon, and are all sources of and modifications to that data “freely available” and transparent?
If not, why not? This model has been created with taxpayer money, so any researcher who wants to play with it should have complete and unfettered access.
Perhaps some of the experts who frequent this blog could look into these questions. Why should only climate alarmists be privy to the internal construction of such simulations?
/Mr Lynn

DJ Meredith
August 19, 2010 5:03 am

“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming…”
In other words, the output is predetermined.

Mark
August 19, 2010 5:03 am

“So is this the next version of the Sims? And when can I get a copy at Walmart?”
Nope. Try NHL 2011 (Lots of hockey sticks!)

1DandyTroll
August 19, 2010 5:04 am

Weather patterns are too complex to expand to decades and centuries they say, so what do they do but omit all the details that makes the weather pattern apparatuses work, and call it a day, pat their backs, cross their thumbs and hope the mention of global warming will pay for the updates.
What do people generally call a product that has been abstracted to the point of lunacy but homeopathy. So I welcome the Climate Homeopathy Soft Apparatus CESM to the daylight.
And did they name it community to attract communists or just lefties in general? O_o

August 19, 2010 5:08 am

When I posted my comment above, I had not seen this:

Mike Edwards says:
August 19, 2010 at 4:24 am
The source code is available from the NCAR site here:
http://www.cesm.ucar.edu/models/cesm1.0/
So, they are doing this in open source mode – a big round of applause for that. . .
Instead of simply criticising, it would be good if folk from WUWT with relevant skills could take a look at the model and make an assessment of its approach – both the good points and the bad.

Ditto. Perhaps a team of knowledgeable folk could work together to provide an independent assessment of the strengths and weaknesses of the model—and then let the world know. It’s time the media had input from more than one side.
/Mr Lynn

Cadae
August 19, 2010 5:14 am

I see they use neural networks in CESM’s CAM module. See the excuses for using this technology at http://www.cesm.ucar.edu/events/ws.2010/Presentations/AMWG/tribbia.pdf
To refute any claims that these models are based on physics you only need to point to this use of neural networks.

Tom in Florida
August 19, 2010 5:15 am

Every result needs to be prefaced by a statement such as:
“Based on climate conditions X,Y,Z and the assumption that these conditions will continue, the models show……….”
Anything less will be hogwash.

Village Idiot
August 19, 2010 5:20 am

Computer models? “Bah!..Humbug!” I say! And “Bah!” again. That’s the problem with the world outside our Village – computers! What use have we of these machines? Modelling weather, modelling proteins, businesses, engineering…and now climate!!
These ‘scientists’ with phoney titles pretending to know something about the climate, trying to indoctrinate us with their lies!! We know we can learn of matters climatical here, safe in our Village. Why the Oracle Goddard is living proof that no so called ‘qualifications’ are needed to possess all geophysical knowledge, and beat these people at their own game.
Keep faith, the Great Cooling is fast approaching…

brokenhockeystick
August 19, 2010 5:24 am

Ammonite says:
“August 19, 2010 at 4:02 am
UK Sceptic says: August 19, 2010 at 1:04 am
What part of global warming has stopped do these people not understand?
The part where global temperature is at record or near record levels across the last twelve months in all the major temperature measuring systems…”
There you go disagreeing with Phil Jones who says there’s been no warming in the past decade.
These record levels would be from the mangled/homogenised/cmpletely discredited temperature records used by CRU, GISS etc which have been inexplicably adjusted upwards, rather than downwards, to account for UHI effects, as pointed out in countless posts on this site.

starzmom
August 19, 2010 5:27 am

Now I understand why it is necessary to continually adjust and readjust historical data.

August 19, 2010 5:27 am

The points made that all the modeling in the world is worthless if the basic data is corrupted, due to the GIGO factor, is spot on.
It is heartening, at least, to see a refreshing appearance of openness with this latest attempt. Despite trepidation about the difficulties with attempting to model a dynamic, chaotic system, I think it’s a bit unseemly to automatically hip shoot with aspersions as to the designer’s and developer’s intentions, even though it’s likely that most of those rocks are being tossed by folks who won’t be doing what is done best around here, which is to roll up the sleeves, and dive into the eye watering and brain pounding lines of code to validate its accuracy or utility, insofar as that is relevant, even though a number of folks have expressed the opinion that such is a Quixotic quest.
Full disclosure – I’ll admit that upon seeing this a phsyorg.com yesterday, my intitial reaction was to cast the write-up in the same voice and timing as Dan Akroyd selling the Bass-o-matic. Still it does appear, prima facia, to be a good faith attempt towards further understanding – and instead of chucking rocks at the folks attempting it, it would probably be best to become as cognizant as possible about both the strengths (if there are any) as well as the pitfalls of this effort, which according to the press release, is on track to be a significant factor, or “forcing” if I may borrow the phrase, for the continued deliberations and work of the IPCC, which, like it or not, isn;t going away any time soon, and will be relied upon, rightly or wrongly, by policymakers and politicians worldwide. Simply dismissing it will not prepare anyone to be in a position to rationally engage in the discussions which are sure to continue on this topic, given the imprt it has to society, at least from a political aspect, even if you believe that the planet is just gonna do what the planet is just gonna do, regardless of silly human shennanigans.

simpleseekeraftertruth
August 19, 2010 5:28 am

They are going to need two computers as this one is only built for the Northern hemisphere!
“The figure also captures ……. including warmer air moving north on the eastern side of low-pressure regions and colder air moving south on the western side of the lows.

Al Gore's Holy Hologram
August 19, 2010 5:30 am

GIGO

Enneagram
August 19, 2010 5:37 am

Is that an expression of progress in science or, rather, of progressive science?

ShrNfr
August 19, 2010 5:43 am

As all of us who used computers used to say Garbage In, Garbage Out. It was that way when I got my PhD in EE from MIT in the 70s and has not, nor will it ever, change.

latitude
August 19, 2010 5:47 am

“Because climate models cover far longer periods than weather models, they cannot include as much detail.”
Now if this isn’t the total opposite, I don’t know what is.
You would need even more “detail” in a climate model.
The further out to take it, the more it’s going to amplify any mistakes or anything you leave out.

Shevva
August 19, 2010 5:48 am

Does the model include natural phenomino such as volcano’s exploding? what about the much hyped cloud’s? or is this just a simple model that you input temp into a CO2 model and it tells you the erath will heat up and up till we all cook?
I can understand modelling a building before it is built as we understand the physics behind it but modelling a planets future climate without understanding the climate you are trying to model, sorry but computers and nice useful tools but they cannot do the work of modelling a climate its the human input that does this and if you don’t full understand something how can you model it?
I’d do a simple test and model 2005-2015 and then see how it gets on, my guess is the finger in the air method maybe as accurate.

Dave F
August 19, 2010 5:50 am

Biogeochemistry? Put the horse before the cart…

August 19, 2010 5:54 am

Another expensive screensaver.

Jim
August 19, 2010 5:56 am

The implication that a computer climate model run is a “climate experiment” is just wrong and also misleading. An experiment would involve manipulation of the Earth’s climate in some manner and comparing the outcome to a control Earth. A computer model run isn’t that.

Frank K.
August 19, 2010 6:00 am

I applaud the NCAR effort. Please note the difference between this:
http://www.cesm.ucar.edu/models/cesm1.0/
and this…
http://www.giss.nasa.gov/tools/modelE/
I would like to know why we, the taxpayers, are funding (at least) two identical research efforts! We should consolidate ALL climate GCM research at NCAR – that would save a lot of money and resources.

Ed Fix
August 19, 2010 6:05 am

I have a bit of experience in computer modeling, and here’s a basic conundrum that many modelers don’t recognize.
The unspoken assumption behind the kind of detailed modeling this article is describing is that if you can just get all the sub-processes right, and get all their multifaceted interactions right, and get all the initial conditions and inputs right, then the model will mimic the emergent behavior of the real system.
In a computer model, all data, all processes, and all interactions are necessarily approximations. To get a reasonable answer, some approximations must be more accurate than others, but there’s no way to know in advance which those are. If you get something wrong, then the model will fail in an unexpected, sometimes spectacular way.
The more detailed the model, the greater the chance that you have something wrong, and that your model will fail as a result.
The die-hard modeler’s reaction to his model’s failure is often to make his model even more detailed and complex. He feels he’s made progress when he fixes any one particular problem, but there’s an unexpected failure in another area, so he works on fixing that, and the cycle continues.
In the end, you have a minutely detailed, hugely expensive, insanely compute-intensive behemoth that actually performs no better than the simple model you started out with. There’s just this one more little detail you need to fix, if only you can get the funding. You’ve learned a great deal about computer modeling, but have really added very little to your understanding of the process you were trying to model in the first place. And you’ve spent an unconscionable amount of money and man-hours doing it.

Henry chance
August 19, 2010 6:08 am

I used fortran and wrote programs for quant analysis and forecasting several decades ago. I read the line of instruction that Jones referred to in e-mails that obviously “evolved” temp readings from the earlier part of the last century downward.

The CESM1 Timing Table provides performance data that will continue to evolve due to changes in the model, machine hardware and input from the user community.

http://www.cesm.ucar.edu/models/cesm1.0/
So historical data continues to “evolve” so the models can be helped to forcast what the people need to believe.
This method is not new. It was applied in reading tea leaves.

Martin Brumby
August 19, 2010 6:10 am

Since the alarmists have already decided that 2010 is the hottest year ever in the total history of the universe (correct to 17 decimal places), not sure what this new fancy model will add.
Meanwhile, back in the real world, the Climate Just Keeps On Doin’ Whatta Climate’s Gotta Do……..

August 19, 2010 6:12 am

The last paragraph in the news release gives me some hope that they intend to do some objective research rather than trying to find proofs that CAGW is an irreversable problem. It sounds like this will be an improvement of the Reanalysis model (that does not include CO2 as a factor). http://kidswincom.net/CO2OLR.pdf.

jack morrow
August 19, 2010 6:12 am

I wonder if this “New Fangled Computer”(NFO) includes anything about the things mentioned by the previous post by Paul Vaughan or anything about planetary influence or any other out side factor? It seems to me to just be the same stuff just made to look better and get more attention.

jack morrow
August 19, 2010 6:13 am

(NFC)

Martin Lewitt
August 19, 2010 6:15 am

“Most of the simulations in support of that assessment are scheduled to be completed and publicly released beginning in late 2010, so that the broader research community can complete its analyses in time for inclusion in the assessment. ”
If the IPCC adheres to past practice, it will report the model projections without qualification as if they are relevant anyway, even though the diagnostic literature shows they errors far larger than the energy imbalance of interest. Watch for publication of model results before the diagnostic work has been done. These models should be assumed to have the correlated diagnostic issues that the past models have had until they prove that they don’t, e.g., the correlated surface albedo bias published by Roesch, the reproduction of less than half the increase in precipitation published by Wentz, the inability to reproduce the amplitude of the observed response to the solar cycle published by Camp and Tung and separately by Lean, the radiative tropical radiative imbalances published by Lindzen and separately by Spencer, etc. It will be interesting to use the average daily tropical thunderstorm feedback published by Eschenbach as an additional diagnostic.

Patrick Davis
August 19, 2010 6:19 am

“Mike Edwards says:
August 19, 2010 at 4:24 am
The source code is available from the NCAR site here:
http://www.cesm.ucar.edu/models/cesm1.0/
Yeah, any programmer knows about source code, blah blah blah, to build the model, the difference here is what are the data inputs? Is it raw, real, observerd or “adjusted”, homogenised, any “fudge factors built in”, “tricks”, “assumptions” and/or “invertions” used? I bet it’ll prove to be worse than ANYONE thought or any past GCM “simulated”.

Solomon Green
August 19, 2010 6:22 am

How many variables does this model contain? Lorenz found that 12 variables each with as little as 0.001% error in initial input was hopeless for forecasting weather (chaos theory). OK climate is not weather. But the same applies, the greater the number of variables the less accurate the model as a forecasting tool, no matter how well it has been calibrated to past events.

August 19, 2010 6:23 am

There is nothing in the press release about the model including a sensitivity of CO2 to temperature, as well as a sensitivity of temperature to CO2. Unless this fundamental but common fallacy has been corrected, GIGO. Of course, a model with significant values for both sensitivities cannot match observations over any substantial interval, but at least this would expose the absurdity of the CO2-induced-warming hypothesis.

stephen richards
August 19, 2010 6:35 am

Phil. says:
August 19, 2010 at 3:48 am
Another of your pathetic strawmen. Engage brain before opening mouth.
Aero dynamics and climate for the next 1000 yrs. Oh yes, I can see the similarites. Like hell.

stephen richards
August 19, 2010 6:36 am

Fred H. Haynie says:
August 19, 2010 at 6:12 am
The last paragraph in the news release gives me some hope that they intend to do some objective research rather than trying to find proofs that CAGW is an irreversable problem. It sounds like this will be an improvement of the Reanalysis model (that does not include CO2 as a factor). http://kidswincom.net/CO2OLR.pdf.
Fred I sure hope you are right. But nothing they have done in the past would allow me to arrive at the same conclusion as you.

Chuck L
August 19, 2010 6:38 am

Oh, goody, another computer model:
“The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC). ”
“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
■What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?
■How will patterns in the ocean and atmosphere affect regional climate in coming decades?
■How will climate change influence the severity and frequency of tropical cyclones, including hurricanes?
■What are the effects of tiny airborne particles, known as aerosols, on clouds and temperatures?
It seems to me that the result of the climate simulations has already been predetermined, after reading the press release, and that result wil be (drumroll….)
“IT’S WORSE THAN WE THOUGHT!”

Alan the Brit
August 19, 2010 6:39 am

Joe Lalonde says:
August 19, 2010 at 4:04 am
Quote:“Thanks to its improved physics and expanded biogeochemistry, it gives us a better representation of the real world.”
I forgot that word too!
OED says Representation: description or portrayal or imagination, place likeness of before the mind or senses, serve or be meant as likeness, symbolise, act as embodiment of, stand for, be a specimen for, fill place of, work of art (now we’re getting close) portraying something, be a substitute for………………………!
It doesn’t actually define is as the actual or real thing. So, are these all little clues to what is actually being done in these models? Are they merely a bunch of sophisticated, simulated, representative, mathematical equations? In other words they’re guessing? I recall as a young engineering student that during examinations, one was not permitted to take a “programmable” calculator into the exam hall, only basic function ones, because some smart arse could use one or two parameters given in the questions to produce the right number, without actually knowing or understanding any engineering principles! They’ll develop a model to represent sex next then we’re really done for!

Frank K.
August 19, 2010 6:40 am

Phil. says:
August 19, 2010 at 3:48 am
Phil. – meet the Boeing wind tunnel…
http://www.boeing.com/news/releases/2003/q2/nr_030616i.html
“Fewer tests are needed today because modern computational fluid dynamics (CFD) tools allow designers to consider and run virtual experiments on designs with a higher degree of confidence than ever before. Only those designs that show real promise will make it to the wind tunnel.”
“Our computer modeling tools allow us to predict much more closely what will transpire in the wind tunnel, which then accurately verifies the flight performance of the airplane,” Bair said. “This process gives us the opportunity to refine our designs with computers long before we ever get to the wind tunnel. We are looking for the design that will allow the airplane to fly most efficiently.”

I would also note that the aircraft aerodynamics problem is about an order of magnitude simpler than the physics that are purportedly being modeled in a climate code. Of course, with codes like GISS Model E, you really don’t know what the heck they’re modeling…

red432
August 19, 2010 6:42 am

Take dozens of chunks of software using formulas based on guess work and hook all their inputs and outputs together. Then bootstrap the process with thousands of guessed input parameters and let the thing chug away for months. The result? Impressive looking nonsense. Illusionists of old had nothing on computer models. I do like to watch the animations, though — they’re cool.
It’s not clear to me that if you greatly simplify any component of a climate model that anyone really knows when the results of the computation have any meaning or not. As I mentioned before protein folding is many orders of magnitude simpler than any component of a climate model, and no one knows how to correctly emulate protein folding using any amount of computational power.

Michael Schaefer
August 19, 2010 6:42 am

DJ Meredith says:
August 19, 2010 at 5:03 am
“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming…”
In other words, the output is predetermined.
——————————————————–
“SIGH!” I agree…

ian middleton
August 19, 2010 7:02 am

This is the bit that worries me:
USING the CESM, Hurrell and other scientists HOPE TO LEARN more about ocean-atmosphere patterns such as the North Atlantic Oscillation and the Pacific Decadal Oscillation.
So now the computer is teaching the scientists. I don’t think so buddy. A computer is nothing more than a sophisticated slide rule. Call them out on this, if it so good at predicting or defining future climatic conditions, whats the weather going to be like on 25th december 2010 in Sydney, Australia, to the nearest 5 degrees. Any scientist who puts their name to this in the next IPCC report is (1) not a scientist, (2) a complete prat.
Sorry for the rant but had I said what I really meant I would be banned from the blog.

Warren in Minnesota
August 19, 2010 7:03 am

The press release starts with the word SCIENTISTS and continues with SCIENTISTS do this and SCIENTISTS do that. I think a shortened version of the press release could be stated as this:
SCIENTISTS continue to play with their computer simulations to find human-influenced global warming for the IPCC assessment.
But I don’t think that scientists do that.

August 19, 2010 7:05 am

The opening of the NCAR press release states: “Modeling climate’s complexity. This image, taken from a larger simulation of 20th century climate, depicts several aspects of Earth’s climate system.”
What does it mean that the image shows Greenland without its ice cover? Is Greenland’s ice cover not one of the aspects of Earth’s climate system?
I wonder what their depiction of the Antarctic shows. Have they done away with the antarctic icecap, too?

Enneagram
August 19, 2010 7:10 am

This says it all: The Community Earth System Model (CESM) will be one of the primary climate models used for the next assessment by the Intergovernmental Panel on Climate Change (IPCC)
Assessment:
Definition: assignment of fee, amount .
Hands up!, don’t move!

Enneagram
August 19, 2010 7:13 am

Or should we read?:
The Communist Earth System Model (CESM) will be one of the primary climate models used for the next assessment of Global Governance.

Gail Combs
August 19, 2010 7:22 am

“The new model’s advanced capabilities will help scientists shed light on some of the critical mysteries of global warming, including:
* What impact will warming temperatures have on the massive ice sheets in Greenland and Antarctica?…”

If you start with a bias – What impact will warming temperatures – then the model is absolute garbage because of the scientists built in prejudices.
There is a darn good chance we are looking at a cooling trend. And a cooling trend is going to be a lot more damaging that a warming trend. The CAGW type scientists KNOW there is a possibility of a COOLING trend.
Item number four over at Joe Romm’s Climate Progress states:
“4. Absent human emissions, we’d probably be in a slow long-term cooling trend due primarily by changes in the Earth’s orbit — see Human-caused Arctic warming overtakes 2,000 years of natural cooling, “seminal” study finds”
He is talking about the Milankovitch cycles and the fact the sun’s apparent energy has declined by 9% since the beginning of the Holocene.
This paper puts it a little more bluntly: Lesson from the past: present insolation minimum holds potential for glacial inception (2007)
“Because the intensities of the 397 ka BP and present insolation minima are very similar, we conclude that under natural boundary conditions the present insolation minimum holds the potential to terminate the Holocene interglacial. Our findings support the Ruddiman hypothesis [Ruddiman, W., 2003. The Anthropogenic Greenhouse Era began thousands of years ago. Climate Change 61, 261–293], which proposes that early anthropogenic greenhouse gas emission prevented the inception of a glacial that would otherwise already have started….”
So why is that possibility not mentioned as a problem to be investigated? The slide into an Ice Age has got to be a lot more “catastrophic” that a minor amount of warming and it sure would be nice to have an idea of when that is going to happen as it inevitably will.

Buffoon
August 19, 2010 7:30 am

“Scientists can now study climate change” using “The Community Earth System Model
How
Can
Millions
Of
People
Not
See
The
Problem
With
That

Charles Higley
August 19, 2010 7:43 am

Problems (a short list):
Resolution still not adequate.
Is the water cycle heat engine included?
Are the solar influences included? The 80 and 200 year cycles (a minimum approaches)?
ENSO, EL Nino, La Nina?
The Earth’s tilt as it varies?
And, it appears that even the Jet Stream should be in there.

Editor
August 19, 2010 7:54 am

ShrNfr says:
August 19, 2010 at 5:43 am
> As all of us who used computers used to say Garbage In, Garbage Out. It was that way when I got my PhD in EE from MIT in the 70s and has not, nor will it ever, change.
Sure it has, now it’s Garbage In, Gospel Out. (Actually, that was a CMU observation from the 70s.)
——-
More seriously, I think this model has gotten too little attention at WUWT. It’s been readily available as source for years but has been pretty much ignored here.
Mike Edwards notes “Instead of simply criticising, it would be good if folk from WUWT with relevant skills could take a look at the model and make an assessment of its approach – both the good points and the bad.”
I agree with the sentiment, but given the tenor of comments here I think any rational analysis would be drowned out by knee jerk criticism. There’s over 100 comments here – about a dozen have something that adds to understanding (this comment is not one of them). The rest make it hard to find the useful ones.
If I had time to dig into this, I’d be much more inclined to do it as a project at http://chiefio.wordpress.com/ and report back here with guest posts.
Apparently the model can run on personal computers, the recent reports (don’t see the link off hand modeling the American Southwest was done with that code on a smallish computer.

Foley Hund
August 19, 2010 8:01 am

Modeling is fun. We have models of just about everything. In the end, they are what they are: models; a pretend representation, an approximation of realism. We can walk away from flight sims, and battles sims…but climate modellers want us to live the fear they model.

Editor
August 19, 2010 8:01 am

Typo in last paragraph, but found the link I wanted, http://wattsupwiththat.com/2010/07/09/modeling-the-big-toasty/#comment-426718

jorgekafkazar
August 19, 2010 8:12 am

[SNIP: Citing biblical verse is not an argument… – mike]

Richard M
August 19, 2010 8:22 am

starzmom says:
August 19, 2010 at 5:27 am
Now I understand why it is necessary to continually adjust and readjust historical data.

Yes, the main reason behind Mann’s hockey stick has always been to provide a basis for claiming that CO2 is the cause of warming.
I wonder what historic record these models are now using? I’d bet they still use the hockey stick which makes them completely invalid.
As for improvements … how does taking something from .00000000001% accuracy up to .0000000001% accuracy make any difference whatsoever? It’s still no more valid than a wild guess.
How can there ever be any value in using an incomplete model to make predictions? It’s just plain silly.

Jimbo
August 19, 2010 9:04 am

Brad says:
August 19, 2010 at 12:40 am
So the conclusion is already drawn, temps are going up in their model no matter what the data says…just look at the unscientific questions they are asking.

——————
“Modellers have an inbuilt bias towards forced climate change because the causes and effect are clear.” Gavin Schmidt, Michael Mann et. al. Source [pdf] 2004
——————
So, badly placed thermometers, guessing temps for unmonitored regions, inbuilt bias of modellers, unknown unknowns / new discoveries, contradictory evidence, politicization of science, lucrative funding etc. and they expect their model to predict what the climate will be like at the end of the century. We have had enough already of their failed predictions and IPCC forecasts.
Household garbage in utter rubbish out.

Alan F
August 19, 2010 9:09 am

This mean Canadian smoothing is going to get decent billing in the credits for The Sims: Climate Edition? After playing such an important part in the last game’s outcome, we won’t settle for being just above the grips.

Tommy
August 19, 2010 9:12 am

I see the source code is available here:
http://www.cesm.ucar.edu/models/cesm1.0/cesm/cesmBbrowser/
It’s written in f90 (fortran 90).

Pete of Perth
August 19, 2010 9:28 am

The latest model reminds me of upgrading from XP to Vista – same crap different wallpaper

Alan F
August 19, 2010 9:35 am

Foly Hund,
“Modeling is fun. We have models of just about everything. In the end, they are what they are: models; a pretend representation, an approximation of realism. We can walk away from flight sims, and battles sims…but climate modellers want us to live the fear they model.”
The ELOC on this thing won’t be any better than every other program out there. The more complex, the worse things are and if there’s seriously wonky biasing, it’ll be buried deep in some obscure check with an imaginative injection vector. Fun fun fun finding that by hand!

jorgekafkazar
August 19, 2010 9:43 am

For more information on the program structure, see also:
http://www.ccsm.ucar.edu/models/

Zeke the Sneak
August 19, 2010 9:52 am

The importance of this technological advancement cannot be overstated.
The day has apparently arrived where a highly trained and specialized individuals can look into a silicon-based device and be able to forcast the past, look into the present, and forcast the future.
These insights gained by utilizing this cutting edge silicon processing technology may be used to inform and advise policy makers on important decisions in diverse areas regarding taxation, weath distribution, land use, power consumption, etc.
I suppose the difference between this and the ancient art of scrying into a crystal ball is that these are not Druids, they are climate scientists. And critically, the implement used is not composed of flawless SiO2, but it is based on Si. These differences cannot be emphasized enough.

Bernd Felsche
August 19, 2010 9:55 am

I see no working group on data collection and quality assessment?
How can they possibly initialise a chaotic model without a clear definition of system state?
GIGO == Garbage In; Grants Out.

Robert
August 19, 2010 10:10 am

seems to me like this will be a model that will just project continents as warm, even if there will be a lot of cold in the continent, because the model can only pick up regions, and its going to be used by the IPCC, who will try to cherry-pick the data to say its warmer then it is
Similar situation to what happened this summer in Russia, Moscow and St. Petersburg were really warm, though a lot of the nation was also extremely cold. But no one focuses on the cold, just the warmth, and then proclaims the whole area is really warm, when in fact it isn’t

Frank K.
August 19, 2010 10:12 am

Tommy says:
August 19, 2010 at 9:12 am
I see the source code is available here:
http://www.cesm.ucar.edu/models/cesm1.0/cesm/cesmBbrowser/
It’s written in f90 (fortran 90).

A quick look at the source code shows me that it’s MUCH better written than GISS Model E…and they have real documentation.

Tom Scharf
August 19, 2010 10:52 am

I totally support the effort here. I totally disagree with the merit many “scientists” give its results, which are simply speculative returns based on inaccurate input (what exactly was the aerosol concentrations across the earth in 1947?).
Nice press release, sounds like it is perfect, I’m interested in:
* What do the modelers feel are the greatest challenges of current modeling? How do they plan to overcome these?
* What do the modelers (not their boss) say about the reliability of the current model, from a high level perspective?
* Does it properly model the last decade’s “pause” in warming? What was changed in the model to allow this to happen?
* Does it predict warming at the same rate as previously? What is the estimated CO2 forcing? Why should we have to wait for an IPCC report for this result?
* If it can predict droughts, when, where, and how long are the next several going to happen?
* What do the modelers believe would be the earliest indications that their model is not performing as well as they wish? What unique characteristic do they look for as an early indicator of performance?
* Does it predict a tropospheric hot spot as previous models did (which has shown to not be accurate)? What was changed to allow this?
* Does this model show it is worse than we thought?! (Just kidding)
Note: I don’t believe they haven’t done runs on this model and they don’t know how it performs in general. Call me a cynic, but I say it would not be released if it hasn’t shown to give “correct” results.
I agree with a previous poster that taking a model that is not known to work very well and adding additional layers of complexity is not likely to encounter success.
As much as the modelers want to hide it, these models are highly tuned so that they can backcast previous climate with some degree of accuracy, which is fine. It is this “hand” tuning and the inaccuracy of the climate input / starting conditions that make the future predictions speculative at best. Claims of “it’s just physics” stretch credibility.

John Luft
August 19, 2010 10:56 am

Oh good. A quicker computer to do stupid things faster.

August 19, 2010 10:58 am

Can they run CET record backwards?
http://www.climate4you.com/images/CentralEnglandTempSince1659%20880pixel.gif
Wake me up, if the model will match it.

August 19, 2010 11:14 am

Ric Werme says:
August 19, 2010 at 7:54 am
______________________
Can you provide a link to a documented source? I base my criticism of carbon-control models on my inability to find any mention of the sensitivity of CO2 to temperature in the IPCC AR4 or in any paper that purports to have model output similar to the empirical record in phase, amplitude and spectrum.

August 19, 2010 11:33 am

Sorry, I should have read the other comments as well, as there are links aplenty. I’ll look for sensitivities.

BobA
August 19, 2010 11:36 am

Hmm, I wonder if they got the data for this model from NOAA-16?

August 19, 2010 11:41 am

The connection between spiffy new software, fancy new computers, and man’s total knowledge of how climate works remains elusive.
Are they studying computers now? As a former computer engineer, I understand GIGO.

M White
August 19, 2010 12:07 pm

“To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.”
Similar models were used to prove that nighttime and daytime were caused by the sun orbiting the earth.

Brian H
August 19, 2010 12:42 pm

The model(s) will exhibit chaotic instabilities due to unpredictable interactions of parameters; these sensitivities will match observational fluctuations only by sheer chance, and in narrow bands.
We need models to emulate and predict the behavior of the models.

899
August 19, 2010 1:40 pm

After a measured amount of cogitation on this matter, I’ve come to the conclusion that the people who propose or put forth these ‘models’ with pictorial results for display, are actually engaging in the worst sort of propaganda.
The very terrible thing about that is that many —dare I say most?— layman haven’t a clue about what’s being placed under their noses.
Think about it: Those laymen see the pretty pictures with colors calculated to induce the mind to draw a conclusion: The one intended. Red = blazing Hades, while blue = freaking cold. And white = The Great White North, i.e., Canada.
And of course, it doesn’t help any when the MSM resorts to giving the impression that the models represent the REAL WORLD as opposed to what really represent: Make believe.
Instead of playing with models, why don’t the so-called ‘researchers’ actually resort to using actual data which is currently available?
But that wouldn’t attract grant money, now would it?

ANH
August 19, 2010 1:44 pm

The article said it is freely available. Is that true? Is the source code freely available?

August 19, 2010 2:08 pm

Phil.: August 19, 2010 at 3:48 am
I doubt whether Boeing would agree with you since they rely on Tony Jameson’s CFD code for their wing design.
[…]
“The list of those airplanes begins with the Boeing 767, designed in the late 1970s. That was followed by the 757, the 747-400, the new Boeing 777, and the recentlyannounced Boeing 737-700 which embodies a new wing and other advanced features. Each of those airplanes is displayed on the model.
[…]
There is one model position reserved for a future Boeing 787 airplane, and another for a 797. Those airplanes are presently only a gleam in our eye, but when they are designed and built, they undoubtedly also will contain the imprint of Jameson’s computational methodology.”

The B-777 went $2 billion over budget just in the R&D because of problems with the coding, and went a year behind schedule because the metal-benders couldn’t reconcile the fly-by-wire system (not just in the wing) with the computer-generated schematics.
BTW, the “recently-announced B-737-700 first rolled off the production line in 1996, and the B-787 is three years’ overdue for delivery to the first customer — problems with the fuselage needing structural reinforcement during *actual flight testing*, y’know…

August 19, 2010 2:27 pm

Again, I can’t find any mention of the sensitivity of CO2 to temperature. Perhaps this and similar models might be reasonable for a sterile planet, but on earth the biosphere absorbs and releases CO2 at rates that depend on temperature.

kwik
August 19, 2010 2:33 pm

I like the new understanding of GIGO.
Garbage In, Grants Out.
Priceless!

August 19, 2010 2:55 pm

stephen richards says:
August 19, 2010 at 6:35 am
Phil. says:
August 19, 2010 at 3:48 am
Another of your pathetic strawmen. Engage brain before opening mouth.

We’d be a lot better off if you kept your mouth shut, is there even a brain there to connect too? Geof Hoyle raised the subject of aero-design and suggested that “It would be nice to see some of these vast sums of money spent on simulation design channelled to better use?” I pointed out that Boeing thought the money spent was very worthwhile indeed.

August 19, 2010 3:03 pm

ANH says:
August 19, 2010 at 1:44 pm
The article said it is freely available. Is that true? Is the source code freely available?

If you follow the link in the original post you’ll see where you can download the code: http://www.cesm.ucar.edu/models/cesm1.0/

Jim Barker
August 19, 2010 3:33 pm

I’m not sure it would help, but my Ouija board will be available for loan, when I get it back from calibration;-)
Seriously, they seem to be implying that this new system is capable of solving at least an n-body problem and that Only involves a few bodies and gravity. They are really capable of modeling all of the molecules on Earth, in all of their interactions?

August 19, 2010 3:44 pm

So these models take into consideration air temperatures, ocean temperatures, sea ice, land coverage, and greenhouse gases.
Do these models take into consideration the ten fold increase in global cooling due to clouds over the trival change in global warming from ghg?
Do these models take into consideration planetary mechanics, the titanic forces in climate change from varying Sun spot cycles, coronal mass ejections, etc.?
Do these models take into consideration the impact of changing atmospheric chemical reactions created by the cyclical bombardment of xrays, cosmic rays, and uv rays?
If not, the latest models are woefully inadequate to predict anything. They are as predictive as looking at an oil pressure gauge to see what direction your car is moving.

AndrewG
August 19, 2010 3:50 pm

I keep looking for the sentance “Accurately predicted the temperature patterns in 2010 when fed the data from 1990-2000” but it just isn’t there.

SSam
August 19, 2010 3:55 pm

I know they don’t mean it the way it sounds… I think.
…CESM source code is distributed through a public Subversion code repository…

EthicallyCivil
August 19, 2010 4:42 pm

I wonder if they included the empirical feedback models from Lindzen and Choi in their “improved physics”. Of course, moving to the stable half of phase space wouldn’t be as exciting as the old result.

Enneagram
August 19, 2010 4:49 pm

Just compare the image above, full of red, with the following:
http://weather.unisys.com/surface/sst_anom.html
Do they count on fools?, but, surprise, the majority of healthy and normal people are not fools, so beware.

August 19, 2010 6:00 pm

Unfortunately there is nothing static about the Earth’s climate system. It is not a closed system. The basic parameters are changing in a way that we do not understand and can not explain: See http://climatechange1.wordpress.com/

Tom Harley
August 19, 2010 6:12 pm

I would much sooner believe that much cheaper super computer between the ears of farmers, fishermen, geologists etc. whose outdoor lives depend on knowledge of the climate for a living.

H.R.
August 19, 2010 6:53 pm

erlhapp says:
August 19, 2010 at 6:00 pm
“Unfortunately there is nothing static about the Earth’s climate system. It is not a closed system. The basic parameters are changing in a way that we do not understand and can not explain: See http://climatechange1.wordpress.com/
Good reminder. And another reminder (geological perspective); earth’s climate never repeats exactly, and has changed from states that will probably never be seen again. And undoubtedly, earth’s climate will change to states that have never been seen before. Model that!
(h/t to anna v, who posted an observation of one of the Greek philosophers, “You can never cross the same river twice.”)

August 19, 2010 6:54 pm

This is really cool. It is fantastic how NCAR makes their codes easily and freely available.

Mark W
August 19, 2010 7:38 pm

GIGO in climate science = Garbage in Gospel out…

Ammonite
August 19, 2010 8:09 pm

brokenhockeystick says: August 19, 2010 at 5:24 am
These record levels would be from the mangled/homogenised/cmpletely discredited temperature records used by CRU, GISS etc which have been inexplicably adjusted upwards, rather than downwards, to account for UHI effects, as pointed out in countless posts on this site.
There are significant signs that the planet is warming independent of the temperature data. Species are migrating away from the equator and to higher altitudes. The Greenland and Antarctic ice sheets are losing mass and so are glaciers from all the world’s major mountain chains.
If you haven’t already done so, please examine the GISS website for their analysis of UHI. It is very accessible and interesting and places the effect in its proper context with respect to the temperature record – real, but small.

pwl
August 19, 2010 8:47 pm

“Because climate models cover far longer periods than weather models, they cannot include as much detail. Thus, climate projections appear on regional to global scales rather than local scales. This approach enables researchers to simulate global climate over years, decades, or millennia. To verify a model’s accuracy, scientists typically simulate past conditions and then compare the model results to actual observations.”
That’s a very bold claim, in fact it’s an extraordinary claim that needs extraordinary hard evidence. Produce such evidence forthwith NCARS, otherwise you’re just another doomsday soothsayer in the same league as Nostradamus.

pwl
August 19, 2010 10:23 pm

Mr Lynn quotes ” . . . The CESM and its predecessors are unique among these models in that they were developed by a broad community of scientists. The model is freely available to researchers worldwide.”
and then asks some good questions:
“Does that mean it is open-source?”
It seems to be. Mostly it looks like Fortran 90 programs. Shivers.
“Is all the code “freely available”?”
It seems to be under various open source licences.
“Are all the built-in parameters clearly stated?”
That I don’t know. It’s cryptic so it will require decoding.
“Are the assumptions used to create the parameters fully variable, so their values can be freely changed by scientists who are not programmers?”
Don’t know. Must study it.
“Are the conditions under which new parameters can be added clearly specified?”
Don’t know.
“Is the model made available with all the datasets it draws upon, and are all sources of and modifications to that data “freely available” and transparent?”
They have over one terabyte of data available and don’t want it all downloaded but just want people to use what they need. How the heck do they get one terabyte of data? I’ve not yet looked at what it is but yikes that’s one heck of a lot of data for sure. It would take quite sometime to download it all even with a high speed connection.
I doubt very much that the “modifications” are documented. I’d be pleasantly surprised if they were.

August 20, 2010 3:13 am

Phil.: August 19, 2010 at 2:55 pm
I pointed out that Boeing thought the money spent was very worthwhile indeed.
At that time (1994), it was *very* worthwhile, because Boeing used a lot of NASA-derived technology — which the US taxpayers paid for. To be fair, Boeing *did* re-imburse NASA for the wind-tunnel testing portion.
That’s not a knock at Anthony Jameson, btw — he’s one of aviation’s shining stars.

Editor
August 20, 2010 6:14 am

Frank K. says:
August 19, 2010 at 6:00 am
I applaud the NCAR effort. Please note the difference between this:
http://www.cesm.ucar.edu/models/cesm1.0/
and this…
http://www.giss.nasa.gov/tools/modelE/
I would like to know why we, the taxpayers, are funding (at least) two identical research efforts! We should consolidate ALL climate GCM research at NCAR – that would save a lot of money and resources.
—…—…—
I would politely disagree though:
I DEMAND that (at least) 2 different and independently-produced models compete with each other.
Preferably, each model would be privately (competitively and for-profit) production so government cannot interfere with funding rewards (for their zealots) and funding restrictions (against their critics) – and today’s international governments desperately want today’s Mann-made CAGW theories to be validated so the UN, IPCC, GISS, NASA, NOAA, NISDC, Penn State, democrat party, ecologists, and “community organizers” (etc. etc. etc.) can continue their agendas.

Richard M
August 20, 2010 6:56 am

I think the standard phrase of GIGO, no mater what you call it, is not the primary problem. Even if you had perfect data an incomplete program is worthless.
I look at the GCMs as equivalent of a bridge that is only 1% complete (or much less). Just how useful would that be to someone wanting to use the bridge? If you don’t sufficiently understand the system being modelled, it can’t be modelled … it’s that simple.

Martin Lewitt
August 20, 2010 7:50 am

The models are not worthless. They have already contributed qualitative insights into the climate that have been confirmed with followup observations. However, the recent warming is an energy imbalance of less that 1 W/m^2 globally and annually averaged and not within the range of quantitative skill of the models. All the AR4 models have significant positive feedback to CO2 forcing and there is no model independent evidence that the feedback to CO2 forcing is positive rather than negative for the current climate regime. Models with correlated errors in several areas larger than 1W/m^2 and that aren’t able to reproduce the amplitude of the observed water cycle and solar cycle responses or the cloud or surface albedo feedbacks don’t give us confidence in their attribution or their projections. The direct effects of CO2 forcing would result in a climate sensitivity somewhere in the neighborhood of 1.1 degrees C and have it responsible for about a third of the recent warming … hardly a cause for concern or uneconomic measures. Research should focus on model independent means of assessing whether the net feedbacks are negative or positive and then getting the models on the right track. Of course, we can still try to improve the physical realism in the various components of the models. Perhaps, a threshold in matching the observations will be reached that will inspire quantitative confidence in their representation of the phenomena of interest.

Ric Locke
August 20, 2010 8:44 am

More handwaving.
(1) The northern coast of Alaska is a little over 70N, so counting the aliasing/pixelation rings in the Arctic gives us two-degree cells. 180 (longitude) x 90 (latitude) gives 16,200 cells.
A fundamental requirement of FEM (Finite Element Modeling, which this is a version of) is that the cells must be small enough that the contents of any given cell are, or can be assumed to be, homogeneous — interactions take place between cells. When I can walk across my place (linear distance roughly 150 meters) and find a two-degree difference in air temperature, I have [ahem] serious doubts that that condition obtains in their model.
An engineer who claimed to be adequately modeling the hinge on a flip-phone — material totally homogeneous, response and cell interactions well understood — with only 10K or so cells would be fired and replaced by someone who knew what he/she was doing and wasn’t too lazy to get up in the morning. Yes, yes, I know, that’s why we have to have all those expensive supercomputers. Bulls*t. Even the crudest FEM program nowadays can vary the size of the cells to meet the requirement of the analysis while reducing the computational load. Look again at the Arctic. There’s pixelation/aliasing between rings, but not within them. That says they’re sticking with two degrees all the way to the poles; that is, 88 – 90 degrees is represented by 180 triangles when it ought to be one cell (actually, that circle is smaller than one cell should be). Worse: if the cell-interaction algorithm is so crude as to require the same numbers all the way down, those aren’t triangles, they’re “rectangles” with one zero side, requiring a discontinuity in the algorithm to figure out how they fit together. Are they even weighting by cell size? It doesn’t show in the results!
Total area of the surface of the earth is about 5.1 x 10^^8 sq. km. A two-degree cell at the equator is about 5 x 10^^5 sq. km. The effective resolution of their model is 1K cells — but they’re spending the computational resources needed for over 16X as many PLUS having to account for pathological conditions at the poles! This says loud and clear that These Are People Who Do Not Know What The F* They Are Doing. A Chinese plastic-molding factory making PET water bottles would put them on the street in a heartbeat.
Do they have anybody on their overpaid and undertasked staff who has ever heard of NASTRAN, let alone any familiarity with it?
(2) One of the subjects of discussion re: Dr. Curry the other day was use of EOFs. That discussion never touched on the fact that they’re using “functions” in the first place. Functions are handy for simpleminded interpolations, because if the function doesn’t fit the data you just pick one with more degrees of freedom. Voila! it fits. Unfortunately it says nothing about the behavior of the function between data points — you can get multiple cycles of variation, especially when (as here) the data are sparse and not well controlled or characterized. Applied to a classic FEM, (e.g., for stress in a plastic part) it means you get zones and ripples of stress patterns that do not appear in the real piece being modeled. That effect is vastly increased if the underlying data isn’t reliable, because the function picked to accommodate an erroneous data point by definition has to have more degrees of freedom, and thus is virtually certain to have unpredictable behavior between data points — even valid ones.
Thirty and forty years ago, before this effect was understood and there was enough computer power available to improve the process, it was one of the banes of my existence. It can be disconcerting, to say the least, to go to a grid cell containing a check point, and find that the model-predicted value is off by an order of magnitude or worse, and be totally unable to find out which data point — perhaps tens or hundreds of cells away — was erroneous, requiring the modeler to add more degrees of freedom to accommodate it. After all, it fits perfectly! And there’s a perfectly plausible way to make the situation worse while improving the fit — simply allow the algorithm to assume that the data points might be in error, and “adjust” them to conform. I would be willing to bet that at least some, if not all, of the adjustments to the datasets that so puzzle us come from that. In one case I was involved with long ago (early Seventies), the procedure ended up declaring that the riverine marshes north and east of Mobile, AL were at an elevation of roughly 100 meters AMSL, which is [ahem] not actually the case.
The solution, in my field and other applications of modeling, was linearization and block adjustment. Build the set of equations describing the interactions between cells, and reduce them to linear partial differentials; choose a step size for which the assumption of linearity is approximately valid. Introduce strawman correlation coefficients, solve the resulting system of linear equations (the “block”) using matrix methods, calculate the errors (“residuals”), then inverse the matrix and use the residuals (fractions thereof, the “weights”) to adjust the correlations. Iterate until the residuals are small enough (an aesthetic consideration to some degree), and you have a set of equations that may not be absolutely correct, but at least conforms to the data without introducing wildly-varying interpolations and/or extrapolations that put the unknown areas at the borders on Alpha Centauri B-II.
This procedure is built in to any credible modern FEM program. It isn’t always used, because the computational requirements are high and it isn’t necessary in well-controlled and well-understood cases, but it’s available. I see no hint, at NCAR or any of the other climate-modeling groups, that they even know it exists, let alone are making any attempt to use it. If anybody at WUWT, or in the “skeptic” community in general, has access to a modern FEM program and knows how to use it — I don’t fit either criterion — it might be worthwhile putting together a simpleminded model using only a few variables. It would be a lot of work, but I’d be willing to bet that a 1000-cell model using only half a dozen correlations would fit the data better than any of the multi-million-dollar boondoggles produced by the current set of eggheads.
Regards,
Ric

Ric Locke
August 20, 2010 8:53 am

Addendum: “…a 1000-cell model…” in which the cell size on the ground is constant, meaning that it varies a lot in lat/long coordinates.
Regards,
Ric

August 20, 2010 9:58 am

Bill Tuttle says:
August 20, 2010 at 3:13 am
Phil.: August 19, 2010 at 2:55 pm
I pointed out that Boeing thought the money spent was very worthwhile indeed.
At that time (1994), it was *very* worthwhile, because Boeing used a lot of NASA-derived technology — which the US taxpayers paid for. To be fair, Boeing *did* re-imburse NASA for the wind-tunnel testing portion.

But it wasn’t the CFD code/runs that cost that though, CFD is cost effective because wind tunnel testing is so expensive. As I recall the number of wind tunnel facilities has gone down in recent years for that reason. Wind tunnel testing with models isn’t perfect either with scaling issues and wall effects etc.
That’s not a knock at Anthony Jameson, btw — he’s one of aviation’s shining stars.
Yes Tony’s a good guy and the cost issues weren’t a result of his CFD.

Not My Name
August 20, 2010 1:35 pm

I can’t say a lot, but I’ve worked with NCAR for years. I wouldn’t trust the engineers and scientists there to run a hot dog stand.

August 20, 2010 2:26 pm

Phil.: August 20, 2010 at 9:58 am
CFD is cost effective because wind tunnel testing is so expensive. As I recall the number of wind tunnel facilities has gone down in recent years for that reason.
We’re getting into chicken/egg territory, here. Every aircraft manufacturer used to have an on-site wind tunnel (Grumman had one at Bethpage and one at Peconic), but when the mergers began in the ’80s, a lot of them were torn down. CAD took off, and more wind tunnels went unused, so it wasn’t cost effective to maintain them, except for the gummint sites, which were (and are) still useful.
Wind tunnel testing with models isn’t perfect either with scaling issues and wall effects etc.
The one at Langley is huge — almost zero wall effect, and it’s big enough for full-scale models of most military fighters. But since it *is* unique –yeah, booking time in it is expensive.

Arno Arrak
August 20, 2010 7:31 pm

Wonderful new toy for climate “scientists” paid for by Uncle Sam. But don’t expect it to be a fortune teller beyond what your daily weatherman’s five day forecast reveals of your future.

Jaye Bass
August 23, 2010 6:38 am

Ric Locke says:
August 20, 2010 at 8:44 am
More handwaving.

And there we have it. The great divide between academics playing at modeling with no negative feedback to improve their results and professional engineers who make things actually work, who can be fired or replaced with somebody better. Tenured academics live within a protected monopoly, while professionals live in the market. Big difference in the amount of adult supervision involved.