Be a closet climate modeler with Climate@home

First there was SETI@home where you could use spare CPU cycles to look for extraterrestrial signals. Now we have Climate@Home, running NASA GISS modeling software in distributed computing, but no release date yet. Oxford has already created climateprediction.net running BOINC. I expect it’s a case of “keeping up with the Joneses”

I wonder though, what’s the “carbon footprint” of leaving all those computers running to calculate the effects of fossil fuels burned to run them? Already the Met Office Supercomputer is drawing fire for it’s massive electricity use, so does this simply spread the problem around?

From NASA’s CIO: Creating a Virtual Supercomputer to Model Climate

Climate@Home

NASA will be calling on people worldwide to help determine the accuracy of a computer model that scientists use to predict climate change. The initiative, called “Climate@Home,” is unprecedented in scope. Never before has NASA attempted to recruit so many people to help perform research vital to forecasting the Earth’s climate in the 21st century under a wide range of different situations.

NASA’s Earth Science Division (ESD) and Office of the Chief Information Officer (OCIO) have strategically partnered to manage the Climate@Home initiative. This effort will include collaborations between the 10 NASA Centers, the 13 Federal agencies of the USGCRP (United States Global Change Research Program) along with several universities and private organizations.

Goddard Space Flight Center (GSFC)’s Robert Cahalan is serving as the project scientist and has assembled an international team of scientists to help set science goals and determine which parameters to run. GSFC’s senior advisor to the CIO, Myra Bambacus, serves as the project manager and will run this initiative.

Participants need no special training to get involved in Climate@Home. All they need is a desktop computer or laptop. Volunteers will be able to download the computer model to run on their computers as a background process whenever the computers are on, but not used to their full capacity.

The climate model that volunteers download is made up of mathematical equations that quantitatively describe how atmospheric temperature, air pressure, winds, water vapor, clouds, precipitation and other factors all respond to the Sun’s heating of the Earth’s surface and atmosphere. Models help predict how the Earth’s climate might respond to small changes in Earth’s ability to absorb sunlight or radiate energy into space.

Scientists traditionally have used supercomputers to test the sensitivity and accuracy of climate models. With these powerful machines, they are able to run millions of calculations, each computing a different scenario or combination of circumstances such as varying levels of chlorine or water vapor in the atmosphere.

Instead of using supercomputers in this effort, NASA is creating a virtual supercomputing network that spreads the data-processing chores across thousands of computers. Such a task-sharing initiative eliminates the need to buy additional supercomputers, which consume enormous amounts of energy, and reduces the “carbon footprint” of running these calculations.

Prior to the initiative’s official roll out in early 2011, the project will be announced in the news. The project website will provide instructions on how to download the models and supporting computer software. The goal is to have recruited tens of thousands of participants by the time the initiative begins. Each participant will run the same model but with certain parameters slightly adjusted. Scientists will examine the resulting sensitivity of the climate predictions to those adjustments, resulting in a better understanding of the parameters that should be studied in the future.

Climate@Home will have the option of using the same high-level architecture developed for “SETI@Home,” a scientific experiment that uses Internet-connected computers in the Search for Extraterrestrial Intelligence program. The initiative also is modeled after a similar project coordinated by the Oxford e-Research Centre in the United Kingdom called Climateprediction.net.

Climate@Home will test the accuracy of a model developed by the Goddard Institute of Space Studies (GISS) in New York, and will serve as a trailblazer to explore the accuracy of other models as well.

==============

h/t to WUWT reader Bruce Foutch

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

81 Comments
Inline Feedbacks
View all comments
August 27, 2010 3:29 am

Agile Aspect says:
August 26, 2010 at 7:41 pm
SETI and CAGW have a lot in common – they’re both junk science.
That is, they formulate their theories in terms of parameters which can’t be physical measured (the theories can’t be falsified in the sense of Popper.)
In fact, it can be argued that SETI sanitised junk science for the public consumption.

The Drake equation is junkscience since it has a couple parameters that are not known and at the moment can’t be measured at all the moment, some can be measured or we are in the process of measuring them like there are the Exo-planets, in the last 15 years we discovered about 500 of them and the next 15 years will give us a lot more. Still many see the outcome of the Drake equation as a fact.
Seti may look like junkscience but in fact the answer is still out there, with the Drake equation in hand people claimed that the Milkyway must be full with intelligent life but we yet have to find it, so at the moment it does not look good for intelligent life out there.
Does that mean that there is no intelligent life out there? We simply don’t know. Even if we knew all the variables of the Drake equation, even the statistics over the possibility of the rise intelligent life on a planet than we still have to find the proof that Aliens are out there, ET has to phone our home.

Frank K.
August 27, 2010 6:59 am

Steven Mosher says:
August 26, 2010 at 10:30 pm
Interesting link, Steve. Thanks.
Here’s the part of the abstract that caught my interest:
“Firstly, climate model biases are still substantial, and may well be systemically related to the use of deterministic bulk-formula closure – this is an area where a much better basic understanding is needed. Secondly, deterministically formulated climate models are incapable of predicting the uncertainty in their predictions; and yet this is a crucially important prognostic variable for societal applications.”
Translation – we really don’t have a handle on how to model the physics in our GCMs using simple parameterizations, and, in any case, we won’t be able to determine if the answer is correct or not (which is important before start doing insane things like getting rid of all fossil fuels for power generation and transportation).
All the more reason to abandon this wasteful, unnecessary NASA project…

August 27, 2010 8:48 am

Frank, you actually have to watch the video:
“Secondly, deterministically formulated climate models are incapable of predicting the uncertainty in their predictions; and yet this is a crucially important prognostic variable for societal applications.”
If you watched the video you will note how the type of experiement suggested here fits into stochastic modelling of physical processes. You’d even learn about “probablilistic chips” The problem of paramaterization ( see the discussion of navier stokes) doesnt stop folks from building useful things that work. The issue is that up to now there are really two camps of modellers. Palmer is suggesting a third path. If you run a deterministic ab initio model, you will get the same answer, and that answer will be wrong. As it is practiced, there are over 20 models run this way and the ensemble mean is taken. That mean is more skillful than any one model. Palmer is suggesting that if you introduce stochastic methods to the problem of bulk parameterizations, you will get a a model that is non deterministic ( provides uncertainty) AND it outperforms the ensemble mean. Which means instead of 20 different models modelling the physics in slightly different ways, you get ONE MODEL, run many different times that is more skillful. This would be a huge change, because it would mean that the research community wouldnt do some wasteful things.. ( like build yet another GCM –YAGCM)
The GENERALIZED rants against models are a sign of intellectual laziness. All physics are models. When those models ( equations) do very very very well, we call them laws.

August 27, 2010 9:15 am

“That is, they formulate their theories in terms of parameters which can’t be physical measured (the theories can’t be falsified in the sense of Popper.)”
1. That’s not true. The parameterizations are typically not about things that cant be measured. They, are bull representations of processes that can be measured and are measured. But, the finer grained implementations cant be run, say due to compute power.
2. The theories CAN be ‘falsified’ ( see below for what that relly means) and ARE. when you do a bulk parameterization and get the ‘wrong’ answer that tells you somethings ‘wrong’. But more to the point, there is no such thing as popperian falsification. as popper himself came to realize. Whenever a theory ( a statement with the purity of math) comes into conflict with data ( which is ALWAYS) you have these choices:
1. reject the data ( an experiemnt that questioned F=MA, would likely not get attention)
2. Modify the theory, especially if it got many other things right
3. Ignore the theory, especially if it wasnt well connected to other science
4. Claim that the disagreement (if small) between the theory and the data was “error” or “uncertainty” and say that the theory was “verified”
Data doesnt prove theories ‘false’ anymore than it proves them ‘true’. DATA renders Theories more or less useful for some purpose. At the extreme some are so useful and so skillful that we call them ‘true.’ At the other extreme they are so useless that we call them ‘false.’ If we observe how people actually respond to mismatches between data and theory and catalog the responses we see that they do one of the 4 above. But from a logical standpoint, data cannot prove ( in a logical, mathematical sense) a theory TRUE, and can’t prove ( in a logical sense) that it is ‘wrong.’
You also need to understand what is meant by falsifiable in principle.
start here:
http://en.wikipedia.org/wiki/Falsifiability
read Quine.

Frank K.
August 27, 2010 10:12 am

“Frank, you actually have to watch the video:”
OK. I watched the video (it’s an hour long!).
The probablistic computer chip idea sounded very interesting. I’ll have to read up on that.
As for the rest of it, it was somewhat interesting (boring in spots), and no, I see NO resemblance to what was discussed in the video and the wasteful, uninteresting, and useless NASA project being discussed here. The toy model NASA wants to use is no where near the complexity and behavior of modern climate models. Again, I return to my example of the coarse grid Navier-Stokes solution. I can perturb the dynamic viscosity, the freestream velocity, etc. all day long and it will tell me nothing of the true behavior of the system because the model is not adequately representing the true physics.
By the way, take a look at the slide that is presented at 31:15. There, Prof. Palmer shows how much spread there is in the climate models’ predictions of global temperature when you look at the absolute temperatures rather than anomalies. Where have I heard that one before…
If our economy were in better shape and our government weren’t so burdened with massive debt, I would perhaps be more open to these kinds of projects. But we DON’T HAVE THE MONEY FOR THIS JUNK. Of course, if they really think this is important, NASA should consider canceling other projects to free up money for this…

John C
August 27, 2010 12:41 pm

I always find it funny that the Anti-GW crowd chimes in with quips about “Nature being random” and “models are useless” – Yet physical models are used everyday. We fly planes built in simulation. We have more accurate forecasting than decades ago because of modeling.
Some of you may say : “The weatherman is always wrong”. There’s a big difference between micro and mesocyclonic events. Modeling of Rossby Waves is pretty close to temperature variations on the planetary surface and trends can be forecast several weeks in advance.
Is modeling perfect? No. However, I disagree vehemently with suggestions that models are no better than random guess. This is absolutely not true and slaps the face of thousands of meteorologists and climatologists that work day in and day out to save people from floods, tornadoes, and other natural disasters.
To me one thing is absolutely certain about Global Warming. Nobody really knows the absolute truth. We can only base the future on current understanding of physics and some large assumptions. But at the end of the day, am I going to trust people that proclaim they know truth without a single shred of doubt? If you’re truly a scientist there’s always room for doubt. I notice that the most ardent Anti-Global Warming folks like to focus on little inconsistencies, yet they gloss over the majority of science research that gives data in contrary to their beliefs.