Be a closet climate modeler with Climate@home

First there was SETI@home where you could use spare CPU cycles to look for extraterrestrial signals. Now we have Climate@Home, running NASA GISS modeling software in distributed computing, but no release date yet. Oxford has already created climateprediction.net running BOINC. I expect it’s a case of “keeping up with the Joneses”

I wonder though, what’s the “carbon footprint” of leaving all those computers running to calculate the effects of fossil fuels burned to run them? Already the Met Office Supercomputer is drawing fire for it’s massive electricity use, so does this simply spread the problem around?

From NASA’s CIO: Creating a Virtual Supercomputer to Model Climate

Climate@Home

NASA will be calling on people worldwide to help determine the accuracy of a computer model that scientists use to predict climate change. The initiative, called “Climate@Home,” is unprecedented in scope. Never before has NASA attempted to recruit so many people to help perform research vital to forecasting the Earth’s climate in the 21st century under a wide range of different situations.

NASA’s Earth Science Division (ESD) and Office of the Chief Information Officer (OCIO) have strategically partnered to manage the Climate@Home initiative. This effort will include collaborations between the 10 NASA Centers, the 13 Federal agencies of the USGCRP (United States Global Change Research Program) along with several universities and private organizations.

Goddard Space Flight Center (GSFC)’s Robert Cahalan is serving as the project scientist and has assembled an international team of scientists to help set science goals and determine which parameters to run. GSFC’s senior advisor to the CIO, Myra Bambacus, serves as the project manager and will run this initiative.

Participants need no special training to get involved in Climate@Home. All they need is a desktop computer or laptop. Volunteers will be able to download the computer model to run on their computers as a background process whenever the computers are on, but not used to their full capacity.

The climate model that volunteers download is made up of mathematical equations that quantitatively describe how atmospheric temperature, air pressure, winds, water vapor, clouds, precipitation and other factors all respond to the Sun’s heating of the Earth’s surface and atmosphere. Models help predict how the Earth’s climate might respond to small changes in Earth’s ability to absorb sunlight or radiate energy into space.

Scientists traditionally have used supercomputers to test the sensitivity and accuracy of climate models. With these powerful machines, they are able to run millions of calculations, each computing a different scenario or combination of circumstances such as varying levels of chlorine or water vapor in the atmosphere.

Instead of using supercomputers in this effort, NASA is creating a virtual supercomputing network that spreads the data-processing chores across thousands of computers. Such a task-sharing initiative eliminates the need to buy additional supercomputers, which consume enormous amounts of energy, and reduces the “carbon footprint” of running these calculations.

Prior to the initiative’s official roll out in early 2011, the project will be announced in the news. The project website will provide instructions on how to download the models and supporting computer software. The goal is to have recruited tens of thousands of participants by the time the initiative begins. Each participant will run the same model but with certain parameters slightly adjusted. Scientists will examine the resulting sensitivity of the climate predictions to those adjustments, resulting in a better understanding of the parameters that should be studied in the future.

Climate@Home will have the option of using the same high-level architecture developed for “SETI@Home,” a scientific experiment that uses Internet-connected computers in the Search for Extraterrestrial Intelligence program. The initiative also is modeled after a similar project coordinated by the Oxford e-Research Centre in the United Kingdom called Climateprediction.net.

Climate@Home will test the accuracy of a model developed by the Goddard Institute of Space Studies (GISS) in New York, and will serve as a trailblazer to explore the accuracy of other models as well.

==============

h/t to WUWT reader Bruce Foutch

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

81 Comments
Inline Feedbacks
View all comments
Chris F
August 25, 2010 4:50 pm

No thank’s, think I’ll pass on this…

Leon Brozyna
August 25, 2010 4:55 pm

You have got to be kidding!
I’m not letting GISS anywhere near my computer!

bill-tb
August 25, 2010 4:56 pm

It’s a crackpot idea to make people feel good.

pwl
August 25, 2010 4:57 pm

“NASA will be calling on people worldwide to help determine the accuracy of a computer model that scientists use to predict climate change.”
Comparisons with the Objective Reality of Nature are what would determine it’s accuracy, not people. For some strange reason climate scientists are reluctant to compare their models with what is really happening in Nature. Strange that.
However, we already know that it’s a wild chicken shoot for no models can accurately predict Nature with her built in randomness.

Keith
August 25, 2010 4:57 pm

The BBC did something similar a while back. I don’t remember much coming of it. Anywy, I’m not quite sure how we can “help determine the accuracy of a computer model that scientists use to predict climate change” by, er, running a program on our computers.
You don’t verify a model with another model or abstract number-crunching, but through real-world observation. Not the first time that this basic aspect of the scientific process seems to have been overlooked by those in the field of climat-“ology”.

Chris B
August 25, 2010 5:00 pm

Lots of garbage in………lots of garbage out.

DCC
August 25, 2010 5:05 pm

So what’s the point? Do they think that if I run their models on my computer that I will be more likely to believe the results? Hummph!

Jimash
August 25, 2010 5:16 pm

Seti@home was an extraordinarily unrewarding, use of screensaver time.
BOINC was worse since it works in the background and steals cycles you actually might need .
Neither ever gave the slightest hint what it was doing or offered any way to examine
any data, like maybe the sound, ( Like in Contact) or the direction in which they were looking or a star map of where the thing was pointed. Nothing. Might as well be the IRS.

Keith Minto
August 25, 2010 5:20 pm

Climate@home?,more like Weather@home.
If past experience is a guide, NASA will be very selective in who they choose.

August 25, 2010 5:27 pm

SETI probably failed because the aliens got a look at what passes for Sol III intelligence, and took a pass. [click in image to embiggen]

latitude
August 25, 2010 5:31 pm

We have too many pot smokers in government……….

Bill Illis
August 25, 2010 5:36 pm

Skynet combined with Connolley’s Wikipedia?

pwl
August 25, 2010 5:36 pm

Stephen Wolfram proves with his New Kind of Science that certain systems generate randomness from within the system – no outside randomness needed, no randomness in initial conditions needed. The system itself generates randomness thus making it unpredictable as a first principle of science. Oh, and these systems can be incredibly simple and yet generate inherent randomness from within the systems. Let that sink in.
The implication for climate and weather prediction is that one can’t predict due to the inherent randomness of the Natural systems. No matter how elaborate you make your “climate model” you’ll never be able to predict it with any accuracy.
Wolfram makes his case with his book A New Kind of Science, he presents the key ideas in this video. Enjoy learning. Oh, Rule 30 is one simple system that generates it’s own randomness. You’ll want to read chapter two of the book, http://www.wolframscience.com/nksonline/page-23, for the proof. The entire book is amazing. 1/2 of it is references.
Wolfram’s work has very serious implications for climate science and weather forecasting.

Urederra
August 25, 2010 5:40 pm

There is a folding@home or something like that, where they use your computer to do study protein misfolding.
http://folding.stanford.edu/
Out of the three of them, (Seti, folding and Climate) folding is the closest to real science.

Kilted Mushroom
August 25, 2010 5:52 pm

Perhaps this idea could be used by skeptical scientists, with no access to super computers, to model some of the missing parameters in the standard models?

Spartacus
August 25, 2010 5:56 pm

How can this be a new? I downloaded and run this screensaver in 2002!!! Downloaded from climateprediction.net. This is very old news. Just an effort from the modellers to gathered lost computing power as time passed, such as myself…. Don’t bother.

noaaprogrammer
August 25, 2010 5:56 pm

This is similar to GIMPS – the Great Internet Mersenne Prime Search.

August 25, 2010 5:56 pm

As long as CO2 is given priority for (alleged) AGW in computer models, they will always be wrong. GI/GO

Rick Bradford
August 25, 2010 6:03 pm

I expect it’s a case of “keeping up with the Joneses”

If you’re referring to the East Anglian Joneses, I think we should be keeping away from them, not up with them….

pkatt
August 25, 2010 6:05 pm

Doh! thats it.. just DOH!

wayne
August 25, 2010 6:10 pm

“The climate model that volunteers download is made up of mathematical equations that quantitatively describe how atmospheric temperature, air pressure, winds, water vapor, clouds, precipitation and other factors all respond to the Sun’s heating of the Earth’s surface and atmosphere. Models help predict how the Earth’s climate might respond to small changes in Earth’s ability to absorb sunlight or radiate energy into space.”
What, are they going to have all of these people’s computers MEASURING all of these small changes? Model cannot predict when they are faulty to begin with.
How about the low angle reflection off of the oceans, the ice, and glossy leaves at the edges of this globe as the world does spin? Are they going to measure and allow for that too? How NASA, exactly how?
It’s always the missing and miss-estimated parameters and faulty interpolation equations that get them in the end. Me thinks they need 1000 worlds with everybody’s, all 6.5 billion of us, computers interconnected and that probably still won’t be enough, let alone the missing or faulty measurements.
I have no doubt that they will come out with some world from their simulation (model), it just won’t be our Earth.

Aldi
August 25, 2010 6:12 pm

Compare models with more models? I’m guessing a new headline by the communist networks would be something along the lines…”our models have reached a consensus, and it is worse than what we previously believed was worse than we thought.”

Graeme
August 25, 2010 6:22 pm

ooohhh – can I get a computer model which sets water vapour as a negative feedback?

Paul Linsay
August 25, 2010 6:25 pm

I did the ClimatePrediction.net thing. It got withdrawn because about half the time the model would plunge into an Ice Age. Can’t have that, can we?

George E. Smith
August 25, 2010 6:34 pm

So how big are the grants they are offering to become real climate scientists in this endeavor ?

1 2 3 4