Be a closet climate modeler with Climate@home

First there was SETI@home where you could use spare CPU cycles to look for extraterrestrial signals. Now we have Climate@Home, running NASA GISS modeling software in distributed computing, but no release date yet. Oxford has already created climateprediction.net running BOINC. I expect it’s a case of “keeping up with the Joneses”

I wonder though, what’s the “carbon footprint” of leaving all those computers running to calculate the effects of fossil fuels burned to run them? Already the Met Office Supercomputer is drawing fire for it’s massive electricity use, so does this simply spread the problem around?

From NASA’s CIO: Creating a Virtual Supercomputer to Model Climate

Climate@Home

NASA will be calling on people worldwide to help determine the accuracy of a computer model that scientists use to predict climate change. The initiative, called “Climate@Home,” is unprecedented in scope. Never before has NASA attempted to recruit so many people to help perform research vital to forecasting the Earth’s climate in the 21st century under a wide range of different situations.

NASA’s Earth Science Division (ESD) and Office of the Chief Information Officer (OCIO) have strategically partnered to manage the Climate@Home initiative. This effort will include collaborations between the 10 NASA Centers, the 13 Federal agencies of the USGCRP (United States Global Change Research Program) along with several universities and private organizations.

Goddard Space Flight Center (GSFC)’s Robert Cahalan is serving as the project scientist and has assembled an international team of scientists to help set science goals and determine which parameters to run. GSFC’s senior advisor to the CIO, Myra Bambacus, serves as the project manager and will run this initiative.

Participants need no special training to get involved in Climate@Home. All they need is a desktop computer or laptop. Volunteers will be able to download the computer model to run on their computers as a background process whenever the computers are on, but not used to their full capacity.

The climate model that volunteers download is made up of mathematical equations that quantitatively describe how atmospheric temperature, air pressure, winds, water vapor, clouds, precipitation and other factors all respond to the Sun’s heating of the Earth’s surface and atmosphere. Models help predict how the Earth’s climate might respond to small changes in Earth’s ability to absorb sunlight or radiate energy into space.

Scientists traditionally have used supercomputers to test the sensitivity and accuracy of climate models. With these powerful machines, they are able to run millions of calculations, each computing a different scenario or combination of circumstances such as varying levels of chlorine or water vapor in the atmosphere.

Instead of using supercomputers in this effort, NASA is creating a virtual supercomputing network that spreads the data-processing chores across thousands of computers. Such a task-sharing initiative eliminates the need to buy additional supercomputers, which consume enormous amounts of energy, and reduces the “carbon footprint” of running these calculations.

Prior to the initiative’s official roll out in early 2011, the project will be announced in the news. The project website will provide instructions on how to download the models and supporting computer software. The goal is to have recruited tens of thousands of participants by the time the initiative begins. Each participant will run the same model but with certain parameters slightly adjusted. Scientists will examine the resulting sensitivity of the climate predictions to those adjustments, resulting in a better understanding of the parameters that should be studied in the future.

Climate@Home will have the option of using the same high-level architecture developed for “SETI@Home,” a scientific experiment that uses Internet-connected computers in the Search for Extraterrestrial Intelligence program. The initiative also is modeled after a similar project coordinated by the Oxford e-Research Centre in the United Kingdom called Climateprediction.net.

Climate@Home will test the accuracy of a model developed by the Goddard Institute of Space Studies (GISS) in New York, and will serve as a trailblazer to explore the accuracy of other models as well.

==============

h/t to WUWT reader Bruce Foutch

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

81 Comments
Inline Feedbacks
View all comments
George E. Smith
August 25, 2010 6:36 pm

Why not just look at what Mother Gaia does, and be done with it; she always gets the right answer; she has the world’s biggest parallel computing system; with more computing nodes than you can even count.

sky
August 25, 2010 6:52 pm

NASA says: “Scientists traditionally have used supercomputers to test the sensitivity and accuracy of climate models. ”
And here I thought that real-world measurements were necessary for testing model accuracy. Silly me!

Scott
August 25, 2010 7:00 pm

I’m willing to participate in this if they also give me the raw temperature record data to analyze on my own.
-Scott

R. Shearer
August 25, 2010 7:00 pm

Gotta save the NASA mainframes for fantasy football, surfing and porn.

Frank K.
August 25, 2010 7:05 pm

“The climate model that volunteers download is made up of mathematical equations that quantitatively describe how atmospheric temperature, air pressure, winds, water vapor, clouds, precipitation and other factors all respond to the Sun’s heating of the Earth’s surface and atmosphere. Models help predict how the Earth’s climate might respond to small changes in Earth’s ability to absorb sunlight or radiate energy into space.”
Maybe GISS will share with us exactly what those “equations” are. It appears that they don’t really know themselves…(and don’t really care – too busy blogging, you know).
And what does it say about a code when you have to rely on some stupid scheme such as this to test it’s “accuracy”… Accuracy compared to what? Hindcasting again? I’d better stop before I say too much…

Frank K.
August 25, 2010 7:26 pm

noaaprogrammer says:
August 25, 2010 at 5:56 pm
“This is similar to GIMPS – the Great Internet Mersenne Prime Search.”
The difference between GIMPS and the NASA climate “modeling” effort is that GIMPS is a well-posed mathematical problem…

Eddie
August 25, 2010 7:34 pm

‘reduces the “carbon footprint” ‘
lol…how does shifting the cycles to a less efficient machine decrease the carbon footprint? FAIL

juanslayton
August 25, 2010 7:34 pm

Does this mean we get a really transparent look at their code? Or will we find an end user agreement saying that we can’t peek…?

August 25, 2010 7:42 pm

I have two Dell 1850’s each with 2 x quad core 3.1 GHz CPU’s, 8 GB RAM, 2 x 15k rpm drives on PERC controllers, and dual fibre channel HBA’s in my garage. There they will sit in pursuit of a worthy cause, this isn’t one.
(Wish the ATI 5700 series graphics cards would slot into a 1U chassis)

Lew Skannen
August 25, 2010 8:02 pm

That Grauniad article is funny. I love this line.
“save thousands of lives by accurately modelling a climate-related disaster.”
I am totally in favour of running big computers to do modelling …( if the modelling is worthwhile)… but I really love the idea that these guys see themselves as saviours of the universe 24/7.

noaaprogrammer
August 25, 2010 8:06 pm

noaaprogrammer says:
“This is similar to GIMPS – the Great Internet Mersenne Prime Search.”
Frank K. says:
“The difference between GIMPS and the NASA climate “modeling” effort is that GIMPS is a well-posed mathematical problem…”
Yes, I should have said that the distributed processing approach is similar.

kadaka (KD Knoebel)
August 25, 2010 8:14 pm

Is it really that hard for them to get a Beowulf cluster to do this work? They could set one up for less than a climate modeler’s annual salary.
Heck, the US Air Force has their Sony PlayStation 3 supercomputing cluster. NASA could do climate modeling on old X-Boxes picked up at yard sales.
How much does Gavin earn receive in a year? 600 new PS3 consoles?

Ed Caryl
August 25, 2010 8:43 pm

When their models can predict the proverbial butterfly flapping its wings in China and causing rain in Oregon, then I’ll believe.
Where did they get the idea this would produce a smaller carbon footprint? All those individual computers left on all the time will burn 10 times the wattage that the same number of cores in a supercomputer will burn. It will just spread the carbon like peanut butter, all over the world.

Ben
August 25, 2010 8:43 pm

If they released something like this and tested models to KNOWN and measured conditions and the winner received X, it could be worthwhile.
Like, the people testing it can input their own assumptions in to the model and find out who’s assumptions work out best with the known temperature scales.
But no, we have to waste more CPU cycles on re-learning the CS 101 lesson known as garbage in- garbage out.
These people remind me of my class-mates in college who I would poke fun at when I would tell them “there are 10 kinds of people in this world..those who understand binary and those who should find a new major..”
Of course, I stole that one and re-arranged it for my own purposes, but lets be honest, since when has a model proven reality?
If that was even possible, I could prove the evidence of God by inputting a God variable and saying, see, my model works, God exists….

Richard Keen
August 25, 2010 8:43 pm

I tried to let the climate software onto my computer, but my anti-virus stopped it.

jorgekafkazar
August 25, 2010 9:02 pm

My apathy knows no bounds!

August 25, 2010 9:12 pm

Distributed computing. Hmmm.
A little adjustment work with the right software, we could install the correct CO2 feedback value. I’m still busy doing assembler on my ‘286, so any volunteers?

Graeme
August 25, 2010 9:25 pm

kadaka (KD Knoebel) says:
August 25, 2010 at 8:14 pm
Is it really that hard for them to get a Beowulf cluster to do this work? They could set one up for less than a climate modeler’s annual salary.

But that would be to initiate “Cost Effective Thinking” and that option is greyed out on the Climate Programs menu…
The AGW types never think about real world cost benefit analyses – it gets in the way of their sad little dogmas.

August 25, 2010 9:43 pm

No computer cycles (although i prefer to call them T-states) on my computer will be used for this.
This and Seti are based on Junk-science, to many parameters are simply guessed and those guesses have not been backed up by observational data. And the Drake equation has only a limited set of parameters compared to our Climate, yet it can not produce an accurate prediction on how many planets will have intelligent live.
A while a ago a sceptic added another parameter to the equation, the average life span of an average kingdom/republic/state, we had enough of them in our history, like the Mayans, Egyptians, Roman Empire, etc.
The outcome then was rather disturbing, because it was 0.3 i believe, that means either no intelligent live at all or just one, us! And somehow i doubt the outcome of 1.

UK Sceptic
August 26, 2010 12:05 am

If I want a model based prediction all I have to do is switch on the TV and watch the garbage put out by the Met Office. Meanwhile my computer has better things to do with its processor.

pwl
August 26, 2010 12:12 am

“On two occasions I have been asked [by members of the British Parliament], ‘Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?’ I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.”
– Charles Babbage, The Inventor of the Analytical Engine – the first computer

Strawanzer
August 26, 2010 1:23 am

May be you better spend your (computational) time for more interesting things: e.g. http://einstein.phys.uwm.edu/

Paul Vaughan
August 26, 2010 1:38 am

spyware?

John Marshall
August 26, 2010 1:49 am

Not another climate model to get it wrong. I have never thought models of chaotic systems as being any good or produce and result worth looking at. Both climate and weather are chaotic systems. Need I say more or even waste the battery power of the good old laptop.

Jane Coles
August 26, 2010 2:51 am

“Each participant will run the same model but with certain parameters slightly adjusted.”
And this participant will disassemble the code and tweak these parameters further — thus helping ‘climate science’ to maintain the fine reputation for data and model integrity that it has earned over the years.