Use your PC to help map the Milky Way

From a Rensselaer Polytechnic Institute press release

PCs around the world unite to map the Milky Way

http://www.astro.wisc.edu/sirtf/skymt_payne_big.jpg

Combined computing power of the MilkyWay@Home project recently surpassed the world’s second fastest supercomputer

Troy, N.Y. – At this very moment, tens of thousands of home computers around the world are quietly working together to solve the largest and most basic mysteries of our galaxy.

Enthusiastic and inquisitive volunteers from Africa to Australia are donating the computing power of everything from decade-old desktops to sleek new netbooks to help computer scientists and astronomers at Rensselaer Polytechnic Institute map the shape of our Milky Way galaxy. Now, just this month, the collected computing power of these humble home computers has surpassed one petaflop, a computing speed that surpasses the world’s second fastest supercomputer.

The project, MilkyWay@Home, uses the Berkeley Open Infrastructure for Network Computing (BOINC) platform, which is widely known for the SETI@home project used to search for signs of extraterrestrial life. Today, MilkyWay@Home has outgrown even this famous project, in terms of speed, making it the fastest computing project on the BOINC platform and perhaps the second fastest public distributed computing program ever in operation (just behind Folding@home).

The interdisciplinary team behind MilkyWay@Home, which ranges from professors to undergraduates, began the formal development under the BOINC platform in July 2006 and worked tirelessly to build a volunteer base from the ground up to build its computational power.

Each user participating in the project signs up their computer and offers up a percentage of the machine’s operating power that will be dedicated to calculations related to the project. For the MilkyWay@Home project, this means that each personal computer is using data gathered about a very small section of the galaxy to map its shape, density, and movement.

In particular, computers donating processing power to MilkyWay@Home are looking at how the different dwarf galaxies that make up the larger Milky Way galaxy have been moved and stretched following their merger with the larger galaxy millions of years ago. This is done by studying each dwarf’s stellar stream. Their calculations are providing new details on the overall shape and density of dark matter in the Milky Way galaxy, which is widely unknown.

The galactic computing project had very humble beginnings, according to Heidi Newberg, associate professor of physics, applied physics, and astronomy at Rensselaer. Her personal research to map the three-dimensional distribution of stars and matter in the Milky Way using data from the extensive Sloan Digital Sky Survey could not find the best model to map even a small section of a single galactic star stream in any reasonable amount of time.

“I was a researcher sitting in my office with a very big computational problem to solve and very little personal computational power or time at my fingertips,” Newberg said. “Working with the MilkyWay@Home platform, I now have the opportunity to use a massive computational resource that I simply could not have as a single faculty researcher, working on a single research problem.”

Before taking the research to BOINC, Newberg worked with Malik Magdon-Ismail, associate professor of computer science, to create a stronger and faster algorithm for her project. Together they greatly increased the computational efficiency and set the groundwork for what would become the much larger MilkyWay@Home project.

“Scientists always need additional computing power,” Newberg said. “The massive amounts of data out there make it so that no amount of computational power is ever enough.” Thus, her work quickly exceeded the limits of laboratory computers and the collaboration to create MilkyWay@Home formally began in 2006 with the assistance of the Claire and Roland Schmitt Distinguished Professor of Computer Science Boleslaw Szymanski; Associate Professor of Computer Science Carlos Varela; postdoctoral research assistant Travis Desell; as well as other graduate and undergraduate students at Rensselaer.

With this extensive collaboration, leaps and bounds have been made to further the astrophysical goals of the project, but important discoveries have also been made along the way in computational science to create algorithms that make the extremely distributed and diverse MilkyWay@Home system work so well, even with volunteered computers that can be highly unreliable.

“When you use a supercomputer, all the processors are the same and in the same location, so they are producing the same results at the same time,” Varela said. “With an extremely distributed system, like we have with MilkyWay@Home, we are working with many different operating systems that are located all over the globe. To work with such asynchronous results we developed entirely new algorithms to process work as it arrives in the system.” This makes data from even the slowest of computers still useful to the project, according to Varela. “Even the slowest computer can help if it is working on the correct problem in the search.”

In total, nine articles have been published and multiple public talks have been given regarding the computer science discoveries made during the creation of the project, and many more are expected as the refined algorithms are utilized for other scientific problems. Collaboration has already begun to develop a DNA@Home platform to find gene regulations sites on human DNA. Collaborations have also started with biophysicists and chemists on two other BOINC projects at Rensselaer to understand protein folding and to design new drugs and materials.

In addition to important discoveries in computer science and astronomy, the researchers said the project is also making important strides in efforts to include the public in scientific discovery. Since the project began, more than 45,000 individual users from 169 countries have donated computational power to the effort. Currently, approximately 17,000 users are active in the system.

“This is truly public science,” said Desell, who began working on the project as a graduate student and has seen the project through its entire evolution. “This is a really unique opportunity to get people interested in science while also allowing us to create a strong computing resource for Rensselaer research.” All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/

Desell cites the public nature and regular communication as important components of the project’s success. “They are not just sitting back and allowing the computer to do the work,” he says, referencing that volunteers have made donations for equipment as well as made their own improvements to the underlying algorithms that greatly increased computational speed. Varela jokes, “We may end up with a paper with 17,000 authors.”

###

In addition to the volunteers, others within Rensselaer and outside of the Institute have been involved in the project. Some of these collaborators include Rensselaer graduate students Matthew Newby, Anthony Waters, and Nathan Cole; and SETI@home creator David Anderson at Berkeley. The research was funded primarily by the National Science Foundation (NSF) with donations of equipment by IBM, ATI, and NVIDIA.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
44 Comments
Inline Feedbacks
View all comments
Stephan
February 10, 2010 10:40 pm

It would be good if someone could list the statements made by warmers when it was warm ie 1998 to support their theory versus the cold today. If I recall they said no more snow no more winters etc was that true? They will now hammer away at extreme events whether its cold or hot so it would be nice to have those past statements listed….

February 10, 2010 10:52 pm

Great article. I once had a semi significant virtual machine pounding away at these contests. I have currently crunched more data then 98.84380% of all other BOINC users. However, I am down to just one machine banging away these days.

February 10, 2010 10:53 pm

Mass participatory science. Shades of the surface station project! Astronomy has a leg up, though, because it has been a participatory science for a long time, with many discoveries made by amateurs.
And the smart guys behind this project have obviated the need for a gaggle-flop superduper computer. Although I heard a rumor that there is one for sale on E-Bay from East Anglia.
Is this what post-post-normal science will look like?

Evilgidget
February 10, 2010 10:57 pm

Excellent idea. I’m now donating some raw computing power to the project.
Stephan: Wrong thread. Possibly even wrong planet.

TGSG
February 10, 2010 11:06 pm

This is very cool. Reminds me a touch of “Jane” in Ender’s worlds.

Michael
February 10, 2010 11:14 pm

Nice topic for an OT comment.
The north east cities will have to budget for 100 inches of global warming for next winter season. That’s how much they will probably get this year because of global cooling and the solar minimum. Next year may be worse.
Many cities are on the verge of bankruptcy now. Just imagine what next winter season’s snow removal increases will do to the budget?

Michael
February 10, 2010 11:19 pm

The productivity of north east cities is being greatly diminished with this almost a week of blizzard snows. I bet the lost productivity in dollar terms so far must be in the billions. Thank you global warming.

Zeke the Sneak
February 10, 2010 11:30 pm

I could donate my bathroom fan. It turns off and on all by itself now–it has gone quantum on me. There must be several qubits in there. 🙂

elrica
February 10, 2010 11:39 pm

@Evilgidget
I think Stephan got caught in a virtual reality gap. One minute I’m looking at some students at Penn State being very upset, went off to do a bit of look-up, came back, and found myself in a study of the galaxy. He was probably posting betwixt, in the shift of entries.

Michael
February 10, 2010 11:41 pm

Senator: Mr scientist, if we spend all this money on cap and trade, are we going to be able to make the planet colder so we don’t keep getting these crazy snowmageden winters.
Scientist. No Senator. We may be able to stop the planet from warming but it most likely will not get any colder because man-made global warming is so intense.
Senator: So if the planet stays the same temperature as it is now, we can expect to keep getting all these blizzards every year? How does that help out the American people? We’re sick of these winters already. How much global cooling will it take to make these kinds of winters go away? How much less snow can we expect if we double down on the cap and trade tax and spend twice as much, Can we get the planet to have global cooling by spending a tone of money so we get less severe snow storms? These winters are killing me. How much is it going to cost us to get some global cooling so these kinds of winters go away?
Scientist: I’ll have to get back to you senator on that one.

PaulW
February 11, 2010 12:11 am

I like this idea, and it can be used for some real good – after installing BOINC, you can choose from many projects such as the World Community Grid, for research into many human diseases. See http://www.worldcommunitygrid.org

February 11, 2010 1:07 am

I have been donating since the beginning of SETI. Perhaps they can start a project to rework the temperature record.

Hotlink
February 11, 2010 1:24 am

Don’t bother. I mapped it last night.

Jack Simmons
February 11, 2010 1:59 am

I’ve got an idea.
Why not take all the GCM programs off the super-duper, giant electronic brains being used now, and do the climate projections on everyone’s unused laptops?
Here are some advantages:
1) Reductions in spending. Instead of a $5 billion per year budget, make it the same budget these astronomers have for mapping the Galaxy. Certainly the size of the tasks are comparable. The savings could be used to rebuild Haiti.
2) Did you notice this little tidbit in the article?

All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/

We could avoid all this nasty business with FOI requests and the like. Everyone could have more confidence in the process being used and the arguing and debates could focus on the assumptions in the programming.
3) As the article pointed out, everyone contributing unused portions of their PC could claim authorship of any resulting papers. This tendency has already been established in the IPCC studies, where anyone could be counted as a ‘climate scientist’. We would literally have millions of climate scientists who have contributed to a massively peer reviewed paper on climate change. This would, in one fell swoop, enhance the resumes of millions of people in their vain attempts to secure a minimum wage job at TJ Maxx or Waste Management. Wouldn’t it be nice to put a little entry on your resume alluding to your research efforts to save mankind from CO2? Or, to stop the great leftist conspiracy to destroy the West? The two choices depending upon your personal outlook of course.
4) Energy used in computing would be distributed. That way we won’t have burn up all that nasty coal up in Wyoming to run the big data center.
I’m sure there several in the WUWT community who could come up with even better advantages of this type of computing approach.

Dodgy Geezer
February 11, 2010 1:59 am

@Stephan
“It would be good if someone could list the statements made by warmers when it was warm ie 1998 to support their theory versus the cold today. If I recall they said no more snow no more winters etc was that true? They will now hammer away at extreme events whether its cold or hot so it would be nice to have those past statements listed….”
They equivocate. You will find that the extreme statements are ‘taken out of context’ or ‘made by the press’. And you will find them saying that ‘Claiming that something is “very likely” doesn’t mean that I said it would happen’….
They have already done that for the model predictions, which are now well out of kilter with reality. They just say “We never said it would happen – these are just ‘projections’…..”

Tenuc
February 11, 2010 2:54 am

Looking forward to see if the observations of what they find confirm the current theory.
What a pity the CAGW hypothesis couldn’t be checked in such an open way?
“All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/
CRU/GISS/NASA/IPCC please note!!!

Richard C
February 11, 2010 2:56 am

Very interesting.
I just downloaded and installed the BOINC client and noticed that one of the projects that one can (could) join is called “climatepredictions.org”. Out of interest I looked up this site and got “Services for this domain have been discontinued”. Does anyone know what this was all about? Was it a warmist or realist project that died a natural death? It could have been on-topic in an off-topic sort of way.

Michael D Smith
February 11, 2010 3:05 am

All of the research, results, data, and even source code are made public and regularly updated
Hmm. climatescience@home?

Roger
February 11, 2010 3:56 am

Has anyone ever heard of the term , or name “Looking Glass”? How safe is it to allow access to your computer to a group such as this?

Indiana Bones
February 11, 2010 4:35 am

Not very Roger. Consider the opportunities to introduce viral vectors.

RichieP
February 11, 2010 4:42 am

C: “climatepredictions.org. Out of interest I looked up this site and got “Services for this domain have been discontinued”. Does anyone know what this was all about? Was it a warmist or realist project that died a natural death? ”
I think this is probably it, since it uses boinc and has the same name, though .net not .org.
http://climateprediction.net/content/about-climatepredictionnet-project
“By using your computers, we will be able to improve our understanding of, and confidence in, climate change predictions more than would ever be possible using the supercomputers currently available to scientists. The climateprediction.net experiment should help to “improve methods to quantify uncertainties of climate projections and scenarios, including long-term ensemble simulations using complex models”, identified by the Intergovernmental Panel on Climate Change (IPCC) in 2001 as a high priority. Hopefully, the experiment will give decision makers a better scientific basis for addressing one of the biggest potential global problems of the 21st century. ”
It’s linked to a BBC project too (currently closed to new members):
http://bbc.cpdn.org/
From which, via an FAQ:
“What are the participating institutions in climateprediction.net?
Climateprediction.net is a consortium of research organisations, led by the University of Oxford, and including The Met Office, The University of California – Berkeley, The London School of Economics, The Open University, The University of Reading and the Rutherford Appleton Laboratory. The project has been funded by the UK Natural Environment Research Council’s and UK Department of Trade and Industry’s e-Science programmes, the NERC COAPEC thematic programme and the NERC Atmospheric Science and Technology Board. Additional financial and in-kind support is acknowledged from The University of Oxford Department of Physics, Oxford University Computing Services, and The Tyndall Centre for Climate Change Research. For more information about the participating organisations and sponsors see the project website. ”
and:
“Who designed and made the experiment?
The BBC Climate Change Experiment was created for the BBC by climateprediction.net using the Met Office climate model.”
Oh dear ….

Steve Goddard
February 11, 2010 5:14 am

The most powerful supercomputer (at Oak Ridge National Lab) is currently listed at 2.3 petaflops – the press release statement is not accurate.
http://www.top500.org/files/newsletter112009_tabloid_v3.pdf

February 11, 2010 5:15 am

Hotlink: Don’t bother. I mapped it last night.
I mapped it last year: http://www.tinyurl.com/tidingsgss

February 11, 2010 6:16 am

Great. I have it running now on 2 machines (for the ‘malariacontrol’ project), and I still have 2 more that have been shelved and can be used. Thanks.

TanGeng
February 11, 2010 8:19 am

Isn’t this kind of processing notoriously inefficient and also quite the energy hog? I’m not going to yell global warming, but it is certainly the cause of much localized warming.