Use your PC to help map the Milky Way

From a Rensselaer Polytechnic Institute press release

PCs around the world unite to map the Milky Way

https://i0.wp.com/www.astro.wisc.edu/sirtf/skymt_payne_big.jpg?resize=400%2C394

Combined computing power of the MilkyWay@Home project recently surpassed the world’s second fastest supercomputer

Troy, N.Y. – At this very moment, tens of thousands of home computers around the world are quietly working together to solve the largest and most basic mysteries of our galaxy.

Enthusiastic and inquisitive volunteers from Africa to Australia are donating the computing power of everything from decade-old desktops to sleek new netbooks to help computer scientists and astronomers at Rensselaer Polytechnic Institute map the shape of our Milky Way galaxy. Now, just this month, the collected computing power of these humble home computers has surpassed one petaflop, a computing speed that surpasses the world’s second fastest supercomputer.

The project, MilkyWay@Home, uses the Berkeley Open Infrastructure for Network Computing (BOINC) platform, which is widely known for the SETI@home project used to search for signs of extraterrestrial life. Today, MilkyWay@Home has outgrown even this famous project, in terms of speed, making it the fastest computing project on the BOINC platform and perhaps the second fastest public distributed computing program ever in operation (just behind Folding@home).

The interdisciplinary team behind MilkyWay@Home, which ranges from professors to undergraduates, began the formal development under the BOINC platform in July 2006 and worked tirelessly to build a volunteer base from the ground up to build its computational power.

Each user participating in the project signs up their computer and offers up a percentage of the machine’s operating power that will be dedicated to calculations related to the project. For the MilkyWay@Home project, this means that each personal computer is using data gathered about a very small section of the galaxy to map its shape, density, and movement.

In particular, computers donating processing power to MilkyWay@Home are looking at how the different dwarf galaxies that make up the larger Milky Way galaxy have been moved and stretched following their merger with the larger galaxy millions of years ago. This is done by studying each dwarf’s stellar stream. Their calculations are providing new details on the overall shape and density of dark matter in the Milky Way galaxy, which is widely unknown.

The galactic computing project had very humble beginnings, according to Heidi Newberg, associate professor of physics, applied physics, and astronomy at Rensselaer. Her personal research to map the three-dimensional distribution of stars and matter in the Milky Way using data from the extensive Sloan Digital Sky Survey could not find the best model to map even a small section of a single galactic star stream in any reasonable amount of time.

“I was a researcher sitting in my office with a very big computational problem to solve and very little personal computational power or time at my fingertips,” Newberg said. “Working with the MilkyWay@Home platform, I now have the opportunity to use a massive computational resource that I simply could not have as a single faculty researcher, working on a single research problem.”

Before taking the research to BOINC, Newberg worked with Malik Magdon-Ismail, associate professor of computer science, to create a stronger and faster algorithm for her project. Together they greatly increased the computational efficiency and set the groundwork for what would become the much larger MilkyWay@Home project.

“Scientists always need additional computing power,” Newberg said. “The massive amounts of data out there make it so that no amount of computational power is ever enough.” Thus, her work quickly exceeded the limits of laboratory computers and the collaboration to create MilkyWay@Home formally began in 2006 with the assistance of the Claire and Roland Schmitt Distinguished Professor of Computer Science Boleslaw Szymanski; Associate Professor of Computer Science Carlos Varela; postdoctoral research assistant Travis Desell; as well as other graduate and undergraduate students at Rensselaer.

With this extensive collaboration, leaps and bounds have been made to further the astrophysical goals of the project, but important discoveries have also been made along the way in computational science to create algorithms that make the extremely distributed and diverse MilkyWay@Home system work so well, even with volunteered computers that can be highly unreliable.

“When you use a supercomputer, all the processors are the same and in the same location, so they are producing the same results at the same time,” Varela said. “With an extremely distributed system, like we have with MilkyWay@Home, we are working with many different operating systems that are located all over the globe. To work with such asynchronous results we developed entirely new algorithms to process work as it arrives in the system.” This makes data from even the slowest of computers still useful to the project, according to Varela. “Even the slowest computer can help if it is working on the correct problem in the search.”

In total, nine articles have been published and multiple public talks have been given regarding the computer science discoveries made during the creation of the project, and many more are expected as the refined algorithms are utilized for other scientific problems. Collaboration has already begun to develop a DNA@Home platform to find gene regulations sites on human DNA. Collaborations have also started with biophysicists and chemists on two other BOINC projects at Rensselaer to understand protein folding and to design new drugs and materials.

In addition to important discoveries in computer science and astronomy, the researchers said the project is also making important strides in efforts to include the public in scientific discovery. Since the project began, more than 45,000 individual users from 169 countries have donated computational power to the effort. Currently, approximately 17,000 users are active in the system.

“This is truly public science,” said Desell, who began working on the project as a graduate student and has seen the project through its entire evolution. “This is a really unique opportunity to get people interested in science while also allowing us to create a strong computing resource for Rensselaer research.” All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/

Desell cites the public nature and regular communication as important components of the project’s success. “They are not just sitting back and allowing the computer to do the work,” he says, referencing that volunteers have made donations for equipment as well as made their own improvements to the underlying algorithms that greatly increased computational speed. Varela jokes, “We may end up with a paper with 17,000 authors.”

###

In addition to the volunteers, others within Rensselaer and outside of the Institute have been involved in the project. Some of these collaborators include Rensselaer graduate students Matthew Newby, Anthony Waters, and Nathan Cole; and SETI@home creator David Anderson at Berkeley. The research was funded primarily by the National Science Foundation (NSF) with donations of equipment by IBM, ATI, and NVIDIA.

0 0 votes
Article Rating
44 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Stephan
February 10, 2010 10:40 pm

It would be good if someone could list the statements made by warmers when it was warm ie 1998 to support their theory versus the cold today. If I recall they said no more snow no more winters etc was that true? They will now hammer away at extreme events whether its cold or hot so it would be nice to have those past statements listed….

February 10, 2010 10:52 pm

Great article. I once had a semi significant virtual machine pounding away at these contests. I have currently crunched more data then 98.84380% of all other BOINC users. However, I am down to just one machine banging away these days.

February 10, 2010 10:53 pm

Mass participatory science. Shades of the surface station project! Astronomy has a leg up, though, because it has been a participatory science for a long time, with many discoveries made by amateurs.
And the smart guys behind this project have obviated the need for a gaggle-flop superduper computer. Although I heard a rumor that there is one for sale on E-Bay from East Anglia.
Is this what post-post-normal science will look like?

Evilgidget
February 10, 2010 10:57 pm

Excellent idea. I’m now donating some raw computing power to the project.
Stephan: Wrong thread. Possibly even wrong planet.

TGSG
February 10, 2010 11:06 pm

This is very cool. Reminds me a touch of “Jane” in Ender’s worlds.

Michael
February 10, 2010 11:14 pm

Nice topic for an OT comment.
The north east cities will have to budget for 100 inches of global warming for next winter season. That’s how much they will probably get this year because of global cooling and the solar minimum. Next year may be worse.
Many cities are on the verge of bankruptcy now. Just imagine what next winter season’s snow removal increases will do to the budget?

Michael
February 10, 2010 11:19 pm

The productivity of north east cities is being greatly diminished with this almost a week of blizzard snows. I bet the lost productivity in dollar terms so far must be in the billions. Thank you global warming.

Zeke the Sneak
February 10, 2010 11:30 pm

I could donate my bathroom fan. It turns off and on all by itself now–it has gone quantum on me. There must be several qubits in there. 🙂

elrica
February 10, 2010 11:39 pm

@Evilgidget
I think Stephan got caught in a virtual reality gap. One minute I’m looking at some students at Penn State being very upset, went off to do a bit of look-up, came back, and found myself in a study of the galaxy. He was probably posting betwixt, in the shift of entries.

Michael
February 10, 2010 11:41 pm

Senator: Mr scientist, if we spend all this money on cap and trade, are we going to be able to make the planet colder so we don’t keep getting these crazy snowmageden winters.
Scientist. No Senator. We may be able to stop the planet from warming but it most likely will not get any colder because man-made global warming is so intense.
Senator: So if the planet stays the same temperature as it is now, we can expect to keep getting all these blizzards every year? How does that help out the American people? We’re sick of these winters already. How much global cooling will it take to make these kinds of winters go away? How much less snow can we expect if we double down on the cap and trade tax and spend twice as much, Can we get the planet to have global cooling by spending a tone of money so we get less severe snow storms? These winters are killing me. How much is it going to cost us to get some global cooling so these kinds of winters go away?
Scientist: I’ll have to get back to you senator on that one.

PaulW
February 11, 2010 12:11 am

I like this idea, and it can be used for some real good – after installing BOINC, you can choose from many projects such as the World Community Grid, for research into many human diseases. See http://www.worldcommunitygrid.org

February 11, 2010 1:07 am

I have been donating since the beginning of SETI. Perhaps they can start a project to rework the temperature record.

Hotlink
February 11, 2010 1:24 am

Don’t bother. I mapped it last night.

Jack Simmons
February 11, 2010 1:59 am

I’ve got an idea.
Why not take all the GCM programs off the super-duper, giant electronic brains being used now, and do the climate projections on everyone’s unused laptops?
Here are some advantages:
1) Reductions in spending. Instead of a $5 billion per year budget, make it the same budget these astronomers have for mapping the Galaxy. Certainly the size of the tasks are comparable. The savings could be used to rebuild Haiti.
2) Did you notice this little tidbit in the article?

All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/

We could avoid all this nasty business with FOI requests and the like. Everyone could have more confidence in the process being used and the arguing and debates could focus on the assumptions in the programming.
3) As the article pointed out, everyone contributing unused portions of their PC could claim authorship of any resulting papers. This tendency has already been established in the IPCC studies, where anyone could be counted as a ‘climate scientist’. We would literally have millions of climate scientists who have contributed to a massively peer reviewed paper on climate change. This would, in one fell swoop, enhance the resumes of millions of people in their vain attempts to secure a minimum wage job at TJ Maxx or Waste Management. Wouldn’t it be nice to put a little entry on your resume alluding to your research efforts to save mankind from CO2? Or, to stop the great leftist conspiracy to destroy the West? The two choices depending upon your personal outlook of course.
4) Energy used in computing would be distributed. That way we won’t have burn up all that nasty coal up in Wyoming to run the big data center.
I’m sure there several in the WUWT community who could come up with even better advantages of this type of computing approach.

Dodgy Geezer
February 11, 2010 1:59 am

@Stephan
“It would be good if someone could list the statements made by warmers when it was warm ie 1998 to support their theory versus the cold today. If I recall they said no more snow no more winters etc was that true? They will now hammer away at extreme events whether its cold or hot so it would be nice to have those past statements listed….”
They equivocate. You will find that the extreme statements are ‘taken out of context’ or ‘made by the press’. And you will find them saying that ‘Claiming that something is “very likely” doesn’t mean that I said it would happen’….
They have already done that for the model predictions, which are now well out of kilter with reality. They just say “We never said it would happen – these are just ‘projections’…..”

Tenuc
February 11, 2010 2:54 am

Looking forward to see if the observations of what they find confirm the current theory.
What a pity the CAGW hypothesis couldn’t be checked in such an open way?
“All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/
CRU/GISS/NASA/IPCC please note!!!

Richard C
February 11, 2010 2:56 am

Very interesting.
I just downloaded and installed the BOINC client and noticed that one of the projects that one can (could) join is called “climatepredictions.org”. Out of interest I looked up this site and got “Services for this domain have been discontinued”. Does anyone know what this was all about? Was it a warmist or realist project that died a natural death? It could have been on-topic in an off-topic sort of way.

Michael D Smith
February 11, 2010 3:05 am

All of the research, results, data, and even source code are made public and regularly updated
Hmm. climatescience@home?

Roger
February 11, 2010 3:56 am

Has anyone ever heard of the term , or name “Looking Glass”? How safe is it to allow access to your computer to a group such as this?

Indiana Bones
February 11, 2010 4:35 am

Not very Roger. Consider the opportunities to introduce viral vectors.

RichieP
February 11, 2010 4:42 am

C: “climatepredictions.org. Out of interest I looked up this site and got “Services for this domain have been discontinued”. Does anyone know what this was all about? Was it a warmist or realist project that died a natural death? ”
I think this is probably it, since it uses boinc and has the same name, though .net not .org.
http://climateprediction.net/content/about-climatepredictionnet-project
“By using your computers, we will be able to improve our understanding of, and confidence in, climate change predictions more than would ever be possible using the supercomputers currently available to scientists. The climateprediction.net experiment should help to “improve methods to quantify uncertainties of climate projections and scenarios, including long-term ensemble simulations using complex models”, identified by the Intergovernmental Panel on Climate Change (IPCC) in 2001 as a high priority. Hopefully, the experiment will give decision makers a better scientific basis for addressing one of the biggest potential global problems of the 21st century. ”
It’s linked to a BBC project too (currently closed to new members):
http://bbc.cpdn.org/
From which, via an FAQ:
“What are the participating institutions in climateprediction.net?
Climateprediction.net is a consortium of research organisations, led by the University of Oxford, and including The Met Office, The University of California – Berkeley, The London School of Economics, The Open University, The University of Reading and the Rutherford Appleton Laboratory. The project has been funded by the UK Natural Environment Research Council’s and UK Department of Trade and Industry’s e-Science programmes, the NERC COAPEC thematic programme and the NERC Atmospheric Science and Technology Board. Additional financial and in-kind support is acknowledged from The University of Oxford Department of Physics, Oxford University Computing Services, and The Tyndall Centre for Climate Change Research. For more information about the participating organisations and sponsors see the project website. ”
and:
“Who designed and made the experiment?
The BBC Climate Change Experiment was created for the BBC by climateprediction.net using the Met Office climate model.”
Oh dear ….

Steve Goddard
February 11, 2010 5:14 am

The most powerful supercomputer (at Oak Ridge National Lab) is currently listed at 2.3 petaflops – the press release statement is not accurate.
http://www.top500.org/files/newsletter112009_tabloid_v3.pdf

February 11, 2010 5:15 am

Hotlink: Don’t bother. I mapped it last night.
I mapped it last year: http://www.tinyurl.com/tidingsgss

February 11, 2010 6:16 am

Great. I have it running now on 2 machines (for the ‘malariacontrol’ project), and I still have 2 more that have been shelved and can be used. Thanks.

TanGeng
February 11, 2010 8:19 am

Isn’t this kind of processing notoriously inefficient and also quite the energy hog? I’m not going to yell global warming, but it is certainly the cause of much localized warming.

Milwaukee Bob
February 11, 2010 8:25 am

Steve Goddard (05:14:09) :
“….has surpassed one (1) petaflop, a computing speed that surpasses the world’s second (2nd) fastest supercomputer.” Not the fastest….
But it’s a arbitrary ranking any how as the total computational power of any given system does not guarantee that particular system can produce results faster. That is most often determined by programming/software AND yes, therein using and taking advantage hardware capabilities unique to each system. Over the years there have been many more “powerful” machines beat by better programming.
More OnT, Anthony, why don’t you create a NEW world wide network of temperature stations – with all of us, here. We each attach a simple “weather station” (properly positioned by your specifications, of course), to our computers that feeds real-time temp, humidity, wind speed & direction and barometric pressure to a software program therein that is real-time 24/7 accessible to “the network” and a central unit that tracks, records, averages, etc the data that would then be, again, real-time posted the widget herein – – viewable by all to see…
real numbers, real-time, whole world – 20 to 30 thousand locations all over the globe…. at all different altitudes….. both hemispheres…. heck, maybe even some ships at sea…..

Richard M
February 11, 2010 8:41 am

I currently have one quad-core, one dual-core and one single-core processor running BOINC. A total of over 16 Ghz of computing power. I have been running it for many years. It’s a great program and you don’t even notice it running.

L Bowser
February 11, 2010 8:55 am

@TanGeng
Isn’t this kind of processing notoriously inefficient and also quite the energy hog?
Yes and Yes. At least on an absolute basis.
I’m not going to yell global warming, but it is certainly the cause of much localized warming.
Probably not even localized warming. The theory behind this is using resources that would otherwise be wasted, i.e. the computers are going to be on anyway and most people rarely use all of their computing power, so rather that put new processors on the grid, we’ll use the existing under-utilized ones.
Now there may be a few people **cough, cough, Josualdo** who put computers solely on the grid for the use of these projects, but even then, I would argue that the benefits of malaria control gained by his two extra computers being on the grid vastly outweigh the impact of any contribution he may be making to global warming.

Gary Pearse
February 11, 2010 11:17 am

“Climategate: Plausibility and the blogosphere in the post-normal age. ”
Here we have Rensselaer Polytechnic Institute’s Milky way cooperative science with the masses!!
Wow this is post normal science with overdrive! Willis are you there?

George E. Smith
February 11, 2010 11:31 am

So just how many petaWatts of computer power dissipation is being focussed on this boondoggle ?
Just think how much global warming we could prevent; by just turning off all these idle computers, when they aren’t being used to do something that is actually useful. Why not add everybody’s cell phone and raspberry to the mix; maybe all the digital cameras to take milky way pictures too.

Zeke the Sneak
February 11, 2010 11:58 am

“Their calculations are providing new details on the overall shape and density of dark matter in the Milky Way galaxy, which is widely unknown.”
The astrophysicists do need to locate 96% (+dark energy) of the Universe.
That’s a problem!

p.g.sharrow "PG"
February 11, 2010 12:41 pm

The “missing” 96% is right in front of them, they just can’t or won’t see it. Sort of like looking for love in all the wrong places. If they would just open their eyes and look around they might see it’s chesshire cat grin. ];-)

JonesII
February 11, 2010 1:15 pm

Zeke the Sneak (11:58:04) :
Got to challenge them to get a few miligrams in a lab!

JonesII
February 11, 2010 1:18 pm

This is why the next coming gate it’s ASTROGATE!

JonesII
February 11, 2010 1:24 pm

I’d better resort to analog computational techniques *
(*)Translation: Lab experimental science.

Optimizer
February 11, 2010 1:27 pm

Always nice to see my alma mater in the news for being involved with something cool.

Andy
February 11, 2010 1:45 pm

Distributive computing is one of the most exciting ideas in all of computing. I helped with SETI back in the early days and it was fun knowing I could help find alien life. Then I read the small print. Seems they were only looking in one small frequency band in one small part of the sky. Maybe that was because of technical limitations at the time, I don’t know, but it didn’t seem to be a broad enough search to actually accomplish anything.
The DNA and protein folding programs are exciting but I would want to know who owns the intellectual property from any discoveries. If they’re going to patent their findings and make a few billion dollars off of it, then I’m not interested in helping.
Of all the programs listed, the Milky Way project is possibly the most interesting. Even so, I would want to know it’s not really the NSA.
What can I say. I’m a cynical skeptic.

Zeke the Sneak
February 11, 2010 7:09 pm

JonesII (13:15:24) :
Zeke the Sneak (11:58:04) :
Got to challenge them to get a few miligrams in a lab!

We’ll make it easy for them. Of all the types – cold collisionless dark matter, fuzzy dark matter, self-annhilating dark matter, repulsive dark matter, warm dark matter, hot dark matter, and strongly self-interacting dark matter – we just want to have a look at one type.
Just the one that keeps the Milky Way rotating would be fine.
But it’s been 77 years and people are starting to talk. lol
http://blogs.princeton.edu/frs110/s2005/darkmatter/2_history_of_dark_matter/

February 11, 2010 7:55 pm

JonesII (13:15:24) :
Zeke the Sneak (11:58:04) :
Got to challenge them to get a few miligrams in a lab!
We’ll make it easy for them. Of all the types – cold collisionless dark matter, fuzzy dark matter, self-annhilating dark matter, repulsive dark matter, warm dark matter, hot dark matter, and strongly self-interacting dark matter – we just want to have a look at one type.
Just the one that keeps the Milky Way rotating would be fine.
But it’s been 77 years and people are starting to talk. lol
http://blogs.princeton.edu/frs110/s2005/darkmatter/2_history_of_dark_matter/
Oops, should mention great post! Can’t wait to seeing the next one!

February 12, 2010 10:20 am

George E. Smith (11:31:55) :

So just how many petaWatts of computer power dissipation is being focussed on this boondoggle ?
Just think how much global warming we could prevent; by just turning off all these idle computers, when they aren’t being used to do something that is actually useful. Why not add everybody’s cell phone and raspberry to the mix; maybe all the digital cameras to take milky way pictures too.

I think it *is* useful to learn more about our galaxy. Now think about all the people who let their computers sit idle because they don’t want to wait for it to boot up and log in. A project like this is an ideal way to put that to use.
Like the poster above, I am concerned about “intellectual property” rights in the discoveries, so it was comforting to read the project’s site. I turned on an older machine I was planning to dispose of, installed BOINC, and joined this project.

Zeke the Sneak
February 12, 2010 10:45 am

I did not post the second post (19:55:28). I do not have a website. Thank you.

George E. Smith
February 12, 2010 1:56 pm

“”” Andy (13:45:59) :
Distributive computing is one of the most exciting ideas in all of computing. I helped with SETI back in the early days and it was fun knowing I could help find alien life. Then I read the small print. Seems they were only looking in one small frequency band in one small part of the sky. Maybe that was because of technical limitations at the time, I don’t know, but it didn’t seem to be a broad enough search to actually accomplish anything.
The DNA and protein folding programs are exciting but I would want to know who owns the intellectual property from any discoveries. If they’re going to patent their findings and make a few billion dollars off of it, then I’m not interested in helping.
Of all the programs listed, the Milky Way project is possibly the most interesting. Even so, I would want to know it’s not really the NSA.
What can I say. I’m a cynical skeptic. “””
So while you worked on that SETI project, exactly how many binary digits of experimental evidence of extra-terrestrial intelligent life; or even unintelligent life, did you actually discover ?
So Carl Sagan went to his grave; with no more data than you found. Doesn’t sound very productive use of a great mind to me.

Zeke the Sneak
February 13, 2010 6:37 am

An image of the heart of the Milky Way by the VLA:
http://www.holoscience.com/news/img/GC%20radioarc.jpg
Stop your day and take a look. Happy Valentine’s Day to all.

February 14, 2010 2:08 pm

I’ve had seti@home installed on all of my home and office machines since 2000 and looking at the benchmarks for the machines has been very interesting. My latest 64 bit dual core intel machine clocks in at 2.8 Gflops/core whereas the GPU comes in at 46 Gflops! The GPU is just single precision floating point and must be a royal pain to write programs for, but I can now see why BOINC assigns most of the workunits to the GPU. To put things in perspective, I was really thrilled in 1984 when the lab I worked in got $55,000 in funding for a “super fast” array processor that had a top speed of 5 Mflops; fast in comparison to the 0.05 Mflops of the PDP-11/34 but laughable now. Time to take a look at a new BOINC project to run on my office machines that are currently doing nothing.
As far as distributed climate data collection goes, does anyone know if the weather underground (http://www.wunderground.com) has a dataset of all it’s private weather stations? Just looking at the “Wundermap” that comes up when I enter Kamloops as the city I want data for, 4 weather stations come up with the warmest being the Kamloops airport at 12 C, the one closest to where I live at 10 C (which fits with my outside thermometer) and two stations higher up at 4 and 3 C. Obviously the Kamloops airport is at the lowest elevation of the 4 stations. Once I ran into a site that would provide a complete weather station that would allow one to hook up to this network but have long since lost the link. I’ve got a back yard that probably meets Anthony’s siting requirements and wouldn’t mind having a weather station back there. It seems that there is far more data from the weather underground network than the “official” weather stations but I haven’t seen anyone do an analysis of this data set.