NASA powers up for the next UN IPCC

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo

From NASA,  new fossil fuel powered toys for the boys.

Related:  “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.

Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

News Release from NASA here

NASA Expands High-End Computing System for Climate Simulation

WASHINGTON, DC UNITED STATES

GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.

The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.

Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.

“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”

In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.

“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.

NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

About the systemlink

No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.

Photo of iDataPlex cluster

IBM iDataPlex Cluster: Discover Scalable Units 3 & 4

The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.

Discover

The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.

Discover derives its name from the NASA adage of “Explore. Discover. Understand.”

Discover System Details

System Architecture
As mentioned above, the system architecture is made up of multiple scalable units. The list below describes the total aggregate components of the system and its individual scalable units.

Aggregate
  • 49 Racks (compute, storage, switches, and more)
  • 111 Tflop/s
  • 11,032 Total Cores
File System and Storage
  • IBM GPFS
  • 2.46 PB Storage
Operating Environment
  • Operating System: SLES
  • Job Scheduler: PBS
  • Compilers: C, C++, Fortran (Intel and PGI)

Individual Scalable Units

Base Unit
  • Manufacturer: Linux Networx/SuperMicro
  • 3.33 Tflop/s
  • 520 Total Cores
  • 2 dual-core processors per node
  • 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
  • Production: 4Q 2006
Scalable Unit 1
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 2Q 2007
Scalable Unit 2
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2007
Scalable Unit 3
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2008
Scalable Unit 4
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 4Q 2008
Scalable Unit 5
  • Manufacturer: IBM
  • 46.23 Tflop/s
  • 4,128 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2009

Photo of IBM POWER5+ system

IBM Test System: Schirra

The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.

Photo of the 128-screen hyperwall-2 visualization system

Hyperwall-2 Visualization System

The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.

Climate Simulation Computer Becomes More Powerful
08.24.09

Moving from 27 kilometer resolution to 3.5 kilometer resolution yields cloud clusters like those seen in observations from NOAA's latest GOES satellite.> View larger image

> View larger 27 km image

> View larger 3.5 km image

> View larger GOES-O image

New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin

A 24 hour simulation run at progressively higher resolutions captures finer and finer details of cloud vortex streets moving over Alaska's Aleutian Islands.> View larger image

A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.> View larger image

Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

126 Comments
Inline Feedbacks
View all comments
Highlander
August 25, 2009 1:36 am

The comment was:
—————
FatBigot (16:54:07) :
Why are they bothering? For years they’ve been telling us they’ve got the answer already.
—————
.
Well, ya see? It’s as this: They tell =US= that =THEY= know what they’re talking about, and that’s so there aren’t any questions in people’s minds.
.
However, the real truth is just this: THEY realize that =THEY= don’t really know what =THEY= are talking about.
.
Ergo the need for more money to prove that they DON’T know what they didn’t know and CAN’T ever know: What they don’t know.
.
Got that?
.
Otherwise the program they use will only lead to the same conclusions, only faster and with greater resolution.
.
Kinda sorta like four-wheel drive versus two-wheel drive: With four driving wheels, if you don’t know what your doing, you’ll get into trouble twice as fast!
.

Alan the Brit
August 25, 2009 1:42 am

Douglas DC (17:11:53) :
42-now what was the question?
ht to “Hitchiker’s Guide….”
Look, it’s in black & white, 5 x 9 = 42! Always has, always will, me bucko!
I can see the transatlantic spats now, those jolly chaps & chapesses in the Met Office chanting, “Na na na na na! Our Deep Thought is better than your Deep Thought!”
As to costs, well the Met Office have very kindly spent £30M of money that isn’t theirs on their new super-computer, plus all the salaries to go with it, I understand that place is like a glorified holiday camp, have a guess at what your lot are spending that isn’t theirs! $50M+? Governments have no money, it’s taxpayers money folks, which they remove by legal force from your wallets!
AND just how will they know what they get out of the end of it is right or meaningful. Well they don’t actually, & after all that’s isn’t the point either. They just need lots of evidence to point the finger. The UK Government spends £Ms on Focus Groups & fake charities that prepare reports, that just happen to endorse its policy proposals so that they can raise taxes after promising they wouldn’t do it! It’s simple really & this is how benign totalitarianism works, they create solution to a non-existant problem, define the problem, then legislate against it. You’re next!

August 25, 2009 1:44 am

..and all they get is some kind of hockey stick, or curves obediently following the CO2 curve.

Joe Soap
August 25, 2009 1:51 am

That’s a seriously impressive amount of processing power, and probably enough to make Microsoft’s FSX playable.

Rhyl Dearden
August 25, 2009 4:00 am

Are they going to power these giants with alternative power sources?

Bruckner8
August 25, 2009 4:49 am

Isn’t it ironic that the warmists need this cutting edge technology, and at the same time they’re using this cutting edge technology to enact legislation to curtail the use of such technology, and at the same time the deniers are slamming the warmists for using cutting edge technology?

August 25, 2009 5:31 am

Why not use the now well established peer reviewed economical models used in academic research and simulate the DOW of the future.
Estimate the DOW value of 2100 should not be any problem if you use the latest super computer instead of a single PC.
The more powerful computer, the more accurate the result should be, right?

Craigo
August 25, 2009 5:38 am

Is this all really necessary when all it will do is draw more curves and then plot more points that correlate with something somewhere that will be peerreviewedpublishedinajournal as the third letter from Michael to the Gavinites followed shortly by rigorous analysis by the Apostates who will once again show the abuse of the “statistical method”.
This AGW really is a cash cow getting a good milking which will be followed by some cap ‘n tax or ets that will squeeze the last drops from the shriveled udder.

Nogw
August 25, 2009 5:57 am

That thing will need some intelligent input, do they have it?

Harold Vance
August 25, 2009 7:10 am

Does this make it the world’s fastest cherry-picking machine?

John Luft
August 25, 2009 7:28 am

So NASA will now get the wrong answers faster? Reminds me of the blurb about coffee…..drink lots and do stupid things faster.

Nogw
August 25, 2009 7:34 am

It would be better an horoscope of future climate, meanwhile old sun it is parked somewhere out there resting.

John G
August 25, 2009 7:37 am

Wow, nobody will be able to question the validity of the model results obtained on that baby (/sarcasm). Seriously, when will I be able to buy a desktop version?

Don S.
August 25, 2009 9:15 am

Hey GISS, what’s the density altitude and temp going to be at

Don S.
August 25, 2009 9:20 am

Hey GISS, what are the density altitude and temp going to be at MSO at 1200Z tomorrow? The air tanker guys could save a lot of time and labor if you could let them know.

Don S.
August 25, 2009 9:24 am

The question is: How do I turn it on?

Gene
August 25, 2009 9:48 am

sky: “Can’t wait to compare convective cloud clusters generated by GCMs with those observed in the real world.”
The observations will be wrong, of course, and the model right. 🙂
I find it “interesting” (other words that came to mind were far too coarse to post) that they plan to make model runs back a millennium, and forward to 2100. I need only ask: ON WHAT BLOODY DATA?!?!?!!?
I certainly hope someone without an axe to grind can access such high-res simulations so that we can get an independent assessment of how well the simulations replicate reality. I’m betting not very well. Current models do not necessarily do very well for hurricanes, so why should the GCMs.
It’s also nice to see that they have admitted never seeing a well-formed eye wall in prior simulations.
The key would seem to be for the modelers to work with the people who do field research and add computations for “natural factors” and see what happens under this new computational capacity, but I rather have doubts that will occur before I’m pushing up daisies … or the next ice age arrives.

George E. Smith
August 25, 2009 9:53 am

So we can go from garbage in to garbage out much faster than ever before; rah !
Reminds me of an “Idea for Design” in a famous (old) electronics industry magazine. Someone donated an hour of computer time on an IBM 1100 or some such to this “engineer” so he decided to do a Monte Carlo analysis, of the two transistor amplifier circuit he had “Worst case” designed to amplify with a gain of 10 +/-1 .
The results of the analysis said that the lead resistor of the main gain stage was the most critical item in the circuit. It also said that the spread of amplifier gains was skewed outside the worst case design limits (how could that be if it is a worst case design). So it suggested a 5% change in the value of that load resistor, to resore the gain to its desired value.
Whoop de do; the :engineer didn’t do the WCD properly; but more importantly it was a totally lousy circuit to use in the first place; and he could have rearranged the exact same number of circuit components into a much better negative feedback amplifier whose gain would be the ratio of just two resistors (easy to make 10:1 with standard values); and quite independent almost of the properties of either of the two transistors; a much more robust design.
So a fast computer can perhaps optimise a lousy design; or climate theory; but don’t expect it to come up with a better design or climate theory that a better engineer (climate scientist) could have figured out without the faster computer.
NASA isn’t going to statistical mathematicate its way to a better climate science explanation; so why the bigger computer.

Nogw
August 25, 2009 10:46 am

Some brains at WUWT are faster and more intelligent that that 21st.century childish hardware.

Roy Tucker
August 25, 2009 12:00 pm

pwl:
“Nice one Roy, may I post your comparison steps elsewhere?”
Be my guest. Permission granted. ;^)

Dave Andrews
August 25, 2009 12:41 pm

““This new computing system represents a dramatic step forward in performance for climate simulations.””
Is this tacit acknowledgement that current climate models are not too good in their simulations?

Thomas J. Arnold.
August 25, 2009 1:04 pm

Should it be tested retro forecasting? Take a location(s) surface weather station and feed in inputs from last 7 days/month/year, then test computer to see if it can mimic the climate inputs and outputs for that particular week/month/year – are NASA up for it??
I do feel that all this ‘power’ should be used for something more tangible, if it came up with projections say, that the climate is not warming would they (NASA) release the data?? – or am I being ingenuous?
There must be a tendency for the computer experts to be so completely immersed in the subject and intricacies of the engineering, to miss entirely the objective, like being lost in cyberspace.
I am not sure that we shall ever be able to predict with any accuracy this beautiful and terrible climate of our world – or we may just be on the verge……. and something goes off in a corner of Wyoming!!****! – now might I suggest, that we should far more concentrate our energies on volcanic/seismic event prediction.

gofer
August 25, 2009 1:52 pm

Turn it OFF! Isn’t that what we’ve been hearing from the “greenies” concerning our computers. I just read where computers are responsible for millions of tons of GHGs, so the best thing they can do for the enviroment is to shut down this massive GHG smoking hunk of earth destroying junk. We’ve already been told that they are “pretty certain” that we are all going to bake like KFC, so why do they need to do anymore of this nonsense? How many light bulbs I got to change out in order to save enough juice to run this montrosity?

George E. Smith
August 26, 2009 11:05 am

“”” Peter West (15:02:54) :
See http://seattletimes.nwsource.com/html/boeingaerospace/2009565319_boeing30.html and http://blogs.crikey.com.au/planetalking/2009/07/07/how-the-joint-strike-fighter-and-dreamliner-787-programs-are-compromised-by-similar-project-management-failures/ for some of the triumphs of computer modelling. “””
As I recall, the 787 has yet to get off the ground (commercially) so don’t count that as a success. I think the 737 was a success; the 787 may never be.
And the JSF if they ever build any, will just be displacing a far superior aircraft.
But if you want to buy a product that resulted from computer modelling, you can now get the Logitech “Everywhere MX” mouse that will track (almost) everywhere; even on glass. I can pretty much guarantee that it will NOT work on the Hubble Telescope Primary mirror, even though they won’t let anyone try that; but it works most everywhere else.
George