NASA powers up for the next UN IPCC

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo

From NASA,  new fossil fuel powered toys for the boys.

Related:  “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.

Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

News Release from NASA here

NASA Expands High-End Computing System for Climate Simulation

WASHINGTON, DC UNITED STATES

GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.

The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.

Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.

“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”

In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.

“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.

NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

About the systemlink

No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.

Photo of iDataPlex cluster

IBM iDataPlex Cluster: Discover Scalable Units 3 & 4

The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.

Discover

The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.

Discover derives its name from the NASA adage of “Explore. Discover. Understand.”

Discover System Details

System Architecture
As mentioned above, the system architecture is made up of multiple scalable units. The list below describes the total aggregate components of the system and its individual scalable units.

Aggregate
  • 49 Racks (compute, storage, switches, and more)
  • 111 Tflop/s
  • 11,032 Total Cores
File System and Storage
  • IBM GPFS
  • 2.46 PB Storage
Operating Environment
  • Operating System: SLES
  • Job Scheduler: PBS
  • Compilers: C, C++, Fortran (Intel and PGI)

Individual Scalable Units

Base Unit
  • Manufacturer: Linux Networx/SuperMicro
  • 3.33 Tflop/s
  • 520 Total Cores
  • 2 dual-core processors per node
  • 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
  • Production: 4Q 2006
Scalable Unit 1
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 2Q 2007
Scalable Unit 2
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2007
Scalable Unit 3
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2008
Scalable Unit 4
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 4Q 2008
Scalable Unit 5
  • Manufacturer: IBM
  • 46.23 Tflop/s
  • 4,128 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2009

Photo of IBM POWER5+ system

IBM Test System: Schirra

The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.

Photo of the 128-screen hyperwall-2 visualization system

Hyperwall-2 Visualization System

The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.

Climate Simulation Computer Becomes More Powerful
08.24.09

Moving from 27 kilometer resolution to 3.5 kilometer resolution yields cloud clusters like those seen in observations from NOAA's latest GOES satellite.> View larger image

> View larger 27 km image

> View larger 3.5 km image

> View larger GOES-O image

New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin

A 24 hour simulation run at progressively higher resolutions captures finer and finer details of cloud vortex streets moving over Alaska's Aleutian Islands.> View larger image

A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.> View larger image

Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

126 Comments
Inline Feedbacks
View all comments
Richard deSousa
August 24, 2009 5:42 pm

More expensive and faster junk science.

Don Penim
August 24, 2009 5:44 pm

I sure hope they got the free upgrade to Windows 7 when it comes out 😉

Dr A Burns
August 24, 2009 5:48 pm

When a computer model can tell with any accuracy whether the weather will be wet next week, I’ll start to pay attention to the much more complex climate models.
I wonder if the new machines are still running the old FORTRAN code ?

August 24, 2009 5:51 pm

Hey Giss. Can your new computer tell us when it’s going to rain in San Antonio?

TJA
August 24, 2009 5:54 pm

Maybe they could have better spent the money attracting first rate mathematicians with their paper notebooks and chalkboards. Maybe a laptop running Mathematica.

Frank K.
August 24, 2009 5:55 pm

“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!
I hope everyone who knows someone who has lost a job or are themselves hurting in our current economy looks at this expenditure at the colossal waste of money that it is – turning over millions to the incompetent numerical analysts of the GISS…

John F. Hultquist
August 24, 2009 6:10 pm

Lighten up, folks. In the long run this is a good thing. I am much happier with a fast computer and high-speed internet connection. For a few days after I got these I did the same things I’d been doing – but then I learned I could do other things I had not been able to do before.
So, they will transfer their simulations and data to the new hardware and, at first, produce the same answers. But they will soon find they have time to insert other data and other functions and better resolution. Eventually, they will find that things more complex than CO2 driven warming are involved and as these are incorporated the role of GHGs will decrease in their models.
None of this is likely to happen before Copenhagen, of course. But even as the western economies stagnate under “cap and tax” laws we will begin to understand (better) the natural variability of Earth’s climate.
Science is good. Politics – not so much.

ian middleton
August 24, 2009 6:15 pm

I bet the first program they run on it will be DOOM 2

August 24, 2009 6:20 pm

Garbage In, Garbage Out, Only faster !

Joe Miner
August 24, 2009 6:23 pm

Maybe they’ll run folding@home on it and we’ll get some productive use out of our tax dollars.

Gene Nemetz
August 24, 2009 6:29 pm

If Henrik Svensmark, Piers Corbyn, Roy Spencer, and Anthony Watts (and who they would choose to help them) were in charge of the inputs I’d feel a lot better about the outputs of this toy.

JimB
August 24, 2009 6:34 pm

“Smokey (17:19:59) :
46 teraflops …”
What could go wrong?
JimB

Richard M
August 24, 2009 6:34 pm

I hope the application has sufficient granularity to take advantage of the extra processing power. Otherwise it might not run much faster.

paulakw
August 24, 2009 6:35 pm

Chuckle !!!!!!!!!
Now they can be wrong 100 times faster !!!

Gene Nemetz
August 24, 2009 6:35 pm

Robert Wykoff (17:22:17) :
It will only take minutes instead of hours now to determine that the climate is getting worse much faster than we imagined.
And it only took you seconds to know this is what will happen. 😉

JimB
August 24, 2009 6:36 pm

“Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !”
…or
As someone else on WUWT posted some time ago:
Uga in, Chucka out.
JimB

Gene Nemetz
August 24, 2009 6:39 pm

Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.

August 24, 2009 6:42 pm

Gerry (16:46:33) :
More GIGI (garbage in, garbage out) for the IPCC’s sewerpipe.
Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !
Frank K. (17:55:58) :
“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!

My question is WHICH “past millenium” temperatures are they going to use as “input” and which are they going to use as output to “verify” their programs. “GIGO faster” means also that – if the input equations are going to be compared to the Mann-Jones hokey stick temperatures (whether bristlecone pines or his latest “hurricane tempestology” repitition of the hockey stick tree rings, the output (100 years) CAN’T be correct be the INPUT (1000 years) is a blantant hoax.

August 24, 2009 6:43 pm

Gene Nemetz (18:39:48) :
Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.
—–
Rather:
Because they are spending my money, and your money, to TAKE that money. And the money they are TAKING is my money, and your money.

Curiousgeorge
August 24, 2009 6:44 pm

All hail the new Oracle! Anybody notice there seems to be a significant similarity between these electromechanical oracles and the original that used to sniff natural gas at Delphi? They both have their priests and acolytes, they both must be sacrificed and catered to, and they both had major influence over political leaders. And they both have equivalent track records.

Gene Nemetz
August 24, 2009 6:44 pm

Carbon-based Life Form (17:24:42) : ..according to Gavin’s estimation
Doesn’t it just SUCK that someone the caliber of Gavin Schmidt is one of the people who is going to get to play with this toy?! OMFG!!

August 24, 2009 6:50 pm

So is this baby powered by windmills? Or fossil fuels? Does it come with an energy efficiency rating like waterheaters? If so, what’s the rating?
And don’t tell me 42. That is NOT the answer to everything.

August 24, 2009 6:52 pm

Sweet toy. Now we can compute garbage at an unprecedented rate.

Jeff Szuhay
August 24, 2009 6:54 pm

Computers are useless. They can only give you answers.
— Pablo Picasso, painter (1881 – 1973)

WatasC
August 24, 2009 6:56 pm

This kind of smokescreen makes the climatologists think they are getting somewhere, when all that will happen is that they will become increasingly sure of their answers – while they will be no more accurate than before.
Climate change cannot be predicted by a single model no matter how complex that model.
Population growth models based on data from 1950 -1970 is inapplicable to population growth patterns in 2009. Investment banking models were based on data from 1945 – 2005 during which time single family housing prices in aggregate increased. The models based on these data predicted that single home mortgages in aggregate had less than a 1% chance risk of default. Guess what happened?
The results of this reliance in climatology on models rather than on observation, experiment, and theory is as likely to be as disappointing as similar reliance on econometric models was in the mid 20 century and investment banking models which led to our current financial collapse.