
From NASA, new fossil fuel powered toys for the boys.
Related: “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.
Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
News Release from NASA here
NASA Expands High-End Computing System for Climate Simulation
WASHINGTON, DC UNITED STATES
GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.
The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.
Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.
“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”
In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.
“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.
NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
About the system…link
No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.
IBM iDataPlex Cluster: Discover Scalable Units 3 & 4
The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.
Discover
- The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.
Discover derives its name from the NASA adage of “Explore. Discover. Understand.”
Discover System Details
- 49 Racks (compute, storage, switches, and more)
- 111 Tflop/s
- 11,032 Total Cores
- IBM GPFS
- 2.46 PB Storage
- Operating System: SLES
- Job Scheduler: PBS
- Compilers: C, C++, Fortran (Intel and PGI)
Individual Scalable Units
- Manufacturer: Linux Networx/SuperMicro
- 3.33 Tflop/s
- 520 Total Cores
- 2 dual-core processors per node
- 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
- Production: 4Q 2006
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 2Q 2007
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2007
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2008
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 4Q 2008
- Manufacturer: IBM
- 46.23 Tflop/s
- 4,128 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2009
IBM Test System: Schirra
The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.
Hyperwall-2 Visualization System
The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.
New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin
A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.





More expensive and faster junk science.
I sure hope they got the free upgrade to Windows 7 when it comes out 😉
When a computer model can tell with any accuracy whether the weather will be wet next week, I’ll start to pay attention to the much more complex climate models.
I wonder if the new machines are still running the old FORTRAN code ?
Hey Giss. Can your new computer tell us when it’s going to rain in San Antonio?
Maybe they could have better spent the money attracting first rate mathematicians with their paper notebooks and chalkboards. Maybe a laptop running Mathematica.
“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!
I hope everyone who knows someone who has lost a job or are themselves hurting in our current economy looks at this expenditure at the colossal waste of money that it is – turning over millions to the incompetent numerical analysts of the GISS…
Lighten up, folks. In the long run this is a good thing. I am much happier with a fast computer and high-speed internet connection. For a few days after I got these I did the same things I’d been doing – but then I learned I could do other things I had not been able to do before.
So, they will transfer their simulations and data to the new hardware and, at first, produce the same answers. But they will soon find they have time to insert other data and other functions and better resolution. Eventually, they will find that things more complex than CO2 driven warming are involved and as these are incorporated the role of GHGs will decrease in their models.
None of this is likely to happen before Copenhagen, of course. But even as the western economies stagnate under “cap and tax” laws we will begin to understand (better) the natural variability of Earth’s climate.
Science is good. Politics – not so much.
I bet the first program they run on it will be DOOM 2
Garbage In, Garbage Out, Only faster !
Maybe they’ll run folding@home on it and we’ll get some productive use out of our tax dollars.
If Henrik Svensmark, Piers Corbyn, Roy Spencer, and Anthony Watts (and who they would choose to help them) were in charge of the inputs I’d feel a lot better about the outputs of this toy.
“Smokey (17:19:59) :
46 teraflops …”
What could go wrong?
JimB
I hope the application has sufficient granularity to take advantage of the extra processing power. Otherwise it might not run much faster.
Chuckle !!!!!!!!!
Now they can be wrong 100 times faster !!!
Robert Wykoff (17:22:17) :
It will only take minutes instead of hours now to determine that the climate is getting worse much faster than we imagined.
And it only took you seconds to know this is what will happen. 😉
“Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !”
…or
As someone else on WUWT posted some time ago:
Uga in, Chucka out.
JimB
Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.
Gerry (16:46:33) :
More GIGI (garbage in, garbage out) for the IPCC’s sewerpipe.
Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !
Frank K. (17:55:58) :
“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!
…
My question is WHICH “past millenium” temperatures are they going to use as “input” and which are they going to use as output to “verify” their programs. “GIGO faster” means also that – if the input equations are going to be compared to the Mann-Jones hokey stick temperatures (whether bristlecone pines or his latest “hurricane tempestology” repitition of the hockey stick tree rings, the output (100 years) CAN’T be correct be the INPUT (1000 years) is a blantant hoax.
Gene Nemetz (18:39:48) :
Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.
—–
Rather:
Because they are spending my money, and your money, to TAKE that money. And the money they are TAKING is my money, and your money.
All hail the new Oracle! Anybody notice there seems to be a significant similarity between these electromechanical oracles and the original that used to sniff natural gas at Delphi? They both have their priests and acolytes, they both must be sacrificed and catered to, and they both had major influence over political leaders. And they both have equivalent track records.
Carbon-based Life Form (17:24:42) : ..according to Gavin’s estimation
Doesn’t it just SUCK that someone the caliber of Gavin Schmidt is one of the people who is going to get to play with this toy?! OMFG!!
So is this baby powered by windmills? Or fossil fuels? Does it come with an energy efficiency rating like waterheaters? If so, what’s the rating?
And don’t tell me 42. That is NOT the answer to everything.
Sweet toy. Now we can compute garbage at an unprecedented rate.
Computers are useless. They can only give you answers.
— Pablo Picasso, painter (1881 – 1973)
This kind of smokescreen makes the climatologists think they are getting somewhere, when all that will happen is that they will become increasingly sure of their answers – while they will be no more accurate than before.
Climate change cannot be predicted by a single model no matter how complex that model.
Population growth models based on data from 1950 -1970 is inapplicable to population growth patterns in 2009. Investment banking models were based on data from 1945 – 2005 during which time single family housing prices in aggregate increased. The models based on these data predicted that single home mortgages in aggregate had less than a 1% chance risk of default. Guess what happened?
The results of this reliance in climatology on models rather than on observation, experiment, and theory is as likely to be as disappointing as similar reliance on econometric models was in the mid 20 century and investment banking models which led to our current financial collapse.