
From NASA, new fossil fuel powered toys for the boys.
Related: “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.
Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
News Release from NASA here
NASA Expands High-End Computing System for Climate Simulation
WASHINGTON, DC UNITED STATES
GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.
The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.
Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.
“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”
In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.
“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.
NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
About the system…link
No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.
IBM iDataPlex Cluster: Discover Scalable Units 3 & 4
The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.
Discover
- The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.
Discover derives its name from the NASA adage of “Explore. Discover. Understand.”
Discover System Details
- 49 Racks (compute, storage, switches, and more)
- 111 Tflop/s
- 11,032 Total Cores
- IBM GPFS
- 2.46 PB Storage
- Operating System: SLES
- Job Scheduler: PBS
- Compilers: C, C++, Fortran (Intel and PGI)
Individual Scalable Units
- Manufacturer: Linux Networx/SuperMicro
- 3.33 Tflop/s
- 520 Total Cores
- 2 dual-core processors per node
- 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
- Production: 4Q 2006
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 2Q 2007
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2007
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2008
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 4Q 2008
- Manufacturer: IBM
- 46.23 Tflop/s
- 4,128 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2009
IBM Test System: Schirra
The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.
Hyperwall-2 Visualization System
The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.
New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin
A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.





And all this computer equipment won’t replace intelligent understanding.
More GIGI (garbage in, garbage out) for the IPCC’s sewerpipe.
Why are they bothering? For years they’ve been telling us they’ve got the answer already.
It may be rubbish, but it’ll be high speed rubbish.
42-now what was the question?
ht to “Hitchiker’s Guide….”
The faster computing will help climate modellers as resolution and process detail increases. However, the CO2 forcing mechanism is derived from very straightforward equations, and is the main driving cause of projected warming over the next century and beyond, according to the models quoted by the IPCC. So, in my humble opinion, the benefit will be more toward improved meteorological modelling rather than climate change magnitude prediction (aside from perhaps improved feedback analyses with higher spatial and temporal resolution in atmospheric and oceanic circulation and land surface-atmosphere interactions). Greater computing power will not likely have any sizeable effect on the magnitude of warming predicted. Improved precision, but little change in accuracy, would be my guess.
Or maybe high-speed hubris?
Cf. Fredric Brown, “Answer”:
http://www.alteich.com/oldsite/answer.htm
/Mr Lynn
This statement can’t be right:
“Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.”
I thought all atmospheric features were already taken into account . . . at least all the important ones . . . at least those needed to adequately guide multi-trillion-dollar policy decisions . . .
Re the Fredric Brown story, this one’s even more chilling:
Arthur C. Clarke, “The Nine Billion Names of God”:
http://lucis.net/stuff/clarke/9billion_clarke.html
“Overhead, without any fuss. . .”
/Mr Lynn
46 teraflops means they can arrive at the incorrect answer much faster.
Can’t wait to compare convective cloud clusters generated by GCMs with those observed in the real world.
As a programmer it is the thinking that counts not the processor that is running the thinking. Bigger, better, faster does not by any measure make the logic or results correct. But everyone here knows that. Too bad our government agencies have lost sight of science and are all about agenda. 45 days and counting, what NASA guy was on top of that?
How much does this cost?
In my country too the AGWers are always eager to renew their supercomputer at the expense of our tax.
I’m an electronic engineer and therefore say .. cooool!
However, I always warn people that a machine is only as intelligent as its operator.
It will only take minutes instead of hours now to determine that the climate is getting worse much faster than we imagined.
Perhaps it’ll be like the Snapple mantra
“The best stuff on earth just got better”
Or not
It sounds impressive; until you see that hyperwall and you realise that they are just wasting money while getting their jollies.
You’ve got spend money to make money.
It might help in data processing (ie: computing, software etc), that’s about it.
Hey Folks, I am all for spending money to test the theory and separate fact from belief. Much better than throwing money at something we don’t understand. Let’s hold our government accountable for some quality control. I hope it will be more than the pittance they have applied so far, according to Gavin’s estimation.
OT: I have not read the book by Ian Plimer and I have not read here a comprehensive review. Yet something seems strange: Plimer’s book seems a convenient distraction for both sides: the Realclimatists will easily dump on this while the opposition thinks they are getting MSM mileage. One wonders if this whole business is indeed a pre Copenhagen set up in order to feature a somewhat caricatural figure opposing the true scientists… because so far, I read more about Plimer in Realclimate and other pro AGW blogs than in skeptical blogs! Anyone care to comment? thanks
Prepare for more computer made AGW.
Well if they input climate data assuming that CO2 is the main climate driver they won’t get much use out of it other than saying the same thing again.
Now they need to factor in the Sun, SST’s and any other possible factor for a real climate analysis.
Also, when it comes to local weather forecasting and modeling I noticed that Intellicast, where the chief meteorologist is an AGW skeptic, tends to have a bit more consistancy and many times accuracy in the first 7 days of their forecast for Wichita than the ones output by organizations in bed with the IPCC, they say we’re supposed to have a streak of 70 degree days (not that normal for August by the way), and it seems like other forecast models are starting to creep in that direction. Also I know the model used by Accuweather in comparison really needs a re-work, their forecasts for Wichita a few days out bounces around a bit and gets alot worse until the end of their forecast period.
When you input the correct data with the correct factors, you tend to get better results.
Robert Wood (17:21:09) :
Neat. I’m an electronics engineer, can I check out your programming 😛
(From an engineering perspective)
DaveE.
Pre upgrade: Increase in CO2= 3.5 degree temperature increase
Post upgrade: increase in CO2=3.54983539318 degree temperature increase