
From NASA, new fossil fuel powered toys for the boys.
Related: “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.
Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
News Release from NASA here
NASA Expands High-End Computing System for Climate Simulation
WASHINGTON, DC UNITED STATES
GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.
The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.
Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.
“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”
In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.
“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.
NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
About the system…link
No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.
IBM iDataPlex Cluster: Discover Scalable Units 3 & 4
The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.
Discover
- The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.
Discover derives its name from the NASA adage of “Explore. Discover. Understand.”
Discover System Details
- 49 Racks (compute, storage, switches, and more)
- 111 Tflop/s
- 11,032 Total Cores
- IBM GPFS
- 2.46 PB Storage
- Operating System: SLES
- Job Scheduler: PBS
- Compilers: C, C++, Fortran (Intel and PGI)
Individual Scalable Units
- Manufacturer: Linux Networx/SuperMicro
- 3.33 Tflop/s
- 520 Total Cores
- 2 dual-core processors per node
- 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
- Production: 4Q 2006
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 2Q 2007
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2007
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2008
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 4Q 2008
- Manufacturer: IBM
- 46.23 Tflop/s
- 4,128 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2009
IBM Test System: Schirra
The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.
Hyperwall-2 Visualization System
The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.
New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin
A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.





Perhaps what bothers me most is that they will spend a gazillion dollars on hardware, tout on all about how fast and powerful it all is, then claim that their new output says “we are burning up far faster than we ever thought” .. and the most frightening, “we know this is correct now because we have all of this super computing power to back it up” .. sheeple will believe it…
Wrong answers, faster. It just doesn’t get any better than this.
They are spending all that money and sucking down all that power to do what now? Prove a single theory. Why this theory? What’s so darn special and unique about AGW doom & gloom that it gets exclusive rights to the feeding trough? This stuff is made up and moved around so much the left hand doesn’t know what the right hand is predicting.
What ever happened to the system of proposal for instrument time????
Like HST. What is the justification for monopolizing the installation at the expense of all others?
So who told them to build more computing capacity and what to input?
The previous computer system’s fuzzy output.
It’s like using Hubble to read the plaque on Voyager.
You already know what it looks like and what is says.
Maybe someone can put it to good use, like computing when the next sunspot will appear. At least that result is testable.
All in my lifetime, either I have seen (in person) or spent time on, to date for a variety of engineering purposes: the IBM/360, IBM/370, a stand-alone IBM 4341, TI960, 980 & 990/12, TI-ASC, an unknown CDC model, PDP 11, VAX 11/780, VAX 8600 et al, uVAX, Apollo workstation, a run of Crays, a series of obscure and forgotten GE mainframes, IBM’s Big Blue, the IBM PC, IBM clones, Apple McIntosh, Lisa etc., BeoWulf clusters, and now “The Discover high-end computing system”.
Next please.
.
.
A nice new shiny toy for faster garbage out.
Too bad the government will not spend, say, a quarter of the cost of this project on finding near Earth asteroids and predicting their potential orbits instead of wasting it on the likes of Hansen, Schmidt, et al.
What you see here is the result of telling the right people EXACTLY what they want to hear. You will then have all your career needs fulfilled. In this case it is a ginormous computer that when told what to produce, will produce much more of EXACTLY what those people want to hear. I apparently have this character flaw that causes me to want to tell the truth. With therapy, I hope to be able to correct this affliction, …
I have this cartoon running in my head where with their new found speed and high resolution they run one of their models and for the first time it does something really really really stupid, because it can finally avoid rounding errors or something like that.
You know something like predict the ocean is at absolute zero after 5 years.
As the operator sits there and mutters — “That can’t be right?”
We can hope the new speed and precision will uncover some impossible solutions that will make someone someplace ask a few questions.
“is this real or is this Memorex?”
Larry
Andrew S (17:23:15) :
You’ve got spend money to make money.
============
Too bad governments don’t make money in a business sense, they only print it.
Hope they’ve got the F77 compiler tuned. That’s what Hansen’s code is.
Now if they could only tune Hansen…
F77, ahem, state of the art. Choke.
How about Mathematica, which is something that can provide proofs of the math on demand?
The Discovery System will now demand all controls be routed through it. “Dave, I can save the world now. I see it all so clearly”.
So, with an output already known in advance, what are they REALLY doing with this Monstrosity?
I went to see the Gypsy Woman. She told me I would meet a tall dark stranger, and that it would be a really hot day. I wasn’t sure, so I hired another 10,000 Gypsy Women to improve the accuracy of my forecast… (no insult intended to the Roma people)
Best,
Frank
“Hope they’ve got the F77 compiler tuned. That’s what Hansen’s code is.”
Last i heard there was even a little bit of python code tossed in just for kicks.
Oh, and completely off topic, I had a chance to look at Roger Pielke Sr.’s blog today and noticed this statement concerning Gavin Schmidt’s understanding of boundary layer physics:
I wanted to take issue with that statement and tell him that I believe that Schmidt has an entirely adequate lack of knowledge.
Readers of WUWT will enjoy Dr. John Everett’s http://www.climatechangefacts.info/ for more information on the IPCC, climate modeling, and other aspects of climate science with a unbiased emphasis on the facts. Climate realists will appreciate the contentious folly of AGW theory as presented by Dr. Everett.
Now GISS can chase their tails at the speed of light! “Aye aye, Captain Hansen! Warp speed it is!”
CH
Robin W (19:24:14) : “daisy, daisy give me your answer do…….”
Just what do you think you’re doing Dave?
Now why in the world would they need this if the science was settled?
Interesting that NASA is fighting for its very life but still pissing away countless dollars on computers to perpetuate a fraud.
Ya, but can it beat Kasparov at chess?
It they cheat like with Deep Blue…..maybe.
Andrew S (17:23:15) :
You’ve got spend money to make money.
Nah… It’s the government. Their macro economic rules are different. For them, it really is:
“You’ve got to make money to spend money”
Or more correctly:
“You’ve got to print money to spend money”
Though today, the “printing” is mostly just by creating bits in certain databases in particular bank computers…
So more precisely:
“You’ve got to computer simulate money to spend money”
BTW, true story: When a certain past president decided to “Freeze Iranian Assets” this was accomplished by posting a paper note on the PDP-11 at the Fed that handled international transfers at the time. When the appropriate computer person came on shift, they toggled the keys that disabled Iran’s bits from being moved to any other computer…
Aren’t computers wonderful?…
Gary P (20:01:33) :
What are the odds that the local USHCN Stevenson Screen thermometer box is sitting just outside the air conditioning units for this new toy?
That would be par for the course.
H/T to Douglas Adams…
When Deep Thought produced the answer 42, the solution was to build a bigger and better computer to find out what the actual question was:
“I speak of none other than the computer that is to come after me,” intoned Deep Thought, his voice regaining its accustomed declamatory tones. “A computer whose merest operational parameters I am not worthy to calculate — and yet I will design it for you. A computer which can calculate the Question to the Ultimate Answer, a computer of such infinite and subtle complexity that organic life itself shall form part of its operational matrix. And you yourselves shall take on new forms and go down into the computer to navigate its ten-million-year program! Yes! I shall design this computer for you. And I shall name it also unto you. And it shall be called … The Earth.”
Is that where this is going to? Will GISS get out and look at the Earth before issuing a proclamation, or is it all models?
The role of Mr Hansen in all this?
But is it a Mac or PC ?
I take exception to the following statement:
“This new computing system represents a dramatic step forward in performance for climate simulations.”
It is however an interesting insight into the mondset of climate modellers and their perceptions of “performance”.
What a waste of government money (oops – I meant TAXPAYER money). For what they plan to do they could have just trotted on down to ToysRUs and for $22.95 gotten a genuine glow in the dark Ouija board (http://www.toysrus.com/product/index.jsp?productId=2266493&CAWELAID=107526667) which would generate the answers they want a lot cheaper, faster, and without all that messy programming.