From NASA, new fossil fuel powered toys for the boys.
Related: “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.
Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
News Release from NASA here
NASA Expands High-End Computing System for Climate Simulation
WASHINGTON, DC UNITED STATES
GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.
The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.
Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.
“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”
In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.
“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.
NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
About the system…link
No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.
IBM iDataPlex Cluster: Discover Scalable Units 3 & 4
The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.
Discover
- The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.
Discover derives its name from the NASA adage of “Explore. Discover. Understand.”
Discover System Details
- 49 Racks (compute, storage, switches, and more)
- 111 Tflop/s
- 11,032 Total Cores
- IBM GPFS
- 2.46 PB Storage
- Operating System: SLES
- Job Scheduler: PBS
- Compilers: C, C++, Fortran (Intel and PGI)
Individual Scalable Units
- Manufacturer: Linux Networx/SuperMicro
- 3.33 Tflop/s
- 520 Total Cores
- 2 dual-core processors per node
- 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
- Production: 4Q 2006
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 2Q 2007
- Manufacturer: Linux Networx/Dell
- 10.98 Tflop/s
- 1,032 Total Cores
- Dell 1950 Compute Nodes
- 2 dual-core processors per node
- 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2007
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2008
- Manufacturer: IBM
- 20.64 Tflop/s
- 2,064 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 4Q 2008
- Manufacturer: IBM
- 46.23 Tflop/s
- 4,128 Total Cores
- IBM iDataPlex Compute Nodes
- 2 quad-core processors per node
- 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
- Interconnect: Infiniband DDR
- Production: 3Q 2009
IBM Test System: Schirra
The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.
Hyperwall-2 Visualization System
The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.
New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin
A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.
Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.
Climate scientists can relate.
They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.
High-End Computing System Installed
In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.
To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.
“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”
Well-Suited for Climate Studies
According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.
In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.
For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”
IPCC Simulations
For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.
NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.
And all this computer equipment won’t replace intelligent understanding.
More GIGI (garbage in, garbage out) for the IPCC’s sewerpipe.
Why are they bothering? For years they’ve been telling us they’ve got the answer already.
It may be rubbish, but it’ll be high speed rubbish.
42-now what was the question?
ht to “Hitchiker’s Guide….”
The faster computing will help climate modellers as resolution and process detail increases. However, the CO2 forcing mechanism is derived from very straightforward equations, and is the main driving cause of projected warming over the next century and beyond, according to the models quoted by the IPCC. So, in my humble opinion, the benefit will be more toward improved meteorological modelling rather than climate change magnitude prediction (aside from perhaps improved feedback analyses with higher spatial and temporal resolution in atmospheric and oceanic circulation and land surface-atmosphere interactions). Greater computing power will not likely have any sizeable effect on the magnitude of warming predicted. Improved precision, but little change in accuracy, would be my guess.
Or maybe high-speed hubris?
Cf. Fredric Brown, “Answer”:
http://www.alteich.com/oldsite/answer.htm
/Mr Lynn
This statement can’t be right:
“Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.”
I thought all atmospheric features were already taken into account . . . at least all the important ones . . . at least those needed to adequately guide multi-trillion-dollar policy decisions . . .
Re the Fredric Brown story, this one’s even more chilling:
Arthur C. Clarke, “The Nine Billion Names of God”:
http://lucis.net/stuff/clarke/9billion_clarke.html
“Overhead, without any fuss. . .”
/Mr Lynn
46 teraflops means they can arrive at the incorrect answer much faster.
Can’t wait to compare convective cloud clusters generated by GCMs with those observed in the real world.
As a programmer it is the thinking that counts not the processor that is running the thinking. Bigger, better, faster does not by any measure make the logic or results correct. But everyone here knows that. Too bad our government agencies have lost sight of science and are all about agenda. 45 days and counting, what NASA guy was on top of that?
How much does this cost?
In my country too the AGWers are always eager to renew their supercomputer at the expense of our tax.
I’m an electronic engineer and therefore say .. cooool!
However, I always warn people that a machine is only as intelligent as its operator.
It will only take minutes instead of hours now to determine that the climate is getting worse much faster than we imagined.
Perhaps it’ll be like the Snapple mantra
“The best stuff on earth just got better”
Or not
It sounds impressive; until you see that hyperwall and you realise that they are just wasting money while getting their jollies.
You’ve got spend money to make money.
It might help in data processing (ie: computing, software etc), that’s about it.
Hey Folks, I am all for spending money to test the theory and separate fact from belief. Much better than throwing money at something we don’t understand. Let’s hold our government accountable for some quality control. I hope it will be more than the pittance they have applied so far, according to Gavin’s estimation.
OT: I have not read the book by Ian Plimer and I have not read here a comprehensive review. Yet something seems strange: Plimer’s book seems a convenient distraction for both sides: the Realclimatists will easily dump on this while the opposition thinks they are getting MSM mileage. One wonders if this whole business is indeed a pre Copenhagen set up in order to feature a somewhat caricatural figure opposing the true scientists… because so far, I read more about Plimer in Realclimate and other pro AGW blogs than in skeptical blogs! Anyone care to comment? thanks
Prepare for more computer made AGW.
Well if they input climate data assuming that CO2 is the main climate driver they won’t get much use out of it other than saying the same thing again.
Now they need to factor in the Sun, SST’s and any other possible factor for a real climate analysis.
Also, when it comes to local weather forecasting and modeling I noticed that Intellicast, where the chief meteorologist is an AGW skeptic, tends to have a bit more consistancy and many times accuracy in the first 7 days of their forecast for Wichita than the ones output by organizations in bed with the IPCC, they say we’re supposed to have a streak of 70 degree days (not that normal for August by the way), and it seems like other forecast models are starting to creep in that direction. Also I know the model used by Accuweather in comparison really needs a re-work, their forecasts for Wichita a few days out bounces around a bit and gets alot worse until the end of their forecast period.
When you input the correct data with the correct factors, you tend to get better results.
Robert Wood (17:21:09) :
Neat. I’m an electronics engineer, can I check out your programming 😛
(From an engineering perspective)
DaveE.
Pre upgrade: Increase in CO2= 3.5 degree temperature increase
Post upgrade: increase in CO2=3.54983539318 degree temperature increase
More expensive and faster junk science.
I sure hope they got the free upgrade to Windows 7 when it comes out 😉
When a computer model can tell with any accuracy whether the weather will be wet next week, I’ll start to pay attention to the much more complex climate models.
I wonder if the new machines are still running the old FORTRAN code ?
Hey Giss. Can your new computer tell us when it’s going to rain in San Antonio?
Maybe they could have better spent the money attracting first rate mathematicians with their paper notebooks and chalkboards. Maybe a laptop running Mathematica.
“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!
I hope everyone who knows someone who has lost a job or are themselves hurting in our current economy looks at this expenditure at the colossal waste of money that it is – turning over millions to the incompetent numerical analysts of the GISS…
Lighten up, folks. In the long run this is a good thing. I am much happier with a fast computer and high-speed internet connection. For a few days after I got these I did the same things I’d been doing – but then I learned I could do other things I had not been able to do before.
So, they will transfer their simulations and data to the new hardware and, at first, produce the same answers. But they will soon find they have time to insert other data and other functions and better resolution. Eventually, they will find that things more complex than CO2 driven warming are involved and as these are incorporated the role of GHGs will decrease in their models.
None of this is likely to happen before Copenhagen, of course. But even as the western economies stagnate under “cap and tax” laws we will begin to understand (better) the natural variability of Earth’s climate.
Science is good. Politics – not so much.
I bet the first program they run on it will be DOOM 2
Garbage In, Garbage Out, Only faster !
Maybe they’ll run folding@home on it and we’ll get some productive use out of our tax dollars.
If Henrik Svensmark, Piers Corbyn, Roy Spencer, and Anthony Watts (and who they would choose to help them) were in charge of the inputs I’d feel a lot better about the outputs of this toy.
“Smokey (17:19:59) :
46 teraflops …”
What could go wrong?
JimB
I hope the application has sufficient granularity to take advantage of the extra processing power. Otherwise it might not run much faster.
Chuckle !!!!!!!!!
Now they can be wrong 100 times faster !!!
Robert Wykoff (17:22:17) :
It will only take minutes instead of hours now to determine that the climate is getting worse much faster than we imagined.
And it only took you seconds to know this is what will happen. 😉
“Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !”
…or
As someone else on WUWT posted some time ago:
Uga in, Chucka out.
JimB
Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.
Gerry (16:46:33) :
More GIGI (garbage in, garbage out) for the IPCC’s sewerpipe.
Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !
Frank K. (17:55:58) :
“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!
…
My question is WHICH “past millenium” temperatures are they going to use as “input” and which are they going to use as output to “verify” their programs. “GIGO faster” means also that – if the input equations are going to be compared to the Mann-Jones hokey stick temperatures (whether bristlecone pines or his latest “hurricane tempestology” repitition of the hockey stick tree rings, the output (100 years) CAN’T be correct be the INPUT (1000 years) is a blantant hoax.
Gene Nemetz (18:39:48) :
Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.
—–
Rather:
Because they are spending my money, and your money, to TAKE that money. And the money they are TAKING is my money, and your money.
All hail the new Oracle! Anybody notice there seems to be a significant similarity between these electromechanical oracles and the original that used to sniff natural gas at Delphi? They both have their priests and acolytes, they both must be sacrificed and catered to, and they both had major influence over political leaders. And they both have equivalent track records.
Carbon-based Life Form (17:24:42) : ..according to Gavin’s estimation
Doesn’t it just SUCK that someone the caliber of Gavin Schmidt is one of the people who is going to get to play with this toy?! OMFG!!
So is this baby powered by windmills? Or fossil fuels? Does it come with an energy efficiency rating like waterheaters? If so, what’s the rating?
And don’t tell me 42. That is NOT the answer to everything.
Sweet toy. Now we can compute garbage at an unprecedented rate.
Computers are useless. They can only give you answers.
— Pablo Picasso, painter (1881 – 1973)
This kind of smokescreen makes the climatologists think they are getting somewhere, when all that will happen is that they will become increasingly sure of their answers – while they will be no more accurate than before.
Climate change cannot be predicted by a single model no matter how complex that model.
Population growth models based on data from 1950 -1970 is inapplicable to population growth patterns in 2009. Investment banking models were based on data from 1945 – 2005 during which time single family housing prices in aggregate increased. The models based on these data predicted that single home mortgages in aggregate had less than a 1% chance risk of default. Guess what happened?
The results of this reliance in climatology on models rather than on observation, experiment, and theory is as likely to be as disappointing as similar reliance on econometric models was in the mid 20 century and investment banking models which led to our current financial collapse.
By the way, you can add comments to the article that Anthony linked to at NASA
http://www.nasa.gov/topics/earth/features/climate_computing.html
I just did – we’ll see if they post my comment tomorrow. I simply asked them to state how much taxpayer money was spent on the new processors.
No matter how many bones and entrails soothsayers use, no matter how shiny their rocks, no matter how pretty the pictures on their astrology cards, no matter how the dice sound when they roll the i-ching, no matter how spiffy the processors nor how fast nor which software language they use, no matter what climate scientists, oh sorry, climate soothsayers and doomsayers will not be able to accurately simulate the ACTUAL and REAL WORLD weather and climate systems of Earth. It’s just not possible to accurately simulate, with predictive results of any value, a real system that needs to play itself out in real time as the climate and weather systems do. This was proved mathematically and logically by Stephen Wolfram, A New Kind of Science, Chapter 2. How come this is? It’s because certain systems – such as weather and climate – generate randomness from within the system itself as part of it’s inherent Nature; this isn’t randomness from initial conditions, nor from chaos theory, it’s randomness that is inherent in the system itself. Can’t simulate them, they must play out in real time in the real objective reality. It’s actually even worse… given the vast number of different systems that must be integrated into a whole earth simulation and that many of those systems also have inherent randomness in them.
So soothsay and doomsay away Nasa and IPCC. Gotta love those shiny new computers. Heck what am I saying? I could get a high paying programming systems analysis job there… nope resist the temptation… [:|]
Now what to do with that spiffy parallel computer that Nasa has that could actually contribute to the world?
What’s the carbon footprint of that monstrosity ?
If they’d only spend 1/100000000 of the money on open minds instead of closed ones. Like someone else said why bother, the science is in, now move along — nothing to see here.
pwl (19:04:11) :
Maybe Boeing could have made better use of it 😀
DaveE.
OK boys. Let’s see how this baby does. How much of a temperature increase was it that you wanted…
I imagine that if this monsterous computer gives NASA’s “climate scientists” simulation results that don’t support their biases, they’ll be in the control room with hacksaws and the Discover system will be singing “daisy, daisy give me your answer do…….”
Does anyone know what this cost me? I’m fed up with the government wasting my money.
Jesse
My father-in-law has spent much of his eight decades on this planet raising cows in East Texas. He can spit his tobacco juice in the sand (watch your foot!), stare at it for a moment and predict Mother Nature’s intentions for the coming growing season with startling accuracy! He’s never met a teraflop, but can relate many a story about cowflop. He’s more afraid of his Congressman than he is of cow farts. NASA should ask him…
Another GIGO…GIGO….GIGO Vote!
Remember the MAINE!
Remember the Boeing 787!
Remember the Conneticut 1978 Ice Arena Roof Collapse (first totally “computer” designed truss roof structure..)
Climate models will ultimately depend on quantifying the magnitude and even the sign of water feedback. To quantitatively evaluate this will require very detailed modeling of clouds and precipitation rather that high level models with numerous assumptions. That is the greatest opportunity for these new “toys”.
McIntyre reports that these papers are being presented at: Erice 2009
See also
Satellite and Climate Model Evidence Against Substantial Manmade Climate Change by Roy W. Spencer, Ph.D.
December 27, 2008 (last modified December 29, 2008)
Otherwise GIGO will become “FGIFGO” (Faster Garbage In Faster Garbage Out).
Is the new system powered by solar power? I hope it doesn’t create too large of a carbon footprint.
All that power, and they already know the answer…..
What are the odds that the local USHCN Stevenson Screen thermometer box is sitting just outside the air conditioning units for this new toy?
The beauty of that is all they need to do is turn it on to prove global warming!
Wow, a great post. Interesting to know about environmental actions taken by NASA.
NASA just described climate projections as “forecasts”-I was under the impression that this was a big no-no that only denier cave men would do.
Gary P (20:01:33) : Nice Double Entendre.
The new computer power will enable NASA to get garbage in/garbage out much faster as long as the Goddard Institute for Space Studies has Hansen working for them and his faulty programming.
I’m a bit dismayed about how computer models have come to be more important than actual observations and so I offer a formal statement of the Scientific Computer Modeling Method.
The Scientific Method
1. Observe a phenomenon carefully.
2. Develop a hypothesis that possibly explains the phenomenon.
3. Perform a test in an attempt to disprove or invalidate the hypothesis. If the hypothesis is disproven, return to steps 1 and 2.
4. A hypothesis that stubbornly refuses to be invalidated may be correct. Continue testing.
The Scientific Computer Modeling Method
1. Observe a phenomenon carefully.
2. Develop a computer model that mimics the behavior of the phenomenon.
3. Select observations that conform to the model predictions and dismiss observations as of inadequate quality that conflict with the computer model.
4. In instances where all of the observations conflict with the model, “refine” the model with fudge factors to give a better match with pesky facts. Assert that these factors reveal fundamental processes previously unknown in association with the phenomenon. Under no circumstances willingly reveal your complete data sets, methods, or computer codes.
5. Upon achieving a model of incomprehensible complexity that still somewhat resembles the phenomenon, begin to issue to the popular media dire predictions of catastrophe that will occur as far in the future as possible, at least beyond your professional lifetime.
6. Continue to “refine” the model in order to maximize funding and the awarding of Nobel Prizes.
7. Dismiss as unqualified, ignorant, and conspiracy theorists all who offer criticisms of the model.
Repeat steps 3 through 7 indefinitely.
“What’s the carbon footprint of that monstrosity?”
IT’s it’s own Global Warming Heat Island!
The carbon footprint of this monstrosity will tilt the planet off it’s orbit so we dive into the Sun or smash into Mars….
Like any computer geek that may be a reader/contributor here at WUWT, I would love to have one of those babies myself. However, I cannot help but to be left with a sick feeling in my stomach when I consider the enormous resources being applied to this endeavor as I browser through the Model-E code, realizing that this is the garbage that will be running on that beautiful piece of machinery. I am certainly a huge proponent of mass computing power, as I would consider myself one of the premier gadget freaks, but when I look at the unfathomable amount of debt piling up, Stimulus Bill, OmniBus Bill, Bailout Bills, More Bailout Bills, Cash for Clunkers mess, Climate Bill, upcoming Health Care Bill. When does it end? We are already bankrupt! About 6 years ago I was approaching bankruptcy, and the last thing that was on my mind was popping out to Alienware and buying a sporty new top of the line play toy. I’m sorry, but with all that is going on in our country right now, and the fact that NASA will still be relying on people like Gavin Schmidt and James Hansen to program, operating and massage the data from this cluster-F, I simply cannot endorse such a notion, and I believe this is just one more example of our fine government waste in action. … to gain a little perspective on this, try this link on for size: http://usdebtclock.org/ … tough to get too excited about spending more money on toys when you put it into perspective, don’t you think?
Sorry to be such a downer here .. its just that sometimes reality can smack you in the face pretty hard.
But, do they have enough compute power to take into account the sun?
I doubt this will be enough compute power to fill the knowledge gap. What man knows about the earth’s climate, you can write a book, what man doesn’t know, would fill a library. Until someone finds the missing library full of books, sounds like it will be nothing more than a waste of badly needed electricity.
You would have thought that in the case of AGW, the theory predicts that there should be a measurable hotspot in the upper troposphere. Observations from satellites and weather balloons have been made, and no hotspot has been found. In a less politicized field, this would have been the end of it.
Need new theory.
Nice one Roy, may I post your comparison steps elsewhere?
That’s great. Now the models can be wrong faster.
Do the new systems address their lack of QC? Because that seems to be what’s lacking and probably would have been money better spent.
FORTRAN ???
So James can run programs like this real fast.
program a warming world
print *,”The Planet is getting warmer!”
end program a warming world
And than proclaim it’s evidence.
Perhaps what bothers me most is that they will spend a gazillion dollars on hardware, tout on all about how fast and powerful it all is, then claim that their new output says “we are burning up far faster than we ever thought” .. and the most frightening, “we know this is correct now because we have all of this super computing power to back it up” .. sheeple will believe it…
Wrong answers, faster. It just doesn’t get any better than this.
They are spending all that money and sucking down all that power to do what now? Prove a single theory. Why this theory? What’s so darn special and unique about AGW doom & gloom that it gets exclusive rights to the feeding trough? This stuff is made up and moved around so much the left hand doesn’t know what the right hand is predicting.
What ever happened to the system of proposal for instrument time????
Like HST. What is the justification for monopolizing the installation at the expense of all others?
So who told them to build more computing capacity and what to input?
The previous computer system’s fuzzy output.
It’s like using Hubble to read the plaque on Voyager.
You already know what it looks like and what is says.
Maybe someone can put it to good use, like computing when the next sunspot will appear. At least that result is testable.
All in my lifetime, either I have seen (in person) or spent time on, to date for a variety of engineering purposes: the IBM/360, IBM/370, a stand-alone IBM 4341, TI960, 980 & 990/12, TI-ASC, an unknown CDC model, PDP 11, VAX 11/780, VAX 8600 et al, uVAX, Apollo workstation, a run of Crays, a series of obscure and forgotten GE mainframes, IBM’s Big Blue, the IBM PC, IBM clones, Apple McIntosh, Lisa etc., BeoWulf clusters, and now “The Discover high-end computing system”.
Next please.
.
.
A nice new shiny toy for faster garbage out.
Too bad the government will not spend, say, a quarter of the cost of this project on finding near Earth asteroids and predicting their potential orbits instead of wasting it on the likes of Hansen, Schmidt, et al.
What you see here is the result of telling the right people EXACTLY what they want to hear. You will then have all your career needs fulfilled. In this case it is a ginormous computer that when told what to produce, will produce much more of EXACTLY what those people want to hear. I apparently have this character flaw that causes me to want to tell the truth. With therapy, I hope to be able to correct this affliction, …
I have this cartoon running in my head where with their new found speed and high resolution they run one of their models and for the first time it does something really really really stupid, because it can finally avoid rounding errors or something like that.
You know something like predict the ocean is at absolute zero after 5 years.
As the operator sits there and mutters — “That can’t be right?”
We can hope the new speed and precision will uncover some impossible solutions that will make someone someplace ask a few questions.
“is this real or is this Memorex?”
Larry
Andrew S (17:23:15) :
You’ve got spend money to make money.
============
Too bad governments don’t make money in a business sense, they only print it.
Hope they’ve got the F77 compiler tuned. That’s what Hansen’s code is.
Now if they could only tune Hansen…
F77, ahem, state of the art. Choke.
How about Mathematica, which is something that can provide proofs of the math on demand?
The Discovery System will now demand all controls be routed through it. “Dave, I can save the world now. I see it all so clearly”.
So, with an output already known in advance, what are they REALLY doing with this Monstrosity?
I went to see the Gypsy Woman. She told me I would meet a tall dark stranger, and that it would be a really hot day. I wasn’t sure, so I hired another 10,000 Gypsy Women to improve the accuracy of my forecast… (no insult intended to the Roma people)
Best,
Frank
“Hope they’ve got the F77 compiler tuned. That’s what Hansen’s code is.”
Last i heard there was even a little bit of python code tossed in just for kicks.
Oh, and completely off topic, I had a chance to look at Roger Pielke Sr.’s blog today and noticed this statement concerning Gavin Schmidt’s understanding of boundary layer physics:
I wanted to take issue with that statement and tell him that I believe that Schmidt has an entirely adequate lack of knowledge.
Readers of WUWT will enjoy Dr. John Everett’s http://www.climatechangefacts.info/ for more information on the IPCC, climate modeling, and other aspects of climate science with a unbiased emphasis on the facts. Climate realists will appreciate the contentious folly of AGW theory as presented by Dr. Everett.
Now GISS can chase their tails at the speed of light! “Aye aye, Captain Hansen! Warp speed it is!”
CH
Robin W (19:24:14) : “daisy, daisy give me your answer do…….”
Just what do you think you’re doing Dave?
Now why in the world would they need this if the science was settled?
Interesting that NASA is fighting for its very life but still pissing away countless dollars on computers to perpetuate a fraud.
Ya, but can it beat Kasparov at chess?
It they cheat like with Deep Blue…..maybe.
Andrew S (17:23:15) :
You’ve got spend money to make money.
Nah… It’s the government. Their macro economic rules are different. For them, it really is:
“You’ve got to make money to spend money”
Or more correctly:
“You’ve got to print money to spend money”
Though today, the “printing” is mostly just by creating bits in certain databases in particular bank computers…
So more precisely:
“You’ve got to computer simulate money to spend money”
BTW, true story: When a certain past president decided to “Freeze Iranian Assets” this was accomplished by posting a paper note on the PDP-11 at the Fed that handled international transfers at the time. When the appropriate computer person came on shift, they toggled the keys that disabled Iran’s bits from being moved to any other computer…
Aren’t computers wonderful?…
Gary P (20:01:33) :
What are the odds that the local USHCN Stevenson Screen thermometer box is sitting just outside the air conditioning units for this new toy?
That would be par for the course.
H/T to Douglas Adams…
When Deep Thought produced the answer 42, the solution was to build a bigger and better computer to find out what the actual question was:
“I speak of none other than the computer that is to come after me,” intoned Deep Thought, his voice regaining its accustomed declamatory tones. “A computer whose merest operational parameters I am not worthy to calculate — and yet I will design it for you. A computer which can calculate the Question to the Ultimate Answer, a computer of such infinite and subtle complexity that organic life itself shall form part of its operational matrix. And you yourselves shall take on new forms and go down into the computer to navigate its ten-million-year program! Yes! I shall design this computer for you. And I shall name it also unto you. And it shall be called … The Earth.”
Is that where this is going to? Will GISS get out and look at the Earth before issuing a proclamation, or is it all models?
The role of Mr Hansen in all this?
But is it a Mac or PC ?
I take exception to the following statement:
“This new computing system represents a dramatic step forward in performance for climate simulations.”
It is however an interesting insight into the mondset of climate modellers and their perceptions of “performance”.
What a waste of government money (oops – I meant TAXPAYER money). For what they plan to do they could have just trotted on down to ToysRUs and for $22.95 gotten a genuine glow in the dark Ouija board (http://www.toysrus.com/product/index.jsp?productId=2266493&CAWELAID=107526667) which would generate the answers they want a lot cheaper, faster, and without all that messy programming.
The comment was:
—————
FatBigot (16:54:07) :
Why are they bothering? For years they’ve been telling us they’ve got the answer already.
—————
.
Well, ya see? It’s as this: They tell =US= that =THEY= know what they’re talking about, and that’s so there aren’t any questions in people’s minds.
.
However, the real truth is just this: THEY realize that =THEY= don’t really know what =THEY= are talking about.
.
Ergo the need for more money to prove that they DON’T know what they didn’t know and CAN’T ever know: What they don’t know.
.
Got that?
.
Otherwise the program they use will only lead to the same conclusions, only faster and with greater resolution.
.
Kinda sorta like four-wheel drive versus two-wheel drive: With four driving wheels, if you don’t know what your doing, you’ll get into trouble twice as fast!
.
Douglas DC (17:11:53) :
42-now what was the question?
ht to “Hitchiker’s Guide….”
Look, it’s in black & white, 5 x 9 = 42! Always has, always will, me bucko!
I can see the transatlantic spats now, those jolly chaps & chapesses in the Met Office chanting, “Na na na na na! Our Deep Thought is better than your Deep Thought!”
As to costs, well the Met Office have very kindly spent £30M of money that isn’t theirs on their new super-computer, plus all the salaries to go with it, I understand that place is like a glorified holiday camp, have a guess at what your lot are spending that isn’t theirs! $50M+? Governments have no money, it’s taxpayers money folks, which they remove by legal force from your wallets!
AND just how will they know what they get out of the end of it is right or meaningful. Well they don’t actually, & after all that’s isn’t the point either. They just need lots of evidence to point the finger. The UK Government spends £Ms on Focus Groups & fake charities that prepare reports, that just happen to endorse its policy proposals so that they can raise taxes after promising they wouldn’t do it! It’s simple really & this is how benign totalitarianism works, they create solution to a non-existant problem, define the problem, then legislate against it. You’re next!
..and all they get is some kind of hockey stick, or curves obediently following the CO2 curve.
That’s a seriously impressive amount of processing power, and probably enough to make Microsoft’s FSX playable.
Are they going to power these giants with alternative power sources?
Isn’t it ironic that the warmists need this cutting edge technology, and at the same time they’re using this cutting edge technology to enact legislation to curtail the use of such technology, and at the same time the deniers are slamming the warmists for using cutting edge technology?
Why not use the now well established peer reviewed economical models used in academic research and simulate the DOW of the future.
Estimate the DOW value of 2100 should not be any problem if you use the latest super computer instead of a single PC.
The more powerful computer, the more accurate the result should be, right?
Is this all really necessary when all it will do is draw more curves and then plot more points that correlate with something somewhere that will be peerreviewedpublishedinajournal as the third letter from Michael to the Gavinites followed shortly by rigorous analysis by the Apostates who will once again show the abuse of the “statistical method”.
This AGW really is a cash cow getting a good milking which will be followed by some cap ‘n tax or ets that will squeeze the last drops from the shriveled udder.
That thing will need some intelligent input, do they have it?
Does this make it the world’s fastest cherry-picking machine?
So NASA will now get the wrong answers faster? Reminds me of the blurb about coffee…..drink lots and do stupid things faster.
It would be better an horoscope of future climate, meanwhile old sun it is parked somewhere out there resting.
Wow, nobody will be able to question the validity of the model results obtained on that baby (/sarcasm). Seriously, when will I be able to buy a desktop version?
Hey GISS, what’s the density altitude and temp going to be at
Hey GISS, what are the density altitude and temp going to be at MSO at 1200Z tomorrow? The air tanker guys could save a lot of time and labor if you could let them know.
The question is: How do I turn it on?
sky: “Can’t wait to compare convective cloud clusters generated by GCMs with those observed in the real world.”
The observations will be wrong, of course, and the model right. 🙂
I find it “interesting” (other words that came to mind were far too coarse to post) that they plan to make model runs back a millennium, and forward to 2100. I need only ask: ON WHAT BLOODY DATA?!?!?!!?
I certainly hope someone without an axe to grind can access such high-res simulations so that we can get an independent assessment of how well the simulations replicate reality. I’m betting not very well. Current models do not necessarily do very well for hurricanes, so why should the GCMs.
It’s also nice to see that they have admitted never seeing a well-formed eye wall in prior simulations.
The key would seem to be for the modelers to work with the people who do field research and add computations for “natural factors” and see what happens under this new computational capacity, but I rather have doubts that will occur before I’m pushing up daisies … or the next ice age arrives.
So we can go from garbage in to garbage out much faster than ever before; rah !
Reminds me of an “Idea for Design” in a famous (old) electronics industry magazine. Someone donated an hour of computer time on an IBM 1100 or some such to this “engineer” so he decided to do a Monte Carlo analysis, of the two transistor amplifier circuit he had “Worst case” designed to amplify with a gain of 10 +/-1 .
The results of the analysis said that the lead resistor of the main gain stage was the most critical item in the circuit. It also said that the spread of amplifier gains was skewed outside the worst case design limits (how could that be if it is a worst case design). So it suggested a 5% change in the value of that load resistor, to resore the gain to its desired value.
Whoop de do; the :engineer didn’t do the WCD properly; but more importantly it was a totally lousy circuit to use in the first place; and he could have rearranged the exact same number of circuit components into a much better negative feedback amplifier whose gain would be the ratio of just two resistors (easy to make 10:1 with standard values); and quite independent almost of the properties of either of the two transistors; a much more robust design.
So a fast computer can perhaps optimise a lousy design; or climate theory; but don’t expect it to come up with a better design or climate theory that a better engineer (climate scientist) could have figured out without the faster computer.
NASA isn’t going to statistical mathematicate its way to a better climate science explanation; so why the bigger computer.
Some brains at WUWT are faster and more intelligent that that 21st.century childish hardware.
pwl:
“Nice one Roy, may I post your comparison steps elsewhere?”
Be my guest. Permission granted. ;^)
““This new computing system represents a dramatic step forward in performance for climate simulations.””
Is this tacit acknowledgement that current climate models are not too good in their simulations?
Should it be tested retro forecasting? Take a location(s) surface weather station and feed in inputs from last 7 days/month/year, then test computer to see if it can mimic the climate inputs and outputs for that particular week/month/year – are NASA up for it??
I do feel that all this ‘power’ should be used for something more tangible, if it came up with projections say, that the climate is not warming would they (NASA) release the data?? – or am I being ingenuous?
There must be a tendency for the computer experts to be so completely immersed in the subject and intricacies of the engineering, to miss entirely the objective, like being lost in cyberspace.
I am not sure that we shall ever be able to predict with any accuracy this beautiful and terrible climate of our world – or we may just be on the verge……. and something goes off in a corner of Wyoming!!****! – now might I suggest, that we should far more concentrate our energies on volcanic/seismic event prediction.
Turn it OFF! Isn’t that what we’ve been hearing from the “greenies” concerning our computers. I just read where computers are responsible for millions of tons of GHGs, so the best thing they can do for the enviroment is to shut down this massive GHG smoking hunk of earth destroying junk. We’ve already been told that they are “pretty certain” that we are all going to bake like KFC, so why do they need to do anymore of this nonsense? How many light bulbs I got to change out in order to save enough juice to run this montrosity?
See http://seattletimes.nwsource.com/html/boeingaerospace/2009565319_boeing30.html and http://blogs.crikey.com.au/planetalking/2009/07/07/how-the-joint-strike-fighter-and-dreamliner-787-programs-are-compromised-by-similar-project-management-failures/ for some of the triumphs of computer modelling.
“”” Peter West (15:02:54) :
See http://seattletimes.nwsource.com/html/boeingaerospace/2009565319_boeing30.html and http://blogs.crikey.com.au/planetalking/2009/07/07/how-the-joint-strike-fighter-and-dreamliner-787-programs-are-compromised-by-similar-project-management-failures/ for some of the triumphs of computer modelling. “””
As I recall, the 787 has yet to get off the ground (commercially) so don’t count that as a success. I think the 737 was a success; the 787 may never be.
And the JSF if they ever build any, will just be displacing a far superior aircraft.
But if you want to buy a product that resulted from computer modelling, you can now get the Logitech “Everywhere MX” mouse that will track (almost) everywhere; even on glass. I can pretty much guarantee that it will NOT work on the Hubble Telescope Primary mirror, even though they won’t let anyone try that; but it works most everywhere else.
George
So as a test of this new terramachine, why don’t they turn it on, and have it calculate the GISStemp anomaly back to the day that James Hansen invented global warming; that would include of course calculating from basic physics, the actual raw data readings from Hansen’s collection of errant owl boxes, that are on Anthony’s “little list”.
I wouodn’t let them advance to positive times BP, until that can correctly get the CE, and BCE data calculated from Peter Humbug;’s model.