NASA powers up for the next UN IPCC

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo

From NASA,  new fossil fuel powered toys for the boys.

Related:  “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.

Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

News Release from NASA here

NASA Expands High-End Computing System for Climate Simulation

WASHINGTON, DC UNITED STATES

GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.

The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.

Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.

“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”

In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.

“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.

NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

About the systemlink

No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.

Photo of iDataPlex cluster

IBM iDataPlex Cluster: Discover Scalable Units 3 & 4

The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.

Discover

The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.

Discover derives its name from the NASA adage of “Explore. Discover. Understand.”

Discover System Details

System Architecture
As mentioned above, the system architecture is made up of multiple scalable units. The list below describes the total aggregate components of the system and its individual scalable units.

Aggregate
  • 49 Racks (compute, storage, switches, and more)
  • 111 Tflop/s
  • 11,032 Total Cores
File System and Storage
  • IBM GPFS
  • 2.46 PB Storage
Operating Environment
  • Operating System: SLES
  • Job Scheduler: PBS
  • Compilers: C, C++, Fortran (Intel and PGI)

Individual Scalable Units

Base Unit
  • Manufacturer: Linux Networx/SuperMicro
  • 3.33 Tflop/s
  • 520 Total Cores
  • 2 dual-core processors per node
  • 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
  • Production: 4Q 2006
Scalable Unit 1
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 2Q 2007
Scalable Unit 2
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2007
Scalable Unit 3
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2008
Scalable Unit 4
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 4Q 2008
Scalable Unit 5
  • Manufacturer: IBM
  • 46.23 Tflop/s
  • 4,128 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2009

Photo of IBM POWER5+ system

IBM Test System: Schirra

The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.

Photo of the 128-screen hyperwall-2 visualization system

Hyperwall-2 Visualization System

The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.

Climate Simulation Computer Becomes More Powerful
08.24.09

Moving from 27 kilometer resolution to 3.5 kilometer resolution yields cloud clusters like those seen in observations from NOAA's latest GOES satellite.> View larger image

> View larger 27 km image

> View larger 3.5 km image

> View larger GOES-O image

New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin

A 24 hour simulation run at progressively higher resolutions captures finer and finer details of cloud vortex streets moving over Alaska's Aleutian Islands.> View larger image

A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.> View larger image

Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

0 0 votes
Article Rating
126 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Antonio San
August 24, 2009 4:38 pm

And all this computer equipment won’t replace intelligent understanding.

Gerry
August 24, 2009 4:46 pm

More GIGI (garbage in, garbage out) for the IPCC’s sewerpipe.

FatBigot
August 24, 2009 4:54 pm

Why are they bothering? For years they’ve been telling us they’ve got the answer already.

dearieme
August 24, 2009 4:58 pm

It may be rubbish, but it’ll be high speed rubbish.

Douglas DC
August 24, 2009 5:11 pm

42-now what was the question?
ht to “Hitchiker’s Guide….”

lulo
August 24, 2009 5:13 pm

The faster computing will help climate modellers as resolution and process detail increases. However, the CO2 forcing mechanism is derived from very straightforward equations, and is the main driving cause of projected warming over the next century and beyond, according to the models quoted by the IPCC. So, in my humble opinion, the benefit will be more toward improved meteorological modelling rather than climate change magnitude prediction (aside from perhaps improved feedback analyses with higher spatial and temporal resolution in atmospheric and oceanic circulation and land surface-atmosphere interactions). Greater computing power will not likely have any sizeable effect on the magnitude of warming predicted. Improved precision, but little change in accuracy, would be my guess.

Mr Lynn
August 24, 2009 5:15 pm

dearieme (16:58:50) :
It may be rubbish, but it’ll be high speed rubbish.

Or maybe high-speed hubris?
Cf. Fredric Brown, “Answer”:
http://www.alteich.com/oldsite/answer.htm
/Mr Lynn

Eric Anderson
August 24, 2009 5:16 pm

This statement can’t be right:
“Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.”
I thought all atmospheric features were already taken into account . . . at least all the important ones . . . at least those needed to adequately guide multi-trillion-dollar policy decisions . . .

Mr Lynn
August 24, 2009 5:19 pm

Re the Fredric Brown story, this one’s even more chilling:
Arthur C. Clarke, “The Nine Billion Names of God”:
http://lucis.net/stuff/clarke/9billion_clarke.html
“Overhead, without any fuss. . .”
/Mr Lynn

August 24, 2009 5:19 pm

46 teraflops means they can arrive at the incorrect answer much faster.

sky
August 24, 2009 5:20 pm

Can’t wait to compare convective cloud clusters generated by GCMs with those observed in the real world.

TerryBixler
August 24, 2009 5:20 pm

As a programmer it is the thinking that counts not the processor that is running the thinking. Bigger, better, faster does not by any measure make the logic or results correct. But everyone here knows that. Too bad our government agencies have lost sight of science and are all about agenda. 45 days and counting, what NASA guy was on top of that?

tokyoboy
August 24, 2009 5:21 pm

How much does this cost?
In my country too the AGWers are always eager to renew their supercomputer at the expense of our tax.

Robert Wood
August 24, 2009 5:21 pm

I’m an electronic engineer and therefore say .. cooool!
However, I always warn people that a machine is only as intelligent as its operator.

Robert Wykoff
August 24, 2009 5:22 pm

It will only take minutes instead of hours now to determine that the climate is getting worse much faster than we imagined.

bryan
August 24, 2009 5:22 pm

Perhaps it’ll be like the Snapple mantra
“The best stuff on earth just got better”
Or not

Greg Cavanagh
August 24, 2009 5:22 pm

It sounds impressive; until you see that hyperwall and you realise that they are just wasting money while getting their jollies.

Andrew S
August 24, 2009 5:23 pm

You’ve got spend money to make money.

VG
August 24, 2009 5:24 pm

It might help in data processing (ie: computing, software etc), that’s about it.

Carbon-based Life Form
August 24, 2009 5:24 pm

Hey Folks, I am all for spending money to test the theory and separate fact from belief. Much better than throwing money at something we don’t understand. Let’s hold our government accountable for some quality control. I hope it will be more than the pittance they have applied so far, according to Gavin’s estimation.

Antonio San
August 24, 2009 5:25 pm

OT: I have not read the book by Ian Plimer and I have not read here a comprehensive review. Yet something seems strange: Plimer’s book seems a convenient distraction for both sides: the Realclimatists will easily dump on this while the opposition thinks they are getting MSM mileage. One wonders if this whole business is indeed a pre Copenhagen set up in order to feature a somewhat caricatural figure opposing the true scientists… because so far, I read more about Plimer in Realclimate and other pro AGW blogs than in skeptical blogs! Anyone care to comment? thanks

Ron de Haan
August 24, 2009 5:27 pm

Prepare for more computer made AGW.

Adam from Kansas
August 24, 2009 5:28 pm

Well if they input climate data assuming that CO2 is the main climate driver they won’t get much use out of it other than saying the same thing again.
Now they need to factor in the Sun, SST’s and any other possible factor for a real climate analysis.
Also, when it comes to local weather forecasting and modeling I noticed that Intellicast, where the chief meteorologist is an AGW skeptic, tends to have a bit more consistancy and many times accuracy in the first 7 days of their forecast for Wichita than the ones output by organizations in bed with the IPCC, they say we’re supposed to have a streak of 70 degree days (not that normal for August by the way), and it seems like other forecast models are starting to creep in that direction. Also I know the model used by Accuweather in comparison really needs a re-work, their forecasts for Wichita a few days out bounces around a bit and gets alot worse until the end of their forecast period.
When you input the correct data with the correct factors, you tend to get better results.

DaveE
August 24, 2009 5:39 pm

Robert Wood (17:21:09) :

I’m an electronic engineer and therefore say .. cooool!

Neat. I’m an electronics engineer, can I check out your programming 😛
(From an engineering perspective)
DaveE.

james allison
August 24, 2009 5:40 pm

Pre upgrade: Increase in CO2= 3.5 degree temperature increase
Post upgrade: increase in CO2=3.54983539318 degree temperature increase

Richard deSousa
August 24, 2009 5:42 pm

More expensive and faster junk science.

Don Penim
August 24, 2009 5:44 pm

I sure hope they got the free upgrade to Windows 7 when it comes out 😉

Dr A Burns
August 24, 2009 5:48 pm

When a computer model can tell with any accuracy whether the weather will be wet next week, I’ll start to pay attention to the much more complex climate models.
I wonder if the new machines are still running the old FORTRAN code ?

August 24, 2009 5:51 pm

Hey Giss. Can your new computer tell us when it’s going to rain in San Antonio?

TJA
August 24, 2009 5:54 pm

Maybe they could have better spent the money attracting first rate mathematicians with their paper notebooks and chalkboards. Maybe a laptop running Mathematica.

Frank K.
August 24, 2009 5:55 pm

“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!
I hope everyone who knows someone who has lost a job or are themselves hurting in our current economy looks at this expenditure at the colossal waste of money that it is – turning over millions to the incompetent numerical analysts of the GISS…

John F. Hultquist
August 24, 2009 6:10 pm

Lighten up, folks. In the long run this is a good thing. I am much happier with a fast computer and high-speed internet connection. For a few days after I got these I did the same things I’d been doing – but then I learned I could do other things I had not been able to do before.
So, they will transfer their simulations and data to the new hardware and, at first, produce the same answers. But they will soon find they have time to insert other data and other functions and better resolution. Eventually, they will find that things more complex than CO2 driven warming are involved and as these are incorporated the role of GHGs will decrease in their models.
None of this is likely to happen before Copenhagen, of course. But even as the western economies stagnate under “cap and tax” laws we will begin to understand (better) the natural variability of Earth’s climate.
Science is good. Politics – not so much.

ian middleton
August 24, 2009 6:15 pm

I bet the first program they run on it will be DOOM 2

August 24, 2009 6:20 pm

Garbage In, Garbage Out, Only faster !

Joe Miner
August 24, 2009 6:23 pm

Maybe they’ll run folding@home on it and we’ll get some productive use out of our tax dollars.

Gene Nemetz
August 24, 2009 6:29 pm

If Henrik Svensmark, Piers Corbyn, Roy Spencer, and Anthony Watts (and who they would choose to help them) were in charge of the inputs I’d feel a lot better about the outputs of this toy.

JimB
August 24, 2009 6:34 pm

“Smokey (17:19:59) :
46 teraflops …”
What could go wrong?
JimB

Richard M
August 24, 2009 6:34 pm

I hope the application has sufficient granularity to take advantage of the extra processing power. Otherwise it might not run much faster.

paulakw
August 24, 2009 6:35 pm

Chuckle !!!!!!!!!
Now they can be wrong 100 times faster !!!

Gene Nemetz
August 24, 2009 6:35 pm

Robert Wykoff (17:22:17) :
It will only take minutes instead of hours now to determine that the climate is getting worse much faster than we imagined.
And it only took you seconds to know this is what will happen. 😉

JimB
August 24, 2009 6:36 pm

“Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !”
…or
As someone else on WUWT posted some time ago:
Uga in, Chucka out.
JimB

Gene Nemetz
August 24, 2009 6:39 pm

Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.

Editor
August 24, 2009 6:42 pm

Gerry (16:46:33) :
More GIGI (garbage in, garbage out) for the IPCC’s sewerpipe.
Mike McMillan (18:20:39) :
Garbage In, Garbage Out, Only faster !
Frank K. (17:55:58) :
“GISS ModelE will perform simulations going back a full millennium and forward to 2100.”
This will be hilarious to observe – an undocumented, garbage FORTRAN code like Model E attempting to model climate for a full millennium. Your tax dollars at work!

My question is WHICH “past millenium” temperatures are they going to use as “input” and which are they going to use as output to “verify” their programs. “GIGO faster” means also that – if the input equations are going to be compared to the Mann-Jones hokey stick temperatures (whether bristlecone pines or his latest “hurricane tempestology” repitition of the hockey stick tree rings, the output (100 years) CAN’T be correct be the INPUT (1000 years) is a blantant hoax.

Editor
August 24, 2009 6:43 pm

Gene Nemetz (18:39:48) :
Andrew S (17:23:15) :
You’ve got spend money to make money.
God I hate that this is true. Because they are spending my money, and your money, to make that money. And the money they are making is my money, and your money.
—–
Rather:
Because they are spending my money, and your money, to TAKE that money. And the money they are TAKING is my money, and your money.

Curiousgeorge
August 24, 2009 6:44 pm

All hail the new Oracle! Anybody notice there seems to be a significant similarity between these electromechanical oracles and the original that used to sniff natural gas at Delphi? They both have their priests and acolytes, they both must be sacrificed and catered to, and they both had major influence over political leaders. And they both have equivalent track records.

Gene Nemetz
August 24, 2009 6:44 pm

Carbon-based Life Form (17:24:42) : ..according to Gavin’s estimation
Doesn’t it just SUCK that someone the caliber of Gavin Schmidt is one of the people who is going to get to play with this toy?! OMFG!!

August 24, 2009 6:50 pm

So is this baby powered by windmills? Or fossil fuels? Does it come with an energy efficiency rating like waterheaters? If so, what’s the rating?
And don’t tell me 42. That is NOT the answer to everything.

August 24, 2009 6:52 pm

Sweet toy. Now we can compute garbage at an unprecedented rate.

Jeff Szuhay
August 24, 2009 6:54 pm

Computers are useless. They can only give you answers.
— Pablo Picasso, painter (1881 – 1973)

WatasC
August 24, 2009 6:56 pm

This kind of smokescreen makes the climatologists think they are getting somewhere, when all that will happen is that they will become increasingly sure of their answers – while they will be no more accurate than before.
Climate change cannot be predicted by a single model no matter how complex that model.
Population growth models based on data from 1950 -1970 is inapplicable to population growth patterns in 2009. Investment banking models were based on data from 1945 – 2005 during which time single family housing prices in aggregate increased. The models based on these data predicted that single home mortgages in aggregate had less than a 1% chance risk of default. Guess what happened?
The results of this reliance in climatology on models rather than on observation, experiment, and theory is as likely to be as disappointing as similar reliance on econometric models was in the mid 20 century and investment banking models which led to our current financial collapse.

Frank K.
August 24, 2009 7:02 pm

By the way, you can add comments to the article that Anthony linked to at NASA
http://www.nasa.gov/topics/earth/features/climate_computing.html
I just did – we’ll see if they post my comment tomorrow. I simply asked them to state how much taxpayer money was spent on the new processors.

pwl
August 24, 2009 7:04 pm

No matter how many bones and entrails soothsayers use, no matter how shiny their rocks, no matter how pretty the pictures on their astrology cards, no matter how the dice sound when they roll the i-ching, no matter how spiffy the processors nor how fast nor which software language they use, no matter what climate scientists, oh sorry, climate soothsayers and doomsayers will not be able to accurately simulate the ACTUAL and REAL WORLD weather and climate systems of Earth. It’s just not possible to accurately simulate, with predictive results of any value, a real system that needs to play itself out in real time as the climate and weather systems do. This was proved mathematically and logically by Stephen Wolfram, A New Kind of Science, Chapter 2. How come this is? It’s because certain systems – such as weather and climate – generate randomness from within the system itself as part of it’s inherent Nature; this isn’t randomness from initial conditions, nor from chaos theory, it’s randomness that is inherent in the system itself. Can’t simulate them, they must play out in real time in the real objective reality. It’s actually even worse… given the vast number of different systems that must be integrated into a whole earth simulation and that many of those systems also have inherent randomness in them.
So soothsay and doomsay away Nasa and IPCC. Gotta love those shiny new computers. Heck what am I saying? I could get a high paying programming systems analysis job there… nope resist the temptation… [:|]
Now what to do with that spiffy parallel computer that Nasa has that could actually contribute to the world?

Miles
August 24, 2009 7:13 pm

What’s the carbon footprint of that monstrosity ?

Justin Sane
August 24, 2009 7:15 pm

If they’d only spend 1/100000000 of the money on open minds instead of closed ones. Like someone else said why bother, the science is in, now move along — nothing to see here.

DaveE
August 24, 2009 7:16 pm

pwl (19:04:11) :

Now what to do with that spiffy parallel computer that Nasa has that could actually contribute to the world?

Maybe Boeing could have made better use of it 😀
DaveE.

August 24, 2009 7:23 pm

OK boys. Let’s see how this baby does. How much of a temperature increase was it that you wanted…

Robin W
August 24, 2009 7:24 pm

I imagine that if this monsterous computer gives NASA’s “climate scientists” simulation results that don’t support their biases, they’ll be in the control room with hacksaws and the Discover system will be singing “daisy, daisy give me your answer do…….”

August 24, 2009 7:28 pm

Does anyone know what this cost me? I’m fed up with the government wasting my money.
Jesse

Dave Dodd
August 24, 2009 7:29 pm

My father-in-law has spent much of his eight decades on this planet raising cows in East Texas. He can spit his tobacco juice in the sand (watch your foot!), stare at it for a moment and predict Mother Nature’s intentions for the coming growing season with startling accuracy! He’s never met a teraflop, but can relate many a story about cowflop. He’s more afraid of his Congressman than he is of cow farts. NASA should ask him…

August 24, 2009 7:30 pm

Another GIGO…GIGO….GIGO Vote!
Remember the MAINE!
Remember the Boeing 787!
Remember the Conneticut 1978 Ice Arena Roof Collapse (first totally “computer” designed truss roof structure..)

David L. Hagen
August 24, 2009 7:31 pm

Climate models will ultimately depend on quantifying the magnitude and even the sign of water feedback. To quantitatively evaluate this will require very detailed modeling of clouds and precipitation rather that high level models with numerous assumptions. That is the greatest opportunity for these new “toys”.
McIntyre reports that these papers are being presented at: Erice 2009

CLIMATE & CLOUDS
FOCUS: Sensitivity of Climate to Additional CO2 as indicated by Water Cycle Feedback Issues
Chairman A.Zichichi – Co-chair R. Lindzen
10.30 – 12.00
SESSION N° 9
* Dr. William Kininmonth
Australasian Climate Research, Melbourne, Australia
A Natural Constraint to Anthropogenic Global Warming
* Professor Albert Arking
Earth and Planetary Sciences Dept., John Hopkins University, Baltimore, USA
Effect of Sahara Dust on Vertical Transport of Moisture over the Tropical Atlantic and its Impact on Upper Tropospheric Moisture and Water Vapor Feedback.
* Dr. Yon-Sang Choi
Earth, Atmospheric and Planetary Sciences Dept., MIT, Cambridge, USA
Detailed Properties of Clouds and Aerosols obtained from Satellite Data
* Professor Richard S. Lindzen
Department of Earth, Atmospheric & Planetary Sciences, MIT, Cambridge
On the Determination of Climate Feedbacks from Satellite Data
* Professor Garth W. Paltridge
Australian National University and University of Tasmania, Hobart, Australia
Two Basic Problems of Simulating Climate Feedbacks

See also
Satellite and Climate Model Evidence Against Substantial Manmade Climate Change by Roy W. Spencer, Ph.D.
December 27, 2008 (last modified December 29, 2008)
Otherwise GIGO will become “FGIFGO” (Faster Garbage In Faster Garbage Out).

gt
August 24, 2009 7:35 pm

Is the new system powered by solar power? I hope it doesn’t create too large of a carbon footprint.

Layne Blanchard
August 24, 2009 7:40 pm

All that power, and they already know the answer…..

Gary P
August 24, 2009 8:01 pm

What are the odds that the local USHCN Stevenson Screen thermometer box is sitting just outside the air conditioning units for this new toy?
The beauty of that is all they need to do is turn it on to prove global warming!

August 24, 2009 8:03 pm

Wow, a great post. Interesting to know about environmental actions taken by NASA.

timetochooseagain
August 24, 2009 8:09 pm

NASA just described climate projections as “forecasts”-I was under the impression that this was a big no-no that only denier cave men would do.

timetochooseagain
August 24, 2009 8:10 pm

Gary P (20:01:33) : Nice Double Entendre.

LarryT
August 24, 2009 8:12 pm

The new computer power will enable NASA to get garbage in/garbage out much faster as long as the Goddard Institute for Space Studies has Hansen working for them and his faulty programming.

Roy Tucker
August 24, 2009 8:17 pm

I’m a bit dismayed about how computer models have come to be more important than actual observations and so I offer a formal statement of the Scientific Computer Modeling Method.
The Scientific Method
1. Observe a phenomenon carefully.
2. Develop a hypothesis that possibly explains the phenomenon.
3. Perform a test in an attempt to disprove or invalidate the hypothesis. If the hypothesis is disproven, return to steps 1 and 2.
4. A hypothesis that stubbornly refuses to be invalidated may be correct. Continue testing.
The Scientific Computer Modeling Method
1. Observe a phenomenon carefully.
2. Develop a computer model that mimics the behavior of the phenomenon.
3. Select observations that conform to the model predictions and dismiss observations as of inadequate quality that conflict with the computer model.
4. In instances where all of the observations conflict with the model, “refine” the model with fudge factors to give a better match with pesky facts. Assert that these factors reveal fundamental processes previously unknown in association with the phenomenon. Under no circumstances willingly reveal your complete data sets, methods, or computer codes.
5. Upon achieving a model of incomprehensible complexity that still somewhat resembles the phenomenon, begin to issue to the popular media dire predictions of catastrophe that will occur as far in the future as possible, at least beyond your professional lifetime.
6. Continue to “refine” the model in order to maximize funding and the awarding of Nobel Prizes.
7. Dismiss as unqualified, ignorant, and conspiracy theorists all who offer criticisms of the model.
Repeat steps 3 through 7 indefinitely.

pwl
August 24, 2009 8:18 pm

“What’s the carbon footprint of that monstrosity?”
IT’s it’s own Global Warming Heat Island!
The carbon footprint of this monstrosity will tilt the planet off it’s orbit so we dive into the Sun or smash into Mars….

Squidly
August 24, 2009 8:22 pm

Like any computer geek that may be a reader/contributor here at WUWT, I would love to have one of those babies myself. However, I cannot help but to be left with a sick feeling in my stomach when I consider the enormous resources being applied to this endeavor as I browser through the Model-E code, realizing that this is the garbage that will be running on that beautiful piece of machinery. I am certainly a huge proponent of mass computing power, as I would consider myself one of the premier gadget freaks, but when I look at the unfathomable amount of debt piling up, Stimulus Bill, OmniBus Bill, Bailout Bills, More Bailout Bills, Cash for Clunkers mess, Climate Bill, upcoming Health Care Bill. When does it end? We are already bankrupt! About 6 years ago I was approaching bankruptcy, and the last thing that was on my mind was popping out to Alienware and buying a sporty new top of the line play toy. I’m sorry, but with all that is going on in our country right now, and the fact that NASA will still be relying on people like Gavin Schmidt and James Hansen to program, operating and massage the data from this cluster-F, I simply cannot endorse such a notion, and I believe this is just one more example of our fine government waste in action. … to gain a little perspective on this, try this link on for size: http://usdebtclock.org/ … tough to get too excited about spending more money on toys when you put it into perspective, don’t you think?
Sorry to be such a downer here .. its just that sometimes reality can smack you in the face pretty hard.

August 24, 2009 8:29 pm

But, do they have enough compute power to take into account the sun?
I doubt this will be enough compute power to fill the knowledge gap. What man knows about the earth’s climate, you can write a book, what man doesn’t know, would fill a library. Until someone finds the missing library full of books, sounds like it will be nothing more than a waste of badly needed electricity.
You would have thought that in the case of AGW, the theory predicts that there should be a measurable hotspot in the upper troposphere. Observations from satellites and weather balloons have been made, and no hotspot has been found. In a less politicized field, this would have been the end of it.
Need new theory.

pwl
August 24, 2009 8:34 pm

Nice one Roy, may I post your comparison steps elsewhere?

Ken G
August 24, 2009 8:35 pm

That’s great. Now the models can be wrong faster.
Do the new systems address their lack of QC? Because that seems to be what’s lacking and probably would have been money better spent.

Christian Bultmann
August 24, 2009 8:38 pm

FORTRAN ???
So James can run programs like this real fast.
program a warming world
print *,”The Planet is getting warmer!”
end program a warming world
And than proclaim it’s evidence.

Squidly
August 24, 2009 8:43 pm

Ken G (20:35:14) :
That’s great. Now the models can be wrong faster.

Perhaps what bothers me most is that they will spend a gazillion dollars on hardware, tout on all about how fast and powerful it all is, then claim that their new output says “we are burning up far faster than we ever thought” .. and the most frightening, “we know this is correct now because we have all of this super computing power to back it up” .. sheeple will believe it…

Bob
August 24, 2009 8:52 pm

Wrong answers, faster. It just doesn’t get any better than this.

rbateman
August 24, 2009 9:17 pm

They are spending all that money and sucking down all that power to do what now? Prove a single theory. Why this theory? What’s so darn special and unique about AGW doom & gloom that it gets exclusive rights to the feeding trough? This stuff is made up and moved around so much the left hand doesn’t know what the right hand is predicting.
What ever happened to the system of proposal for instrument time????
Like HST. What is the justification for monopolizing the installation at the expense of all others?
So who told them to build more computing capacity and what to input?
The previous computer system’s fuzzy output.
It’s like using Hubble to read the plaque on Voyager.
You already know what it looks like and what is says.
Maybe someone can put it to good use, like computing when the next sunspot will appear. At least that result is testable.

August 24, 2009 9:31 pm

All in my lifetime, either I have seen (in person) or spent time on, to date for a variety of engineering purposes: the IBM/360, IBM/370, a stand-alone IBM 4341, TI960, 980 & 990/12, TI-ASC, an unknown CDC model, PDP 11, VAX 11/780, VAX 8600 et al, uVAX, Apollo workstation, a run of Crays, a series of obscure and forgotten GE mainframes, IBM’s Big Blue, the IBM PC, IBM clones, Apple McIntosh, Lisa etc., BeoWulf clusters, and now “The Discover high-end computing system”.
Next please.
.
.

F. Ross
August 24, 2009 9:44 pm

A nice new shiny toy for faster garbage out.
Too bad the government will not spend, say, a quarter of the cost of this project on finding near Earth asteroids and predicting their potential orbits instead of wasting it on the likes of Hansen, Schmidt, et al.

David Ball
August 24, 2009 9:51 pm

What you see here is the result of telling the right people EXACTLY what they want to hear. You will then have all your career needs fulfilled. In this case it is a ginormous computer that when told what to produce, will produce much more of EXACTLY what those people want to hear. I apparently have this character flaw that causes me to want to tell the truth. With therapy, I hope to be able to correct this affliction, …

hotrod
August 24, 2009 9:51 pm

I have this cartoon running in my head where with their new found speed and high resolution they run one of their models and for the first time it does something really really really stupid, because it can finally avoid rounding errors or something like that.
You know something like predict the ocean is at absolute zero after 5 years.
As the operator sits there and mutters — “That can’t be right?”
We can hope the new speed and precision will uncover some impossible solutions that will make someone someplace ask a few questions.
“is this real or is this Memorex?”
Larry

HarvestMoon
August 24, 2009 9:56 pm

Andrew S (17:23:15) :
You’ve got spend money to make money.
============
Too bad governments don’t make money in a business sense, they only print it.

Mom2girls
August 24, 2009 10:07 pm

Hope they’ve got the F77 compiler tuned. That’s what Hansen’s code is.
Now if they could only tune Hansen…

pwl
August 24, 2009 10:15 pm

F77, ahem, state of the art. Choke.
How about Mathematica, which is something that can provide proofs of the math on demand?

rbateman
August 24, 2009 10:33 pm

The Discovery System will now demand all controls be routed through it. “Dave, I can save the world now. I see it all so clearly”.
So, with an output already known in advance, what are they REALLY doing with this Monstrosity?

Frank Kotler
August 24, 2009 10:35 pm

I went to see the Gypsy Woman. She told me I would meet a tall dark stranger, and that it would be a really hot day. I wasn’t sure, so I hired another 10,000 Gypsy Women to improve the accuracy of my forecast… (no insult intended to the Roma people)
Best,
Frank

crosspatch
August 24, 2009 10:36 pm

“Hope they’ve got the F77 compiler tuned. That’s what Hansen’s code is.”
Last i heard there was even a little bit of python code tossed in just for kicks.
Oh, and completely off topic, I had a chance to look at Roger Pielke Sr.’s blog today and noticed this statement concerning Gavin Schmidt’s understanding of boundary layer physics:

Clearly, however, despite clear evidence of his inadequate lack of knowledge of boundary layer physics, he elected to be the authority on our research papers.

I wanted to take issue with that statement and tell him that I believe that Schmidt has an entirely adequate lack of knowledge.

Brandon Dobson
August 24, 2009 10:44 pm

Readers of WUWT will enjoy Dr. John Everett’s http://www.climatechangefacts.info/ for more information on the IPCC, climate modeling, and other aspects of climate science with a unbiased emphasis on the facts. Climate realists will appreciate the contentious folly of AGW theory as presented by Dr. Everett.

Claude Harvey
August 24, 2009 10:52 pm

Now GISS can chase their tails at the speed of light! “Aye aye, Captain Hansen! Warp speed it is!”
CH

Gene Nemetz
August 24, 2009 11:22 pm

Robin W (19:24:14) : “daisy, daisy give me your answer do…….”
Just what do you think you’re doing Dave?

Clarity2009
August 24, 2009 11:36 pm

Now why in the world would they need this if the science was settled?
Interesting that NASA is fighting for its very life but still pissing away countless dollars on computers to perpetuate a fraud.

Gene Nemetz
August 24, 2009 11:48 pm

Ya, but can it beat Kasparov at chess?
It they cheat like with Deep Blue…..maybe.

E.M.Smith
Editor
August 24, 2009 11:56 pm

Andrew S (17:23:15) :
You’ve got spend money to make money.

Nah… It’s the government. Their macro economic rules are different. For them, it really is:
“You’ve got to make money to spend money”
Or more correctly:
“You’ve got to print money to spend money”
Though today, the “printing” is mostly just by creating bits in certain databases in particular bank computers…
So more precisely:
“You’ve got to computer simulate money to spend money”
BTW, true story: When a certain past president decided to “Freeze Iranian Assets” this was accomplished by posting a paper note on the PDP-11 at the Fed that handled international transfers at the time. When the appropriate computer person came on shift, they toggled the keys that disabled Iran’s bits from being moved to any other computer…
Aren’t computers wonderful?…

Gene Nemetz
August 24, 2009 11:58 pm

Gary P (20:01:33) :
What are the odds that the local USHCN Stevenson Screen thermometer box is sitting just outside the air conditioning units for this new toy?
That would be par for the course.

Derek Walton
August 25, 2009 12:01 am

H/T to Douglas Adams…
When Deep Thought produced the answer 42, the solution was to build a bigger and better computer to find out what the actual question was:
“I speak of none other than the computer that is to come after me,” intoned Deep Thought, his voice regaining its accustomed declamatory tones. “A computer whose merest operational parameters I am not worthy to calculate — and yet I will design it for you. A computer which can calculate the Question to the Ultimate Answer, a computer of such infinite and subtle complexity that organic life itself shall form part of its operational matrix. And you yourselves shall take on new forms and go down into the computer to navigate its ten-million-year program! Yes! I shall design this computer for you. And I shall name it also unto you. And it shall be called … The Earth.”
Is that where this is going to? Will GISS get out and look at the Earth before issuing a proclamation, or is it all models?

Rhys Jaggar
August 25, 2009 12:17 am

The role of Mr Hansen in all this?

Boudu
August 25, 2009 1:19 am

But is it a Mac or PC ?

August 25, 2009 1:21 am

I take exception to the following statement:
“This new computing system represents a dramatic step forward in performance for climate simulations.”
It is however an interesting insight into the mondset of climate modellers and their perceptions of “performance”.

Piggy Bank
August 25, 2009 1:28 am

What a waste of government money (oops – I meant TAXPAYER money). For what they plan to do they could have just trotted on down to ToysRUs and for $22.95 gotten a genuine glow in the dark Ouija board (http://www.toysrus.com/product/index.jsp?productId=2266493&CAWELAID=107526667) which would generate the answers they want a lot cheaper, faster, and without all that messy programming.

Highlander
August 25, 2009 1:36 am

The comment was:
—————
FatBigot (16:54:07) :
Why are they bothering? For years they’ve been telling us they’ve got the answer already.
—————
.
Well, ya see? It’s as this: They tell =US= that =THEY= know what they’re talking about, and that’s so there aren’t any questions in people’s minds.
.
However, the real truth is just this: THEY realize that =THEY= don’t really know what =THEY= are talking about.
.
Ergo the need for more money to prove that they DON’T know what they didn’t know and CAN’T ever know: What they don’t know.
.
Got that?
.
Otherwise the program they use will only lead to the same conclusions, only faster and with greater resolution.
.
Kinda sorta like four-wheel drive versus two-wheel drive: With four driving wheels, if you don’t know what your doing, you’ll get into trouble twice as fast!
.

Alan the Brit
August 25, 2009 1:42 am

Douglas DC (17:11:53) :
42-now what was the question?
ht to “Hitchiker’s Guide….”
Look, it’s in black & white, 5 x 9 = 42! Always has, always will, me bucko!
I can see the transatlantic spats now, those jolly chaps & chapesses in the Met Office chanting, “Na na na na na! Our Deep Thought is better than your Deep Thought!”
As to costs, well the Met Office have very kindly spent £30M of money that isn’t theirs on their new super-computer, plus all the salaries to go with it, I understand that place is like a glorified holiday camp, have a guess at what your lot are spending that isn’t theirs! $50M+? Governments have no money, it’s taxpayers money folks, which they remove by legal force from your wallets!
AND just how will they know what they get out of the end of it is right or meaningful. Well they don’t actually, & after all that’s isn’t the point either. They just need lots of evidence to point the finger. The UK Government spends £Ms on Focus Groups & fake charities that prepare reports, that just happen to endorse its policy proposals so that they can raise taxes after promising they wouldn’t do it! It’s simple really & this is how benign totalitarianism works, they create solution to a non-existant problem, define the problem, then legislate against it. You’re next!

August 25, 2009 1:44 am

..and all they get is some kind of hockey stick, or curves obediently following the CO2 curve.

Joe Soap
August 25, 2009 1:51 am

That’s a seriously impressive amount of processing power, and probably enough to make Microsoft’s FSX playable.

Rhyl Dearden
August 25, 2009 4:00 am

Are they going to power these giants with alternative power sources?

Bruckner8
August 25, 2009 4:49 am

Isn’t it ironic that the warmists need this cutting edge technology, and at the same time they’re using this cutting edge technology to enact legislation to curtail the use of such technology, and at the same time the deniers are slamming the warmists for using cutting edge technology?

August 25, 2009 5:31 am

Why not use the now well established peer reviewed economical models used in academic research and simulate the DOW of the future.
Estimate the DOW value of 2100 should not be any problem if you use the latest super computer instead of a single PC.
The more powerful computer, the more accurate the result should be, right?

Craigo
August 25, 2009 5:38 am

Is this all really necessary when all it will do is draw more curves and then plot more points that correlate with something somewhere that will be peerreviewedpublishedinajournal as the third letter from Michael to the Gavinites followed shortly by rigorous analysis by the Apostates who will once again show the abuse of the “statistical method”.
This AGW really is a cash cow getting a good milking which will be followed by some cap ‘n tax or ets that will squeeze the last drops from the shriveled udder.

Nogw
August 25, 2009 5:57 am

That thing will need some intelligent input, do they have it?

Harold Vance
August 25, 2009 7:10 am

Does this make it the world’s fastest cherry-picking machine?

John Luft
August 25, 2009 7:28 am

So NASA will now get the wrong answers faster? Reminds me of the blurb about coffee…..drink lots and do stupid things faster.

Nogw
August 25, 2009 7:34 am

It would be better an horoscope of future climate, meanwhile old sun it is parked somewhere out there resting.

John G
August 25, 2009 7:37 am

Wow, nobody will be able to question the validity of the model results obtained on that baby (/sarcasm). Seriously, when will I be able to buy a desktop version?

Don S.
August 25, 2009 9:15 am

Hey GISS, what’s the density altitude and temp going to be at

Don S.
August 25, 2009 9:20 am

Hey GISS, what are the density altitude and temp going to be at MSO at 1200Z tomorrow? The air tanker guys could save a lot of time and labor if you could let them know.

Don S.
August 25, 2009 9:24 am

The question is: How do I turn it on?

Gene
August 25, 2009 9:48 am

sky: “Can’t wait to compare convective cloud clusters generated by GCMs with those observed in the real world.”
The observations will be wrong, of course, and the model right. 🙂
I find it “interesting” (other words that came to mind were far too coarse to post) that they plan to make model runs back a millennium, and forward to 2100. I need only ask: ON WHAT BLOODY DATA?!?!?!!?
I certainly hope someone without an axe to grind can access such high-res simulations so that we can get an independent assessment of how well the simulations replicate reality. I’m betting not very well. Current models do not necessarily do very well for hurricanes, so why should the GCMs.
It’s also nice to see that they have admitted never seeing a well-formed eye wall in prior simulations.
The key would seem to be for the modelers to work with the people who do field research and add computations for “natural factors” and see what happens under this new computational capacity, but I rather have doubts that will occur before I’m pushing up daisies … or the next ice age arrives.

George E. Smith
August 25, 2009 9:53 am

So we can go from garbage in to garbage out much faster than ever before; rah !
Reminds me of an “Idea for Design” in a famous (old) electronics industry magazine. Someone donated an hour of computer time on an IBM 1100 or some such to this “engineer” so he decided to do a Monte Carlo analysis, of the two transistor amplifier circuit he had “Worst case” designed to amplify with a gain of 10 +/-1 .
The results of the analysis said that the lead resistor of the main gain stage was the most critical item in the circuit. It also said that the spread of amplifier gains was skewed outside the worst case design limits (how could that be if it is a worst case design). So it suggested a 5% change in the value of that load resistor, to resore the gain to its desired value.
Whoop de do; the :engineer didn’t do the WCD properly; but more importantly it was a totally lousy circuit to use in the first place; and he could have rearranged the exact same number of circuit components into a much better negative feedback amplifier whose gain would be the ratio of just two resistors (easy to make 10:1 with standard values); and quite independent almost of the properties of either of the two transistors; a much more robust design.
So a fast computer can perhaps optimise a lousy design; or climate theory; but don’t expect it to come up with a better design or climate theory that a better engineer (climate scientist) could have figured out without the faster computer.
NASA isn’t going to statistical mathematicate its way to a better climate science explanation; so why the bigger computer.

Nogw
August 25, 2009 10:46 am

Some brains at WUWT are faster and more intelligent that that 21st.century childish hardware.

Roy Tucker
August 25, 2009 12:00 pm

pwl:
“Nice one Roy, may I post your comparison steps elsewhere?”
Be my guest. Permission granted. ;^)

Dave Andrews
August 25, 2009 12:41 pm

““This new computing system represents a dramatic step forward in performance for climate simulations.””
Is this tacit acknowledgement that current climate models are not too good in their simulations?

Thomas J. Arnold.
August 25, 2009 1:04 pm

Should it be tested retro forecasting? Take a location(s) surface weather station and feed in inputs from last 7 days/month/year, then test computer to see if it can mimic the climate inputs and outputs for that particular week/month/year – are NASA up for it??
I do feel that all this ‘power’ should be used for something more tangible, if it came up with projections say, that the climate is not warming would they (NASA) release the data?? – or am I being ingenuous?
There must be a tendency for the computer experts to be so completely immersed in the subject and intricacies of the engineering, to miss entirely the objective, like being lost in cyberspace.
I am not sure that we shall ever be able to predict with any accuracy this beautiful and terrible climate of our world – or we may just be on the verge……. and something goes off in a corner of Wyoming!!****! – now might I suggest, that we should far more concentrate our energies on volcanic/seismic event prediction.

gofer
August 25, 2009 1:52 pm

Turn it OFF! Isn’t that what we’ve been hearing from the “greenies” concerning our computers. I just read where computers are responsible for millions of tons of GHGs, so the best thing they can do for the enviroment is to shut down this massive GHG smoking hunk of earth destroying junk. We’ve already been told that they are “pretty certain” that we are all going to bake like KFC, so why do they need to do anymore of this nonsense? How many light bulbs I got to change out in order to save enough juice to run this montrosity?

George E. Smith
August 26, 2009 11:05 am

“”” Peter West (15:02:54) :
See http://seattletimes.nwsource.com/html/boeingaerospace/2009565319_boeing30.html and http://blogs.crikey.com.au/planetalking/2009/07/07/how-the-joint-strike-fighter-and-dreamliner-787-programs-are-compromised-by-similar-project-management-failures/ for some of the triumphs of computer modelling. “””
As I recall, the 787 has yet to get off the ground (commercially) so don’t count that as a success. I think the 737 was a success; the 787 may never be.
And the JSF if they ever build any, will just be displacing a far superior aircraft.
But if you want to buy a product that resulted from computer modelling, you can now get the Logitech “Everywhere MX” mouse that will track (almost) everywhere; even on glass. I can pretty much guarantee that it will NOT work on the Hubble Telescope Primary mirror, even though they won’t let anyone try that; but it works most everywhere else.
George

George E. Smith
August 26, 2009 11:12 am

So as a test of this new terramachine, why don’t they turn it on, and have it calculate the GISStemp anomaly back to the day that James Hansen invented global warming; that would include of course calculating from basic physics, the actual raw data readings from Hansen’s collection of errant owl boxes, that are on Anthony’s “little list”.
I wouodn’t let them advance to positive times BP, until that can correctly get the CE, and BCE data calculated from Peter Humbug;’s model.