NASA powers up for the next UN IPCC

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.
Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo

From NASA,  new fossil fuel powered toys for the boys.

Related:  “Giant sucking sound” heard in Greenbelt, MD when the switch was thrown.

Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

News Release from NASA here

NASA Expands High-End Computing System for Climate Simulation

WASHINGTON, DC UNITED STATES

GREENBELT, Md., Aug. 24 /PRNewswire-USNewswire/ — NASA’s Goddard Space Flight Center in Greenbelt, Md., made available to scientists in August the first unit of an expanded high-end computing system that will serve as the centerpiece of a new climate simulation capability. The larger computer, part of NASA’s High-End Computing Program, will be hosting the agency’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC) and other national and international climate initiatives.

The expansion added 4,128 computer processors to Goddard’s Discover high-end computing system. The IBM iDataPlex “scalable unit” uses Intel’s newest Xeon 5500 series processors, which are based on the Nehalem architecture introduced in spring 2009.

Discover will be hosting climate simulations for the IPCC’s Fifth Assessment Report by the Goddard Institute for Space Studies (GISS) in New York City and Goddard’s Global Modeling and Assimilation Office (GMAO). Stimulus funds from the American Recovery and Reinvestment Act of 2009 will enable installation of another 4,128 Nehalem processors this fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of the Computational and Information Sciences and Technology Office (CISTO) at Goddard. “This new computing system represents a dramatic step forward in performance for climate simulations.”

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared to other nationally recognized high-end computing systems. Moreover, the new computational capabilities allow NASA climate scientists to run high-resolution simulations that reproduce atmospheric features not previously seen in their models.

“Nehalem architecture is especially well-suited to climate studies,” said Dan Duffy, CISTO lead architect. “Speed is an inherent advantage for solving complex problems, but climate models need large memory and fast access. We configured our Nehalem system to have 3 gigabytes of memory per processor, among the highest available today, and memory access is three to four times faster than Discover’s previous-generation processors.”

In daily forecasts for NASA satellite missions and field campaigns, the GMAO typically runs its flagship Goddard Earth Observing System Model, Version 5 (GEOS-5) at 27-kilometer resolution. Using Discover’s new processors, the GMAO has been testing a special “cubed-sphere” version of GEOS-5 at resolutions as high as 3.5 kilometers.

“Once the model goes below 10-kilometer resolution, features such as well-defined hurricane eyewalls and convective cloud clusters appear for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.” Putman has been collaborating with GMAO modeling lead Max Suarez and others on the cubed-sphere configuration of GEOS-5.

NASA’s IPCC simulations will include both longer-and-shorter-term climate projections using the latest versions of the GISS and GMAO models. GISS ModelE will perform simulations going back a full millennium and forward to 2100. Making its first IPCC contributions, the GMAO will focus on the next 30 years and perform decadal prediction simulations using GEOS-5 and atmospheric chemistry-climate simulations using the GEOS Chemistry Climate Model. With the performance of GEOS-5 on the Nehalem processors, investigations of societal relevance, such as climate impacts on weather extremes, now reach a new level of realism. The IPCC’s Fifth Assessment Report is due to be published in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

About the systemlink

No mention of power consumption in the specs. Though this internal report suggests “dramatically increased power consumption” at the AMES facility when similarly upgraded.

Photo of iDataPlex cluster

IBM iDataPlex Cluster: Discover Scalable Units 3 & 4

The NASA Center for Computational Sciences (NCCS) expanded its “Discover” high-end computer by integrating a 4,096-core IBM iDataPlex cluster, for a combined performance of 67 teraflops. The NCCS plans to at least double the iDataPlex processor count during 2009. Credit: Photo by NASA Goddard Space Flight Center/Pat Izzo.

Discover

The Discover system is an assembly of multiple Linux scalable units built upon commodity components. The first scalable unit was installed in the fall of 2006, and the NCCS continues to expand this highly successful computing platform.

Discover derives its name from the NASA adage of “Explore. Discover. Understand.”

Discover System Details

System Architecture
As mentioned above, the system architecture is made up of multiple scalable units. The list below describes the total aggregate components of the system and its individual scalable units.

Aggregate
  • 49 Racks (compute, storage, switches, and more)
  • 111 Tflop/s
  • 11,032 Total Cores
File System and Storage
  • IBM GPFS
  • 2.46 PB Storage
Operating Environment
  • Operating System: SLES
  • Job Scheduler: PBS
  • Compilers: C, C++, Fortran (Intel and PGI)

Individual Scalable Units

Base Unit
  • Manufacturer: Linux Networx/SuperMicro
  • 3.33 Tflop/s
  • 520 Total Cores
  • 2 dual-core processors per node
  • 3.2 GHz Intel Xeon Dempsey (2 flop/s per clock)
  • Production: 4Q 2006
Scalable Unit 1
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 2Q 2007
Scalable Unit 2
  • Manufacturer: Linux Networx/Dell
  • 10.98 Tflop/s
  • 1,032 Total Cores
  • Dell 1950 Compute Nodes
  • 2 dual-core processors per node
  • 2.66 GHz Intel Xeon Woodcrest (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2007
Scalable Unit 3
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2008
Scalable Unit 4
  • Manufacturer: IBM
  • 20.64 Tflop/s
  • 2,064 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.5 GHz Intel Xeon Harpertown (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 4Q 2008
Scalable Unit 5
  • Manufacturer: IBM
  • 46.23 Tflop/s
  • 4,128 Total Cores
  • IBM iDataPlex Compute Nodes
  • 2 quad-core processors per node
  • 2.8 GHz Intel Xeon Nehalem (4 flop/s per clock)
  • Interconnect: Infiniband DDR
  • Production: 3Q 2009

Photo of IBM POWER5+ system

IBM Test System: Schirra

The High-End Computing Capability Project purchased this 6-teraflop IBM POWER5+ system as part of their effort to evaluate next-generation supercomputing technology, which will eventually replace the Columbia supercomputer, to meet NASA’s future computing needs.

Photo of the 128-screen hyperwall-2 visualization system

Hyperwall-2 Visualization System

The 128-screen hyperwall-2 system is used to view, analyze, and communicate results from NASA’s high-fidelity modeling and simulation projects, leading to new discoveries.

Climate Simulation Computer Becomes More Powerful
08.24.09

Moving from 27 kilometer resolution to 3.5 kilometer resolution yields cloud clusters like those seen in observations from NOAA's latest GOES satellite.> View larger image

> View larger 27 km image

> View larger 3.5 km image

> View larger GOES-O image

New high-resolution climate simulations reproduce atmospheric features not previously seen in NASA computer models. Moving from 27-kilometer resolution (top) to 3.5-kilometer resolution (center) yields cloud clusters like those seen in observations from NOAA’s latest GOES satellite (bottom). Credit: NASA/William Putman, Max Suarez; NOAA/Shian-Jiann Lin

A 24 hour simulation run at progressively higher resolutions captures finer and finer details of cloud vortex streets moving over Alaska's Aleutian Islands.> View larger image

A 24-hour simulation run at progressively higher resolutions captures finer and finer details of cloud “vortex streets” moving over Alaska’s Aleutian Islands. A vortex street is a train of alternating clockwise and anticlockwise eddies (swirls) shed in a flow interacting with some structure. Credit: NASA/William Putman, Max Suarez; NOAA/ Shian-Jiann Lin

Goddard Space Flight Center recently added 4,128 processors to its Discover high end computing system, with another 4,128 processors to follow this fall.> View larger image

Goddard Space Flight Center recently added 4,128 processors to its Discover high-end computing system, with another 4,128 processors to follow this fall. The expanded Discover will host NASA’s climate simulations for the Intergovernmental Panel on Climate Change. Credit: NASA/Pat Izzo Remember the day you got a brand-new computer? Applications snapped open, processes that once took minutes finished in seconds, and graphics and animation flowed as smoothly as TV video. But several months and many new applications later, the bloom fell off the rose.

Your lightning-fast computer no longer was fast. You needed more memory and faster processors to handle the gigabytes of new files now embedded in your machine.

Climate scientists can relate.

They, too, need more powerful computers to process the sophisticated computer models used in climate forecasts. Such an expanded capability is now being developed at NASA’s Goddard Space Flight Center in Greenbelt, Md.

High-End Computing System Installed

In August, Goddard added 4,128 new-generation Intel “Nehalem” processors to its Discover high-end computing system. The upgraded Discover will serve as the centerpiece of a new climate simulation capability at Goddard. Discover will host NASA’s modeling contributions to the Intergovernmental Panel on Climate Change (IPCC), the leading scientific organization for assessing climate change, and other national and international climate initiatives.

To further enhance Discover’s capabilities, Goddard will install another 4,128 Nehalem processors in the fall, bringing Discover to 15,160 processors.

“We are the first high-end computing site in the United States to install Nehalem processors dedicated to climate research,” said Phil Webster, chief of Goddard’s Computational and Information Sciences and Technology Office (CISTO). “This new computing system represents a dramatic step forward in performance for climate simulations.”

Well-Suited for Climate Studies

According to CISTO lead architect Dan Duffy, the Nehalem architecture is especially well-suited to climate studies. “Speed is an inherent advantage for solving complex problems, but climate models also require large memory and fast access to memory,” he said. Each processor has 3 gigabytes of memory, among the highest available today. In addition, memory access is three to four times faster than Discover’s previous-generation processors.

In preliminary testing of Discover’s Nehalem processors, NASA climate simulations performed up to twice as fast per processor compared with other nationally recognized high-end computing systems. The new computational capabilities also allowed NASA climate scientists to run high-resolution simulations that reproduced atmospheric features not previously seen in their models.

For instance, “features such as well-defined hurricane eyewalls and convective cloud clusters appeared for the first time,” said William Putman, acting lead of the Advanced Software Technology Group in Goddard’s Software Integration and Visualization Office. “At these cloud-permitting resolutions, the differences are stunning.”

IPCC Simulations

For the IPCC studies, scientists will run both longer-term and shorter-term climate projections using different computer models. A climate model from the Goddard Institute for Space Studies will perform simulations going back a full millennium and forward to 2100. Goddard’s Global Modeling and Assimilation Office will use a climate model for projections of the next 30 years and an atmospheric chemistry-climate model for short-term simulations of chemistry-climate feedbacks. The IPCC will use information from climate simulations such as these in its Fifth Assessment Report, which IPCC expects to publish in 2014.

NASA climate simulation efforts also contribute to the U.S. Global Change Research Program, the U.S. Integrated Earth Observation System, and the U.S. Weather Research Program. Supported international programs include UNESCO’s Intergovernmental Oceanographic Commission, the United Nations Environment Programme, the World Climate Research Programme, the World Meteorological Organization, and the World Weather Research Programme.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

126 Comments
Inline Feedbacks
View all comments
Frank K.
August 24, 2009 7:02 pm

By the way, you can add comments to the article that Anthony linked to at NASA
http://www.nasa.gov/topics/earth/features/climate_computing.html
I just did – we’ll see if they post my comment tomorrow. I simply asked them to state how much taxpayer money was spent on the new processors.

pwl
August 24, 2009 7:04 pm

No matter how many bones and entrails soothsayers use, no matter how shiny their rocks, no matter how pretty the pictures on their astrology cards, no matter how the dice sound when they roll the i-ching, no matter how spiffy the processors nor how fast nor which software language they use, no matter what climate scientists, oh sorry, climate soothsayers and doomsayers will not be able to accurately simulate the ACTUAL and REAL WORLD weather and climate systems of Earth. It’s just not possible to accurately simulate, with predictive results of any value, a real system that needs to play itself out in real time as the climate and weather systems do. This was proved mathematically and logically by Stephen Wolfram, A New Kind of Science, Chapter 2. How come this is? It’s because certain systems – such as weather and climate – generate randomness from within the system itself as part of it’s inherent Nature; this isn’t randomness from initial conditions, nor from chaos theory, it’s randomness that is inherent in the system itself. Can’t simulate them, they must play out in real time in the real objective reality. It’s actually even worse… given the vast number of different systems that must be integrated into a whole earth simulation and that many of those systems also have inherent randomness in them.
So soothsay and doomsay away Nasa and IPCC. Gotta love those shiny new computers. Heck what am I saying? I could get a high paying programming systems analysis job there… nope resist the temptation… [:|]
Now what to do with that spiffy parallel computer that Nasa has that could actually contribute to the world?

Miles
August 24, 2009 7:13 pm

What’s the carbon footprint of that monstrosity ?

Justin Sane
August 24, 2009 7:15 pm

If they’d only spend 1/100000000 of the money on open minds instead of closed ones. Like someone else said why bother, the science is in, now move along — nothing to see here.

DaveE
August 24, 2009 7:16 pm

pwl (19:04:11) :

Now what to do with that spiffy parallel computer that Nasa has that could actually contribute to the world?

Maybe Boeing could have made better use of it 😀
DaveE.

August 24, 2009 7:23 pm

OK boys. Let’s see how this baby does. How much of a temperature increase was it that you wanted…

Robin W
August 24, 2009 7:24 pm

I imagine that if this monsterous computer gives NASA’s “climate scientists” simulation results that don’t support their biases, they’ll be in the control room with hacksaws and the Discover system will be singing “daisy, daisy give me your answer do…….”

August 24, 2009 7:28 pm

Does anyone know what this cost me? I’m fed up with the government wasting my money.
Jesse

Dave Dodd
August 24, 2009 7:29 pm

My father-in-law has spent much of his eight decades on this planet raising cows in East Texas. He can spit his tobacco juice in the sand (watch your foot!), stare at it for a moment and predict Mother Nature’s intentions for the coming growing season with startling accuracy! He’s never met a teraflop, but can relate many a story about cowflop. He’s more afraid of his Congressman than he is of cow farts. NASA should ask him…

August 24, 2009 7:30 pm

Another GIGO…GIGO….GIGO Vote!
Remember the MAINE!
Remember the Boeing 787!
Remember the Conneticut 1978 Ice Arena Roof Collapse (first totally “computer” designed truss roof structure..)

David L. Hagen
August 24, 2009 7:31 pm

Climate models will ultimately depend on quantifying the magnitude and even the sign of water feedback. To quantitatively evaluate this will require very detailed modeling of clouds and precipitation rather that high level models with numerous assumptions. That is the greatest opportunity for these new “toys”.
McIntyre reports that these papers are being presented at: Erice 2009

CLIMATE & CLOUDS
FOCUS: Sensitivity of Climate to Additional CO2 as indicated by Water Cycle Feedback Issues
Chairman A.Zichichi – Co-chair R. Lindzen
10.30 – 12.00
SESSION N° 9
* Dr. William Kininmonth
Australasian Climate Research, Melbourne, Australia
A Natural Constraint to Anthropogenic Global Warming
* Professor Albert Arking
Earth and Planetary Sciences Dept., John Hopkins University, Baltimore, USA
Effect of Sahara Dust on Vertical Transport of Moisture over the Tropical Atlantic and its Impact on Upper Tropospheric Moisture and Water Vapor Feedback.
* Dr. Yon-Sang Choi
Earth, Atmospheric and Planetary Sciences Dept., MIT, Cambridge, USA
Detailed Properties of Clouds and Aerosols obtained from Satellite Data
* Professor Richard S. Lindzen
Department of Earth, Atmospheric & Planetary Sciences, MIT, Cambridge
On the Determination of Climate Feedbacks from Satellite Data
* Professor Garth W. Paltridge
Australian National University and University of Tasmania, Hobart, Australia
Two Basic Problems of Simulating Climate Feedbacks

See also
Satellite and Climate Model Evidence Against Substantial Manmade Climate Change by Roy W. Spencer, Ph.D.
December 27, 2008 (last modified December 29, 2008)
Otherwise GIGO will become “FGIFGO” (Faster Garbage In Faster Garbage Out).

gt
August 24, 2009 7:35 pm

Is the new system powered by solar power? I hope it doesn’t create too large of a carbon footprint.

Layne Blanchard
August 24, 2009 7:40 pm

All that power, and they already know the answer…..

Gary P
August 24, 2009 8:01 pm

What are the odds that the local USHCN Stevenson Screen thermometer box is sitting just outside the air conditioning units for this new toy?
The beauty of that is all they need to do is turn it on to prove global warming!

August 24, 2009 8:03 pm

Wow, a great post. Interesting to know about environmental actions taken by NASA.

timetochooseagain
August 24, 2009 8:09 pm

NASA just described climate projections as “forecasts”-I was under the impression that this was a big no-no that only denier cave men would do.

timetochooseagain
August 24, 2009 8:10 pm

Gary P (20:01:33) : Nice Double Entendre.

LarryT
August 24, 2009 8:12 pm

The new computer power will enable NASA to get garbage in/garbage out much faster as long as the Goddard Institute for Space Studies has Hansen working for them and his faulty programming.

Roy Tucker
August 24, 2009 8:17 pm

I’m a bit dismayed about how computer models have come to be more important than actual observations and so I offer a formal statement of the Scientific Computer Modeling Method.
The Scientific Method
1. Observe a phenomenon carefully.
2. Develop a hypothesis that possibly explains the phenomenon.
3. Perform a test in an attempt to disprove or invalidate the hypothesis. If the hypothesis is disproven, return to steps 1 and 2.
4. A hypothesis that stubbornly refuses to be invalidated may be correct. Continue testing.
The Scientific Computer Modeling Method
1. Observe a phenomenon carefully.
2. Develop a computer model that mimics the behavior of the phenomenon.
3. Select observations that conform to the model predictions and dismiss observations as of inadequate quality that conflict with the computer model.
4. In instances where all of the observations conflict with the model, “refine” the model with fudge factors to give a better match with pesky facts. Assert that these factors reveal fundamental processes previously unknown in association with the phenomenon. Under no circumstances willingly reveal your complete data sets, methods, or computer codes.
5. Upon achieving a model of incomprehensible complexity that still somewhat resembles the phenomenon, begin to issue to the popular media dire predictions of catastrophe that will occur as far in the future as possible, at least beyond your professional lifetime.
6. Continue to “refine” the model in order to maximize funding and the awarding of Nobel Prizes.
7. Dismiss as unqualified, ignorant, and conspiracy theorists all who offer criticisms of the model.
Repeat steps 3 through 7 indefinitely.

pwl
August 24, 2009 8:18 pm

“What’s the carbon footprint of that monstrosity?”
IT’s it’s own Global Warming Heat Island!
The carbon footprint of this monstrosity will tilt the planet off it’s orbit so we dive into the Sun or smash into Mars….

Squidly
August 24, 2009 8:22 pm

Like any computer geek that may be a reader/contributor here at WUWT, I would love to have one of those babies myself. However, I cannot help but to be left with a sick feeling in my stomach when I consider the enormous resources being applied to this endeavor as I browser through the Model-E code, realizing that this is the garbage that will be running on that beautiful piece of machinery. I am certainly a huge proponent of mass computing power, as I would consider myself one of the premier gadget freaks, but when I look at the unfathomable amount of debt piling up, Stimulus Bill, OmniBus Bill, Bailout Bills, More Bailout Bills, Cash for Clunkers mess, Climate Bill, upcoming Health Care Bill. When does it end? We are already bankrupt! About 6 years ago I was approaching bankruptcy, and the last thing that was on my mind was popping out to Alienware and buying a sporty new top of the line play toy. I’m sorry, but with all that is going on in our country right now, and the fact that NASA will still be relying on people like Gavin Schmidt and James Hansen to program, operating and massage the data from this cluster-F, I simply cannot endorse such a notion, and I believe this is just one more example of our fine government waste in action. … to gain a little perspective on this, try this link on for size: http://usdebtclock.org/ … tough to get too excited about spending more money on toys when you put it into perspective, don’t you think?
Sorry to be such a downer here .. its just that sometimes reality can smack you in the face pretty hard.

August 24, 2009 8:29 pm

But, do they have enough compute power to take into account the sun?
I doubt this will be enough compute power to fill the knowledge gap. What man knows about the earth’s climate, you can write a book, what man doesn’t know, would fill a library. Until someone finds the missing library full of books, sounds like it will be nothing more than a waste of badly needed electricity.
You would have thought that in the case of AGW, the theory predicts that there should be a measurable hotspot in the upper troposphere. Observations from satellites and weather balloons have been made, and no hotspot has been found. In a less politicized field, this would have been the end of it.
Need new theory.

pwl
August 24, 2009 8:34 pm

Nice one Roy, may I post your comparison steps elsewhere?

Ken G
August 24, 2009 8:35 pm

That’s great. Now the models can be wrong faster.
Do the new systems address their lack of QC? Because that seems to be what’s lacking and probably would have been money better spent.

Christian Bultmann
August 24, 2009 8:38 pm

FORTRAN ???
So James can run programs like this real fast.
program a warming world
print *,”The Planet is getting warmer!”
end program a warming world
And than proclaim it’s evidence.