Friday Funny – New NOAA supercomputer "Gaea" revealed

From the Atomic City Underground Blog of comes word that a big kahuna of komputing is about to go online.

The Cray XK6 supercomputer is a trifecta of scalar, network and many-core innovation. It combines Cray’s proven Gemini interconnect, AMD’s leading multi-core scalar processors and NVIDIA’s powerful many-core GPU processors to create a true, productive hybrid supercomputer. Here’s the factory photo before customization:

Reporter Frank Munger writes:

Cray recently delivered the final 26 cabinets of the National Oceanic and Atmospheric Administration’s Gaea climate research supercomputer, which is housed at Oak Ridge National Laboratory. The newly arrived cabinets are loaded with the new AMD 16-core Interlagos processors. According to Jeff Nichols, an associate lab director at ORNL who heads the computational science directorate, the Gaea system is still in two pieces. The first piece is the original 14-cabinet system with a peak capability of 260 teraflops, Nichols said. The second piece is the new 26-cabinet system with a capability of 720 teraflops, he said.

After the first piece is upgraded in the spring and the two pieces are integrated into one system, Gaea will become a 1.1 petaflops supercomputer, ORNL’s computing chief (who returned from a visit to China last week, where he spoke at a conference) said.

Here’s what it looks like at Oak Ridge National Laboratory, note the Earthy graphics:

Photo by Jay Nave of Oak Ridge National Laboratory

While that door panel artwork was well received, word has it though, that the artwork was changed to be more representative of the i/o and stage by stage processing that takes place in this new system. Here’s the upgraded artwork and a brief description of what each processing cabinet does:

Processing starts at left and finishes at the far right – click image to enlarge to see the detail

Below is the guide from left to right: Cabinet number,  processing description

1. Input stage: takes data and bags, boxes, and bins it for distribution

2. Mannomatic stage: chooses which data to use, discards inappropriate data, adds proxy data where none exists, splices on new data to data that was truncated in stage 1.

3. Kevinator stage: approves processed data from stage 2, declares it “robust” using a special stamping system.

4. Hansenizer stage: Fits approved data from stage 3 to three model scenarios to match a “best fit”, applies additional corrections to elevate data for use by stage 5.

5. Gavinotron stage: Chooses data from the Giga Hansenized Climate Numbers (GHCN) to combine with Hansenized three scenario data, extrapolates data from 70°N to 90°N to fill the Gaea Global Model.

6. Humbertian Harmonizer Stage: Using bellows, and a random walk, data is wheezed out to stage 7.

7. Karl Konfabulator Stage: Assigns value to the data to report to Congress, ensuring that the data will be more valuable next year. Monitors power use, sends bills out to taxpayers.

8. Peterson Percolator Stage: Collates the data into inaccessible data furrows buried deep underground in Asheville North Carolina where the “secret sauce” is applied before percolating the data back to the surface.

9. Wigley Wombulator Stage: The data is shipped from Asheville to NCAR in Boulder via a secure optical link where the gatekeeper switch of the wombulator decides how much of it to pass onto CRU via the insecure POTS circuit from Boulder to Norwich. Only data with signed non disclosure agreements is passed on.

10. JonesiFOIAler Stage: Data received from the Wigely Wombulator is then hidden, and signed non disclosure agreements for the data are sent to the top of the paper pile in Jones office to be located by Sherpas mounting the paper summit hired by OSU’s Lonnie Thompson at some future date.

11. Briffabrowser stage: Here, the data is examined, and error flags are sent back up the processing line to all other processing stages using email. The other stages reply that the error flags don’t matter, and consensus is reached, allowing the data to be passed on to stage 12 after the emails are made public.

12. MUIRer (Make Up Independent Rationalizations) Russelizer Stage : Data and emails flagging questionable data are noted, given a brief talking to, and then passed on with no questions asked along with a “CERTIFIED A-OK” letter of endorsement.

13. Tiljander Inverter Stage: As a quality control check, Portions of the A-OK Data is inverted by the upside down Mannomatic, looked at in a mirror, then declared still usable.

14. Serializer Stage: The Final A-OK upside down Data is sent to the IPCC, where it is then returned by Indian handmaidens to the potboiling center at Almora, where it is washed repeatedly in hot water.

15. The Osterizer Stage: The IPCC Almora hot water washed data is then blended repeatedly until it reaches a fine homogenized puree.

16. The RealClimatizer Stage: Here the data undergoes public examination under intense scrutiny of thousands of like minded individuals identical processors. Tiny flecks of data that don’t consitute a pure product that may remain are picked off and routed into the borehole disposer.

17: The Cloudifier Stage: Data patterns are compared to an online satellite photo database of clouds to see if there might be any correlation. Any matches are sent back to stage 16 for disposal in the borehole.

18. The SOL Stage: Effects of sunlight on the data are removed.

19. The data is run through the final AlGoreithm, the CLOud and Weather Neutralizer (CLOWN) to ensure the final data has no remaining “weather not climate” residuals, given a happy demeanor and sent on to the final stage.

20. Output Stage: This cabinet, identical to the Input Stage 1, ejects the data in a composted form, suitable for academic consumption.

More information:

Gaea is NOAA’s prime supercomputing resource, and it will become the third petascale machine housed at ORNL. Jaguar, soon to be morphed into Titan, and Kraken, a National Science Foundation machine, are the others.

Cray XK6 Brochure (PDF)


newest oldest most voted
Notify of
Mark F

Is there not a stage-stage fudge factor such as 0,0,0,0,….…. that you’ve missed?

Glenn Haldane

Cruel but appropriate.


Whoooooo Weeeeeeee.
I can plays world of warcrafts in betweening modelling the planets weather system to within 10 cents of the projected Million dollar grant the UN gaves me fer being smart and stuff.
Then i’ll scatter cast a trillion tweets and crash the world wide interwebby, just fer fun and giggles. Accordings to the manual, every gigaflop eats up a glacier, hell boy, i’ll have that Hindu kush bone dry before sun down.


Garbage in >> garbage out


in the end it will end up as a chook house


Where can I get one? Amazon is sold out already.

John Marshall

Perhaps this Cray will be intelligent and see through all the fudge and tell these NOAA children that they are talking C**P.
Then perhaps not.


1+1=3 the computer says so!

Disko Troop

Why do I find the word “flop” so appropriate when talking about super computers? Maybe it is just a British thing.

Does this mean that climate scientists will be able to make faulty models and unfalsifiable predictions more quickly?

Richard of NZ

I’m certain there is a stage missing somewhere. It has something to do with Gaea I think. Ah ha, got it, its the Uranus stage where all of the nasties come out. It should really be the first stage but in this case it has to be the last.
Anthony and all others who make this blog what it is, I wish you all a merry Christmas and a happy new year. Now, take some time off and relax with family and friends. Speaking purely for myself, the blog can wait for a while so please have a nice break.
Richard (who will celebrate the festival well before most of the readers here and intends to enjoy a vino or three).


I saw the garbage in (GI) but the garbage out (GO) is so far down the line….
good, class you now know more than the IPCC, WWF. Greensprout, Tides (coincidentally own the city council and mayor of Vancouver BC Can.) and every other parasite on the system.
class dismissed


The one with the biggest super computer wins!
Reminder: It might be time for us to make a final non-tax deductible donation to our host and hero, Anthony Watts. I just located the cash can and put in some Merry Christmas wishes. Thanks for all the hard work, Anthony 🙂

Jeroen B.

This looks like a KIBO computer (Knowledge In, Bullshit Out) …


We spent a billion bucks on this! It has to be right!

Peter Miller

Totally believable, except for Stage 15, I think you made that up.

Peter Whale

On e-bay next week.

Talk about overkill and discussions of nut cracking with sledge-hammers.
They could have got better results if they visited Madame Travelle and her crystal ball.
Only $5 per visit or they could have hired the woman for a mere $30,000 per year.
I have a prediction…I predict that this computer will be too slow in a couple of years time.

Peter Whale

Give it to Tallbloke and laugh as the six cops try to take it away.

Aussie Luke Warm

If it is going to be used for their stupid CAGW hoax as your satire (much appreciated – I laughed) implies, what a waste. How much did it cost the US taxpayer?

You missed the random-time generators at each stage for letters requesting more funds, but the rest is structurally sound. Note that the spell checker automatically replaces “does” with “might do”, “will” with “might do,” “is certain to” with “might do”, etc. Finally, a large portion of the memory is allocated to variations of synonyms of “For your eyes only, burn this after reading, don’t even tell XXX”

Luther Wu

With a trend toward the “climate- involved” deciding that they have better things to do, perhaps this Cray will be turned to something actually useful, like running Vijay Pande’s folding@home.
AFAICS, the ORNL machine uses ~4900X as much power as my own smallish machine, which runs at only ~3.2 TFLOPS, making NOAA’s new tool a bit less than 350 times more powerful than mine at lesser efficiency- ( 2.2 MW + cooling system vs 450 W) Do you suppose that taxpayers paid only 350 times as much? 3500X? 10,000 X?

Peter Dunford

This computer will be able to tell us the answer to the great question of life, the universe, and climate sensivity.
The answer will be 42, but we won’t understand the question.
It will then design a super-duper-computer which will explain the question.
No one should hold their breath waiting for the answer.


It doesn’t matter how much computing power they have. A chaotic system complex enough, cannot be simulated. The complexity of the problem grows exponentially, no Moore law can solve that.

Terry Warren

In computing terms it’ll be out of date by next Easter.


They seem to have omitted one important stage.
The FOI-erasure inversion.

Another Ian

Charles.U.Farley says:
December 23, 2011 at 1:47 am
1+1=3 the computer says so!
I think you got it not quite straight. In climate science arithmatic we can’t handle 1 + 1 because one part of the answer might be negative and that would attract sceptics.
But we can handle 2 + 2
When it is warming
2 + 2 = 5.5
When it is cooling
2 + 2 = 1.5
Clear now?

Barry Sheridan

Good one.
However I did wonder how much electricity this beast will consume? Quite a bit I imagine which surely is not the image these warmist fanatics ought to be presenting. Turn it off and make do with a jumbo abacus, much more in tune with the message.
Happy Christmas Anthony, Willis and all the other contributors. Lets hope 2012 will bring a further dose of reality to the debate.

Their first PETA-flop computer with Mann-machine inyourface.

Luther Wu

Forgot to add… my little machine’s performance is measured in actual work being performed vis x86 and if measured like the new Cray system, then GAEA (when complete) will be over 600 times as fast as mine. That’s quite a machine they’ve built and I would like to have a shot at it myself, but wouldn’t want to feed it. Good thing Oak Ridge is hooked up to TVA.
The article mentioned that the new architecture uses AMD’s INTERLAGOS, but didn’t mention which nVidia GPUs are deployed. Good to see that Cray is building these machines with an eye to upgrades. Good job Cray.


Now they will be wrong about everything even faster….

db.. says:

Garbage in >> garbage out

Superseded: Garbage in >> Grants out.


grumpyoldmanuk says: December 23, 2011 at 1:57 am

Does this mean that climate scientists will be able to make faulty models and unfalsifiable predictions more quickly?

As a former supercomputer user, I don’t think that “more quickly” is necessarily their goal. I anticipate that they will strive to execute more detailed, higher precision runs (i.e, by using smaller gird spacings and smaller time steps) of the same basic faulty models. The new predictions should remain as unfalsifiable as before.


Funny, when I first saw the photos, I almost choked.. I thought they were real…then I started reading and realized it was a joke…good one…thanks for the laugh…


From the Cray XK6 Brochure:
Power consumption: 45 – 54.1kW per cabinet.

Bruce Cobb

Another Ian says:
December 23, 2011 at 3:12 am
But we can handle 2 + 2
When it is warming
2 + 2 = 5.5
When it is cooling
2 + 2 = 1.5

Yep, and when you average them out you get 2 + 2 = 3.5, with the .5 difference with the actual result of 4 representing the “missing heat”. It works!


At 1.1 petaflops that equates to about 200,000 Pentium4s @ 3GHz. So just how big of a carbon footprint would that dude leave?
Sure hope Cray forever got rid of that little doubling problem in their OS, when subtracting two very close numbers (read tempearture anomaly calculations) ☺ Just kidding, think that problem was way back around ’88. About when Hansen first recognized AGW’s existance.


🙂 couldn’t see from the angle of the pic..I guess the last unit has a garbage bin on it?
merry xmas to all, lets hope the New Year provides more mirth as the carbon card house proceeds to fall!


Brilliant, did Al Gore invent it?

Claude Harvey

I would add that the “Hansenizer stage” often bypasses all subsequent stages and just pukes something out onto the floor. The cleanup crew ain’t happy about that.

Alex the skeptic

To justify its cost, this computer must find that it is worser than we had thought when it was worse.

Robert of Ottawa

FLOPS = Floating Point Operaration per Second

Darren Potter

wermet says: “As a former supercomputer user, I don’t think that “more quickly” is necessarily their goal. I anticipate that they will strive to execute more detailed, higher precision runs ”
Or they will strive to find more combinations of data (lemon picking) to back their GW fraud.
Making Gaea nothing more than GWing Climatologists latest taxpayer funded “play toy”.

Darren Potter

“Superseded: Garbage in >> Grants out”
Gaea Supercomputer = More Garbage in >> Grants out Faster


The funniest thing is that this is all built using computing technology manufactured in the far east and with taxpayer stimulus funds (NOAA got 170 million) and all to solve a non existent futile challenge – predicting climate!


You got that wrong.
It is Grants In >> Garbage Out.

Tattoo me as a sceptic

Do they really need to spend that much of our money on super computer when all they need is a simple word processor and protection from Freedom of Information inquiries.


As the density and speed of supercomputers increases, the models can compute on smaller and smaller volumetric areas and possibly use smaller time intervals. The idea that higher resolution makes the models more accurate is the driving force for greater computing power. While it is true that accuracy might be increased based upon the design of the model for individual time steps, the uncertainties regarding the validity of the model design creates an error delta in each time step. Every environmental model is based upon time series calculations and every environmental model has an unknown magnitude of error in each time step. The problem with climate models and in fact any model that uses time series calculations is that the error delta accumulates. So if in one time step the ( unknown ) error delta is say .01 degrees, after 100 time steps the error range is plus or minus a half of a degree. The climate scientists would explain this away as being unimportant but it is the single one factor that makes climate models invalid. I repeat, it makes them INVALID. If it were a valid predictor, they could compute the average temperature for the next year, but they can’t. To explain this away the scientists tell us that models predict trends, and not specific temperatures. If this is the case, then it is based upon statistical averages and the error margin is the prime factor invalidating their approach to predicting the future. I will never believe that this supercomputer, nor any other computer, can predict the future of the climate of this planet. If you can show me with a mathematical proof that this is a valid use of our tax money then I might be inclined to accept their work. Otherwise, it is crap.

Crispin in Waterloo

Disko Troop says:
Why do I find the word “flop” so appropriate when talking about super computers? Maybe it is just a British thing.
Odd, but we are stuck with it. It is from the days of bistable multivibrators that we built in the electronics shop at school. The current was sent to one side or the other each time a signal was given. Flip and flop were used but the flop implied a stable condition while the ‘flip’ implied something continuous which took place in a multivibrator. ‘Flop’ was a mechanical description of an electrical event based on the mental concept of a valve (not the British kind). I guess you have to put the radio ‘valve’ into the same category.
[For those on this side of the pond radio ‘valve’ means ‘tube’.]


From the brochure: 45-54.1 kW per cabinet.
Did I read the NOAA unit has 40 cabinets? That would be 1.8-2.2 mW total for the system.
I presume this thing runs 24/7 to make it pay its way so that is 43-52 mW hours per day or 15.7-.18.9 gW hours per annum.
Now in the UK, average household electricity consumption is estimated by the regulator at 3.3 mWh:
So the NOAA super flopsydraw the equivelent of 4,700-5,700 British households. Pretty much a decent sized town.