From the Atomic City Underground Blog of knoxvillenews.com comes word that a big kahuna of komputing is about to go online.
The Cray XK6 supercomputer is a trifecta of scalar, network and many-core innovation. It combines Cray’s proven Gemini interconnect, AMD’s leading multi-core scalar processors and NVIDIA’s powerful many-core GPU processors to create a true, productive hybrid supercomputer. Here’s the factory photo before customization:
Reporter Frank Munger writes:
Cray recently delivered the final 26 cabinets of the National Oceanic and Atmospheric Administration’s Gaea climate research supercomputer, which is housed at Oak Ridge National Laboratory. The newly arrived cabinets are loaded with the new AMD 16-core Interlagos processors. According to Jeff Nichols, an associate lab director at ORNL who heads the computational science directorate, the Gaea system is still in two pieces. The first piece is the original 14-cabinet system with a peak capability of 260 teraflops, Nichols said. The second piece is the new 26-cabinet system with a capability of 720 teraflops, he said.
After the first piece is upgraded in the spring and the two pieces are integrated into one system, Gaea will become a 1.1 petaflops supercomputer, ORNL’s computing chief (who returned from a visit to China last week, where he spoke at a conference) said.
Here’s what it looks like at Oak Ridge National Laboratory, note the Earthy graphics:
While that door panel artwork was well received, word has it though, that the artwork was changed to be more representative of the i/o and stage by stage processing that takes place in this new system. Here’s the upgraded artwork and a brief description of what each processing cabinet does:
Below is the guide from left to right: Cabinet number, processing description
1. Input stage: takes data and bags, boxes, and bins it for distribution
2. Mannomatic stage: chooses which data to use, discards inappropriate data, adds proxy data where none exists, splices on new data to data that was truncated in stage 1.
3. Kevinator stage: approves processed data from stage 2, declares it “robust” using a special stamping system.
4. Hansenizer stage: Fits approved data from stage 3 to three model scenarios to match a “best fit”, applies additional corrections to elevate data for use by stage 5.
5. Gavinotron stage: Chooses data from the Giga Hansenized Climate Numbers (GHCN) to combine with Hansenized three scenario data, extrapolates data from 70°N to 90°N to fill the Gaea Global Model.
6. Humbertian Harmonizer Stage: Using bellows, and a random walk, data is wheezed out to stage 7.
7. Karl Konfabulator Stage: Assigns value to the data to report to Congress, ensuring that the data will be more valuable next year. Monitors power use, sends bills out to taxpayers.
8. Peterson Percolator Stage: Collates the data into inaccessible data furrows buried deep underground in Asheville North Carolina where the “secret sauce” is applied before percolating the data back to the surface.
9. Wigley Wombulator Stage: The data is shipped from Asheville to NCAR in Boulder via a secure optical link where the gatekeeper switch of the wombulator decides how much of it to pass onto CRU via the insecure POTS circuit from Boulder to Norwich. Only data with signed non disclosure agreements is passed on.
10. JonesiFOIAler Stage: Data received from the Wigely Wombulator is then hidden, and signed non disclosure agreements for the data are sent to the top of the paper pile in Jones office to be located by Sherpas mounting the paper summit hired by OSU’s Lonnie Thompson at some future date.
11. Briffabrowser stage: Here, the data is examined, and error flags are sent back up the processing line to all other processing stages using email. The other stages reply that the error flags don’t matter, and consensus is reached, allowing the data to be passed on to stage 12 after the emails are made public.
12. MUIRer (Make Up Independent Rationalizations) Russelizer Stage : Data and emails flagging questionable data are noted, given a brief talking to, and then passed on with no questions asked along with a “CERTIFIED A-OK” letter of endorsement.
13. Inverter Stage: As a quality control check, Portions of the A-OK Data is inverted by the upside down Mannomatic, looked at in a mirror, then declared still usable.
14. Serializer Stage: The Final A-OK upside down Data is sent to the IPCC, where it is then returned by Indian handmaidens to the potboiling center at Almora, where it is washed repeatedly in hot water.
15. The Osterizer Stage: The IPCC Almora hot water washed data is then blended repeatedly until it reaches a fine homogenized puree.
16. The RealClimatizer Stage: Here the data undergoes public examination under intense scrutiny of thousands of
like minded individuals identical processors. Tiny flecks of data that don’t consitute a pure product that may remain are picked off and routed into the borehole disposer.
17: The Cloudifier Stage: Data patterns are compared to an online satellite photo database of clouds to see if there might be any correlation. Any matches are sent back to stage 16 for disposal in the borehole.
18. The SOL Stage: Effects of sunlight on the data are removed.
19. The data is run through the final AlGoreithm, the CLOud and Weather Neutralizer (CLOWN) to ensure the final data has no remaining “weather not climate” residuals, given a happy demeanor and sent on to the final stage.
20. Output Stage: This cabinet, identical to the Input Stage 1, ejects the data in a composted form, suitable for academic consumption.
Gaea is NOAA’s prime supercomputing resource, and it will become the third petascale machine housed at ORNL. Jaguar, soon to be morphed into Titan, and Kraken, a National Science Foundation machine, are the others.
Cray XK6 Brochure (PDF)