Wyoming experiences that "giant sucking sound" as new coal fired climate supercomputer is turned on

Did your lights dim yesterday? It might have been Trenberth’s “travesty calculator”. Readers may recall when I covered the announcement of the building of this new supercomputer: NCAR’s dirty little secret

Measuring 108,000 square feet in total with 15,000-20,000 square feet of raised floor, it will be built for 8 megawatts of power, with 4-5 megawatts for computing and 3-4 for cooling.

Maybe Trenbert will find his missing heat being vented.

According to the figures from the Energy Information Adminsitration and Department of Energy (EIA/DOE) electricity is significantly cheaper in Wyoming.

Why? Coal, that dirty fuel both environmentalists and climate scientists activists like Jim Hansen hate. According to the EPA, Wyoming is king coal of the nation, even beating out west Virginia. They even have a Carbon County (horrors!). But what’s some dirty coal to run a supercomputer if you are saving the planet?  Watch the video from NCAR below: 

Harold Ambler writes on his blog today:

Somewhat ironically, the computational power comes at a price in moral standing, if one equates having a small carbon footprint with having a high moral ranking. That’s because supercomputers of this scale slurp up electricity at staggering rates. The climate scientists using them will tell you that the end justifies the means in their case, and they could be right. But there’s no getting away from the fact that these individuals are using more electricity than you could ever dream of doing.

You and a hundred friends could run around your town or city, let yourselves in unlocked doors every time you found one, and turn on all the lights, all the appliances, all the computers, all the televisions, and all the stereos, day after day, week after week, month after month, year after year, and you wouldn’t touch, you wouldn’t come close, to emitting what these scientists are now emitting, in the name of fighting climate change.

Indeed, NCAR’s supercomputer was constructed in Wyoming specifically because the electricity in the state cost less than that in Colorado. Thus, dozens of scientists and their families moved across the state line because doing so would allow them to emit more carbon dioxide than even NCAR could afford in its original location.

Read his essay here

Here’s the press release:

NCAR-Wyoming Supercomputing Center opens

October 15, 2012

CHEYENNE—The NCAR-Wyoming Supercomputing Center (NWSC), which houses one of the world’s most powerful supercomputers dedicated to the geosciences, officially opens today.

Scientists at the National Center for Atmospheric Research (NCAR) and universities across the country are launching a series of initial scientific projects on the center’s flagship, a 1.5-petaflop IBM supercomputer known as Yellowstone. These first projects focus on a wide range of Earth science topics, from atmospheric disturbances to subterranean faults, that will eventually help to improve predictions of tornadoes, hurricanes, earthquakes, droughts, and other natural hazards.

Side view of a few of Yellowstone's 100 computer racks

A handful of the Yellowstone supercomputer’s 100 racks. An iconic scene from its namesake national park is featured mosaic-style on the ends of each rack. The image by Michael Medford, licensed to National Geographic, centers on Fountain Geyser. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery.

“This center will help transform our understanding of the natural world in ways that offer enormous benefits to society,” says Thomas Bogdan, president of the University Corporation for Atmospheric Research (UCAR), which manages NCAR on behalf of the National Science Foundation (NSF). “Whether it’s better understanding tornadoes and hurricanes, or deciphering the forces that lead to geomagnetic storms, the Yellowstone supercomputer and the NWSC will lead to improved forecasts and better protection for the public and our economy.”

Bogdan took part this morning in a formal opening ceremony with Wyoming Gov. Matt Mead, NSF director Subra Suresh, University of Wyoming (UW) vice president of research relations William Gern, NCAR director Roger Wakimoto, and other political and scientific leaders.

“This is a great day for scientific research, for the University of Wyoming, and for Wyoming,” says Mead. “Wyoming is proud to be part of the collaboration that has brought one of the world’s fastest computers to the state. The center will have a positive impact on our future, through the research done here and by sending the message that Wyoming is honored and equipped to be the home of this amazing facility.”

“The NCAR-Wyoming Supercomputing Center will offer researchers the opportunity to develop, access, and share complex models and data at incredibly powerful speeds,” says Suresh. “This is the latest example of NSF’s unique ability to identify challenges early and make sure that the best tools are in place to support the science and engineering research communities.”

Public-private partnership

Exterior view of the NWSC entrance area

Located on the western fringe of Cheyenne, Wyoming, the NCAR-Wyoming Supercomputing Center officially opened on October 15, 2012. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery.

The NWSC is the result of a broad public-private partnership among NCAR, NSF, UW, the state of Wyoming, Cheyenne LEADS, the Wyoming Business Council, and Cheyenne Light, Fuel & Power. NCAR’s Computational and Information Systems Laboratory (CISL) will operate the NWSC on behalf of NSF and UCAR.

“We are delighted that this successful public-private partnership has delivered a major supercomputing center on time and on budget,” says NCAR director Roger Wakimoto.

Through the NWSC partnership, which will also seek to advance education and outreach, UW will have research use of 20 percent of NWSC’s main computing resource. In turn, UW will provide $1 million each year for 20 years in support of the program. The state of Wyoming also contributed $20 million toward the construction of the center.

“Our access to Yellowstone will allow the university to reach new heights in our educational and research endeavors in engineering; atmospheric, hydrological, and computational sciences; Earth system sciences; and mathematics,” says UW President Tom Buchanan. “The supercomputer is a huge draw for students and faculty. It opens the door to scientific innovation and discovery that will benefit our state, the nation, and the world.”

Located in Cheyenne’s North Range Business Park, near the intersection of I-80 and I-25, the 153,000-square-foot supercomputing center will provide advanced computing services to scientists across the United States. Most researchers will interact with the center remotely, via a laptop or desktop computer and the Internet.

Relative to the most recent ranking of the world’s fastest supercomputers, issued in June, the 1.5 petaflops peak system ranks in the top 20. The rankings constantly change as new and increasingly powerful supercomputers come online.

The main components consist of a massive central file and data storage system, a high performance computational cluster, and a system for visualizing the data.

Scientists will use the new center’s advanced computing resources to understand complex processes in the atmosphere and throughout the Earth system, and to accelerate research into severe weather, geomagnetic storms, climate change, carbon sequestration, aviation safety, wildfires, and other critical geoscience topics.

Future-proof design

Some of the pipes and other systems in the NWSC's mechanical room

Some of the NWSC’s complex mechanical systems, which were designed with performance, flexibility, and energy efficiency in mind.  (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery.

CISL has operated supercomputers at NCAR’s Mesa Laboratory in Boulder since the 1960s, even though the building was not designed with supercomputing in mind. In recent years, new research questions have required more powerful computers to run increasingly complex computer simulations. The Mesa Lab has now reached the limits of its ability to provide the necessary energy and cooling capacity essential for the next generation of supercomputers.

The NWSC is expected to advance scientific discovery for the next several decades. Its design and construction have been “future proofed” by providing the scope to expand as supercomputing technology that does not exist today becomes available in the future.

Raised floors are key to the facility’s flexible design, allowing the computing systems, electrical supply, and cooling to be positioned and controlled for optimal energy use and ease of maintenance. The raised floor is also vented, so air can be circulated as needed to computing systems and servers.

The NWSC was awarded LEED Gold certification for its sustainable design. The center takes full advantage of Cheyenne’s elevation and cool, dry climate by employing ambient air to cool the facility nearly year round. This will significantly reduce the facility’s energy use.

A minimum of 10 percent of the power provided to the facility will be wind energy from the nearby Happy Jack Wind Farm. NCAR and UCAR will continue to explore options to increase the percentage of renewable energy provided to the facility in future years.

Advertisements

  Subscribe  
newest oldest most voted
Notify of
tty

“A minimum of 10 percent of the power provided to the facility will be wind energy from the nearby Happy Jack Wind Farm.”
Does that mean that the will turn the computer off when there is no wind (admittedly not often true in Wyoming)

D. Patterson

We experienced a power outtage in Illinois about 1330 Hours Central Time. You don’t suppose…?

Couldn’t they do this more cheaply in the cloud?
I bet that wasn’t considered when this purchase proposal was set in motion, maybe five or so years ago.

Eric N. WY

We get that sound often here, but only because Utah sucks and Nebraska blows

D. Patterson

In the hands of NOAA and NCAR it becomes the world’s largest GIGOmputer. Someday it may be truely useful in the hands of a reformed DOE.

Eric Dailey

The video failed to mention the coal power used for the cooling of the giant heat source computer. What a surprise.

Dennis Gaskill

So ! does a super computer lend credibility to the AGW thesis ?
Garbage Data and flawed Modeling software IN , then Garbage OUT !!!!!!!
Oh boy! it’s going to predict tornadoes and hurricanes and everything for us ……wow !
It would seem to be a great way to cause a huge Information catastrophe , by putting all your eggs in one basket. This thing will be quite vulnerable to natural as well as man made disasters.
All grousing aside , I do hope some REAL (wherever they may be) scientists will be allowed to scrounge some computer time for research.

Mark

Have to say I’d be happy too if I were owning a wind farm like Happy Jack.
Not that I’d live anywhere near it, of course.

The really sad part is it’s wasted energy and money because the computer still cannot model the climate as I explained here:
http://drtimball.com/2012/computers-incapable-of-modeling-climate-billions-wasted-to-perpetuate-deception/

DaveA

Trenberth will stand in front of it 1000 years from now and it will spit out 42!
(OK, Trenberth’s ancestors)

Alan the Brit

Zzzzzzzzzzzzzzz……………..zzzzzzzzzzzzzzz…………………zzzzzzzzzzzzzzzz……………….zzzzzzzzzzzzzzz……………..zzzzzzzzzzzzzzzz……………………uh, can I wake up now?
Why oh why do they keep treating everyone as stupid? Does anyone know? Why does the speed at which calculations are done by their version of Deep Thought mean that the answer they get out at the blunt end, is the right one? Again there is someone telling us about how important sea ice is concentrating on virtually only NH Arctic Circle sea ice, & how important it is for Earth’s albedo, when at the low angle the Solar Insolation reaches it cannot possibly have such a powerful effect on warming, 6 months land of midnight sun, 6 months pitch darkness, both cases, freezing cold!

Gary

No geothermally-generated electricity in Yellowstone? Tsk Tsk.

SteveB

Larger, faster computer. Same flawed models. They’ll be able to reach the same conclusions, only faster.

RockyRoad

So we get their lies faster.
I’m not impressed.

Can the supercomputer distinguish real data from homogenized data?

Doesn’t matter what size it is; it’s still going to be GIGO if it’s for ‘The Team’.

If Happy Jack’s wind doesn’t blow, they’ll just adjust the models and, presto! Virtual wind.

DesertYote

“Raised floors are key to the facility’s flexible design…” Wow, what innovation … 60 years ago!

SasjaL

… a 1.5-petaflop IBM supercomputer known as Yellowstone. 

Sooner or later it has to explode …

DesertYote

Roger Knights says:
October 16, 2012 at 8:06 am
Couldn’t they do this more cheaply in the cloud?
###
The “Cloud” isn’t some mystical power that we have learned to tap into. It is basically just a set of methods to organize and access computing power. The computing power is still needed meaning the hardware is still needed. The type of problems this system is designed to handle can be distributed, but if the distribution is over the internet, then seconds would be added to each calculation that would take nanosecond locally. I could say more but got to get to my morning SCRUM.

Using cheap energy to calculate how evil cheap energy is …
Do they even realize how strange they are?
Shouldn’t these kind of researchers been bound to use energy from solar and wind?
Would give them some breaks to think about how reality and green dreams go together.

wayne

What is that, about 16,000 homes worth of electricity? A city of about 40,000 population. Can we get some UHI reading about that single building? ☺

3x2

8MW is OK so long as they power it from a wind turbine. Is there a “Green Screen of Death” in whatever OS it uses?

Glacierman

What’s the need for such computer power? I thought they already knew the answer(s). Just insert the following statement…….”We must reduce our emissions of GHGs or……..”
They don’t even do science. I’m sure an XBox would have been perfect for producing their scary scenarios.

John F. Hultquist

The whiff of wind power is touching, but should they not run the thing only on wind and solar power?
I laughed at the part about the “Raised floors are key to the facility’s flexible design . . .”
I think that might have been news about 1965 with the first delivery of the IBM S/360s.

I think that “giant sucking sound” should be attributed to Colorado, or any other state that lost talent to Wyoming’s lower energy prices.
“dozens of scientists and their families moved across the state line”

Reg Nelson

With this kind of computing power they’ll be able to homogenize the data out to six decimal places.

alan

OT:
“A123 Systems Inc. (AONE), the electric car battery maker that received a $249 million federal grant, filed for bankruptcy protection after failing to make a debt payment that was due yesterday.”
“President Barack Obama called A123 Chief Executive Officer David Vieau and then-Michigan Governor Jennifer Granholm during a September 2010 event celebrating the opening of the plant in Livonia, Michigan, that the company received the U.S. grant to help build.”
http://www.bloomberg.com/news/2012-10-16/electric-car-battery-maker-a123-systems-files-bankruptcy.html?cmpid=yhoo

This looks like another political boondoggle. Alternately, it could be used for an entirely different purpose under the disguise of environmental science. Actually, it is far from the most advanced and fastest super-computer. Communist China has three of them. Each is almost 100 times faster than this beast. They are powered by thousands of Intel processors diverted by Lenovo to the Chinese military. I do not believe that they are using them to study climatology.

David L.

They should be required to run that thing 100% off of wind power.

Nice pretty video of the earths weather. Yes a huge mega computer. Why is it that the models of 24 years ago are really not much more accurate than today? Could it be in shite out shite? If I was a computer modeler I too might be an alarmist if it meant that an institution/government would buy me one of those to play with. In another 24 years that computer too will be considered a dinosaur? Will people laugh then? Did he say that the first super computers were no more powerful than today’s cell phone? A typical laptop today has more raw CPU power than a Cray supercomputer from 1988. So therefore Hansen was using what computer in 1980-1988? Oh hang on a minute here it is: http://pubs.giss.nasa.gov/abs/ha04600x.html No computer used here, but that was in 1981. Mute point maybe?

alan says:
October 16, 2012 at 9:49 am
> OT:
I consider it very rude to post OT stuff on a < 2 hour old post. This belongs on Tips & Notes or on an Open Thread. It's not like this is very surprising news….

Gary Pearse

Their work so far has been back-of-the-envelope in terms of knowing what parameters are involved and their weights. This is another bunch of monkeys with expensive typewriters hoping to end up with Hamlet.

DGH

They should have built it in Antarctica and moved the people down there. There’s plenty of solar power for six months of the year and plenty of free cooling year round.

Harold Ambler writes on his blog today:

You and a hundred friends could run around your town or city, let yourselves in unlocked doors every time you found one, and turn on all the lights, all the appliances, all the computers, all the televisions, and all the stereos, day after day, week after week, month after month, year after year, and you wouldn’t touch, you wouldn’t come close, to emitting what these scientists are now emitting, in the name of fighting climate change.

Sorry, this is bogus. A cheap space heater draws 15 kW. Works fine on 15 or 20 amp circuits.
8 MW, call it 9 MW, is 6,000 spaceheaters. 60 per friend, and I’ll check for goodies in the ‘fridge.

Pamela Gray

That’s a lot of coins for computers that run models that apparently may not be worth gnat poop. Apparently, the venerable ENSO models, representing millions of dollars in research tax dollars collectively, need human input. So much so that this ENSO update was issued in September of this year. Apparently the next update in October will include seat of the pants input because the models may have missed “recent changes”, and have “known specific model biases”. If that is the case, why build expensive computing towers with expensive castles to house them in when predictions just need more salt?
“The probabilities derived from the 25 or more models on the IRI/CPC plume describe, on average, development (or maintenance) of weak El Niño during the present time (September 2012). Such conditions have already developed in the SSTs, but the atmospheric response to the SSTs has been ambiguous so far. Without active participation of the atmosphere, the changes in atmospheric circulation, and the consequent climate teleconnections, cannot occur. Implicit in the model predictions of an approximately 80% chance of a weak El Niño persisting into northern winter 2012-13 is an assumption that the atmosphere will eventually respond to the warmed SSTs so that the event will sustain itself for several months. Following this latest model-based ENSO preduction plume prediction, factors such as known specific model biases and recent changes that the models may have missed will be taken into account in the next official outlook to be generated and issued in early October by CPC and IRI, which will include some human judgement in combination with the model guidance.”
http://iri.columbia.edu/climate/ENSO/currentinfo/technical.html

NoAstronomer

WIDIMITWEED* at it’s most poignant.
Mike.
* What I Do Is More Important Than What Everyone Else Does

kwik

You don’t need such a big computer to multiply a few numbers with Val_Adj[ ].

If you want a real Supercomputer, LLNL is home to an IBM system named Sequoia. #1 on the June http://top500.org list, it delivers 16 PFlops and consumes 8 MW. Runs Linux.
1.5 PFlops (I assume theoretical, not delivered to a benchmark) would place at #13 or so on the June Top 500 list. A new list comes out next month at the Supercomputer show in Salt Lake City.

Wijnand

@Ric Werme, October 16, 2012 at 10:26 am
15 kW for a space heater? Don’t you mean 1.5 kW?

To those who suggested the computer run only on wind power, that is my vote too. Living just 150 miles north of the new computing facility, I can tell you that if they did run just off of wind, it’s fortunate that they waited until October to open. All those 90 plus degree days this summer had virtually no wind. It would have been a slow summer on the computing front.
Wyoming just loves to lure in any business it can, especially if it comes with federal money attached. Of course, If Mr. Billionaire in Colorado and Ken Salazar manage to get the 1000 turbines by Rawlins, the supercomputer can run off that. Except that defeats the purpose of coming to Wyoming for cheap electricity. In Wyoming, one gets to make the completely false claim they are using wind energy–like there’s a separate grid, NOT–by paying $1.95 more per KWh if you are a Rocky Mountain Power customer. I’m guessing that $1.95/100KWh more cost would significantly raise the cost for power in a facility using 8MW of power, if they are a RMP customer. Let’s see how much “on paper only” wind power the facility can actually afford.
Also, Cheyenne is building a natural gas power plant to backup that earth-friendly wind. Let’s hope the super computer adds in all that CO2 when making their dire predictions.

Reg Nelson

Can you run Excel on it? ~ Phil Jones

DesertYote says:
October 16, 2012 at 8:58 am

Roger Knights says:
October 16, 2012 at 8:06 am
Couldn’t they do this more cheaply in the cloud?
The “Cloud” isn’t some mystical power that we have learned to tap into. It is basically just a set of methods to organize and access computing power. The computing power is still needed meaning the hardware is still needed. The type of problems this system is designed to handle can be distributed, but if the distribution is over the internet, then seconds would be added to each calculation that would take nanosecond locally.
I realized that. What I should have added was that, if this computer is going to be running climate models only, it will be idle most of the time, so it would be more cost-effective to use only the computer power they need.
However, I now realize that the most common use for this is likely going to be to assist with prediction of current weather, for which speed is needed, and which will keep the machine occupied.

Ric Werme says:
October 16, 2012 at 10:21 am
alan says:
October 16, 2012 at 9:49 am
> OT:
I consider it very rude to post OT stuff on a < 2 hour old post. This belongs on Tips & Notes or on an Open Thread. It's not like this is very surprising news….

I agree. (In fact, I posted a note myself there on this news item (a battery-maker’s bankruptcy)). But the blame mostly belongs to the moderators who regularly let these OT comments through. And to WordPress, for not allowing the moderators to redirect comments to other threads.

Berényi Péter

“A minimum of 10 percent of the power provided to the facility will be wind energy from the nearby Happy Jack Wind Farm.”
Well, Happy Jack Wind Farm, owned by Duke Energy Commercial has fourteen 123 m high wind turbines installed, nominal capacity of each is 2.1 MW. However, as capacity factor is 37-40% there, it makes about 11 MW average power output. Area of the farm is 750 acres, that is, power flux collected is 3.6 W/m². Duke Energy has a long term (20 years) contract with Cheyenne Light, to sell 15 MW of the facility’s energy output under a long-term power purchase agreement. In other words, the product is oversold (by 36%). I wonder how NWSC could get any more over this –4 MW surplus.
It is quite possible Duke Energy can buy cheap coal generated power on the market and pass it on as costly wind power. It is cheating, of course, and probably against the law as well, but hey, who would check it if it makes one feel good? Anyway, electrons are so similar, they can’t even recognize themselves (see: Fremi-Dirac statistics).
Renewable Energy at Cheyenne Light, Fuel & Power
HAPPY JACK WINDPOWER, CHEYENNE, WYOMING
– Wyoming Pipeline Authority: Happy Jack Wind Farm

Alan Watt, Climate Denialist Level 7

Ric Werme says:
October 16, 2012 at 10:46 am

If you want a real Supercomputer, LLNL is home to an IBM system named Sequoia. #1 on the June http://top500.org list, it delivers 16 PFlops and consumes 8 MW. Runs Linux.
1.5 PFlops (I assume theoretical, not delivered to a benchmark) would place at #13 or so on the June Top 500 list. A new list comes out next month at the Supercomputer show in Salt Lake City.

Normally I don’t reference the US Air Force as a way to do things cheaply, but if your climate models can run on just the 33rd largest supercomputer (500 TFlops), you can copy the USAF and build one out of 1760 Sony Play Station 3s: see here .
Some notable features of the PS3 array:

The Condor Cluster project began four years ago, when PlayStation consoles cost about $400 each. At the same time, comparable technology would have cost about $10,000 per unit. Overall, the PS3s for the supercomputer’s core cost about $2 million. According to AFRL Director of High Power Computing Mark Barnell, that cost is about 5-10% of the cost of an equivalent system built with off-the-shelf computer parts.
Another advantage of the PS3-based supercomputer is its energy efficiency: it consumes just 10% of the power of comparable supercomputers.

I note I can get PS3s from Amazon for $285. I don’t know what kind of discount you’d get if you ordered 1760 of them.

John Bell

It does not matter how much computing power is used, because if the models are faulty or rigged then they will give wrong answers. Climate models are worthless.

NoAstronomer

Berényi Péter says:
“It is quite possible Duke Energy can buy cheap coal generated power on the market and pass it on as costly wind power.”
But that would be wrong. Very, very wrong. But, fortunately for some, almost impossible to track.
Mike.

Wijnand says:
October 16, 2012 at 11:57 am
> @Ric Werme, October 16, 2012 at 10:26 am
> 15 kW for a space heater? Don’t you mean 1.5 kW?
Oops. Yep. Looks like the 60 per friend is right, though.

Alan Watt, Climate Denialist Level 7 says:
October 16, 2012 at 2:01 pm

Normally I don’t reference the US Air Force as a way to do things cheaply, but if your climate models can run on just the 33rd largest supercomputer (500 TFlops), you can copy the USAF and build one out of 1760 Sony Play Station 3s:

So far I’ve bit my tongue when people dismissively refer to climate models run on game consoles, but there’s a tremendous amount of floating point horsepower available on high-end graphics chips. Game consoles often have a low (or negative!) price markup, so if your application doesn’t need start-of-the-art interconnects and if you have a fair amount of grunt labor handy, people can build some remarkable systems out of the toys.