Wyoming experiences that “giant sucking sound” as new coal fired climate supercomputer is turned on

Did your lights dim yesterday? It might have been Trenberth’s “travesty calculator”. Readers may recall when I covered the announcement of the building of this new supercomputer: NCAR’s dirty little secret

Measuring 108,000 square feet in total with 15,000-20,000 square feet of raised floor, it will be built for 8 megawatts of power, with 4-5 megawatts for computing and 3-4 for cooling.

Maybe Trenbert will find his missing heat being vented.

According to the figures from the Energy Information Adminsitration and Department of Energy (EIA/DOE) electricity is significantly cheaper in Wyoming.

Why? Coal, that dirty fuel both environmentalists and climate scientists activists like Jim Hansen hate. According to the EPA, Wyoming is king coal of the nation, even beating out west Virginia. They even have a Carbon County (horrors!). But what’s some dirty coal to run a supercomputer if you are saving the planet?  Watch the video from NCAR below: 

Harold Ambler writes on his blog today:

Somewhat ironically, the computational power comes at a price in moral standing, if one equates having a small carbon footprint with having a high moral ranking. That’s because supercomputers of this scale slurp up electricity at staggering rates. The climate scientists using them will tell you that the end justifies the means in their case, and they could be right. But there’s no getting away from the fact that these individuals are using more electricity than you could ever dream of doing.

You and a hundred friends could run around your town or city, let yourselves in unlocked doors every time you found one, and turn on all the lights, all the appliances, all the computers, all the televisions, and all the stereos, day after day, week after week, month after month, year after year, and you wouldn’t touch, you wouldn’t come close, to emitting what these scientists are now emitting, in the name of fighting climate change.

Indeed, NCAR’s supercomputer was constructed in Wyoming specifically because the electricity in the state cost less than that in Colorado. Thus, dozens of scientists and their families moved across the state line because doing so would allow them to emit more carbon dioxide than even NCAR could afford in its original location.

Read his essay here

Here’s the press release:

NCAR-Wyoming Supercomputing Center opens

October 15, 2012

CHEYENNE—The NCAR-Wyoming Supercomputing Center (NWSC), which houses one of the world’s most powerful supercomputers dedicated to the geosciences, officially opens today.

Scientists at the National Center for Atmospheric Research (NCAR) and universities across the country are launching a series of initial scientific projects on the center’s flagship, a 1.5-petaflop IBM supercomputer known as Yellowstone. These first projects focus on a wide range of Earth science topics, from atmospheric disturbances to subterranean faults, that will eventually help to improve predictions of tornadoes, hurricanes, earthquakes, droughts, and other natural hazards.

Side view of a few of Yellowstone's 100 computer racks

A handful of the Yellowstone supercomputer’s 100 racks. An iconic scene from its namesake national park is featured mosaic-style on the ends of each rack. The image by Michael Medford, licensed to National Geographic, centers on Fountain Geyser. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery.

“This center will help transform our understanding of the natural world in ways that offer enormous benefits to society,” says Thomas Bogdan, president of the University Corporation for Atmospheric Research (UCAR), which manages NCAR on behalf of the National Science Foundation (NSF). “Whether it’s better understanding tornadoes and hurricanes, or deciphering the forces that lead to geomagnetic storms, the Yellowstone supercomputer and the NWSC will lead to improved forecasts and better protection for the public and our economy.”

Bogdan took part this morning in a formal opening ceremony with Wyoming Gov. Matt Mead, NSF director Subra Suresh, University of Wyoming (UW) vice president of research relations William Gern, NCAR director Roger Wakimoto, and other political and scientific leaders.

“This is a great day for scientific research, for the University of Wyoming, and for Wyoming,” says Mead. “Wyoming is proud to be part of the collaboration that has brought one of the world’s fastest computers to the state. The center will have a positive impact on our future, through the research done here and by sending the message that Wyoming is honored and equipped to be the home of this amazing facility.”

“The NCAR-Wyoming Supercomputing Center will offer researchers the opportunity to develop, access, and share complex models and data at incredibly powerful speeds,” says Suresh. “This is the latest example of NSF’s unique ability to identify challenges early and make sure that the best tools are in place to support the science and engineering research communities.”

Public-private partnership

Exterior view of the NWSC entrance area

Located on the western fringe of Cheyenne, Wyoming, the NCAR-Wyoming Supercomputing Center officially opened on October 15, 2012. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery.

The NWSC is the result of a broad public-private partnership among NCAR, NSF, UW, the state of Wyoming, Cheyenne LEADS, the Wyoming Business Council, and Cheyenne Light, Fuel & Power. NCAR’s Computational and Information Systems Laboratory (CISL) will operate the NWSC on behalf of NSF and UCAR.

“We are delighted that this successful public-private partnership has delivered a major supercomputing center on time and on budget,” says NCAR director Roger Wakimoto.

Through the NWSC partnership, which will also seek to advance education and outreach, UW will have research use of 20 percent of NWSC’s main computing resource. In turn, UW will provide $1 million each year for 20 years in support of the program. The state of Wyoming also contributed $20 million toward the construction of the center.

“Our access to Yellowstone will allow the university to reach new heights in our educational and research endeavors in engineering; atmospheric, hydrological, and computational sciences; Earth system sciences; and mathematics,” says UW President Tom Buchanan. “The supercomputer is a huge draw for students and faculty. It opens the door to scientific innovation and discovery that will benefit our state, the nation, and the world.”

Located in Cheyenne’s North Range Business Park, near the intersection of I-80 and I-25, the 153,000-square-foot supercomputing center will provide advanced computing services to scientists across the United States. Most researchers will interact with the center remotely, via a laptop or desktop computer and the Internet.

Relative to the most recent ranking of the world’s fastest supercomputers, issued in June, the 1.5 petaflops peak system ranks in the top 20. The rankings constantly change as new and increasingly powerful supercomputers come online.

The main components consist of a massive central file and data storage system, a high performance computational cluster, and a system for visualizing the data.

Scientists will use the new center’s advanced computing resources to understand complex processes in the atmosphere and throughout the Earth system, and to accelerate research into severe weather, geomagnetic storms, climate change, carbon sequestration, aviation safety, wildfires, and other critical geoscience topics.

Future-proof design

Some of the pipes and other systems in the NWSC's mechanical room

Some of the NWSC’s complex mechanical systems, which were designed with performance, flexibility, and energy efficiency in mind.  (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery.

CISL has operated supercomputers at NCAR’s Mesa Laboratory in Boulder since the 1960s, even though the building was not designed with supercomputing in mind. In recent years, new research questions have required more powerful computers to run increasingly complex computer simulations. The Mesa Lab has now reached the limits of its ability to provide the necessary energy and cooling capacity essential for the next generation of supercomputers.

The NWSC is expected to advance scientific discovery for the next several decades. Its design and construction have been “future proofed” by providing the scope to expand as supercomputing technology that does not exist today becomes available in the future.

Raised floors are key to the facility’s flexible design, allowing the computing systems, electrical supply, and cooling to be positioned and controlled for optimal energy use and ease of maintenance. The raised floor is also vented, so air can be circulated as needed to computing systems and servers.

The NWSC was awarded LEED Gold certification for its sustainable design. The center takes full advantage of Cheyenne’s elevation and cool, dry climate by employing ambient air to cool the facility nearly year round. This will significantly reduce the facility’s energy use.

A minimum of 10 percent of the power provided to the facility will be wind energy from the nearby Happy Jack Wind Farm. NCAR and UCAR will continue to explore options to increase the percentage of renewable energy provided to the facility in future years.

About these ads

74 thoughts on “Wyoming experiences that “giant sucking sound” as new coal fired climate supercomputer is turned on

  1. “A minimum of 10 percent of the power provided to the facility will be wind energy from the nearby Happy Jack Wind Farm.”

    Does that mean that the will turn the computer off when there is no wind (admittedly not often true in Wyoming)

  2. We experienced a power outtage in Illinois about 1330 Hours Central Time. You don’t suppose…?

  3. In the hands of NOAA and NCAR it becomes the world’s largest GIGOmputer. Someday it may be truely useful in the hands of a reformed DOE.

  4. The video failed to mention the coal power used for the cooling of the giant heat source computer. What a surprise.

  5. So ! does a super computer lend credibility to the AGW thesis ?
    Garbage Data and flawed Modeling software IN , then Garbage OUT !!!!!!!
    Oh boy! it’s going to predict tornadoes and hurricanes and everything for us ……wow !
    It would seem to be a great way to cause a huge Information catastrophe , by putting all your eggs in one basket. This thing will be quite vulnerable to natural as well as man made disasters.

    All grousing aside , I do hope some REAL (wherever they may be) scientists will be allowed to scrounge some computer time for research.

  6. Have to say I’d be happy too if I were owning a wind farm like Happy Jack.

    Not that I’d live anywhere near it, of course.

  7. Trenberth will stand in front of it 1000 years from now and it will spit out 42!
    (OK, Trenberth’s ancestors)

  8. Zzzzzzzzzzzzzzz……………..zzzzzzzzzzzzzzz…………………zzzzzzzzzzzzzzzz……………….zzzzzzzzzzzzzzz……………..zzzzzzzzzzzzzzzz……………………uh, can I wake up now?

    Why oh why do they keep treating everyone as stupid? Does anyone know? Why does the speed at which calculations are done by their version of Deep Thought mean that the answer they get out at the blunt end, is the right one? Again there is someone telling us about how important sea ice is concentrating on virtually only NH Arctic Circle sea ice, & how important it is for Earth’s albedo, when at the low angle the Solar Insolation reaches it cannot possibly have such a powerful effect on warming, 6 months land of midnight sun, 6 months pitch darkness, both cases, freezing cold!

  9. Larger, faster computer. Same flawed models. They’ll be able to reach the same conclusions, only faster.

  10. “Raised floors are key to the facility’s flexible design…” Wow, what innovation … 60 years ago!

  11. Roger Knights says:
    October 16, 2012 at 8:06 am

    Couldn’t they do this more cheaply in the cloud?

    ###

    The “Cloud” isn’t some mystical power that we have learned to tap into. It is basically just a set of methods to organize and access computing power. The computing power is still needed meaning the hardware is still needed. The type of problems this system is designed to handle can be distributed, but if the distribution is over the internet, then seconds would be added to each calculation that would take nanosecond locally. I could say more but got to get to my morning SCRUM.

  12. Using cheap energy to calculate how evil cheap energy is …
    Do they even realize how strange they are?
    Shouldn’t these kind of researchers been bound to use energy from solar and wind?
    Would give them some breaks to think about how reality and green dreams go together.

  13. What is that, about 16,000 homes worth of electricity? A city of about 40,000 population. Can we get some UHI reading about that single building? ☺

  14. 8MW is OK so long as they power it from a wind turbine. Is there a “Green Screen of Death” in whatever OS it uses?

  15. What’s the need for such computer power? I thought they already knew the answer(s). Just insert the following statement…….”We must reduce our emissions of GHGs or……..”

    They don’t even do science. I’m sure an XBox would have been perfect for producing their scary scenarios.

  16. The whiff of wind power is touching, but should they not run the thing only on wind and solar power?

    I laughed at the part about the “Raised floors are key to the facility’s flexible design . . .”

    I think that might have been news about 1965 with the first delivery of the IBM S/360s.

  17. I think that “giant sucking sound” should be attributed to Colorado, or any other state that lost talent to Wyoming’s lower energy prices.

    “dozens of scientists and their families moved across the state line”

  18. With this kind of computing power they’ll be able to homogenize the data out to six decimal places.

  19. OT:
    “A123 Systems Inc. (AONE), the electric car battery maker that received a $249 million federal grant, filed for bankruptcy protection after failing to make a debt payment that was due yesterday.”

    “President Barack Obama called A123 Chief Executive Officer David Vieau and then-Michigan Governor Jennifer Granholm during a September 2010 event celebrating the opening of the plant in Livonia, Michigan, that the company received the U.S. grant to help build.”

    http://www.bloomberg.com/news/2012-10-16/electric-car-battery-maker-a123-systems-files-bankruptcy.html?cmpid=yhoo

  20. This looks like another political boondoggle. Alternately, it could be used for an entirely different purpose under the disguise of environmental science. Actually, it is far from the most advanced and fastest super-computer. Communist China has three of them. Each is almost 100 times faster than this beast. They are powered by thousands of Intel processors diverted by Lenovo to the Chinese military. I do not believe that they are using them to study climatology.

  21. Nice pretty video of the earths weather. Yes a huge mega computer. Why is it that the models of 24 years ago are really not much more accurate than today? Could it be in shite out shite? If I was a computer modeler I too might be an alarmist if it meant that an institution/government would buy me one of those to play with. In another 24 years that computer too will be considered a dinosaur? Will people laugh then? Did he say that the first super computers were no more powerful than today’s cell phone? A typical laptop today has more raw CPU power than a Cray supercomputer from 1988. So therefore Hansen was using what computer in 1980-1988? Oh hang on a minute here it is: http://pubs.giss.nasa.gov/abs/ha04600x.html No computer used here, but that was in 1981. Mute point maybe?

  22. alan says:
    October 16, 2012 at 9:49 am

    > OT:

    I consider it very rude to post OT stuff on a < 2 hour old post. This belongs on Tips & Notes or on an Open Thread. It's not like this is very surprising news….

  23. Their work so far has been back-of-the-envelope in terms of knowing what parameters are involved and their weights. This is another bunch of monkeys with expensive typewriters hoping to end up with Hamlet.

  24. They should have built it in Antarctica and moved the people down there. There’s plenty of solar power for six months of the year and plenty of free cooling year round.

  25. Harold Ambler writes on his blog today:

    You and a hundred friends could run around your town or city, let yourselves in unlocked doors every time you found one, and turn on all the lights, all the appliances, all the computers, all the televisions, and all the stereos, day after day, week after week, month after month, year after year, and you wouldn’t touch, you wouldn’t come close, to emitting what these scientists are now emitting, in the name of fighting climate change.

    Sorry, this is bogus. A cheap space heater draws 15 kW. Works fine on 15 or 20 amp circuits.

    8 MW, call it 9 MW, is 6,000 spaceheaters. 60 per friend, and I’ll check for goodies in the ‘fridge.

  26. That’s a lot of coins for computers that run models that apparently may not be worth gnat poop. Apparently, the venerable ENSO models, representing millions of dollars in research tax dollars collectively, need human input. So much so that this ENSO update was issued in September of this year. Apparently the next update in October will include seat of the pants input because the models may have missed “recent changes”, and have “known specific model biases”. If that is the case, why build expensive computing towers with expensive castles to house them in when predictions just need more salt?

    “The probabilities derived from the 25 or more models on the IRI/CPC plume describe, on average, development (or maintenance) of weak El Niño during the present time (September 2012). Such conditions have already developed in the SSTs, but the atmospheric response to the SSTs has been ambiguous so far. Without active participation of the atmosphere, the changes in atmospheric circulation, and the consequent climate teleconnections, cannot occur. Implicit in the model predictions of an approximately 80% chance of a weak El Niño persisting into northern winter 2012-13 is an assumption that the atmosphere will eventually respond to the warmed SSTs so that the event will sustain itself for several months. Following this latest model-based ENSO preduction plume prediction, factors such as known specific model biases and recent changes that the models may have missed will be taken into account in the next official outlook to be generated and issued in early October by CPC and IRI, which will include some human judgement in combination with the model guidance.”

    http://iri.columbia.edu/climate/ENSO/currentinfo/technical.html

  27. WIDIMITWEED* at it’s most poignant.

    Mike.

    * What I Do Is More Important Than What Everyone Else Does

  28. If you want a real Supercomputer, LLNL is home to an IBM system named Sequoia. #1 on the June http://top500.org list, it delivers 16 PFlops and consumes 8 MW. Runs Linux.

    1.5 PFlops (I assume theoretical, not delivered to a benchmark) would place at #13 or so on the June Top 500 list. A new list comes out next month at the Supercomputer show in Salt Lake City.

  29. To those who suggested the computer run only on wind power, that is my vote too. Living just 150 miles north of the new computing facility, I can tell you that if they did run just off of wind, it’s fortunate that they waited until October to open. All those 90 plus degree days this summer had virtually no wind. It would have been a slow summer on the computing front.
    Wyoming just loves to lure in any business it can, especially if it comes with federal money attached. Of course, If Mr. Billionaire in Colorado and Ken Salazar manage to get the 1000 turbines by Rawlins, the supercomputer can run off that. Except that defeats the purpose of coming to Wyoming for cheap electricity. In Wyoming, one gets to make the completely false claim they are using wind energy–like there’s a separate grid, NOT–by paying $1.95 more per KWh if you are a Rocky Mountain Power customer. I’m guessing that $1.95/100KWh more cost would significantly raise the cost for power in a facility using 8MW of power, if they are a RMP customer. Let’s see how much “on paper only” wind power the facility can actually afford.
    Also, Cheyenne is building a natural gas power plant to backup that earth-friendly wind. Let’s hope the super computer adds in all that CO2 when making their dire predictions.

  30. DesertYote says:
    October 16, 2012 at 8:58 am

    Roger Knights says:
    October 16, 2012 at 8:06 am

    Couldn’t they do this more cheaply in the cloud?

    The “Cloud” isn’t some mystical power that we have learned to tap into. It is basically just a set of methods to organize and access computing power. The computing power is still needed meaning the hardware is still needed. The type of problems this system is designed to handle can be distributed, but if the distribution is over the internet, then seconds would be added to each calculation that would take nanosecond locally.
    I realized that. What I should have added was that, if this computer is going to be running climate models only, it will be idle most of the time, so it would be more cost-effective to use only the computer power they need.

    However, I now realize that the most common use for this is likely going to be to assist with prediction of current weather, for which speed is needed, and which will keep the machine occupied.

  31. Ric Werme says:
    October 16, 2012 at 10:21 am
    alan says:
    October 16, 2012 at 9:49 am

    > OT:

    I consider it very rude to post OT stuff on a < 2 hour old post. This belongs on Tips & Notes or on an Open Thread. It's not like this is very surprising news….

    I agree. (In fact, I posted a note myself there on this news item (a battery-maker’s bankruptcy)). But the blame mostly belongs to the moderators who regularly let these OT comments through. And to WordPress, for not allowing the moderators to redirect comments to other threads.

  32. “A minimum of 10 percent of the power provided to the facility will be wind energy from the nearby Happy Jack Wind Farm.”

    Well, Happy Jack Wind Farm, owned by Duke Energy Commercial has fourteen 123 m high wind turbines installed, nominal capacity of each is 2.1 MW. However, as capacity factor is 37-40% there, it makes about 11 MW average power output. Area of the farm is 750 acres, that is, power flux collected is 3.6 W/m². Duke Energy has a long term (20 years) contract with Cheyenne Light, to sell 15 MW of the facility’s energy output under a long-term power purchase agreement. In other words, the product is oversold (by 36%). I wonder how NWSC could get any more over this –4 MW surplus.

    It is quite possible Duke Energy can buy cheap coal generated power on the market and pass it on as costly wind power. It is cheating, of course, and probably against the law as well, but hey, who would check it if it makes one feel good? Anyway, electrons are so similar, they can’t even recognize themselves (see: Fremi-Dirac statistics).

    Renewable Energy at Cheyenne Light, Fuel & Power
    HAPPY JACK WINDPOWER, CHEYENNE, WYOMING
    – Wyoming Pipeline Authority: Happy Jack Wind Farm

  33. Ric Werme says:
    October 16, 2012 at 10:46 am

    If you want a real Supercomputer, LLNL is home to an IBM system named Sequoia. #1 on the June http://top500.org list, it delivers 16 PFlops and consumes 8 MW. Runs Linux.

    1.5 PFlops (I assume theoretical, not delivered to a benchmark) would place at #13 or so on the June Top 500 list. A new list comes out next month at the Supercomputer show in Salt Lake City.

    Normally I don’t reference the US Air Force as a way to do things cheaply, but if your climate models can run on just the 33rd largest supercomputer (500 TFlops), you can copy the USAF and build one out of 1760 Sony Play Station 3s: see here .

    Some notable features of the PS3 array:

    The Condor Cluster project began four years ago, when PlayStation consoles cost about $400 each. At the same time, comparable technology would have cost about $10,000 per unit. Overall, the PS3s for the supercomputer’s core cost about $2 million. According to AFRL Director of High Power Computing Mark Barnell, that cost is about 5-10% of the cost of an equivalent system built with off-the-shelf computer parts.

    Another advantage of the PS3-based supercomputer is its energy efficiency: it consumes just 10% of the power of comparable supercomputers.

    I note I can get PS3s from Amazon for $285. I don’t know what kind of discount you’d get if you ordered 1760 of them.

  34. It does not matter how much computing power is used, because if the models are faulty or rigged then they will give wrong answers. Climate models are worthless.

  35. Berényi Péter says:

    “It is quite possible Duke Energy can buy cheap coal generated power on the market and pass it on as costly wind power.”

    But that would be wrong. Very, very wrong. But, fortunately for some, almost impossible to track.

    Mike.

  36. Wijnand says:
    October 16, 2012 at 11:57 am

    > @Ric Werme, October 16, 2012 at 10:26 am

    > 15 kW for a space heater? Don’t you mean 1.5 kW?

    Oops. Yep. Looks like the 60 per friend is right, though.

  37. Alan Watt, Climate Denialist Level 7 says:
    October 16, 2012 at 2:01 pm

    Normally I don’t reference the US Air Force as a way to do things cheaply, but if your climate models can run on just the 33rd largest supercomputer (500 TFlops), you can copy the USAF and build one out of 1760 Sony Play Station 3s:

    So far I’ve bit my tongue when people dismissively refer to climate models run on game consoles, but there’s a tremendous amount of floating point horsepower available on high-end graphics chips. Game consoles often have a low (or negative!) price markup, so if your application doesn’t need start-of-the-art interconnects and if you have a fair amount of grunt labor handy, people can build some remarkable systems out of the toys.

  38. Let’s see….
    1KWH = about 2 lbs of CO2 output
    1MWH = about 1 short ton
    8MW = about 8T per hour = about 200 tons per day
    200 tons per day =
    730 kilotons of CO2 per year
    Of course, we don’t know if that’s rated capacity or expected usage, although one of the linked article cites 12MW as the theoretical capacity.

    @Alan Watt,
    That would be extremely cool, except Sony has rewritten the firmware of the PS2 to prevent, among other things, exactly this use. The Air Force has to maintain their own version of the PS2 firmware, and has to overwrite the installed firmware of machines when replacing failed units.

  39. With Intel’s planned 5 nm process node technology due for 2019, a competitor could produce a supercomputing facility with 10 times the computing power using a 10th of the energy as this current facility in 8 years time.

  40. ****
    Measuring 108,000 square feet in total with 15,000-20,000 square feet of raised floor, it will be built for 8 megawatts of power, with 4-5 megawatts for computing and 3-4 for cooling.
    ****

    As a power engineer, 4-5 MW for computing is a scandalous waste. Only real prime-movers warrant that scale of power usage. :)

  41. Ric Werme says (October 16, 2:35 pm)
    “… there’s a tremendous amount of floating point horsepower available on high-end graphics chips.”

    Sure is. A pal of mine has one – his graphics card alone runs twice the power of my entire (desktop) PC when it’s doing its thing. There’s a price to be paid though – the impressive copper heatsink plus fan is so big he can’t plug any cards into any of the 3 adjacent sockets on his motherboard. :-)

    On the upside, he generously foots the electricity bill run up by donating its spare time to folding@home, a very decent thing to do.

  42. garymount says:
    October 16, 2012 at 3:11 pm
    With Intel’s planned 5 nm process node technology due for 2019, a competitor could produce a supercomputing facility with 10 times the computing power using a 10th of the energy as this current facility in 8 years time.

    Don’t worry. They’ll have an appropriations request and purchase order for that one too, seeing as this one will be obsolete in their eyes within 18 months.

  43. With luck the supercomputers will not be obsolete by the time it is realized that the current CAGW paradigm is as full of holes as a wicker bedpan.

    For then they could be put to some useful purpose.

  44. All that to run a few spreadsheets. Or does it come with AI-on-a-chip? Does it design and build its own software, because what humans have produced up till now is not without problems.

  45. Alan Watt, Climate Denialist Level 7 says:
    October 16, 2012 at 2:01 pm

    ###

    The PS3 is a cheaply (inexpensive) built and simplified version of one of IBM’s reference designs for systems that are build specifically to be used as components in supercomputers. The Playstation thing was more of a gimick. Just think about the MTBF of 2000 boxen!

  46. I attended the grand opening of the Wyoming Supercomputing Center yesterday. It was amazing to walk down the rows of Yellowstone’s server stacks and hear and feel the power within that room. Yellowstone will be used for much more than climate models. You can see the initial seven University of Wyoming research projects that will utilize Yellowstone here: http://www.uwyo.edu/uw/news/2012/06/seven-uw-research-projects-chosen-for-first-cycle-of-supercomputer-use.html
    Also, the wind turbines at the Happy Jack Wind Farm were churning in stout Wyoming wind yesterday.

  47. When the wind power stops, a wail arises from basement as a weeping system administrator screams “No! No!! They promised me I’d never have to run fsck on this machine. They promised!”

  48. I do hope that it will be used for things other than “climate”; I would not trust wind to power a single elevator in the building… unless if elevators come with ups’s… A 8 MW ups? That, would be cool.

  49. Primal Outdoors says:
    October 16, 2012 at 7:07 pm

    It’s good to hear that the servers will be used for much more than climate “research” – quite frankly, climate models are a BIG waste of time and money compared to other computational/numerical work that can be done in such a facility.

  50. I thought you meant NASCAR.
    The UK Met office boast that their latest supercomputer is the worlds largest. It still can’t get the weather correct.
    It is still Garbage In Garbage Out.

  51. I’ve been reading WUWT ever since Climategate I, but never felt compelled to comment because most of the subject matter is miles over my head. As it happens, I live about 5 miles from NCAR on Happy Jack Road. One thing I can tell you with certainty is, they will NEVER have to shut those computers down for lack of wind power; we consider 20mph a mild breeze out in these parts. The only time those turbines ever stop is when the wind is blowing too hard for them to keep up.

    From a political viewpoint, I’m just happy to see more money flowing into my tiny backwoods part of the world.

  52. Statemens like Peter made are why our state is littered with bird killing monstrosities. People think there is usable wind constantly. Last summer, wind generation dropped off the half of that over the colder months (from Wikipedia–did not have time to independently verify). So in the hottest part of the year, production drops off to half. Correct me if I am wrong, but don’t they need the most power during the summer?
    Also, it is NOT true the wind blows all the time in Wyoming. Turbines here exceed the 30% or less of capacity power, but they by no means make electricity ALL the time. It’s less than 50%. And as noted, the wind may exceed 50mph, which is very tough on turbines. Monday, with high wind, 9 out of 11 turbines were turning north of Casper. Bet that cuts the output percentage even more.
    I would agree that many people in Wyoming share Peter’s sentiment–they never met a federal subsidy or business they didn’t like. Again, I recommend prostitution and gambling. They are huge moneymakers.

  53. Maybe now they could calculate the “Bird Kill Rate per Day” of those wind monstrosities, also calculate why those towers are making people physically sick and calculate the additional treatment required and other health factors involved.
    Better still, calculate how much people loose in the value of their property when those “Mobile Wildlife Shredding Machines” are built on top of hills within view of their property.
    Now that would be putting those taxpayer funded Terra-flops to good use !

    One can dream.

  54. During the bird flu scare, I suggested, in the form of a letter from readers to the Denver Post, a series of wind turbines built on all Colorado state borders. Two benefits:

    1 – Prevent the entry of birds, carriers of ‘bird flu’.

    2 – ‘Free’ energy.

    I observed this project would kill two birds with one stone.

    Or would it have been two birds with one turbine?

    The letter was not published.

Comments are closed.