Meet 'Cheyenne', the coal-powered climate predicting supercomputer

There’s been some buzz in the last week as it was announced that NCAR’s new climate supercomputer was finally online. Back in 2010, I ran a story called “NCAR’s Dirty Little Secret” which outlined the new supercomputing center being built in Wyoming, while NCAR is in “green” Boulder Colorado.  From that article:


“Having an NCAR supercomputing facility in Wyoming will be transformative for the University of Wyoming, will represent a significant step forward in the state’s economic development, and will provide exceptional opportunities for NCAR to make positive contributions to the educational infrastructure of an entire state,” says William Gern, the university’s vice president for research and economic development.

Gosh, what an opportunity for Wyoming. But why give the opportunity away? Colorado doesn’t want this opportunity? None of the politicians in Colorado want to be able to say to their constituents that they brought “economic development” and “positive contributions to the educational infrastructure of an entire state”? That doesn’t seem right.

The answer may very well lie in economics, but not the kind they mention in feel good press releases.

You see as we know from supercomputers, they need a lot of energy to operate. And because they operate in enclosed spaces, a lot of energy to keep them cooled so they don’t burn up from the waste heat they generate.

For all their sophistication, without power for operation and cooling, a supercomputer is just dead weight and space.

Electricity is king.

Interestingly, in the press releases and web pages,  NCAR provides no answers (at least none that were easy to find) to how much electricity the new supercomputer might use for operation and cooling. They also provide no explanation as to why Colorado let this opportunity go to another state. I had to dig into NCAR’s  interoffice staff notes to find the answer.

The answer is: electricity.

Measuring 108,000 square feet in total with 15,000-20,000 square feet of raised floor, it will be built for 8 megawatts of power, with 4-5 megawatts for computing and 3-4 for cooling.

8 megawatts! Yowza.

(I noted then that electricity is significantly cheaper in Wyoming.)

click for source data

So besides the fact that NCAR abandoned “green” Colorado for it’s cheaper electricity rates and bond program, what’s the “dirty little secret?

Coal, the “dirtiest of fuels”, some say.

According to Sourcewatch, Wyoming is quite something when it comes to coal. Emphasis mine.

Wyoming is the nation’s highest coal producer, with over 400 million tons of coal produced in the state each year. In 2006, Wyoming’s coal production accounted for almost 40% of the nation’s coal.[1] Currently Wyoming coal comes from four of the State’s ten major coal fields. The Powder River Coal Field has the largest production in the world – in 2007, it produced over 436 million short tons.[2]

Wyoming coal is shipped to 35 other states. The coal is highly desirable because of its low sulfur levels.[3] On average Wyoming coal contains 0.35 percent sulfur by weight, compared with 1.59 percent for Kentucky coal and 3 to 5 percent for other eastern coals. Although Wyoming coal may have less sulfur, it also a lower “heat rate” or fewer Btu’s of energy. On average Wyoming coal has 8600 Btu’s of energy per pound, while Eastern coal has heat rates of over 12,000 Btu’s per pound, meaning that plants have to burn 50 percent more Wyoming coal to equal the power output from Eastern coal.[4]

Coal-fired power plants produce almost 95% of the electricity generated in Wyoming. Wyoming’s average retail price of electricity is 5.27 cents per kilowatt hour, the 2nd lowest rate in the nation[5]


nwscbuilding1

CHEYENNE, Wyoming — The National Center for Atmospheric Research (NCAR) is launching operations this month of one of the world’s most powerful and energy-efficient supercomputers, providing the nation with a major new tool to advance understanding of the atmospheric and related Earth system sciences.

Named “Cheyenne,” the 5.34-petaflop system is capable of more than triple the amount of scientific computing performed by the previous NCAR supercomputer, Yellowstone. It also is three times more energy efficient.

cheyennesuper-jan30-1024x6831

Scientists across the country will use Cheyenne to study phenomena ranging from wildfires and seismic activity to gusts that generate power at wind farms. Their findings will lay the groundwork for better protecting society from natural disasters, lead to more detailed projections of seasonal and longer-term weather and climate variability and change, and improve weather and water forecasts that are needed by economic sectors from agriculture and energy to transportation and tourism.

“Cheyenne will help us advance the knowledge needed for saving lives, protecting property, and enabling U.S. businesses to better compete in the global marketplace,” said Antonio J. Busalacchi, president of the University Corporation for Atmospheric Research. “This system is turbocharging our science.”

UCAR manages NCAR on behalf of the National Science Foundation (NSF).

Cheyenne currently ranks as the 20th fastest supercomputer in the world and the fastest in the Mountain West, although such rankings change as new and more powerful machines begin operations. It is funded by NSF as well as by the state of Wyoming through an appropriation to the University of Wyoming.

Cheyenne is housed in the NCAR-Wyoming Supercomputing Center (NWSC), one of the nation’s premier supercomputing facilities for research. Since the NWSC opened in 2012, more than 2,200 scientists from more than 300 universities and federal labs have used its resources.

“Through our work at the NWSC, we have a better understanding of such important processes as surface and subsurface hydrology, physics of flow in reservoir rock, and weather modification and precipitation stimulation,” said William Gern, vice president of research and economic development at the University of Wyoming. “Importantly, we are also introducing Wyoming’s school-age students to the significance and power of computing.”

The NWSC is located in Cheyenne, and the name of the new system was chosen to honor the support the center has received from the people of that city. The name also commemorates the upcoming 150th anniversary of the city, which was founded in 1867 and named for the American Indian Cheyenne Nation.

Visualization created using Yellowstone

Contour lines and isosurfaces provide valuable information about turbulence and aerodynamic drag in this visualization of air flow through the blades of a wind turbine, the product of a simulation on the NCAR-Wyoming Supercomputing Center’s Yellowstone system. (Image courtesy Dimitri Mavriplis, University of Wyoming.)

 

INCREASED POWER, GREATER EFFICIENCY

Cheyenne was built by Silicon Graphics International, or SGI (now part of Hewlett Packard Enterprise Co.), with DataDirect Networks (DDN) providing centralized file system and data storage components. Cheyenne is capable of 5.34 quadrillion calculations per second (5.34 petaflops, or floating point operations per second).

The new system has a peak computation rate of more than 3 billion calculations per second for every watt of energy consumed. That is three times more energy efficient than the Yellowstone supercomputer, which is also highly efficient.

The data storage system for Cheyenne provides an initial capacity of 20 petabytes, expandable to 40 petabytes with the addition of extra drives.  The new DDN system also transfers data at the rate of 220 gigabytes per second, which is more than twice as fast as the previous file system’s rate of 90 gigabytes per second.

Cheyenne is the latest in a long and successful history of supercomputers supported by the NSF and NCAR to advance the atmospheric and related sciences.

“We’re excited to provide the research community with more supercomputing power,” said Anke Kamrath, interim director of NCAR’s Computational and Information Systems Laboratory, which oversees operations at the NWSC. “Scientists have access to increasingly large amounts of data about our planet. The enhanced capabilities of the NWSC will enable them to tackle problems that used to be out of reach and obtain results at far greater speeds than ever.”

MORE DETAILED PREDICTIONS

High-performance computers such as Cheyenne allow researchers to run increasingly detailed models that simulate complex events and predict how they might unfold in the future. With more supercomputing power, scientists can capture additional processes, run their models at a higher resolution, and conduct an ensemble of modeling runs that provide a fuller picture of the same time period.

“Providing next-generation supercomputing is vital to better understanding the Earth system that affects us all, ” said NCAR Director James W. Hurrell. “We’re delighted that this powerful resource is now available to the nation’s scientists, and we’re looking forward to new discoveries in climate, weather, space weather, renewable energy, and other critical areas of research.”

Some of the initial projects on Cheyenne include:

Long-range, seasonal to decadal forecasting: Several studies led by George Mason University, the University of Miami, and NCAR aim to improve prediction of weather patterns months to years in advance. Researchers will use Cheyenne’s capabilities to generate more comprehensive simulations of finer-scale processes in the ocean, atmosphere, and sea ice. This research will help scientists refine computer models for improved long-term predictions, including how year-to-year changes in Arctic sea ice extent may affect the likelihood of extreme weather events thousands of miles away.

Wind energy: Projecting electricity output at a wind farm is extraordinarily challenging as it involves predicting variable gusts and complex wind eddies at the height of turbines, which are hundreds of feet above the sensors used for weather forecasting. University of Wyoming researchers will use Cheyenne to simulate wind conditions on different scales, from across the continent down to the tiny space near a wind turbine blade, as well as the vibrations within an individual turbine itself. In addition, an NCAR-led project will create high-resolution, 3-D simulations of vertical and horizontal drafts to provide more information about winds over complex terrain. This type of research is critical as utilities seek to make wind farms as efficient as possible.

Space weather: Scientists are working to better understand solar disturbances that buffet Earth’s atmosphere and threaten the operation of satellites, communications, and power grids. New projects led by the University of Delaware and NCAR are using Cheyenne to gain more insight into how solar activity leads to damaging geomagnetic storms. The scientists plan to develop detailed simulations of the emergence of the magnetic field from the subsurface of the Sun into its atmosphere, as well as gain a three-dimensional view of plasma turbulence and magnetic reconnection in space that lead to plasma heating.

Extreme weather: One of the leading questions about climate change is how it could affect the frequency and severity of major storms and other types of severe weather. An NCAR-led project will explore how climate interacts with the land surface and hydrology over the United States, and how extreme weather events can be expected to change in the future. It will use advanced modeling approaches at high resolution (down to just a few miles) in ways that can help scientists configure future climate models to better simulate extreme events.

Climate engineering: To counter the effects of heat-trapping greenhouse gases, some experts have proposed artificially cooling the planet by injecting sulfates into the stratosphere, which would mimic the effects of a major volcanic eruption. But if society ever tried to engage in such climate engineering, or geoengineering, the results could alter the world’s climate in unintended ways. An NCAR-led project is using Cheyenne’s computing power to run an ensemble of climate engineering simulations to show how hypothetical sulfate injections could affect regional temperatures and precipitation.

Smoke and global climate: A study led by the University of Wyoming will look into emissions from wildfires and how they affect stratocumulus clouds over the southeastern Atlantic Ocean. This research is needed for a better understanding of the global climate system, as stratocumulus clouds, which cover 23 percent of Earth’s surface, play a key role in reflecting sunlight back into space. The work will help reveal the extent to which particles emitted during biomass burning influence cloud processes in ways that affect global temperatures.

0 0 votes
Article Rating
120 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
February 22, 2017 9:40 am

Low level simulations like this supercomputer are targeted to are conterproductive for predicting the future climate. Weak solutions to Navier-Stokes quantifying the mean behavior are shown to exist and relative to the climate, all that matters is the mean behavior of the turbulence, not the low level details of that turbulent behavior.

Mike O
Reply to  co2isnotevil
February 23, 2017 1:12 pm

It makes it a lot easier when they know the answer before they run the simulation. Makes it a lot more accurate. /s

catweazle666
Reply to  co2isnotevil
February 24, 2017 1:46 pm

The output is extremely useful to determine precisely how much the temperature databases need to be Mannipulated to match the output from the models.

richard verney
February 22, 2017 9:42 am

What a waste of money.

MarkW
Reply to  richard verney
February 22, 2017 9:58 am

Why?

Reply to  MarkW
February 22, 2017 10:13 am

MarkW,
“Why?”
See the first comment. While in general, supercomputers are useful for some things, they are far from useful when trying to predict climate change since the number of unknowns is far to large, especially when simulating at the level of detail required. While you may be able to predict a climate, you will not be predicting our climate and most likely will not even predict a possible climate.

MarkW
Reply to  MarkW
February 22, 2017 10:25 am

Climate is only one of the things this super computer will be used for.

emsnews
Reply to  MarkW
February 22, 2017 11:51 am

It is the old ‘garbage in/garbage out’ problem.

goldminor
Reply to  richard verney
February 22, 2017 10:38 am

Agreed, that was my first thought as well. They build this fine machine, but are going to use it to prove their dysfunctional logic that CO2 drives the climate of this planet.

Sheri
Reply to  goldminor
February 22, 2017 10:57 am

Living in Wyoming, I also found it ironic that the computer is trying to prove Wyoming coal should be outlawed, followed by oil and gas.

Reply to  goldminor
February 22, 2017 11:05 am

Fortunately, the Wyoming Legislature has a plan to tax wind power. That’ll keep the Wyoming power reliable, and not waste taxpayers money for a useless feel good liberal project.

Sheri
Reply to  goldminor
February 22, 2017 12:31 pm

joelobryan: There is a $1.00 per MW tax right now. The legislature talked about increasing this, but that did not pass so far as I know. Sadly, Wyomingites can be very liberal in many cases. There is no real objection to federal handouts for feel good projects if the money goes to Wyomingites or might go here. One senator ran this year on a platform of how much money he could get the feds to send to Wyoming. He won.
Wyoming Power Company had a fit about the tax thing—as noted, they want to put in 1000 turbines and send the electricity to California. There was also a bill to require Wyoming power companies to buy coal and gas since that’s what pays the bills here, but I don’t think that one went through either.

Reply to  goldminor
February 22, 2017 2:10 pm

Wyoming has its Jackson Liberals, just as Texas has its Austin Liberals, just as Arizona has its Tucson liberals. Thankfully for now they are minority politically in the electorate base.

Reply to  richard verney
February 22, 2017 11:14 am

At the least all that power can be used to heat the surrounding buildings during Cheyenne’s cold winters.

Tom Halla
February 22, 2017 9:43 am

And Wyoming’s power grid is presumably less likely to crash, given less penetration by “renewables”?

Sheri
Reply to  Tom Halla
February 22, 2017 11:04 am

Actually, there are an annoying amount of wind turbines in this state. The computer center claims to run on wind energy, but they deceive. They buy CREDITS, they run on the same electricity as everyone else, MOSTLY from coal.
Wyoming’s wind is extremely unpredictable and pretty much useless for wind “energy”. We went 5 months last year with nearly no wind in the Casper area. Now, the wind is over 40 mph. It’s insane to think this is useful in any way. Except for tax credits and those using RPS’s to lock in profit, even if the turbines get turned off. No rational person should think this is anything but a government income redistribution plan that does great environmental damage.

Reply to  Sheri
February 22, 2017 11:19 am

Sheri:
Having ridden completely across Wyoming on horseback. I can say without doubt that windmills are very useful. You know, the very old mechanical water pump type used to fill remote watering troughs far from electrical services. Wyoming is a beautiful state.
The electrical kind, not so much.
On the other side, good computers are essential but what they are used for is an issue. GCM’s are just one minor issue. Think about medical research, space exploration, and all the things listed in the article. For example: http://www.cbc.ca/natureofthings/episodes/%20%20cracking-cancer Computing is essential in research. Good computers are neither good or bad. The research done on them may be useful … or wasteful. However what one thinks depends on a person’s perspective/bias.
Have a good day.

Sheri
Reply to  Sheri
February 22, 2017 12:25 pm

I have nothing against windmills—only wind turbines. Since I’ve worked with computers off and on for 35 years, I really don’t have a dislike of them, either. They are frustrating at times and can cause problems, but today I was checking the webcams to see how much snow there is across I-80 because I’ll be going to Utah in a few days. They are wonderful for road conditions. Also, as you mention, medical research. I really wouldn’t call them them great for social networking such as Facebook, but blogging—I’m in for that too. If I sounded negative about computers, that was not my intent. My negativity is to wind turbines and claims of using wind energy when one is not doing so.

MarkW
Reply to  Sheri
February 22, 2017 12:56 pm

I interviewed for a position in Cheyenne a couple of years ago. Sadly, I didn’t get it.

Dinsdale
February 22, 2017 9:45 am

The Peoples Republic of Boulder has an electric plant that runs on part coal (from western CO, not WY) and part natural gas:
https://bouldercolorado.gov/energy-future/valmont-generating-station
The coal-burning side will be shut down this year to “reduce emissions”. But no new generation is planned to replace it, so Boulder will just import electricity from other coal-sourced plants in the area. Typical short-sided thinking of the greenies.
NCAR is situated up on a bluff by the Flatirons (just like a Greek temple). It probably didn’t have the room to add a new supercomputer and its attendant cooling system, so they had to put it somewhere else. NCAR already had some facilities in Cheyenne so that seems like a logical place to put it.

February 22, 2017 9:48 am

The Met got a new computer, and look what it did for their results.. nothing apparently 😀
NCAR is where very strange things happen to data. So very strange things will happen to obs data with even greater speed. Wonderful.
Choosing the site because of cheaper coal power, oh boy do these scam artists really push it

Butch
February 22, 2017 9:50 am

..Does not matter how much computation a computer can do…Garbage in, Garbage out ! This fact will never change ….IMHO….

emsnews
Reply to  Butch
February 22, 2017 11:52 am

We think alike.

troe
February 22, 2017 9:52 am

We have one of these at ORNL named Titan. Pretty sure it was used to produce the Marcott paleoclimate reconstruction. In other words a more powerful machine won’t produce better science on a rotten foundation.

AleaJactaEst
February 22, 2017 9:54 am

GIGO , they just get there an awful lot quicker.

MarkW
Reply to  AleaJactaEst
February 22, 2017 10:00 am

We get you wrong answers twice as fast as the other guys.

MarkW
February 22, 2017 9:56 am

“Emphasis mine.”
As in coal mine?

MarkW
February 22, 2017 9:57 am

Year round, Cheyenne is a bit cooler than Colorado. I suspect that will help to make the air conditioning a little more efficient.

Reply to  MarkW
February 22, 2017 11:08 am

Staff costs (salaries for maintenance and technicians) are lower in Cheyenne than Boulder too.

UK Marcus
February 22, 2017 10:06 am

The 8 GW needed to operate this computer is the same as the total power produced by nuclear, and twice the total produced by wind in the UK today. Some machine… Hope it’s worth the cost.

MarkW
Reply to  UK Marcus
February 22, 2017 10:09 am

Megawatts, not gigawatts.

UK Marcus
Reply to  UK Marcus
February 22, 2017 10:11 am

Whoops, got that wrong – should be MW not GW – so comparison does not apply…

Butch
Reply to  UK Marcus
February 22, 2017 1:41 pm

UK Marcus, don’t be like Canada Marcus…..He got fired !! LOL

February 22, 2017 10:07 am

Confession: I do love super-computers. That first class, trying to wrestle with an Ill-Begotten Monstrosity, followed by the string of Control Data Corp. super-computers that allowed us to do real-world-affecting work to make better cars, trucks, cranes, power tools, baby baths…hooked me. Even after porting to the nifty SGI, Sun, AT&T… work-stations, it was never the same. Micro-computers and tablets are like banging rocks together again.
But the tax-victim funding of most of them is thoroughly embarrassing.

Crispin in Waterloo but really in Bishkek
Reply to  Mib8
February 22, 2017 2:53 pm

How much of the computer’s capacity will be devoted to sorting through all the emails, text messages, phone calls, VOIP discussions, walkie talkie traffic, radio broadcasts, computer backups and baby cams pilfered from across the world?

Neal A Brown
Reply to  Mib8
February 22, 2017 3:18 pm

“Ill Begotten Monstrosity” – I love it! LMAO!!
ps: I’m old enough to appreciate that.

RockyRoad
Reply to  Mib8
February 22, 2017 10:34 pm

We always said IBM meant It’s Better Manually. I was able to acquire the 3rd IBM PC that hit the state of Nevada: It had 640 kb memory and a 15-megabyte external hard drive. The thing was loaded with $10,000 worth of geostatistical software and we eventually expanded to several PCs to do real-time grade control and mine planning.
Those were so much better than the “pay-to-play” Vax 11-780 I had previously used that I was in 7th heaven! And we were able to deep-six the MIS group at corporate headquarters that was too costly, slow, and hence useless.

Mark Luhman
Reply to  RockyRoad
February 23, 2017 7:51 am

No IBM does not stand for ” IBM meant It’s Better Manually” it really meant Imitation Burroughs Machine. Funny I once picked up a book in the eighties how IBM invented Virtual memory in the 1960s, that was completel IBM invent virtual memory when the Burroughs patin ran out in the 1960 Burroughs computer had been using virtual memory since the fifties, after all they computer scientist really did figure out you could swap memory out to some other storage medium, at the time I not certain it was even disk and since most machines at that time used core memory it was not remn RAM they were saving, but the process of using virtual memory indeed invented by Burroughs, in reality most computer innovation were done by someone else IBM took them after the patient ran out and used them as their own innovation and the blue short press gave them the credit.

Reply to  RockyRoad
February 23, 2017 11:56 pm

For sale” Cheap. Slightly used Commodore 64. With software, tape drive and other goodies. Includes b/w monitor. Don’t remember them all because I haven’t opened the box for 30 years.

TomM
February 22, 2017 10:07 am

The joke is on them – today’s commercial sector electricity rates in Boulder and Cheyenne only differ by about 0.14¢/kwh. That’s about a tenth of what the commercial rate difference was in 2009. I hope there were other driving reasons behind the location choice.

Tom in Denver
February 22, 2017 10:09 am

In 2004 Colorado passed a law that 20% of all electricity had to come from renewables. It is that renewable mandate that created the difference in electricity costs between Colorado and Wyoming.

Dinsdale
Reply to  Tom in Denver
February 22, 2017 10:21 am

And wind power doesn’t work well where all the population is (just east of the Rockies). The winds are either dead calm or blasting 30+ MPH like this week. Solar works fairly well as we get a lot of sunny days, but of course is still not economical. But the rich greenies like to cover their roofs with solar subsidized by the working poor just to brag to their Progressive friends.

James Francisco
Reply to  Tom in Denver
February 22, 2017 10:59 am

Too bad they can’t just mandate against being stupid.

Mark
February 22, 2017 10:09 am

So we will get garbage predictions now faster?

Tim Hoskins
February 22, 2017 10:19 am

In the latter half of this post, which I think is an NCAR press release, shouldn’t “Named ‘Cheyenne’,’ the 5.34-petaflop system…” read “the 5.34-petaflops system…” since flops isn’t a plural but rather an acronym for “floating-point operations per second”? Does anyone with more knowledge of computers know what the convention is?

Dinsdale
Reply to  Tim Hoskins
February 22, 2017 10:23 am

I’ve seen it stated with several conventions. My personal choice is peta-FLOP’s but that is more rarely used.

Reply to  Dinsdale
February 22, 2017 1:55 pm

I prefer peta-FLOPS.

MarkW
Reply to  Tim Hoskins
February 22, 2017 10:28 am

“Operations” within the acronym is already plural.

ferd berple
February 22, 2017 10:29 am

Long-range, seasonal to decadal forecasting:
===============
Such a forecast is doomed to failure. It will show no more skill than taking the statistical average of past weather.
Quite simply, the mathematics does not exist to reliably predict the future beyond a very short window. The problem size expands exponentially, quickly consuming all resources no matter how fast the computer, so that it takes longer to calculate the future than to simply wait for the future.
You can shrink the problem size through approximation, but then the accuracy of the result falls off so that it is no more accurate than a simple, low cost statistical analysis of past weather.

emsnews
Reply to  ferd berple
February 22, 2017 12:05 pm

Exactly and worse, it cannot predict the radical changes in climate that have happened over and over again during the last three million years.

Samuel C Cogar
Reply to  ferd berple
February 23, 2017 3:46 am

“no matter how fast the computer, so that it takes longer to calculate the future than to simply wait for the future.”
Ferd, a beautiful statement that was.

nn
February 22, 2017 10:42 am

An electronic prophet. Perhaps long-term in a semi-stable environment with limited variance. But, on Earth, they should qualify its predictions are valid in a limited frame of reference (i.e. scientific domain), and its accuracy is inversely proportional to the product of time and space offsets from its input and calculation.

J Mac
February 22, 2017 10:42 am

Wow! There isn’t a door mat big enough to wipe off that 8 megawatt, 5.34-petaflop ‘carbon foot print’!

Gamecock
February 22, 2017 10:45 am

Hardware can’t fix software problems.

RockyRoad
Reply to  Gamecock
February 22, 2017 10:44 pm

True, but the problems go by so fast they’re unrecognizable.

Mark Luhman
Reply to  Gamecock
February 23, 2017 8:02 am

Yes but most of the human race does not now that. Years ago I shot indoor league bow and arrow. There were people that could at 20 yards hit 299 to 300 out of three 300 shots into a 2 inch circle at 20 yards, Us lesser souls were lucky to hit 200, their was a group alway changing their tackle to improve their score but they never considered the problem was not the bow, the arrows, the sights or the release it was them, so the story that had the range made lots of money from the people blame the equipment not who was holding said equipment. The same is for supercomputers it never the limitation of the program or the programmer assumptions, it is alway not a fast enough hardware to give the answer they want.

Mark Luhman
Reply to  Mark Luhman
February 23, 2017 8:05 am

Yes but most of the human race does not now that. Meant yes but most of the human race does that. Should have added the quote “Hardware can’t fix software problems.”

Reply to  Gamecock
February 23, 2017 11:58 pm

That depends on the size of the hammer.

Chimp
February 22, 2017 10:46 am

Both the NCAR and GISS swamps need drained and Trenberth and Schmidt deported to their native Misty Isles.

February 22, 2017 10:51 am

Supercomputers are really cool because they do not need any programming. They automatically produce better results without building in any global warming assumptions or fudge factors. The climate models run on this supercomputer will “magically” produce 3.25C of warming by 2100, verifying the results of all other climate model simulations run before. The good thing is it will come to this conclusion at a much faster pace and with much higher resolution results. Even more proof that global warming is real and it is happening now (say that three times out-loud; just say it at a really fast pace this time)
Definitely worth it.

Reply to  Bill Illis
February 22, 2017 11:21 am

Now they just need a supercomputer that will fix the observations… 😉

Butch
Reply to  David Middleton
February 22, 2017 2:03 pm

..Virtual Reality Goggles ??

Mark Luhman
Reply to  David Middleton
February 23, 2017 8:09 am

They do after all supercomputer produce data and good data at that after all you can’t believe you lying eye when you look at the observed data after all the compute data came from the all knowing computer.

Alan Robertson
February 22, 2017 10:56 am

The quoted speed/efficiency of 3 GB/sec/watt is very impressive. SGI lists E5 Xeon Broadwell processors as components for the SGI ICE XA nodes. I’d have to see some more details about the ICE XA nodes to take that quoted efficiency at face value, with any processor.
In the early days of nVidia’s CUDA GPU rollout, I built a personal 1+TFLOPS machine and only ran it for a couple of winters. Helped warm my house. Such a machine would be trivial now, at much reduced power consumption.

Alan Robertson
Reply to  Alan Robertson
February 22, 2017 11:15 am

typo edit: Meant to say- with any Xeon processor.

Mark T
Reply to  Alan Robertson
February 22, 2017 11:34 am

A single Kepler K-80 has nearly 9 TFLOPS now. I’ve got a few dozen K-10s in the room next to me as well. Our little lab pushes 25+ kW when we get everything humming, though a single K-80 is only 300 W.

Richard G
February 22, 2017 11:00 am

I’ll give them props for running it on Coal Power. At least they know where reliable cost efficient power comes from.

Dr. W. Z
February 22, 2017 11:02 am

Lots of unnesserary and not really helping the mankind in under developed states

February 22, 2017 11:06 am

gusts that generate power at wind farms
**********************************************
lol using coal to study wind power. delicious.

Bob Burban
February 22, 2017 11:06 am

Such a fine piece of machinery should run exclusively on solar and/or wind power, guided by its own awesome predictive brainpower. To cement its green credentials, it should be completely disconnected from the grid and have no back-up power running on polluting diesel/gasoline/gas generators. That’d be called ‘putting your money where your mouth is’.

Sheri
Reply to  Bob Burban
February 22, 2017 12:35 pm

You’re a funny man! 🙂

Reply to  Bob Burban
February 22, 2017 2:00 pm

Even they know that ain’t happening.

garymount
February 22, 2017 11:17 am

I am preparing an epic 50 part series of articles for WUWT themed on “Building Wattson”. It will take me two more years before I have finished the first drafts and begin publishing the first article.
Wattson is an all encompassing climate science software project covering all things climate related.
The Building Wattson series will step through the process of building the software from scratch. Using Test Driven Development and Domain Driven Design principles. We will drive the development through from UI to Database, using various design patterns and best practices of today, as well as the modern tooling for software development now available.
I will write an introductory article about the series and about Wattson itself sometime this year.
I will also elaborate on my expertise on the subjects necessary to carry out this task.
Climate models will also be part of the series.
The completed software code as a whole will not be released publically but instead will be … yet to be determined but likely handed ownership to Anthony Watts.
Meanwhile, computer chips greater than 100 times more powerful than todays best are schedule for commercial production in 2022 :
http://www.eetimes.com/document.asp?doc_id=1330971
ggm

Editor
February 22, 2017 11:19 am

Irony is soooo… Ironic…

QUICK FACTS

  • Wyoming produced 42% of all coal mined in the United States in 2015.
  • In 2015, 32 states received coal from Wyoming mines, with 10 states, including Wyoming, obtaining more than 90% of their domestic coal from Wyoming.
  • Wyoming accounted for 6.2% of U.S. marketed natural gas production in 2015.
  • In 2015, almost 88% of net electricity generation in Wyoming came from coal and nearly 11% came from renewable energy resources, primarily wind.
  • Wyoming had the third lowest average retail electricity rates of any state in 2015.
Last Updated: December 15, 2016

http://www.eia.gov/state/?sid=WY

Mark Luhman
Reply to  David Middleton
February 23, 2017 8:12 am

The Sixteen foot coal seams in the Powder River basin help a lot when elsewhere most are about four feet.

Walter Sobchak
February 22, 2017 11:29 am

They will be able to run their models faster and faster, and produce more and more meaningless piles of irrelevant numbers. Whoopee.

Mark T
February 22, 2017 11:37 am

My understanding of these supercomputers is that they are really designed so multiple problems can be worked on simultaneously, not for one problem. It’s cloud computing. No one project really uses the whole thing at any given time. I can’t imagine even the GCMs require that much power, but when you want to run all of the GCMs along with the programs from 1000 other users, it makes sense.

MarkW
Reply to  Mark T
February 22, 2017 1:02 pm

GCMs divide the atmosphere into grids, to model the entire earth, there could be 100’s of thousands to millions of blocks in the grid.
The GCM then solves the same equations for each block, calculates how the result of each block influence the surrounding blocks, then solve the same equations for each block again.
A GCM could easily use thousands of processors, in fact it is the type of problem for which massively parallel computing was designed.
Other types of models do the same thing. Dividing their computation world into lots of little blocks that each get resolved individually and in parallel.

Samuel C Cogar
Reply to  MarkW
February 23, 2017 4:06 am

…… there could be 100’s of thousands to millions of blocks in the grid.
The GCM then solves the same equations for each block, calculates how the result of each block influence the surrounding blocks, then solve the same equations for each block again.

HA, tis no wonder to one’s amazement that the output data (results) is FUBAR from the get go.
Me thinks the above is a good description of a PCCP (perpetual computing computer program).

Mark T
Reply to  MarkW
February 23, 2017 4:05 pm

I am aware of this. Their implementation is not just for GCMs, however.

Resourceguy
February 22, 2017 11:42 am

Hand it over to the IRS. They always complain about not having enough computing resources.

Reply to  Resourceguy
February 22, 2017 2:04 pm

They’d have more computing power if they didn’t destroy their hard drives.

Rachelle
February 22, 2017 11:43 am

I suspect that claims for its ability to predict climate were made to ease funding . . . so many otherwise worthwhile projects have been hung on climate change to get the funds they need. No doubt climate will be studied, but so will many other things not yet imagined.

February 22, 2017 11:56 am

Will this fantastic machine now be able to produce the scientifically conclusive, empirical evidence that Co2 does actually cause Global Warming?
That is to say, will it, and its associated sensors in ARGO buoys, Balloons (No! not the human type) Stephenson screens, satellites etc. etc. be able to quantify precisely how the human emissions of 0.0004ppm Co2, of the entire atmosphere competes with atmospheric water vapour, not to mention oceanic water itself which, thankfully, dwarfs the atmospheric content.
Will it also be able to accurately determine the hitherto undeterminable effects of clouds on our planet’s atmosphere?
Will it be able to produce predictions of earth temperatures that actually mimic observable measurements?
Well, I guess we won’t know the answers to any of these questions for another 20 years. Hey Ho.
Meanwhile, US taxpayers are fleeced for more barmy climactic pseudo-science; the world’s nations are herded down the route of subjugation at the alter of the green human haters; and the ‘climate change community’ have closed ranks on any criticism of their settled science.
I have probably said this before, so sorry to bore you guys, but I was an AGW believer and went looking for the evidence. I headed for the usual alarmist’s websites/blogs and was horrified at the reception. Browbeating and indoctrination at the reception desk just doesn’t begin to describe the welcome.
When I discovered WUWT and Paul Homewood’s Notmanypeopleknowthat, I was met with enthusiastic crowd who only want answers to questions. No browbeating, no propaganda, no preconceived ideas, nothing like the alarmists.
Thank you guys. I occasionally have my doubts, I guess as we all do, but I’ll get informed answers from the sceptics, I’ll get indoctrination from the alarmist’s.
Super computers can’t compete with the human capacity for curiosity.

EricN.WY
February 22, 2017 12:03 pm

The coal industry in Wyoming has been seriously hurt by Obama EPA rules the last couple of years. Some mines were forced out of business, causing the state massive loss of revenue. Hope that’ll change now, I’d like to keep my $44/month electricity bill (equal payment plan).

tty
February 22, 2017 12:12 pm

A really powerful supercomputer would be able to improve climate models a very great deal by calculating explicitly important processes like turbulence, convection, thunderstorms, tornadoes, orographic precipitation etc that are unavoidably “parameterized” today. However that would require going from 100 km grids, which are common today to 1 km grids (or finer). This requires 100^4 = 100 000 000 times more computing power (since we are dealing with three spatial dimensions plus time).
Ain’t gonna happen soon.

MarkW
Reply to  tty
February 22, 2017 1:03 pm

Each increase in computing power gets us an eensy bit closer. (That’s a technical term.)

RockyRoad
Reply to  MarkW
February 22, 2017 10:53 pm

…I thought nearly imperceptible increments were measured in smidgens.

Mark T
Reply to  tty
February 23, 2017 4:07 pm

It”s actually more efficient to wait for a few orders of magnitude in computing power.

manfredkintop
February 22, 2017 12:15 pm

I heard all that it can do is churn out “42”.

The Original Mike M
Reply to  manfredkintop
February 22, 2017 1:04 pm

When Trump pulls the climate funding maybe they can use it to get a more precise value for Pi out to say, 10^50 digits? That’ll keep it “busy” for a little while…

brians356
Reply to  manfredkintop
February 22, 2017 1:50 pm

Yeah, I hear it. “Di-di-di-di-dah Di-di-dah-dah-dah”, over and over again. Stop! Make it stop!

The Original Mike M
February 22, 2017 12:53 pm

With that much computing power you could predict the orbits of every asteroid over the size of a Volkswagen found in our solar system several years into the future. You might say that the risk of a big one hitting earth is very small and that’s true but the risk to earth from getting a little warmer with more CO2 is far far less than that.

brians356
Reply to  The Original Mike M
February 22, 2017 1:44 pm

I believe they can already calculate orbits for objects we can find. The problem is detecting all the dangerously-large objects while far enough away. I think.

Gary Pearse
Reply to  brians356
February 22, 2017 4:00 pm

Brians356: The orbits aren’t independent of multi body effects and the three body gravitational effect is still looking for a solution, I believe.

Samuel C Cogar
Reply to  brians356
February 23, 2017 4:25 am

The problem is detecting all the dangerously-large objects while far enough away. I think.

Oh my my, t’aint no such problem exists.
Now iffen they can “see” and measure six (6) earth size planets at a distance of 40 light years ….. then they can surely “see” all asteroids bigger than a golf ball residing within the orbit of Pluto.

tty
Reply to  brians356
February 23, 2017 8:41 am

February 22, 2017 1:09 pm

Coal Train — A Reporter At Large by John McPhee. Published in The New Yorker in 2005.
Beyond the detriments of Powder River Basin coal was the signal fact that it was as much as five times lower in sulfur than Appalachian coal. With the Clean Air Act, power plants were required to scrub sulfur out or burn low sulfur coal. The five hundred power plants that use coal to light, heat, cool, and compute fifty-two per cent of just about everything in the United States were suddenly swiveling their attention to Powder River coal. A combination of companies built the Orin Line — the longest new rail line in the United States since the nineteen-thirties. At various sites along the Orin Line, large machines removed a hundred feet of overburden to begin an invasion of the planet unprecedented in scale. Belle Ayr, Black Thunder, North Antelope, Jacobs Ranch — in fewer than twenty years, mines of the Powder River Basin were the largest coal mines in the history of the world.
Coal trains go into the Powder River Basin like tent caterpillars up a tree.

https://yaleunion.org/wp-content/uploads/2013/12/CoalTrain_McPhee.pdf

RockyRoad
Reply to  rovingbroker
February 22, 2017 11:02 pm

I guess whoever coined the term “”an invasion of the planet unprecedented in scale” has never seen the Grand Canyon or Hell’s Canyon (most scenic and deepest river gorges on the continent, respectively).
I suspect the “tent caterpillars up a tree” was just meant to be derogatory.

PaulH
February 22, 2017 1:28 pm

Cheyenne is so powerful it can execute an infinite loop in a mere 24 hours. 😀
(I know, it’s an old joke..)

brians356
February 22, 2017 1:37 pm

Wait, what’s HAL going to think when he finds out he’s being used to provide justification for cutting off his own source of cheap power? “I’m sorry, Dave, I simply cannot let you do that.”

willhaas
February 22, 2017 1:38 pm

“To counter the effects of heat-trapping greenhouse gases,” But there is no evidence that CO2 has any effect on climate. There is no such evidence in the paleoclimate record and plenty of scientific rational to support the idea that the climate sensivity of CO2 is zero. Furthermore there is also no real evidence that a radiametric greenhouse effect caused by the LWIR absorption properties of greenhouse gases even exists. If CO2 really affected climate then the increase in CO2 over the past 30 years would have caused at least a measureable increase in the dry lapse rate in the troposphere but such has not happened. It does not matter how powerful the computer running the models is if the models are wrong.

Gary Pearse
February 22, 2017 4:06 pm

“long and successful history of supercomputers supported by the NSF and NCAR to advance the atmospheric and related sciences.”
What successes are these pray tell? All they’ll end up getting is a prediction of a series of BBQ summers – which, I understand they already have those in Wyoming. The Met Office will be called in as consultants. One way to make these predictions true, at least in the UK would be to sell BBQ’s with 100sq metre umbrellas built in. I’ll bet they ordered this equipment believing HRC would be going to the Oval Office. They are putting the best face on it they can, I guess.

February 22, 2017 6:06 pm

This is like the quantum measurement problem disturbing the system. Computing the climate requires so much energy that it will alter the climate.

February 22, 2017 7:12 pm

I sure hope they send the following warning to all climate scientists approved to run code on this new computer: “Remember: garbage in = garbage out. Using a more powerful computer does not change this most basic equation.”

February 22, 2017 7:39 pm

Scientists across the country will use Cheyenne to study phenomena ranging from wildfires and seismic activity to gusts that generate power at wind farms.
No wind farms in Wyoming –
Coal-fired power plants produce almost 95% of the electricity generated in Wyoming. Wyoming’s average retail price of electricity is 5.27 cents per kilowatt hour, the 2nd lowest rate in the nation[5]

Curious George
February 22, 2017 7:46 pm

Supercomputers are great – if you have a problem they can help with. A weather forecast for 100 hours would be great, and it may be within reach. 100 days? We are not there yet. 100 years? You are fooling yourself.

February 22, 2017 8:11 pm

One supercomputer – one air flow through the blades of a wind turbine :
comment image
Visualization created using Yellowstone
Contour lines and isosurfaces provide valuable information about turbulence and aerodynamic drag in this visualization of air flow through the blades of a wind turbine, the product of a simulation on the NCAR-Wyoming Supercomputing Center’s Yellowstone system. (Image courtesy Dimitri Mavriplis, University of Wyoming.)
What’s next – climate change?

Patrick MJD
February 22, 2017 8:12 pm

Reminds me of this from Hitch Hikers’s Guide to the Galaxy;
“Yes, so anyway,” he resumed, “the idea was that into the first ship, the ‘A’ ship, would go all the brilliant leaders, the scientists, the great artists, you know, all the achievers; and into the third, or ‘C’ ship, would go all the people who did the actual work, who made things and did things, and then into the `B’ ship – that’s us – would go everyone else, the middlemen you see.”
Climate “scientists”, and their computers, should be loaded on to the “B” ship.

MrD
February 22, 2017 8:48 pm

Supercomputers are AWESOME. I’m less worried about the energy they’re going to use than what we can get out of them.

cm
February 22, 2017 9:10 pm

In talking to some folks directly involved in the project, one of the more legit reasons they sited in Wyoming was the passive cooling available for the CRAC units due to the Wyoming weather. On the other hand, their air floor is something like 10 feet high, supposedly due to vents needing to be above snow drifts.

Brook HURD
February 22, 2017 9:47 pm

A 5.34 Pentaflop computer does not reduce the errors in the input data, nor does it reduce the numerous unknowns which are part of the climate system. This super computer will be able to run non verified models much more rapidly than they could be run in the past producing much more garbage output in much less time. A better use of this super computer would be to add more climate inputs to the models so that at some point we could expect to have at least one climate model which is able to reliably hind cast.

Griff
February 23, 2017 12:34 am

“Projecting electricity output at a wind farm is extraordinarily challenging as it involves predicting variable gusts and complex wind eddies at the height of turbines, which are hundreds of feet above the sensors used for weather forecasting”
No it isn’t.
UK national grid has 95% accurate forecasting up to 24 hours in advance and better after that time.
The advances in computing and networks in the last few decades mean that the software to cope is there.

E Mendes
Reply to  Griff
February 23, 2017 3:05 pm

Has anybody told you today Griff, that you’re an idiot? Well- you are.
And it’s evident you have a personality disorder
or you couldn’t be so thoroughly wrong,
so often.

Patrick MJD
Reply to  Griff
February 23, 2017 3:44 pm

“Griff February 23, 2017 at 12:34 am
No it isn’t.
UK national grid has 95% accurate forecasting up to 24 hours in advance and better after that time.”
Really? Remember the wind storms of the mid-80’s? Weather forecasts are wrong roughly 50% of the time.

catweazle666
Reply to  Griff
February 24, 2017 1:58 pm

“UK national grid has 95% accurate forecasting up to 24 hours in advance and better after that time.”
LIAR.

Reply to  catweazle666
February 26, 2017 6:49 am

That really is the only conclusion abt Griff ( and many other on the reality hating Left ) . I’ve seen even continental scale graphs of the extreme variation in output .
India , I think it is , implemented penalties for failing to predict power outputs 24 hours in advance .

Resourceguy
February 23, 2017 6:08 am

Can it predict the rate of AMO decline?

LewSkannen
February 24, 2017 2:42 am

NCAR?
I am more of a NASCAR fan, myself.

tabnumlock
February 28, 2017 8:49 pm

They should have it predict the stock market so it can be self funding.