by Anthony Watts
WUWT readers of course have heard about the Met Office and their giant new supercomputer called “deep black” that they use for climate simulation and short term forecasts.
Not to be outdone, the National Center for Atmospheric Research (NCAR) in Boulder, CO has commissioned a new supercomputer project of their own: The NCAR-Wyoming Supercomputing Center (NWSC) shown in artist rendering below.

In the initial press release they state the location and purpose:
January 23, 2007
BOULDER—The National Center for Atmospheric Research (NCAR) and its managing organization, the University Corporation for Atmospheric Research (UCAR), announced today that they will form a partnership with the University of Wyoming, the State of Wyoming, and the University of Colorado at Boulder to build a new supercomputing data center for scientific research in Cheyenne. The center will house some of the world’s most powerful supercomputers in order to advance understanding of climate, weather, and other Earth and atmospheric processes.
…
The center’s supercomputers, which will be upgraded regularly, will initially achieve speeds of hundreds of teraflops (trillion floating-point operations per second).
The Met Office wrote in their initial press release:
By 2011, the total system is anticipated to have a total peak performance approaching 1 PetaFlop — equivalent to over 100,000 PCs and over 30 times more powerful than what is in place today.
We found out later that the Met Office supercomputer would have an electrical power consumption of 1.2 megawatts.
So with that it mind, we’d expect the new NCAR-Wyoming Supercomputing Center (NWSC) to have some similar sort of power consumption. Right?
On the masthead of the NWSC page they say they are all about energy efficiency.
The NWSC project encompasses the design and construction of a world class center for high performance scientific computing in the atmospheric and related geosciences. Consistent with its mission, the facility will be a leader in energy efficiency, incorporating the newest and most efficient designs and technologies available. The center will provide new space to enable the advancement of scientific knowledge, education, and service through high-performance computing.
And on the right sidebar:
Focus on Sustainability
Maximum energy efficiency, LEED certification, and achievement of the smallest possible carbon footprint are all goals of the NWSC project. In the coming weeks and months, check this section of the site for updates on project sustainability efforts and outcomes.
That’s great, I’m all for sustainability and energy efficiency, even the “smallest possible carbon footprint” doesn’t sound too bad. Surely it will be more energy efficient and “greener” than the Met Office Supercomputer, right?
There’s an interesting unanswered question though. Why put this new facility in Wyoming rather than “green” Colorado? Isn’t Boulder, where NCAR is headquartered, the greenest of Colorado cities, and in the US top five too?
In the initial press release announcing the project, there’s this bit of political feel good prose:
“Having an NCAR supercomputing facility in Wyoming will be transformative for the University of Wyoming, will represent a significant step forward in the state’s economic development, and will provide exceptional opportunities for NCAR to make positive contributions to the educational infrastructure of an entire state,” says William Gern, the university’s vice president for research and economic development.
Gosh, what an opportunity for Wyoming. But why give the opportunity away? Colorado doesn’t want this opportunity? None of the politicians in Colorado want to be able to say to their constituents that they brought “economic development” and “positive contributions to the educational infrastructure of an entire state”? That doesn’t seem right.
The answer may very well lie in economics, but not the kind they mention in feel good press releases.
You see as we know from supercomputers, they need a lot of energy to operate. And because they operate in enclosed spaces, a lot of energy to keep them cooled so they don’t burn up from the waste heat they generate.
For all their sophistication, without power for operation and cooling, a supercomputer is just dead weight and space.
Electricity is king.
Interestingly, in the press releases and web pages, NCAR provides no answers (at least none that were easy to find) to how much electricity the new supercomputer might use for operation and cooling. They also provide no explanation as to why Colorado let this opportunity go to another state. I had to dig into NCAR’s interoffice staff notes to find the answer.
The answer is: electricity.
Measuring 108,000 square feet in total with 15,000-20,000 square feet of raised floor, it will be built for 8 megawatts of power, with 4-5 megawatts for computing and 3-4 for cooling.
8 megawatts! Yowza.
It’s really about economics. Electricity is getting expensive, and likely to be more expensive in the future. Candidate Obama said that under his leadership, “electricity rates would necessarily skyrocket“. Clearly NCAR is planning for a more expensive energy future.
In the interoffice staff notes, NCAR outlines its decision logic.
NCAR considered partnerships for the data center with a number of organizations along the Front Range, giving CU-Boulder and the University of Wyoming particularly close scrutiny. NCAR also looked into leasing space and retrofitting an existing data center.
With support from NSF and the UCAR Board of Trustees, NCAR chose to locate the center in Wyoming after a rigorous evaluation, concluding that this partnership would facilitate getting the greatest computing capability for the regional and national scientific community at the earliest possible time.
“The Wyoming offer provides more computing power, sooner, and at lower cost,” Tim explained during an all-staff town hall meeting on January 31. “We’ve secured the future of NCAR’s role in leadership computing.”
The Wyoming offer consists of a 24-acre “shovel-ready” site for construction in the North Range Business Park in Cheyenne near the intersection of I-80 and I-25, along with physical infra- structure for fiber optics and guaranteed power transmission of 24 megawatts. The University of Wyoming will provide $20 million in endowment funds for construction, as well as $1 million annually for operations. NCAR will utilize the State of Wyoming’s bond program to fund construction, with the state treasurer purchasing bonds that will be paid off by NCAR.
Although CU-Boulder’s offer would have given the new center greater proximity to other NCAR facilities, it would have left NCAR with a mortgage of $50 million rather than $40 million and less long-term financial savings. The Cheyenne site offers cheaper construction costs and lends itself to future expansion. It also brings a transformative partnership to a state that has traditionally lacked opportunities in technology and research.
Indeed according to the latest figures from the Energy Information Adminsitration and Department of Energy (EIA/DOE) electricity is significantly cheaper in Wyoming.

So besides the fact that NCAR abandoned “green” Colorado for it’s cheaper electricity rates and bond program, what’s the “dirty little secret?
Coal, the “dirtiest of fuels”, some say.
According to Sourcewatch, Wyoming is quite something when it comes to coal. Emphasis mine.
Wyoming is the nation’s highest coal producer, with over 400 million tons of coal produced in the state each year. In 2006, Wyoming’s coal production accounted for almost 40% of the nation’s coal.[1] Currently Wyoming coal comes from four of the State’s ten major coal fields. The Powder River Coal Field has the largest production in the world – in 2007, it produced over 436 million short tons.[2]
Wyoming coal is shipped to 35 other states. The coal is highly desirable because of its low sulfur levels.[3] On average Wyoming coal contains 0.35 percent sulfur by weight, compared with 1.59 percent for Kentucky coal and 3 to 5 percent for other eastern coals. Although Wyoming coal may have less sulfur, it also a lower “heat rate” or fewer Btu’s of energy. On average Wyoming coal has 8600 Btu’s of energy per pound, while Eastern coal has heat rates of over 12,000 Btu’s per pound, meaning that plants have to burn 50 percent more Wyoming coal to equal the power output from Eastern coal.[4]
Coal-fired power plants produce almost 95% of the electricity generated in Wyoming. Wyoming’s average retail price of electricity is 5.27 cents per kilowatt hour, the 2nd lowest rate in the nation[5]
It’s so bad, that Wyoming’s coal plants earned the coveted “Coal Swarm” badge on that page.
Gosh.
But not to worry, NCAR has a plan to “clean up” that dirty coal use to power their supercomputer climate modeling system.
Again from the interoffice staff notes
The new center will be the first NCAR facility to earn LEED (Leadership in Energy and Environmental Design) certification for its design, construction, and operation. Measuring 108,000 square feet in total with 15,000-20,000 square feet of raised floor, it will be built for 8 megawatts of power, with 4-5 megawatts for computing and 3-4 for cooling. The power will be generated primarily from “clean” coal (coal that has been chemically scrubbed to reduce emissions of harmful pollutants) via Cheyenne Light Fuel and Power. NCAR is also aggressively working to secure the provision of alternative energy (wind and solar) for the facility, hoping to attain an initial level of 10%.
“We’re going to push for environmentally friendly solutions,” Tim says.
Clean Coal? Hmmm. NASA GISS’ Dr. Jim Hansen says Clean Coal is a decade away:
James Hansen, one of the world’s best-known global warming researchers and a recent vocal advocate of proposed coal plants, says clean coal technology used on a full-scale coal-fired plant could be at least a decade away. He expressed the sentiment in a media briefing organized by clean energy group RE-AMP, arguing against a proposed coal plant in Marshalltown, Iowa.
Hansen also said that:
“The trains carrying coal to power plants are death trains. Coal-fired power plants are factories of death. When I testified against the proposed Kingsnorth power plant, I estimated that in its lifetime it would be responsible for the extermination of about 400 species – its proportionate contribution to the number that would be committed to extinction if carbon dioxide rose another 100 ppm.”
Don’t worry, the University of Wyoming in Cheyenne, where the new NCAR supercomputing center will be, is already on top of the situation. This is from their press release May 26th, 2008:
The University of Wyoming is ready to research clean coal and wants proposals from both academic and industry organizations. With the help of the Wyoming state government, they’ve arranged for up to $4.5 million in research funds — which can be matched by non-state funds.
And, Wyoming already has their hand out to Presdient Obama:
Colorado, Utah, Wyoming Seek Clean Coal Funding
DENVER (AP) ―
The governors of Colorado, Utah and Wyoming are asking President Barack Obama to fund the development of clean-coal technologies in the West.
Yup, clean coal will power that new NCAR supercomputer any day now, and we’ll be paying for it.
In the meantime:
I’m sure NCAR will let us know how those wind turbines work out for that other 10% of the power.
h/t to Steve Goddard in comments
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.



The IPCC may be Eating More Crow soon…
The Times, January 17, 2010
World misled over Himalayan glacier meltdown
http://www.timesonline.co.uk/tol/news/environment/article6991177.ece
Jonathan Leake and Chris Hastings
A WARNING that climate change will melt most of the Himalayan glaciers by 2035 is likely to be retracted after a series of scientific blunders by the United Nations body that issued it.
Two years ago the Intergovernmental Panel on Climate Change (IPCC) issued a benchmark report that was claimed to incorporate the latest and most detailed research into the impact of global warming. A central claim was the world’s glaciers were melting so fast that those in the Himalayas could vanish by 2035.
In the past few days the scientists behind the warning have admitted that it was based on a news story in the New Scientist, a popular science journal, published eight years before the IPCC’s 2007 report.
It has also emerged that the New Scientist report was itself based on a short telephone interview with Syed Hasnain, a little-known Indian scientist then based at Jawaharlal Nehru University in Delhi.
Hasnain has since admitted that the claim was “speculation” and was not supported by any formal research. If confirmed it would be one of the most serious failures yet seen in climate research. The IPCC was set up precisely to ensure that world leaders had the best possible scientific advice on climate change.
Professor Murari Lal, who oversaw the chapter on glaciers in the IPCC report, said he would recommend that the claim about glaciers be dropped: “If Hasnain says officially that he never asserted this, or that it is a wrong presumption, than I will recommend that the assertion about Himalayan glaciers be removed from future IPCC assessments.”
The IPCC’s reliance on Hasnain’s 1999 interview has been highlighted by Fred Pearce, the journalist who carried out the original interview for the New Scientist. Pearce said he rang Hasnain in India in 1999 after spotting his claims in an Indian magazine. Pearce said: “Hasnain told me then that he was bringing a report containing those numbers to Britain. The report had not been peer reviewed or formally published in a scientific journal and it had no formal status so I reported his work on that basis. Since then I have obtained a copy and it does not say what Hasnain said. In other words it does not mention 2035 as a date by which any Himalayan glaciers will melt. However, he did make clear that his comments related only to part of the Himalayan glaciers. not the whole massif.”
The New Scientist report was apparently forgotten until 2005 when WWF cited it in a report called An Overview of Glaciers, Glacier Retreat, and Subsequent Impacts in Nepal, India and China. The report credited Hasnain’s 1999 interview with the New Scientist. But it was a campaigning report rather than an academic paper so it was not subjected to any formal scientific review. Despite this it rapidly became a key source for the IPCC when Lal and his colleagues came to write the section on the Himalayas.
When finally published, the IPCC report did give its source as the WWF study but went further, suggesting the likelihood of the glaciers melting was “very high”. The IPCC defines this as having a probability of greater than 90%. The report read: “Glaciers in the Himalaya are receding faster than in any other part of the world and, if the present rate continues, the likelihood of them disappearing by the year 2035 and perhaps sooner is very high if the Earth keeps warming at the current rate.”
However, glaciologists find such figures inherently ludicrous, pointing out that most Himalayan glaciers are hundreds of feet thick and could not melt fast enough to vanish by 2035 unless there was a huge global temperature rise. The maximum rate of decline in thickness seen in glaciers at the moment is 2-3 feet a year and most are far lower.
Professor Julian Dowdeswell, director of the Scott Polar Research Institute at Cambridge University, said: “Even a small glacier such as the Dokriani glacier is up to 120 meters [394ft] thick. A big one would be several hundred meters thick and tens of kilometers long. The average is 300 meters thick so to melt one even at 5 meters a year would take 60 years. That is a lot faster than anything we are seeing now so the idea of losing it all by 2035 is unrealistically high.”
Some scientists have questioned how the IPCC could have allowed such a mistake into print. Perhaps the most likely reason was lack of expertise. Lal himself admits he knows little about glaciers. “I am not an expert on glaciers.and I have not visited the region so I have to rely on credible published research. The comments in the WWF report were made by a respected Indian scientist and it was reasonable to assume he knew what he was talking about,” he said.
Rajendra Pachauri, the IPCC chairman, has previously dismissed criticism of the Himalayas claim as “voodoo science”. Last week the IPCC refused to comment so it has yet to explain how someone who admits to little expertise on glaciers was overseeing such a report. Perhaps its one consolation is that the blunder was spotted by climate scientists who quickly made it public.
The lead role in that process was played by Graham Cogley, a geographer from Trent University in Ontario, Canada, who had long been unhappy with the IPCC’s finding. He traced the IPCC claim back to the New Scientist and then contacted Pearce. Pearce then re-interviewed Hasnain, who confirmed that his 1999 comments had been “speculative”, and published the update in the New Scientist.
Cogley said: “The reality, that the glaciers are wasting away, is bad enough. But they are not wasting away at the rate suggested by this speculative remark and the IPCC report. The problem is that nobody who studied this material bothered chasing the trail back to the original point when the claim first arose. It is ultimately a trail that leads back to a magazine article and that is not the sort of thing you want to end up in an IPCC report.”
Pearce said the IPCC’s reliance on the WWF was “immensely lazy” and the organisation need to explain itself or back up its prediction with another scientific source. Hasnain could not be reached for comment.
The revelation is the latest crack to appear in the scientific consensus over climate change. It follows the climate-gate scandal, where British scientists apparently tried to prevent other researchers from accessing key date. Last week another row broke out when the Met Office criticised suggestions that sea levels were likely to rise 1.9m by 2100, suggesting much lower increases were likely.
***************************************************************************
Climate change experts clash over sea-rise ‘apocalypse’
Critics say an influential prediction of a 6ft rise in sea levels is flawed
http://www.timesonline.co.uk/tol/news/environment/article6982299.ece
Jonathan Leake
Climate science faces a new controversy after the Met Office denounced research from the Copenhagen summit which suggested that global warming could raise sea levels by 6ft by 2100.
The research, published by the Potsdam Institute for Climate Impact Research in Germany, created headline news during the United Nations summit on climate change in Denmark last month. It predicted an apocalyptic century in which rising seas could threaten coastal communities from England to Bangladesh and was the latest in a series of studies from Potsdam that has gained wide acceptance among governments and environmental campaigners.
Besides underpinning the Copenhagen talks, the research is also likely to be included in the next report of the Intergovernmental Panel on Climate Change. This would elevate it to the level of global policy-making.
However, the studies, led by Stefan Rahmstorf, professor of ocean physics at Potsdam, have caused growing concern among other experts. They say his methods are flawed and that the real increase in sea levels by 2100 is likely to be far lower than he predicts.
Jason Lowe, a leading Met Office climate researcher, said: “These predictions of a rise in sea level potentially exceeding 6ft have got a huge amount of attention, but we think such a big rise by 2100 is actually incredibly unlikely. The mathematical approach used to calculate the rise is simplistic and unsatisfactory.”
The row comes just weeks after the so-called climategate affair when emails leaked from the University of East Anglia’s Climate Research Unit revealed how scientists tried to withhold data from public scrutiny.
The new controversy, which has no connection with Climategate, dates back to January 2007, when Science magazine published a research paper by Rahmstorf linking the 7in rise in sea levels from 1881-2001 with a 0.7ºC rise in global temperature over the same period.
Most scientists accept those data and agree that sea levels will continue to rise. However, Rahmstorf then parted company from colleagues by extrapolating the findings to 2100 — when the world is projected to have warmed by up to 6.4ºC unless greenhouse gas emissions can be reduced. Based on the 7 inch increase in 1881-2001, Rahmstorf calculated that such a spike in temperature would raise sea levels by up to 74 inches — a jump that stunned other experts.
They say it is unsafe to use the relatively small increases in sea levels seen in the 19th and 20th centuries to predict such extreme changes in future. Another critic is Simon Holgate, a sea-level expert at the Proudman Oceanographic Laboratory, Merseyside. He has written to Science magazine, attacking Rahmstorf’s work as “simplistic”.
“Rahmstorf is very good at publishing extreme papers just before big conferences like Copenhagen when they are guaranteed attention,” said Holgate. “The problem is that his methods are biased to generate large numbers for sea-level rise which cannot be justified but which attract headlines.”
One key problem cited by Holgate is that much of the 1881-2001 sea-level rise came from small glaciers melting in regions such as the Alps and Himalayas. Such glaciers are, however, disappearing fast and will be largely gone by 2050. It means further rises in sea levels would have to come from increased melting of the Antarctic and Greenland ice sheets.
These hold enough water to raise global sea levels by more than 200ft, but their recent contribution to sea-level rise has been negligible. Jonathan Gregory, a sea-level specialist at the Met Office, said: “We do not know enough about the physics of large ice sheets to predict how global temperature rise will affect them. My concern about these extreme predictions is that they could discredit the whole process because they are not backed up by solid science and that is vital in such a political area of research.”
Rahmstorf said he accepted the criticisms but his work was “the best system we have got”. He added: “I agree that there has been too little research into the behaviour of ice sheets but that is exactly why I did this research. It uses simple measurements of historic changes in the real world to show a direct relationship between temperature rise and sea level increase and it works stunningly well.”
Rahmstorf said the last decade had, however, seen preliminary evidence suggesting that the ice sheets of Greenland and West Antarctica were becoming unstable. He said: “In my heart I hope my critics are right because a rise of the kind my work predicts would be catastrophic,” he said. “But as a scientist I have to look at the evidence . . . my figures for sea-level rise are likely to be an underestimate of what the world will face by 2100.”
Could that heat be used to warm the building?
Is there a way to make cool super computers? I appreciate you jam as many processors into boxes as possible to make them take up less space and communicate with each other as quickly as possible. Could they be more spread out, accepting a slower rate of communication between processors but by not having such a large cost for cooling, deploy more processors instead? Somewhere between traditional super computing and distributed computing.
Nigel S (11:45:21) :
The circular tiled building on the left looks a lot like CRU at UEA too!
My eyes are bleary, and, at first, I read that as “the circular *file* building on the left looks a lot like CRU at UEA too!” — and I’m wondering how much data may actually be going into the circular file in both locations…
“Gareth (06:11:47) :
Is there a way to make cool super computers?”
…Yes. Google has relocated some of their hardware to Iceland to reduce their air conditioning bill. And much of the “Cloud’s” computing is also cooled by Iceland’s naturally cold air.
Dear MB,
NCAR’s supercomputer will be8 Megawatts:
“I‟m holding a Denver Post article that tells the story of an 8.2 MW solar-power plant built on 82 acres in Colorado. The Post proudly hails it “America‟s most productive utility-scale solar electricity plant”. But when you account for the fact that the sun doesn‟t always shine….”
From : http://wattsupwiththat.com/2009/05/14/now-thats-a-commencement-speech/
That’s 82+ acres of solar power to run this computer!!!!
That is the entire output of the largest state-of-the-art solar power station in the US… 8 megawatts…
to run a climate modeling program,
which any programmer knows is, Garbage in Garbage out…
to “prove” global warming.
“Kate (05:48:06) :
The IPCC may be Eating More Crow soon…
The Times, January 17, 2010
World misled over Himalayan glacier meltdown”
Ah well they’re slow. A rogue element inside Gordon’s propaganda machine (a.k.a. BBC) already leaked it on Dec 05 2009:
http://news.bbc.co.uk/2/hi/south_asia/8387737.stm
Guess they sent him to re-education…
Anthony, don’t have time to read the other comments this a.m. so I am certain that I am repeating ….. Excellent, excellent investigative reporting. Deserving of the Pulitzer. What our now corporate-owned-and-corrupted “mainstream media” used to do at least some of the time.
More subscriptions should be coming your way.
Gareth (06:11:47) :
With 4 and 8 core processors, the heat load is spread out over the chip die, and advances in shrinking the process makes for more gigaflops, but in the end, you still have to supply the power to the things, which are very hungry.
Spreading out the computers only leads to latency (increasing the time bewteen request and reply), which is why they shrink the die and put more processors on the same die…to overcome latency.
The more power you use, the more heat must be dissapated due to friction and leakage.
In the end, one must ask why burn all that power to study something that stopped happening 10 years ago. No matter how much power your apply to stupidity, it fails to make it any smarter. Fools rush in faster than previously imagined.
It’s a good thing this supercomputer was not built to run on nuclear power from the state (I believe it is Connecticut) as priced in this graph: 19 cents per kWh – data from 2007 and EIA. Operating costs would be almost 4 times greater had they done so.
http://1.bp.blogspot.com/_6aSqRkDo2y4/SgY6zX-z1lI/AAAAAAAAACE/ydphQb2R0w4/s1600-h/Image.JPG
This graph (generated by me) is included in my post on Nuclear Nuts – which was argued at some length on WUWT several months ago.
I find it fascinating that the new nuclear project that is closest to reality, in the USA, is South Texas Nuclear Project expansion; and the prospective project owners have filed lawsuits against each other. Not a happy camp, nor happy campers.
The need of society for electric power continues to increase, and that need will be met by natural gas. Natural gas power is cheap and clean, the plants are easy to permit and quick to construct with essentially zero delays and zero cost over-runs. And no toxic byproducts left behind.
DirkH (07:02:12) :
Yes, you are correct. And look at the date…Before Copenhagen.
The India Times has also got “Glacier Wars” going back about two years, but most of the MSM are ignoring the subject and only report the most ludicrous assertions from alarmists with vested interests.
Number one consideration for running large compute clusters is availability of power. The second consideration for running large compute clusters is availability of cooling, either air or water.
The other considerations like price of land and manpower are all secondary. These compute clusters are a huge deal for power companies because they put a huge load on the electrical grid and the power supply needs to be extremely reliable.
Any anomalous blip may cause huge problems across the compute systems and could result in losing upwards of weeks of work.
The story about Himalayan glaciers melting by 2035 is (apparently) in error. The year shold have been 2350. See: http://news.bbc.co.uk/2/hi/8387737.stm
IanM
From Kate (05:48:06) :
[And what I think Kate is doing – thank you, Kate – is the same kind of thing Kate is showing other people doing, and is the same kind of thing Christina Hoff Sommers did in her book, “Who Stole Feminism”, where she chased down the, eventually absent, basis for various claims the “gender feminist”, Feminazis were making about “domestic violence” and the “Patriarchy”. It’s a common, purposeful “Progressive” propaganda m.o. to essentially just make things up and throw in some “references” which don’t support what is claimed.]
The problem is that nobody who studied this material [on Himalayan Glaciers] bothered chasing the trail back to the original point when the claim first arose. It is ultimately a trail that leads back to a magazine article and that is not the sort of thing you want to end up in an IPCC report.”
In practice, by now I simply dismiss everything the ipcc says as “fact” because they are not proceeding scientifically in the first place, and they did in fact ~”want this sort of thing to end up in an ipcc report”; and because it’s going to be much easier to just start over from “scratch” in analyzing the “climate”, as opposed to giving the ipcc any credit whatsoever in having contributed a net useful piece of work.
The ipcc didn’t intend to do real science to begin with, and they even admit it. Their idea was to disasterize what human activity could possibly do in respect of “harm”, totally unbalanced by any counter argument, science, or fact. That ain’t Science.
For the ipcc, the possible automatically becomes the actual, then automatically even an epidemic which we must do something really stupid about, or “we’re all gonna die”.
Whopeeee! So now we can save the World either by committing suicide or else by becoming slaves! The Islamofascists are actually on the leading edge of progress!
This is OT, but the rate for Colorado electricity seems off to me, or there must be a great range of electricity prices in Colorado. On the western slope, with Holy Cross Energy, the residential rate is 8¢/Kw. This is only 75% of the cost in the chart above, from my latest monthly statement, and it has been the same price for over two years.
Pls; Don’t over-sell it.
Battery plants and standby generators mitigate interruptions, since, there is _no_ such thing as 100% reliability in electric power! (At this point one starts referring to the number of ‘nines’ in the reliability factor: 99.99% 99.999% etc reliability)
Our co. even had a FAILURE of the DC bus in the mass-storage bldg from the batt plant and that is NEVER supposed to happen.
And another thing, in a properly run facility NO ONE loses ‘weeks of work’ – incremental daily backups and weekly full backups were a way of life even in lowly distributed compute centers running the TI/990 and DEC/VAX minicomputers in the 80’s (never mind the IBM iron in the CIC).
.
.
Eight megawatts of electricity from a coal fired power plant will produce about 59, 000 tons of CO2 per year. In the interests of moral purity, this will doubtless require the purchase of “offsetting carbon credits”, thus adding another cost item to the ledger.
I am an engineer of the University of Wyoming. We love our coal. We burn coal to heat the campus at our central energy plant. You can’t beat coal. The NCAR facility deal was done during the last boom, when we could offer all kinds of sweetheart deals to NCAR that Boulder couldn’t match.
I Lived in Cheyenne, Wyoming for three years.
Why put a supercomputer in Wyoming?
Why indeed.
J.H. Models could perfectly run in an XT personal computer with 640K RAM running Basic, it´s just CO2 vs. temperature and a convenient adjustment, that´s all.
“JonesII (16:07:26) :
J.H. Models could perfectly run in an XT personal computer with 640K RAM running Basic, it´s just CO2 vs. temperature and a convenient adjustment, that´s all.”
What? You can’t produce pretty pictures and animations on one of these, even with 4 colour VGA graphics.
catmman (12:39:49) :
I Lived in Cheyenne, Wyoming for three years.
Why put a supercomputer in Wyoming?
Why indeed.
Kevin Kilty (21:31:53) :
(snip) Third, there is all the communications bandwidth in the world running east-west and north-south through Cheyenne. If you surf the internet or watch TV its a sure bet the data passed through this town.
The best part of the project, if it ever comes to pass, is that the research staff will stay in Boulder.
My reply;
Where else are they going to store the original RAW data, and process it with new programs that don’t have to rely on the crap found on line every where else?
Before you can control the spin, you have to know where the center really is, keeping it separate will help, keeping the “climate” researchers out, will allow them to forecast weather un encumbered.
With all of the internet infrastructure passing through town, any left over idle time can be used to keep up with all of the bloggers, and their subscribers.
Label it NOAA or anything else it is still Guberment computer time being used for whatever “they think” it needs to be used for.
Maybe they don’t mention it, but Boulder is a very difficult place to build anything. The city intentionally keeps itself small with aggressive bans on development and city planning. I can see that the greenfield site plus easier permitting in Wyoming would be attractive.
Plus, Cheyenne is a bit cooler than Boulder, so the amount of energy consumed to dissipate the waste heat should be smaller.
“Clean coal” is variously defined as anything from “EPA-approved modern scrubbed coal” to “the only clean coal is the coal that stays in the ground.” It is kind of like “smart grid” in that regard – everybody wants it, but nobody knows what it is. Possibly relatedly, Boulder has a very expensive new smart grid system that may mean even higher local power rates than the state average.
Well, this man is certainly smiling.
http://www.ucar.edu/communications/staffnotes/0311/fellow.html
Tom Wigley has been at NCAR, Boulder, since 1993, and he was appointed a senior scientist one year later. But before coming here to Colorado, he was, of course, the director of the Climatic Research Unit in the School of Environmental Sciences at the University of East Anglia in Norwich, United Kingdom, from 1978-1993. He was succeeded by Phil Jones.
The CRU data base, as Patrick Michaels reminds us in “The Dog Ate Global Warming” http://www.cato.org/pub_display.php?pub_id=10578 “is known in the trade as the ‘Jones and Wigley’ record for its authors, Phil Jones and Tom Wigley. The data deletion practices recently described by Jones, and allegedly necessitated by computer storage limitations, were probably started by Wigley during the 80’s. Remember, in “Dog…” that when Roger Pielke, Jr. asked for raw data from CRU, Jones’ now-famous reply was
Old King Cole = Tom Wigley.
Empire building…on our citizens’ precious tax dollars…
Hell no!
In fairness, at least a significant percentage of the cores in the NCAR cluster are reputed to be either Opteron Istanbul CPUs or “Coolthreads” SPARC CPUs. Both are pretty good in the energy department. No link, this is “street knowledge.”