Weather supercomputer used to predict climate change is one of Britain’s worst polluters
Excerpts from the story by the Daily Mail See WUWT’s original story on this
The Met Office has caused a storm of controversy after it was revealed their £30million supercomputer designed to predict climate change is one of Britain’s worst polluters.
The massive machine – the UK’s most powerful computer with a whopping 15 million megabytes of memory – was installed in the Met Office’s headquarters in Exeter, Devon.
It is capable of 1,000 billion calculations every second to feed data to 400 scientists and uses 1.2 megawatts of energy to run – enough to power more than 1,000 homes.
The computer used 1.2 megawatts to run – enough to power 1,000 homesThe machine was hailed as the ‘future of weather prediction’ with the ability to produce more accurate forecasts and produce climate change modelling.
However the Met Office’s HQ has now been named as one of the worst buildings in Britain for pollution – responsible for more than 12,000 tonnes of carbon dioxide a year.
It says 75 per cent of its carbon footprint is produced by the super computer meaning the machine is officially one of the country’s least green machines.
Green campaigners say it is ‘ironic’ that a computer designed to help stave-off climate change is responsible for such high levels of pollution.
But Met Office spokesman Barry Grommett said the computer was ‘vital’ to British meteorology and to help predict weather and environmental change.
He said: ‘We recognise that it is big but it is also necessary. We couldn’t do what we do without it.
‘We would be throwing ourselves back into the dark ages of weather forecasting if we withdrew our reliance on supercomputing, it’s as simple as that.’
The figures have been published by the Department of Communities and Local Government which calculated the ratings and emissions of every public building in the country.
————————————-
“We couldn’t do what we do without it.” – like botch the BBQ summer forecast?
1000 billion calculation per second is not a very impresive number, that number I think should be 1000 trillion calculations per second or 1 petaflop.
WattsUp is a fantastic example of the death of science. Every coment here assumes they already know the truth. Any research that produces answers that contradict that truth must be fraudulent. No space for the possibility that the POV of WUWT could possibly be wrong.
Reading the comments on this thread I am struck by how much the ‘skeptics’ are everything they claim the ‘warmists’ are.
Research is so neatly devided into two catagories. That which is correct and produces the answers you already know are true, and fraud.
No, you could actually get a better long range forecast back in the Dark Ages. People would either watch signs in nature for weather ahead, or they would employ astronomical observations. Later, Copernicus, Tycho Brahe, Galileo, Newton and Kepler all practiced weather astrology, Kepler gained much fame for predicting the cold winter of 1594/5. Me, I have a 233MHz Pentium 2 PC that cost £50 that I run an astronomy program on, its all I need for reliable LRF.
With every announcement like this one, I am reminded again of the brilliance of the script of the movie “Dr Strangelove”.
Here is the Doomsday Machine being discussed in the Pentagon War room, with its terminal function to obliterate the world if there is a nuclear bomb attack anywhere. General Turgidson (George C Scott) marvels at the power of such a machine and muses “Gee, I wish we had one of them doomsday machines, Stainsy. ”
As for the impressive photo of the computer and its contents, my sick mind jumps to the contents of the B-52 survival pack being checked by the Captain “Survival Kit contents check. In them you will find: one 45 caliber automatic, two boxes of ammunition, four days concentrated emergency rations, one drug issue containing antibiotics, morphine, vitamin pills, pep pills, sleeping pills, tranquilizer pills, one miniature combination Rooshan phrase book and Bible, one hundred dollars in rubles, one hundred dollars in gold, nine packs of chewing gum, one issue of prophylactics, three lipsticks, three pair of nylon stockings — shoot, a fellah could have a pretty good weekend in Vegas with all that stuff…. “
Maybe it will become self aware:
“I’m sorry Dave, I can’t predict climate.”
“Tomcat (04:49:18) :
1000 billion calculation per second is not a very impresive number, that number I think should be 1000 trillion calculations per second or 1 petaflop.”
Labour-Prime-Minister Harold Wilson decided to change from one French systen (1 billion=1 million million) to the other French system (1 billion=1000 million). Seems the Daily Mail remains conservative and uses 1 billion=10^12, as do most continental Europeans:
1 million=10^6
1 milliard=10^9
1 billion=10^12
1 trillion=10^18
Well played on the limerick, Neil. Well played indeed. 😉
Andrew
I like the green racing stripe. It must be fast.
Retired Engineer (08:21:05) : 1.2 MW powering 1000 homes? The usual value is 2KW/home
I dunno. My house has 12 KW service. 12 KW * 1000 = 1.2 MW. 2 KW sounds a bit low. Just enough to run a few lights, a refrigerator, a TV and maybe the microwave. In my house, there are three computers on most of the time — well on the way to 1 KW right there.
JER0ME (18:17:56) : “That implies that someone believes that climate models will actual do something….”
Reminds me of “The Laws” at http://www.numberwatch.co.uk/laws.htm:
“The law of computer models
The results from computer models tend towards the desires and expectations of the modellers.
Corollary
The larger the model, the closer the convergence.”
All of those teraflops of processor speed and gigabytes of memory will mean that even more complicated (incomprehensible) models will be possible, the basic laws of physics will be buried under a heap of modelling assumptions, and there will be a marked improvement in the rate of convergence.
Just for good measure (again h/t to numberwatch) – have a think about this in the context of the development of the AGW hypothesis:
“Le Chatelier-Braun Principle
If any change is imposed on a system in equilibrium, the system will change in such a way as to counteract the imposed change.”
A trillion ops per second isn’t a lot given the size of that thing.
At OMZI in Portland they have a nifty “computer” that demonstrates the normal distribution of golfballs falling into a bell curve pattern. And it is accurate every time. I wonder if the MET would like to borrow it.
It should not surprise anyone that the author of this blog would independently choose to ‘hedge his bets’ by deploying the trendy, albeit moronic, verbiage of “carbon pollution.” Humans typically, these days take in a mixture that is 380-400 PPM carbon dioxide and exhale in the range of 30,000-40,000 ppm. So one might ask this author, at merely rudimentary levels, what is it about this stupid planetary genome that would program self-destruction (again, asked in the context of his trendy, albeit moronic, use of the verbiage of “carbon pollution”).
REPLY: I always get a laugh out of analysis like this. I’m not “hedging my bets”, its satire. As you point out ” trendy, albeit moronic,” – Anthony
My desktop computer runs nearly 1 billion calculations per second, has 2.5 G of memory, 1.2 T of disk. Runs on less than 100 W, including a fancy digital LCD display. And cost well under $1K. (it’s not even state of the art) I think the MET got ripped off.
Perhaps they did this to restore their credibility: “Our prediction is based on the output of our new huge supercomputer.” After all, a new huge supercomputer can’t possibly make a mistake. And they can run the model with much finer granularity. (of course, as no one has input data of fine granularity, the model is still worthless, but we don’t talk about that.)
So now they have 42.0000000000
Since they still don’t understand the question, not much help.
“Daisy, Daisy, …”
That big xbox just winked at me.
Too many times lately I wonder whether you are immune enough to green propaganda.
Watt’s a few years ago was nice to read. Surfacestations is a nice and necessary project, but latelly this blog is just an tabloide blog looking for audience and make happy all the negacionists crown.
By the way, computer models, forget for now climate change and look only to weather prediciontion. Computer models have saved many many thousands of lives in the last 20 years. And it’s good we have different models from different institutions (GFS, ECMWF, UKMET, CMC, etc,etc) to balance the predictions in weather forecasting.
~snip~
Super computer or not the UK Met Office got it completely wrong again. Only 48 hours ago they were forecasting sunshine and 21-24 degree temperatures in the south east England for Sunday 30th August and 24-27 degrees and sunshine for Monday 31st August. Sunday is cold and overcast and 17 degrees in Horsham West Sussex, rain in the North and West of the country and Monday is now predicted to be no better. They’d be better using seaweed.
Here in New Zealand Niwa today say they gettin a super computering 100 times bigger than the 1 they have now they say it will help new zealands make up their minds as to what to do when a storm is coming hehehehe
Here’s the NIWA story for anyone interested. Not only are we getting a big, big computer, we’re going to make it twice as big in a couple of years.
http://tvnz.co.nz/business-news/niwa-buys-supercomputer-2956328
May, June and July were very cold but August has been pretty mild, although the north and west have seen some big storms, the latest happening now.
It’s far, far worse than we thought and I’m sure the computer will prove it.
Let’s try this again.
This supercomputer uses 1.2 megawatts.
Britain produces about 100,000 megawatts.
So, if it produces models that aid in passing legislation that reduces Britain’s electrical generation by 10%, it would pay for the power it consumes by about 8000 times.
One point two megawatts is not significant, but the modeling this supercomputer does is vital, IMO.
One or two megawatts is not significant? Tell that to people in Zimbabwe that have no power.
What is significant is that those who program these computers have no handle on what the quality is of the data being used. All the computing horsepower in the world can’t make up for bad programming or bad data, or flawed premises.
Now don’t reply with more touchy-feely garbage like “were all in this together”. The point is that the Met office has a bad track record for predictions and for transparency.
They won’t share the data or the code for replication saying “trust us”.
Yeah right-o. What amazes me is that many people on the left don’t trust the government when it comes to reporting on the Iraq war, but they’ll trust climate data and predictions without any questions at all.
When Hadley/Met shares the data and the methods, then they be doing “vital” work. As it stands, it is questionable since it cannot be replicated.
I’m not interested in a debate about it with you Mr. Palmer. I know your mission.
Hi all-
Good people doing good modeling, IMO.
The info is well worth the 1.2 megawatts, especially since it is insignificant relative to Britain’s total power production, and could significantly impact public policy.
My mission is to tell the truth.
REPLY: What rubbish. Your mission is most definitely not “truth” because if it was, you’d be concerned about the fact that Hadley is withholding data and methods that would allow replication. Yet you give Hadley a free pass on this issue. You can’t even bring yourself to question “why”.
No, your mission is to spout your point of view everywhere you can.
So then, since you don’t mention it or appear to care, I assume you’re 100% OK with Hadley withholding data and code that would lead to independent verification like any other branch of science does? You’re OK with whatever the results that 1.2 megawatt computer puts out even if it is it could be wrong because it hasn’t been independently verified? You’re OK with using energy without verification that it is being used correctly. You’re OK with shutting down electricity production on those answer without independent verification?
In science, “good people” don’t hide things Mr. Palmer. – A
Hi all-
Good people doing good modeling, IMO.
The info is well worth the 1.2 megawatts, especially since it is insignificant relative to Britain’s total power production, and could significantly impact public policy.
My mission is to tell the truth.
REPLY: Mr. Palmer, PLEASE learn to use the refresh button. – A
Hi Anthony-
Well, I might fail in my mission once in a while, but really, it is my mission.
I think that the climate is going out of control, and the only way to bring it back into control is to apply the proper corrective action, which must be based on the truth to succeed.
Regarding the Hadley center, and their transparency or lack of it, most of the information I have about that comes from this blog.
Because I don’t believe most of the stuff I read on this blog, I have tended to discount it as irrelevant or exaggerated.
The Hadley Center consists of hundreds of scientists, modeling the weather and climate, just doing their jobs, I think.
Regarding the computer code, those models have millions of lines of code, I think.
Who is going to be able to understand it or evaluate it other than other climate modelers?
REPLY: Well then let’s set you up with homework, since you won’t seek out that information yourself.
http://www.climateaudit.org/?p=6797
Nature, the worlds most “prestigous” journal reported on it here:
http://www.nature.com/news/2009/090812/full/460787a.html
So explain why “good people” must withhold data. No, their argument denying the FOI’s doesn’t hold water. And there’s a bunch of other questions you left unanswered.
And newsflash Leland, the climate was never “in” control. We’ve never “had” control of it and never will. Such is the folly of alarmists. – A
Hi Anthony-
Actually, I believe Lovelock’s hypothesis, that it is a self-regulating system, is correct.
It’s a very good, robust, self-regulating system, with massive reserves of control, and it has apparently gone out of control spontaneously only a couple of times in the past several hundred million years.
But I think our “geologically instantaneous” release of something like 300 billion tons of carbon from fossil fuels has destabilized it, and we are seeing the initial stages of runaway global warming right now.
The Hadley Center consists of good people, IMO, doing their jobs.
Their supercomputer is vitally necessary to this effort, and it’s electricity usage is trivial, compared to the value of the information gained, IMO.
REPLY:Your’re wrong. Runaway global warming…not possible. Note the earth had 6000PPM or more of CO2 in the past and didn’t turn into Venus. “Good people” at Hadley hide data preventing replication? Follow the links, educate yourself, pull yourself up out of the Gore. – A
Hi Anthony-
In the past, the climate has had one thing it doesn’t have now: time to adapt.
Lovelock, the author of the “climate as a self regulating system” hypothesis, believes it is not only possible, but happening right now – after a visit to the Hadley Center:
I think Lovelock’s right, and you’re wrong.
We’ve used up the resiliency of the system with 300 billion tons of carbon, mostly from fossil fuels, added geologically instantaneously to the climate system.
REPLY: I think you’ve been brainwashed. CO2 has a logarithmic response to LWIR in the atmosphere, see this graph, it can’t cause a tipping point, ever.
Its not about feelings its not about it being “terrifying” its about a physical law. CO2 can’t cause a tipping point, period.
Since you’ve moved on to religion aka Gaia, your welcome has expired. – A