Met Office – COPing to predictions

People send me stuff. Today I got a document dump, not quite on the Scale of ClimateGate but interesting nonetheless.

These are the Met Office’s past reports to the COP climate conferences  going back to 1998, containing predictions about climate change. WUWT readers will surely be interested in examining how accurate those predictions have turned out to be. For example, here’s a figure from the COP4 report in 1998:

There is too much information here to take in today, so this seems like a perfect opportunity for crowd-sourcing bu WUWT readers. In comments make your points with references to the document, and excerpts, and compare to what we know today with appropriate references and citations.

I’ll publish another thread on what you’ve found.

These are all PDF files, some as large as 5MB.











newest oldest most voted
Notify of

Reblogged this on Climate Ponderings and commented:
Researchers needed

I tried to do something similar with IPCC predictions recently. Finding the raw data for predictions can be hard, but I found this handy tool, which allows the reverse-engineering of charts.

–“The majority of the research work carried out within the Met Office Hadley Centre
uses a world-leading global climate model that includes many different components of the climate system.”– OOH, “World-Leading” model. Must be good.


Just like with psychics, ignore the failed predictions and concentrate on the 1-2% they get right.


A ring through the nose is bull leading, which mainly means it goes where you want him to go.

Frumious Bandersnatch

“A ring through the nose is bull leading, which mainly means it goes where you want him to go.”
Yeah, but be careful you don’t get (Al) Gored by the data.

Werner Brozek

This is with reference to the first graph which is also on COP4, page 5, where the land temperature was expected to greatly increase over this past decade. Crutem4 is now on woodfortrees, but there is a slight problem. It only goes to the end of 2010. Mathematicians may wish to improve on my crude analysis, but for what it is worth, here is what I did. I took the slope of crutem3 from 2002 to December 2010 and found it to be -0.00200314. Then I found the slope from 2002 to January 2012, its latest value, and found it to be -.00793462. The drop for the additional 13 months was 0.00593. The slope of crutem4 from 2002 to December 2010 was 0.00585. So if I am allowed to make the assumption that when crutem4 is completely updated, that there would be a similar drop, there will be NO temperature change in land for the past decade.


Oh if I only had some time…on a drop-dead rush job but just great material to analyse.
Ben – thanks for the link – good stuff.

Stephen Singer

These reports future temp change graphs do not use the same baseline years for baseline ave. temp.
COP4 and COP6 use a baseline temp of 1860 – 1890. By COP8 it looks like 1960-1990 perhaps. Then the rest maybe 1980-2010 perhaps. Mostly it’s not specifically stated what the baseline year range is for ave. temp.
Even with the changing baseline years for ave. temp the temp change charts temp change out to 2100 is fairly consistant at 3-5 deg. C.

You don’t even have to look past 1980 to see it is garbage.
None of the curves show the ~1950-~1980 decline. The curves aren’t real data prior to the date of the graph. That makes the entire graph fake.
The curves look like they were taken straight off the fake Mon Loa “real time” but actually artificial linear CO2 curve.

Peter Whale

Until the models can replicate the past future predictions are a biased guess that maintains funding.


Between 2000 and today the AR4 models predicted .33 ° C warming. Actual warming has actually ZERO so the error is infinite.


Ben Pile says:
April 14, 2012 at 7:37 am
I tried to do something similar with IPCC predictions recently. Finding the raw data for predictions can be hard, but I found this handy tool, which allows the reverse-engineering of charts.
Cool link Ben. What a great idea – reverse-engineering visual data to its numerical form (allowing for further numerical analysis beyond that which can be achieved by a human staring at a chart). Nice. Thanks.

David L. Hagen

“Probably the greatest uncertainty in future projections of climate arises from clouds and their interactions with radiation … even the sign of this feedback remains unknown” — IPCC (TAR 2001)
“Cloud effects “remain the largest source of uncertainty” in model based estimates of climate sensitivity” — IPCC (AR4 2007)
“Uncertainty in the sign of projected changes in climate extremes over the coming two to three decades is relatively large because climate change signals are expected to be relatively small compared to natural climate variability”. IPCC Special Report on Extremes 2012
See further quotes at: Roger Pielke Jr. & WUWT
COP4 p 5

The reason why climate predictions are so uncertain is that, once climate change begins, consequential changes will feed back, either positively or negatively, on the original warming. These feedbacks are poorly understood.

However, in COP12 there is no mention of “clouds”, “feedback” or “feed back” or “sensitivity” or “sign” or “hindcast”, or “validate” or “verify” or “verified” . However, there are 13 occurrences of “likely”. e.g.,

observations show that the fraction of the planet’s land surface in drought has risen
sharply since the start of the 1980s. Comparison with computer model simulations suggests this is likely to be due to human induced climate change.

How do we know that the MetOffice did not get their predictions backwards like the CSIRO as found by David Stockwell in Tests of Regional Climate Model Validity in the Drought Exceptional Circumstances Report

The most worrying failure was that simulations showed increases in droughted area over the last century in all regions, while the observed trends in drought decreased in five of the seven regions identified in the CSIRO/Bureau of Meteorology report. Therefore there is no credible basis for the claims of increasing frequency of Exceptional Circumstances declarations made in the report.


I noticed that the predictons made in 2000 were much lower than those made in 1998.
[they were still way wrong but lower.]

warren knott

The Met Office have just been given a £60 million grant to improve their computer system for climate change research. Money makes the world go round.

Net Dr. Good one. Infinity is pretty big; is it not.


COP12 from 2006 devotes four pages to PRECIS, Providing REgional Climates for Impacts
It “is a regional climate modelling system allowing Hadley Centre regional climate models to be set up over any region and run on a PC (under the Linux operating system) with a simple user interface.
“It also includes a suite of data processing, analysis and display tools so experiments can easily be set up, run, analysed and data made available for wider application and dissemination.
“PRECIS is designed for researchers (with a focus on developing countries) to construct high-resolution climate change scenarios for their region of interest.
“These scenarios can be used in impact, vulnerability and adaptation studies, and to aid in the preparation of National Communications, as required under Articles 4.1 and 4.8 of the United Nations Framework Convention on Climate Change (UNFCCC).”
COP12 focused on its use in India, China and Africa. It would be interesting if WUWT could obtain PRECIS, if at all possible, to give an insight into how the Hadley Centre are creating regional climate models and related studies to present to the UN.


Kaboom says:
“A ring through the nose is bull leading, which mainly means it goes where you want him to go.”
I suppose you could make the case that a ring on the finger is miss leading. Misleading seems like a better term to describe the global climate model from the Hadley Centre rather than world-leading.

Crispin in Johannesburg

60 million pounds? Have they considered buying a couple of copies of Piers Corbyn’s laptop? That would save a lot and provide much more valuable and accurate forecasts.

Pamela Gray

By the time the 10th report comes out, the predictions will have morphed into exactly what occurred. And then they can say they were right.



COP 12 – 2006 – Met Office Hadley Centre
“Even though (globally) total rainfall will increase as the climate warms, the proportion of land in
drought is projected to rise throughout the 21st century because some areas are likely to experience less rainfall, and evaporation will be enhanced in a warmer climate.”

But won’t other areas of land get more rainfall? Does all this extra rainfall have a particular preference for the open sea? I have no idea.

Detection of a direct carbon dioxide effect in continental river runoff records
Continental runoff has increased through the twentieth century1, 2 despite more intensive human water consumption3. Possible reasons for the increase include: climate change and variability, deforestation, solar dimming4, and direct atmospheric carbon dioxide (CO2) effects on plant transpiration5.

Brief Communications Arising
Continental Runoff: A quality-controlled global runoff data set
Gedney et al.1 attribute an increase in the twentieth-century continental runoff to the suppression of plant transpiration by CO2-induced stomatal closure, by replicating a continental runoff data set2. However, we have concerns about this data set and the methods used to construct it, in addition to those already raised3, which we believe may undermine their conclusions.


I find it staggering how much the Met Office relies on models to make its projections. No wonder so many people died of cold in the UK after being snowed in over the past few years.

Pat Frank

I downloaded the 1998 COP4.
This document was produced by a literal Who’s Who of British climate scientists: UK Met. Office Hadley Centre: Chris Folland, David Parker, Briony Horton, John Mitchell, Tim Johns, Christine Coughlan, Anne Keen, Nick Rayner , David Roberts, Andy Jones, Paul Jacobs, Simon Tett and Geoff Jenkins
Climatic Research Unit, University of East Anglia: Keith Briffa, Phil Jones, Mike Hulme and David Viner.
The Figure on page 4 shows the UK Met/UEA predicted temperature anomalies from 1859 through to 2100. These predictions were made by “the new Hadley Centre Climate model.” Experience leads one to suspect that the even newer Hadley Centre climate models constructed since 1998 would predict pretty much the same trends.
On top of these predictions, I’ve plotted HADCRUT3 global anomalies out to 2012, normalized to their 1851-1900 mean. The result is here: or, if they can be made to appear: The original Legend is included.
Both plots are on the same scale and were adjusted to the same frame dimensions. So, the lines are directly comparable just by inspection. If you look hard, you’ll see the original tick-marks below mine. For the following comparisons I estimated the mean trends, ignoring all the random temperature spikes.
Comparison shows that the UEA measured global anomalies consistently undershoot the predicted anomalies after about 1950. By 2012, the measured global temperature is about 0.3 C cooler than the predicted global temperature.
I then normalized the GISS land anomalies to their 1880-1900 mean and again plotted them over the UEA predictions. The disparity in land temperature anomalies is much worse than the global disparity. The result is here:, or, if they can be made to appear: The original Legend is again included.
The GISS land surface anomalies again consistently depart from the predicted land surface anomalies after about 1950. By 2011, the measured land anomalies are a full 1 C cooler than the predicted land surface anomalies.
It would look like CRU GCM falsification to me, if I believed that the surface air temperatures could be measured to better than (+/-)0.5 C accuracy and/or that GCMs could actually produce a valid air temperature prediction.

James Ard

The Met Office must realize these predictions are going to be cut to threads? Why did they release them? Sorry I can’t help in the breakdown, I’m only here for the geeky snark.


There have been no negative consequences, if fact quite the opposite these people have prospered very well from their data manipulation and production of scary models.
So this will continue until they face negative consequences.
Terminate a couple of careers and stop the financing for all the nice new toys would send a strong signal.

the first thing you would have to do is plot out the ASSUMED emissions scenario ( BAU) verses the ACTUAL emissions data. If that’s close, then you can compare the prediction to the actual.
Its like this: Suppose I have a model that predicts miles to empty given an Assumed fuel flow rate
So, if you run your car at the current throttle setting for the next 2 hours, your MTE (miles to empty) will be Xt, Xt2, etc etc. You run that model based on an assumed throttle setting.
Now you get to test it. But, instead of a throttle setting that you assumed the driver sometimes runs above that setting and sometimes runs below that setting.
Problem: you cant simply compare the model run ( which assumed condition X ) against a test run where the test conditions were not controlled.
Since we cant control the experiment ( the emissions ) we cant SIMPLY compare a prediction to reality. You first must check the emissions scenario and see how well it tracks with reality.
If you want controlled experiments then your only hope is to install a world government that controls emissions. And we dont want that. Or you can do a range of emissions scenarios..

Werner Brozek

Which data set should be used when analysing these predictions versus reality? I was given the impression that CRUTEM3 was not good because it did not cover the polar regions well. Presumably, we now have the cream of the crop with both CRUTEM4 and BEST. Is one better than the other? I checked out the year 1996 with both and there are HUGE differences! For example, with BEST, January 1996 was 0.282 and August was 1.095 for a rise of 0.813 between January and August of the same year. However with CRUTEM4, January 1996 was 0.208 and August was 0.220 for a rise of only 0.012 between January and August of that same year. The net difference is 0.801 C, which is supposedly the total warming since 1750. See


For the posted chart, how can global temperature change be more than sea or land? Shouldn’t it be between the two, about 70-30 towards the sea end?

Phil's Dad

Steven Mosher says:
April 14, 2012 at 3:27 pm
the first thing you would have to do…

Of course Mr M is right about looking at the assumptions in the models first. Perhaps we could ask them to release their data and code so we could do that. What do you think Mr M?
In the mean time – as afar as I can tell – emissions seem to be about the same or slightly higher than their A1F1 assumptions and temperatures well below.

Allan MacRae

“I am always happy to be in the minority. Concerning the climate models, I know enough of the details to be sure that they are unreliable. They are full of fudge factors that are fitted to the existing climate, so the models more or less agree with the observed data. But there is no reason to believe that the same fudge factors would give the right behavior in a world with different chemistry, for example in a world with increased CO2 in the atmosphere.”
– Freeman Dyson
COP10 (2004) excerpt:
The Hadley Centre has developed a method to estimate the uncertainty in climate models, the largest source of uncertainty in climate predictions over the next 50 years. Preliminary results suggest that:
– the most likely global average temperature rise for a doubling of the concentration of atmospheric carbon dioxide is predicted to be 3.5 °C, with a 90% probability that the warming will be between 2.4 °C and 5.4 °C;
Not a word about aerosols in this COP10 document – I presume their FABRICATED aerosol numbers are buried within the climate computer models.
How do I know the aerosol numbers (pre-1970) are FABRICATED? Because if they used real aerosol numbers, the “most likely global average temperature rise for a doubling of the concentration of atmospheric carbon dioxide“ (Climate Sensitivity to CO2) would not be 3.5C – it would probably be 1.0C or less.
They use the phony aerosol data to fudge their climate computer models so that they can falsely claim to hindcast (model the past) accurately, and voila – can therefore claim to accurately model the future.
It is all nonsense – their models grossly over-predict global warming due to many fatal flaws in their input assumptions – the phony aerosol data is needed to then “adjust” the more obvious resulting consequences of these false inputs (particularly exaggerated Sensitivity) – one lie begets another.

Gail Combs

Steven Mosher says:
April 14, 2012 at 3:27 pm
If you want controlled experiments then your only hope is to install a world government that controls emissions. And we dont want that…..
Yea, but the UN, World Bank and WTO does.
The 2012 PhD Training School on the Human Dimensions of Global Environmental Change: Earth System Governance will be held at the University of Twente in Enschede, on 18-21 June 2012…
New Working Papers
The latest (downloadable) Global Governance Working Papers:
* The Effectiveness of Transnational Rule-Setting Organisations in Global Sustainability Politics: An Analytical Framework
* Technology Transfer through Water Partnerships. A Radical Framework of Assessment for Legitimacy
* Global Democracy Without Global Justice? Why a Procedural Account is Flawed
* Towards a World Environment Organisation: Identifying the Barriers to International Environmental Governance Reform

That was just the first listed in a search on “Global Governance”

George E. Smith

So I looked at just one of their graphed predictions; the green one, since it is the most prominent, and it raised a question. Starting at around 2050, and going to 2100, the climate is predicted to start undergoing much larger temperature swings, than anything in the history up to 2000.
So what is it in their green model that causes everything to go pear shaped in 2050; do they switch to a different random number algorithm in 2050 ?

Werner Brozek

MikeN says:
April 14, 2012 at 5:14 pm
For the posted chart, how can global temperature change be more than sea or land? Shouldn’t it be between the two, about 70-30 towards the sea end?

They are. The red (global) is between the green (land) and blue (sea). And in about the ratio you say. But over the last 11 years, neither the land, sea, nor globe warmed. See


I just threw a dart at
“There is no practical hope of saving small island states like the Maldives Islands (in the Indian Ocean),” Griggs told IPS. “This is a hugely important new issue.”
On the other side of the world, rising waters will drown many beaches on Canada’s Atlantic coast by 2030, says John Shaw, a research scientist with the Geological Survey of Canada. Adding to the problem is the fact that much of the eastern North American continent is subsiding as a result of the last ice age.
Meanwhile back at the ranch…$


“Gail Combs says:
April 14, 2012 at 5:43 pm
Steven Mosher says:
April 14, 2012 at 3:27 pm
If you want controlled experiments then your only hope is to install a world government that controls emissions. And we dont want that…..
Yea, but the UN, World Bank and WTO does. ”
Dread the thought.
So the Global Dictatorship would have to be in power for 800yrs of ‘controlled emission’ to get to see if there really is anything to CO2 AGW/CC due to the 800yr lagtime effect. Am i right?

son of mulder

Just looking at the 1998 COP unperturbed model thay seem to have temperature going down in 1998 as opposed to the El Nino. Are they implying that the 1998 El Nino was caused by anthropogenic CO2 or is their unperturbed model rubbish, not being able to predict a very significant climate event in its year of production? And if their unperturbed model is rubbish what do we expect the others to be?

Stephen Richards

Pat Frank says:
April 14, 2012 at 1:31 pm
Pat, I suspect that it has been taken down from the UK Met Off site by now but in about 2007 they announced a brand new model for their brand new £30m computer which would PREDICT the daily weather and climate to 10 years out and that they would be selling these forecasts commercially. They used that model to predict that the majority of years from 2009 would be hotter globally than 1998. That’s why they had to invent HadCru4. They needed it to make 2010 the warmist year ever, ever ever. They crooks and thieves. They take money from the UK pensioners (£482 / month maximum) to play their dangerous climate games. Richard Betts has been putting himself around the internet with his “be polite and you will be alright” method in support of his friends at the CRU and their climate model which, incidentally, in case you didn’t know, is accurate and has predicted everything that has happened.

Stephen Richards

Mosher says
Since we cant control the experiment ( the emissions ) we cant SIMPLY compare a prediction to reality. You first must check the emissions scenario and see how well it tracks with reality.
So, Steven, we should stop all model research in favour of real, empirical science, yeh ?

David A

Steven Mosher says:
April 14, 2012 at 3:27 pm
Well then we can take their most dire predictions, as emissions have been about as great as the worst case do nothing to control CO2 scenarios, what with India and China as well as many other nations developing. Therefore the accuracy of the predicted result is worse then we thought.


The worm has turned —
What was a trickle of dissent has become a stream. 16 Scientist here, 49 Astronauts there, and treasure troves of documents. The release of this UK Met Office material is another component of what will soon become a torrent of rebuke; as scientist and policy makers flee the sinking ship that is CAWG dogma. Soon people will begin to brag about having left first — the ” I knew it was all a load of ____ when . . . ” crowd. For the money/power grubbers in the CAGW camp, I hope they ride it into the depths. For those easily mislead, I hope that they will learn to exert some independent thought next time. And for a generation of young scientist with degrees in climate studies and computer modeling, I feel sorry. Populist band-wagons may be great social moves, but then often leave one’s career in disarray.

Pat Frank

Stephen, coincidentally, I recently made a comparison between the retrodiction of that particular brand new UK Met climate model, and the 2011 and 2012 CRU global temperature anomalies.
Here’s the plot:
All the lines are zeroed at 1985. The white line is the UK Met model retrodiction with 95% confidence intervals in fading red. The dark blue line is the CRU global temperature anomalies to 2011, and the dark green line is the CRU global temperature anomalies to 2012.
Darned if in the 2011 data, the observed anomaly trend wanders away from the retrodicted anomalies after about 1996. By 2011, the measured anomalies are threatening the 95% confidence limit.
But somehow by 2012 the anomalies were corrected by the Ministry of Truth in Climate Science, and voila!, they are properly right on target over the model line. Climate science, by Winston Smith, et al.