Met office pushes a surface temperature data "do over"

http://www.metoffice.gov.uk/about/images/contact_us/logo_250x170.jpg From Fox News, word that the Met Office has circulated a proposal that intends to completely start over with raw surface temperature data in a transparent process.

Here’s the proposal from the Met Office metoffice_proposal_022410 (PDF). Unfortunately it is not searchable, as they still seem to be living in the typewriter age, having photoscanned the printed document.

I’d feel better about it though if they hadn’t used the word “robust”. Every time I see that word in the context of climate data it makes me laugh. It seems though they already have concluded the effort will find no new information. Given that they are apparently only interested in ending the controversy over transparency, and because GHCN (source for GISS and HadCRUT) originates at NCDC with it’s own set of problems and it is controlled by one man, Dr. Thomas Peterson, it means that we’ll have our work cut out for us again. In my opinion, this proposal is CYA and does not address the basic weaknesses of the data collection.

Britain’s Weather Office Proposes Climate-Gate Do-Over

By George Russell.

At a meeting on Monday of about 150 climate scientists, representatives of Britain’s weather office quietly proposed that the world’s climatologists start all over again to produce a new trove of global temperature data that is open to public scrutiny and “rigorous” peer review.

After the firestorm of criticism called Climate-gate, the British government’s official Meteorological Office apparently has decided to wave a white flag and surrender.

At a meeting on Monday of about 150 climate scientists in the quiet Turkish seaside resort of Antalya, representatives of the weather office (known in Britain as the Met Office) quietly proposed that the world’s climate scientists start all over again on a “grand challenge” to produce a new, common trove of global temperature data that is open to public scrutiny and “rigorous” peer review.

In other words, conduct investigations into modern global warming in a way that the Met Office bureaucrats hope will end the mammoth controversy over world temperature data they collected that has been stirred up by their secretive and erratic ways.

The executive summary of the Met Office proposal to the World Meteorological Organization’s Committee for Climatology was obtained by Fox News. In it, the Met Office defends its controversial historical record of temperature readings, along with similar data collected in the U.S., as a “robust indicator of global change.” But it admits that “further development” of the record is required “in particular to better assess the risks posed by changes in extremes of climate.”

As a result, the proposal says, “we feel that it is timely to propose an international effort to reanalyze surface temperature data in collaboration with the World Meteorological Organization (WMO), which has the responsibility for global observing and monitoring systems for weather and climate.”

The new effort, the proposal says, would provide:

–“verifiable datasets starting from a common databank of unrestricted data”

–“methods that are fully documented in the peer reviewed literature and open to scrutiny;”

–“a set of independent assessments of surface temperature produced by independent groups using independent methods,”

–“comprehensive audit trails to deliver confidence in the results;”

–“robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.”

Click here to read the executive summary.

The Met Office proposal asserts that “we do not anticipate any substantial changes in the resulting global and continental-scale … trends” as a result of the new round of data collection. But, the proposal adds, “this effort will ensure that the data sets are completely robust and that all methods are transparent.”

Despite the bravado, those precautions and benefits are almost a point-by-point surrender by the Met Office to the accusations that have been leveled at its Hadley Climate Centre in East Anglia, which had stonewalled climate skeptics who demanded to know more about its scientific methods. (An inquiry established that the institution had flouted British freedom of information laws in refusing to come up with the data.)

When initially contacted by Fox News to discuss the proposal, its likely cost, how long it would take to complete, and its relationship to the Climate-gate scandal, the Met Office declared that no press officers were available to answer questions. After a follow-up call, the Office said it would answer soon, but did not specify when. At the time of publication, Fox News had not heard back.

The Hadley stonewall began to crumble after a gusher of leaked e-mails revealed climate scientists, including the center’s chief, Phil Jones, discussing how to keep controversial climate data out of the hands of the skeptics, keep opposing scientific viewpoints out of peer-reviewed scientific journals, and bemoaning that their climate models failed to account for more than a decade of stagnation in global temperatures. Jones later revealed that key temperature datasets used in Hadley’s predictions had been lost, and could not be retrieved for verification.

Jones stepped down temporarily after the British government announced an ostensibly independent inquiry into the still-growing scandal, but that only fanned the flames, as skeptics pointed out ties between several panel members and the Hadley Centre. In an interview two weeks ago, Jones also admitted that there has been no “statistically significant” global warming in the past 15 years.

The Met Office’s shift in position could be a major embarrassment for British Prime Minister Gordon Brown, who as recently as last month declared that climate skeptics were “flat-earthers” and “anti-science” for refusing to accept that man-made activity was a major cause of global warming. Brown faces a tough election battle for his government, perhaps as early as May.

It is also a likely blow to Rajendra Pachauri, head of the United Nations backed International Panel on Climate Change (IPCC), whose most recent report, published in 2007, has been exposed by skeptics as rife with scientific errors, larded with un-reviewed and non-scientific source materials, and other failings.

As details of the report’s sloppiness emerged, the ranks of skepticism have swelled to include larger numbers of the scientific community, including weather specialists who worked on the sprawling IPCC report. Calls for Pachauri’s resignation have come from organizations as normally opposed as the Competitive Enterprise Institute and the British chapter of Greenpeace. So far, he has refused to step down.

The Met Office proposes that the new international effort to recalibrate temperature data start at a “workshop”‘ hosted by Hadley. The Met Office would invite “key players” to start the “agreed community challenge” of creating the new datasets.

Then, in a last defense of its old ways, the Met proposals argues says that its old datasets “are adequate for answering the pressing 20th Century questions of whether climate is changing and if so how. But they are fundamentally ill-conditioned to answer 21st Century questions such as how extremes are changing and therefore what adaptation and mitigation decisions should be taken.”

Those “21st Century questions” are not small and they are very far from cheap. At Copenhagen, wealthy nations were being asked to spend trillions of dollars on answering them, a deal that only fell through when China, India, and other near-developed nations refused to join the mammoth climate-control deal.

The question after the Met Office’s shift in stance may be whether environmentalists eager to move those mountains of cash are also ready to stand down until the 21st century questions get 21st century answers.

=========================

h/t to Dr. Richard North, EU Referendum

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
150 Comments
Inline Feedbacks
View all comments
R. de Haan
February 23, 2010 9:01 am

I wonder if the tax payers will get a refund for all the money wasted on the old data?

John
February 23, 2010 9:02 am

Who are these 150 climate scientists?
Who paid for them to convene in Turkey and under what umbrella?
Is this meeting a reaction to Climategate and to discuss reactions?

JJ
February 23, 2010 9:03 am

Who is going to give the Met the money to do this? Why would that entity give that money to the Met? Why are the people that effed it up in the first place being given the opporuntity to do it again?

February 23, 2010 9:03 am

If we really wanted anyone to “start all over again to produce a new trove of global temperature data”, why use those who already have a record of bias?
With kind regards,
Oliver K. Manuel

Tony Hall
February 23, 2010 9:05 am

The “back to square one” reassessment sounds very hopeful but is it:
1. A delaying tactic to keep the Government research funding rolling for as long as possible?
2. A delaying tactic to ensure that National policies to combat AGW proceed so far down the line that it would be even more difficult (and possibly politically suicidal) to do a huge U-turn than if it was done in the near future?

Chuckles
February 23, 2010 9:09 am

I’m with John (09:02), what are 150 staff of the British Met Office doing in Turkey, and on whose dime are they there?
It better not be mine.

February 23, 2010 9:13 am

Since it was the Met Office’s lack of oversight and incompetence that resulted in the existing datasets being flawed, any new dataset should be completely removed from the hands of the Met Office and CRU. It should be put out to competitive tender. It is not a job that needs Met Office or UEA staff to do at great expense. A few competent statisticians and software developers could do the job.
Why should the taxpayers pay the Met Office and CRU again because of their incompetence? The Met Office and CRU should have their funding reduced to pay for the new database.

R.S.Brown
February 23, 2010 9:14 am

This “linkable” proposal is actually a superb response to the
flurry of anticipated FOIA requests its announcement will
otherwise spawn.
However, the discussions that went into drafting
the proposal are not so transparent.
Does this mean the old “raw” data is available somewhere,
in some format ?

Philip Thomas
February 23, 2010 9:17 am

They know now that the value added calculations will have to occur before the temperatures leave the stephenson screen.

Gary
February 23, 2010 9:22 am

It will be important to establish the criteria for the temperature dataset do-over at the beginning. Otherwise it will be a double waste of money. Whoever is charged with the mission should get input from all interested parties before approving a plan.
Just off the top of my head:
1. archive of original records (digital images of the forms and photographs/diagrams of the sites)
2. meta-data on station siting and changes
3. full description of raw data audit rules
4. processing algorithms and code
5. equipment specifications and effects of instruments on temperatures (CRS paint, sensor drift, etc.)

February 23, 2010 9:23 am

That’s one hell’va big mulligan that they get !!!!

thethinkingman
February 23, 2010 9:24 am

The travesty is that it has taken four months to get to this point.

E.M.Smith
Editor
February 23, 2010 9:26 am

If genuine, it would be welcomed.
But it sounds to me like partisans trying to justify their position with a flag wrapping party… Some buzz words / phrases that got my attention:
The executive summary of the Met Office proposal to the World Meteorological Organization’s Committee for Climatology was obtained by Fox News. In it, the Met Office defends its controversial historical record of temperature readings, along with similar data collected in the U.S., as a “robust indicator of global change.”
There is that “robust” red flag again. Nothing wrong with the word, except it has been used too often by too many to mean “Our guesses are strong, trust me”. What’s wrong with just “find the truth”?
But yes, I must agree that the data sets as presently composed are “robust indicators of global change”… accurate temperatures, not so much…
But it admits that “further development” of the record is required “in particular to better assess the risks posed by changes in extremes of climate.”
OK, already has a conclusion in mind. We also see the newest dodge being trotted out “extremes of climate”. Well, unless we have a dinosaur pleasing tropical jungle in Antarctica or a mile thick ice sheet in New York, it just isn’t an ‘extreme of climate’. So how about dumping that idea.
How about “better and ACCURATE assessment of present weather trends, including 30 year and 60 year cycles of weather such as those driven by ocean oscillations”. You know, things like the PDO that is twice as long as your broken definition of climate as “average of 30 years weather”. (It isn’t, by the way. Climate is your latitude, altitude, and distance from the ocean with honorable mention for what atmospheric and ocean major currents you are near. Not 30 years of temperature data.
The Sahara is a desert. It was a desert 100 years ago. It will be a desert 100 years from now. Tundra stays tundra. “Mediterranean Climate” has been Mediterranean for a few thousand years and will stay that way too. It was in the Iron Age Cold Period, Roman Optimum, The Little Ice Age, and even now, in the Modern Warm Period.
So drop the broken biased phrasing about changes of climate when you really mean 30 year weather cycles. Ok?
And “further development” is the last thing we need. I’d much rather see some un-developed raw data… Somehow I always thought a record simply “was” and did not require any “development”… One “develops” a disease or tin mine or even a “bad attitude”, but a record simply ought to be recorded…

As a result, the proposal says, “we feel that it is timely to propose an international effort to reanalyze surface temperature data

“Analysis” is a code word for ‘cook the books’ as near as I can tell. The raw data are run through the NCDC “analysis” and come out with a warming trend. The NCDC “unadjusted” (they avoid the “raw” word) data are run into the GIStemp “analysis” and come out completely mangled. What the ClimateGate CRU crew did was also an “analysis”…
Please, sir, might I have my temperature data set non-analyzed? And certainly not re-analyzed … once through the pooch was one time too many.
in collaboration with the World Meteorological Organization (WMO),
Oh GAK! So they will be consulting with the UN gang of powermongers and central state control advocates on what to do to the planet data series? I’m sure that will work out well and unbiased /sarcoff>
Please, Sirrah, might I have my data un-influenced by the WMO and related political hangers on?

which has the responsibility for global observing and monitoring systems for weather and climate.”

Pardon? I was under the impression that we had a global system composed of independent sovereign nations each running their own weather services. Has NOAA now been put under the “responsibility” of the UN?
IMHO, anything that comes out of a WMO driven process will be obtuse and untrustworthy, at best.
–”verifiable datasets starting from a common databank of unrestricted data”
So an ‘analyzed” dataset, but not the “unrestricted data” directly… Will the data be “unrestricted” to everyone, or only to the WMO approved agencies?

–”methods that are fully documented in the peer reviewed literature and open to scrutiny;”

That would be more comforting if we knew that the “peers” doing the reviewing were not going to be just another re-grouping of the same “peers” who suborned the process to begin with.
How about a “public reviewed” process instead? Or even, just make the data freely available and stand back. Let a free market of ideas sort out the good approaches from the bad.

–”a set of independent assessments of surface temperature produced by independent groups using independent methods,”

Great. But who gets chosen by whom as the ‘approved’ independent voices? If it is a “Y’all Come!” and anyone can have the data, fine. If it’s another “Circle Of Three” like NCDC, CRU, and GISS all working hand in glove and claiming independence, well, no thanks.

–”comprehensive audit trails to deliver confidence in the results;”

That would be good.

–”robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.”

There is that “robust” flag again. But if, by this phrase, they mean:
“Well designed and fully reported error bounds calculations and code review / data audit” then Great!. If they just mean “our group of friends all agreed we can “homogenize” the data and fabricate fill in by our favorite ways, since it’s already “in the literature” that we self-reviewed: Then, no, please don’t waste time and money on the Circus…
Basically, while I like the idea of a ‘do over’; this just smells like a “do show” for dealing with the “raised awareness” after ClimateGate…
We don’t need “bread and circuses” no matter how well peer reviewed and reanalyzed. We just need verifiable UN-processed data. We’ll work out the rest…

geo
February 23, 2010 9:30 am

I am cautiously –very cautiously– pleased. Fighting over analysis is healthy and expected. Let’s just all have access to the same data and knowledge of how it was collected and what was done to it before we get to the knife work on “what does it mean?”.

RockyRoad
February 23, 2010 9:36 am

Neil Hampshire (08:56:47) :
Who are the “Key Players”?
I hope it is not the “Hockey Team”
——-
Reply:
I’ve often said that a scientist without his data is no scientist. He is a soothsayer.
And who would ever hire a soothsayer to be a scientist?
I certainly wouldn’t.
Time to hire a completely new staff–people who are actually scientists.

TerrySkinner
February 23, 2010 9:36 am

How’s this for a radical suggestion: How about admitting that there is at the moment no scientific way to measure the temperature of the Earth (except perhaps to the extent satellites can be used).
Unless and until there is a world wide grid of temperature measuring stations providing simultaneous real time measurements covering ocean and land, mountain and desert, town and country etc any process and any measurement is flawed. It depends far too much on the underlying assumptions loaded into the process rather than the raw measurements.
Quite frankly at the moment all that can be measured is temperature in time series in particular places, not for the world as a whole. If something cannot be measured accurately it cannot be measured accurately.
With human temperature we can get a rough idea of whether somebody ‘has a temperature’ by putting a hand on their forehead. But this doesn’t give us degrees C or F and it certainly doesn’t give us measurements to a tenth of a degree and it would be bogus and unscientific to pretend otherwise. You wouldn’t measure world records without a high precision watch or other measuring device.
Same for the temperature of the Earth. If somebody wants to do this they have to set up a valid (and very expensive) way to do it. Otherwise they should come clean and admit that anything they report is at best very very approximate and likely to be biased in a number of ways by underlying assumptions and measurement locations.
And any ‘temperatures’ depending on tree ring or other proxies should be dumped.

ditmar
February 23, 2010 9:38 am

Perhaps a do over will answer questions like the adjustments made to Anchorage,siberia,cet(uk),cet(prague) finland, darwin,NZ and all the others. Why there is no co2 induced temps in the u.s despite it being the main co2 emitter for about a century. Darwin was bad enough but have you all seen anchoage??? Explain that!

IsoTherm
February 23, 2010 9:42 am

This is really great news …. for the global warming believers!
Why? … Because it will drive a wedge into the sceptic community and sort between those who are sceptical because it is right to be sceptical, and those who would be sceptical of global warming even if lava were flowing down the road.
But seriously, I would have given anything to have been a fly on the wall as these guys realised that they had no choice but to do something like this!

RockyRoad
February 23, 2010 9:50 am

E.M.Smith (09:26:39) :
(…)
And “further development” is the last thing we need. I’d much rather see some un-developed raw data… Somehow I always thought a record simply “was” and did not require any “development”… One “develops” a disease or tin mine or even a “bad attitude”, but a record simply ought to be recorded…
——————
Reply:
Yet even if you’re developing a tin mine (or gold mine or any other type of mine), your raw sample data is the ultimate qualifier when making stepwise development decisions and particularly when you’re putting the project up for capitalization. At that point, the funding banks have independent consultants take the raw data and everthing else you’ve done and do a full-blown model/plan/economic analysis for comparison before funds are allocated.
In climate science as in mining, raw data is the key.

February 23, 2010 9:52 am

IsoTherm (09:42:19),
Thanx for that example of alarmist thinking.
Yes, I would be a skeptic of CAGW even if lava were flowing down the road. But I wouldn’t be a skeptic of volcanic activity…
…while the alarmists would be blaming lava on global warming.

February 23, 2010 9:59 am

Unfortunately it is not searchable, as they still seem to be living in the typewriter age, having photoscanned the printed document.
Fortunately, the rest of us live in the OCR age. Output below. Have at it.
~~~~~~~~~~~~~~
PROPOSAL FOR A NEW INTERNATIONAL ANALYSIS OF LAND SURFACE AIR TEMPERATURE DATA
CONTENT OF DOCUMENT:
UK Met Office submits this document for consideration by the CCI session
Appendix:
• Proposal for a new international analysis of land surface air temperature data
~~
PROPOSAL FOR A NEW INTERNATIONAL ANALYSIS OF LAND SURFACE AIR
TEMPERATURE DATA
Submitted by UK Met Office
Executive summary
Surface temperature datasets are of critical importance for detecting, monitoring and communicating climate change. They are also essential for testing the validity of the climate models that are used to produce predictions of future climate change. The current datasets, constructed in the UK and US using different methodologies, agree in showing that the world is warming. Taken together these records provide a robust indicator of global change and form part of the evidence base that led the IPCC Fourth Assessment Report to conclude that “warming of the climate system is unequivocal”
To meet future needs to better understand the risks of dangerous climate change and to adapt to the effects of global warming, further development of these datasets is required, in particular to better assess the risks posed by changes in extremes of climate. This will require robust and transparent surface temperature datasets at finer temporal fidelity than current products.
The current surface temperature datasets were first put together in the 1980s to the best standards of dataset development at that time; they are independent analyses and give the same results, thereby corroborating each other.
In the case of the CRU land surface temperature dataset (CRUTEM3, which forms the land component of the HadCRUT dataset) there are substantial IPR issues around the raw station data that underpin the dataset; we are actively pursuing resolution of these issues so that the base data can be made openly available. We know that several stations have already been explicitly forbidden from release by the rights’ holders so we will not be able to release all the under-pinning station data.
Consequently we have been considering how the datasets can be brought up to modern standards and made fit for the purpose of addressing 21st Century needs. We feel that it is timely to propose an international effort to reanalyze surface temperature data in collaboration with the World Meteorological Organization (WMO), which has the responsibility for global observing and monitoring systems for weather and climate.
The proposed activity would provide:
1. Verifiable datasets starting from a common databank of unrestricted data at both monthly and finer temporal resolutions (daily and perhaps even sub-daily);
2. Methods that are fully documented in the peer reviewed literature and open to scrutiny;
3. A set of independent assessments of surface temperature produced by independent groups using independent methods;
4. Robust benchmarking of performance and comprehensive audit trails to deliver confidence in the results;
5. Robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.
It is important to emphasize that we do not anticipate any substantial changes in the resulting global and continental-scale multi-decadal trends. This effort will ensure that the datasets are completely robust and that all methods are transparent.
Background
In many respects HadCRUT has been the default choice of surface dataset in all 4 IPCC Assessment Reports. However we must stress that other independent datasets are used which support the HadCRUT data. There are three centres which currently calculate global average temperature each month:
• Met Office, in collaboration with the Climatic Research Unit (CRU) at the University of East Anglia (UK);
• Goddard Institute for Space Studies (GISS), which is part of NASA (USA);
• National Climatic Data Center (NCDC), which is part of the National Oceanic and Atmospheric Administration (NOAA) (USA).
These groups work independently and use different methods in the way they process data to calculate the global average temperature. Despite this, the results of each are similar from month to month and year to year, and there is robust agreement on temperature trends from decade to decade.
All existing surface temperature datasets are homogenized at the monthly resolution, and are therefore suitable for characterizing multi-decadal trends. These are adequate for answering the pressing 20th Century questions of whether climate is changing and if so how. But they are fundamentally ill-conditioned to answer 21st Century questions such as how extremes are changing and therefore what adaptation and mitigation decisions should be taken. Monthly resolution data cannot verify model projections of extremes in temperature which by definition are (sub-) daily resolution events.
Through collaboration with NCDC we have two quality controlled, but not homogenized products at the daily and sub-daily resolution (HadGHCND and HadlSD – the latter about to be submitted to peer review), spanning 1950 onwards and 1973 onwards respectively. However, because these are not homogenized, they may retain time-varying biases. It is an open scientific question as to whether homogenization is feasible at these timescales whilst retaining the true temporal characteristics of the record. In particular, seasonally invariant adjustments which are adequate for monthly timescale data will be grossly inadequate at the daily or sub-daily resolution. Clearly homogenization of these data is highly desirable but some detailed research is needed to define the best approach.
The way forward
Recognizing that no single institution can undertake such a fundamental data collection, re-analysis and verification process single-handedly, we would envisage this as a broad community effort – a ‘grand challenge’ so to speak – involving UK and international partners.
The UK would convene a workshop to be hosted by the Met Office Hadley Centre and invite key players who could plausibly create such datasets with the aim of initiating an agreed community challenge to create an ensemble of open source land temperature datasets for the 21st Century both at monthly temporal resolution and also at the daily and sub-daily timescales needed to monitor extremes. Such an approach would help distribute many of the basic tasks, ensuring that the most appropriate parties were responsible for each part as well as providing a focused framework and timeline. This effort would ideally have involvement from, and be coordinated under, the umbrella of one or more of the Commission for Climatology, the Global Climate Observing System, or the World Climate Research Programme, with assistance from other WMO constituent bodies as appropriate.
Activities that would be required within any overall programme are:
1. Creation of an agreed international databank of surface observations to be made available without restriction, akin to the l-COADS databank in the ocean domain. Note that NCDC already have substantial efforts in this regard and would be a key participant and likely host as the designated world data bank. Data to be available at monthly, daily and sub-daily resolutions;
2. Multiple independent groups undertake efforts to create datasets at various temporal resolutions based upon this data-bank. Participants will be required to create a full audit trail and publish their methodology in the peer-reviewed literature. Strong preference will be given to automated systems and creations of ensembles that reflect the uncertainties in the observations and methods;
3. One or more groups to create realistic test-cases of the spatio-temporal observational availability by sampling output from a range of climate simulations from a number of models, adding realistic error structures;
4. Groups to run their algorithms against the test-cases and one or more groups, preferably completely independent, to undertake a holistic assessment based upon the results of this verification exercise from all groups.
###

Steve M. from TN
February 23, 2010 10:01 am

EM Smith: a great analysis as usual.
I’m wondering how long this “do over” is going to take.

February 23, 2010 10:02 am

We can write our own code to analyze the data, right?
My view is that this is what needs to be done urgently. Start the project now and release it under an open source license. The code doesn’t need to incorporate the adjustments and assumptions, but could include the ability to see the results based on loading different sets adjustments and assumptions.
This will clearly allow us to see the effects of different modelling techniques, and see which results held under which assumptions.
Surely with a genuine interest in science, whether currently holding warmist, luke-warmist or skeptical views, would oppose such a project?

martyn
February 23, 2010 10:03 am

“We feel that it is timely to propose an international effort to reanalyze surface temperature data”……
Reanalyze temperature data from what start point? Half way up the ladder I suspect, more fudging in the pipeline! Ah what the hell it’s only a proposal.

IsoTherm
February 23, 2010 10:16 am

Don’t worry Smokey, the believers are also having their problems. Manmade global warming has lost its scare factor, and has even become a bit comical (What did the actress say to the global warming … want a snowjob?)
It doesn’t have the kudos they need to bring tug at the environmental guilt strings and pull the cash. Likewise the wind lobby have seen the writing on the wall for manmade global warming, and are right this minute opening up the file marked: “plan B — it’s the end of oil”.
Moreover, there’s a lot of environmental campaigns that have been completely sidelined by the guys running the warming alarmism campaign. These environmentalists are also getting pretty sick and tired of loosing the limelight to this stupid manmade warming and there’s many ready and there’s probably more environmentalists ready to stick the knife in AGW campaing thant there are sceptics.
This next year is going to fun!