Met office pushes a surface temperature data “do over”

http://www.metoffice.gov.uk/about/images/contact_us/logo_250x170.jpg From Fox News, word that the Met Office has circulated a proposal that intends to completely start over with raw surface temperature data in a transparent process.

Here’s the proposal from the Met Office metoffice_proposal_022410 (PDF). Unfortunately it is not searchable, as they still seem to be living in the typewriter age, having photoscanned the printed document.

I’d feel better about it though if they hadn’t used the word “robust”. Every time I see that word in the context of climate data it makes me laugh. It seems though they already have concluded the effort will find no new information. Given that they are apparently only interested in ending the controversy over transparency, and because GHCN (source for GISS and HadCRUT) originates at NCDC with it’s own set of problems and it is controlled by one man, Dr. Thomas Peterson, it means that we’ll have our work cut out for us again. In my opinion, this proposal is CYA and does not address the basic weaknesses of the data collection.

Britain’s Weather Office Proposes Climate-Gate Do-Over

By George Russell.

At a meeting on Monday of about 150 climate scientists, representatives of Britain’s weather office quietly proposed that the world’s climatologists start all over again to produce a new trove of global temperature data that is open to public scrutiny and “rigorous” peer review.

After the firestorm of criticism called Climate-gate, the British government’s official Meteorological Office apparently has decided to wave a white flag and surrender.

At a meeting on Monday of about 150 climate scientists in the quiet Turkish seaside resort of Antalya, representatives of the weather office (known in Britain as the Met Office) quietly proposed that the world’s climate scientists start all over again on a “grand challenge” to produce a new, common trove of global temperature data that is open to public scrutiny and “rigorous” peer review.

In other words, conduct investigations into modern global warming in a way that the Met Office bureaucrats hope will end the mammoth controversy over world temperature data they collected that has been stirred up by their secretive and erratic ways.

The executive summary of the Met Office proposal to the World Meteorological Organization’s Committee for Climatology was obtained by Fox News. In it, the Met Office defends its controversial historical record of temperature readings, along with similar data collected in the U.S., as a “robust indicator of global change.” But it admits that “further development” of the record is required “in particular to better assess the risks posed by changes in extremes of climate.”

As a result, the proposal says, “we feel that it is timely to propose an international effort to reanalyze surface temperature data in collaboration with the World Meteorological Organization (WMO), which has the responsibility for global observing and monitoring systems for weather and climate.”

The new effort, the proposal says, would provide:

–“verifiable datasets starting from a common databank of unrestricted data”
–“methods that are fully documented in the peer reviewed literature and open to scrutiny;”
–“a set of independent assessments of surface temperature produced by independent groups using independent methods,”
–“comprehensive audit trails to deliver confidence in the results;”
–“robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.”

Click here to read the executive summary.

The Met Office proposal asserts that “we do not anticipate any substantial changes in the resulting global and continental-scale … trends” as a result of the new round of data collection. But, the proposal adds, “this effort will ensure that the data sets are completely robust and that all methods are transparent.”

Despite the bravado, those precautions and benefits are almost a point-by-point surrender by the Met Office to the accusations that have been leveled at its Hadley Climate Centre in East Anglia, which had stonewalled climate skeptics who demanded to know more about its scientific methods. (An inquiry established that the institution had flouted British freedom of information laws in refusing to come up with the data.)

When initially contacted by Fox News to discuss the proposal, its likely cost, how long it would take to complete, and its relationship to the Climate-gate scandal, the Met Office declared that no press officers were available to answer questions. After a follow-up call, the Office said it would answer soon, but did not specify when. At the time of publication, Fox News had not heard back.

The Hadley stonewall began to crumble after a gusher of leaked e-mails revealed climate scientists, including the center’s chief, Phil Jones, discussing how to keep controversial climate data out of the hands of the skeptics, keep opposing scientific viewpoints out of peer-reviewed scientific journals, and bemoaning that their climate models failed to account for more than a decade of stagnation in global temperatures. Jones later revealed that key temperature datasets used in Hadley’s predictions had been lost, and could not be retrieved for verification.

Jones stepped down temporarily after the British government announced an ostensibly independent inquiry into the still-growing scandal, but that only fanned the flames, as skeptics pointed out ties between several panel members and the Hadley Centre. In an interview two weeks ago, Jones also admitted that there has been no “statistically significant” global warming in the past 15 years.

The Met Office’s shift in position could be a major embarrassment for British Prime Minister Gordon Brown, who as recently as last month declared that climate skeptics were “flat-earthers” and “anti-science” for refusing to accept that man-made activity was a major cause of global warming. Brown faces a tough election battle for his government, perhaps as early as May.

It is also a likely blow to Rajendra Pachauri, head of the United Nations backed International Panel on Climate Change (IPCC), whose most recent report, published in 2007, has been exposed by skeptics as rife with scientific errors, larded with un-reviewed and non-scientific source materials, and other failings.

As details of the report’s sloppiness emerged, the ranks of skepticism have swelled to include larger numbers of the scientific community, including weather specialists who worked on the sprawling IPCC report. Calls for Pachauri’s resignation have come from organizations as normally opposed as the Competitive Enterprise Institute and the British chapter of Greenpeace. So far, he has refused to step down.

The Met Office proposes that the new international effort to recalibrate temperature data start at a “workshop”‘ hosted by Hadley. The Met Office would invite “key players” to start the “agreed community challenge” of creating the new datasets.

Then, in a last defense of its old ways, the Met proposals argues says that its old datasets “are adequate for answering the pressing 20th Century questions of whether climate is changing and if so how. But they are fundamentally ill-conditioned to answer 21st Century questions such as how extremes are changing and therefore what adaptation and mitigation decisions should be taken.”

Those “21st Century questions” are not small and they are very far from cheap. At Copenhagen, wealthy nations were being asked to spend trillions of dollars on answering them, a deal that only fell through when China, India, and other near-developed nations refused to join the mammoth climate-control deal.

The question after the Met Office’s shift in stance may be whether environmentalists eager to move those mountains of cash are also ready to stand down until the 21st century questions get 21st century answers.

=========================

h/t to Dr. Richard North, EU Referendum

About these ads

150 thoughts on “Met office pushes a surface temperature data “do over”

  1. “a new trove of global temperature data that is open to public scrutiny and “rigorous” peer review”

    If that means the data points can be resolved to the raw historic record then that sounds great but it won’t be a trivial task.

    “Where did I put those fudge factors?”

    And remembering the fudge factors is only the start, then they have to document them – plenty of money required to make this happen …

  2. Thank God for this.

    Its the only way to settle things and get everybody back on track and on the same side (where I think we all want to be).

  3. I have no doubt that their new findings will prove beyond a shadow of doubt that it’s much worse than we thought. I’d love to be optimistic about this, but I’ve seen too much from these “scientists” to get my hopes up.

  4. –”a set of independent assessments of surface temperature produced by independent groups using independent methods,”

    Independence must be strictly defined or it will be more of the same. It cannot be the scientific peer review process. More structured and rigorous oversight is needed, similar to what is done in the nuclear industry.

  5. High fives all around!

    I am hopeful that the new data set and analysis will actually live up to the stated goals and set the new standard for transparency in the field.

  6. well…if they really start from original (not adjusted) temp data, really include all the stations, really publish the adjustment algorithms, really honestly evaluate the results and really make the process transparent, it will be a good thing.

    Otherwise it’s just another whitewash.

  7. Not a bad thing at all really. The record will now be “peer reviewed” by the community as it should be. Any dodgy adjustments duly noted. All this is assuming that the raw data is made available and that the entire process is transparent. If not it is a waste of time.

  8. lol, I have the same reaction to the word. Robust now has a whole new connotation.

    While the fight is far from over, this is a win. Transparent scientific inquiry. Whodda thunk it?

  9. They have already pre-determined that the new dataset will be little different from the old ones which clearly are robust. A pre-determined outcome after several years and millions of funding.

  10. I’m not sure it will settle things but at least everyone will be arguing over what the data means and what should be done with it as opposed to arguing over who is lying about what. A step in the right direction if you ask me since I think the fight has gotten entirely too personal.

  11. Isn’t this what it should have been all along? I mean a publicly available, documented, replicable, open and free database. The worry is that there will be a huge wrangling over who holds the final authority over what is the “right” number for any given observation.

    It is just as important to establish an impartial (as possible) panel who will adjudicate the final numbers which reflect adjusted observations to account for as many of the variables affecting the observation for comparative purposes as practicable . In ALL cases the raw dataset and details of the related collection sites should be available. Reminds me of a lyric in a song “Who do you trust?”.

  12. So they are going to investigate every weather station on a regular basis to determine how valid the data is? And when they don’t get the results that they desire, what happens then? I doubt that everything will be as transparent as they are making it out to being.

  13. “Jones later revealed that key temperature datasets used in Hadley’s predictions had been lost, and could not be retrieved for verification.”

    “the Met proposals argues says that its old datasets “are adequate for answering the pressing 20th Century questions”

    So which one is it?

  14. “…The Met Office proposal asserts that “we do not anticipate any substantial changes in the resulting global and continental-scale … trends” as a result of the new round of data collection…”

    So they already have a preconceived notion of the results. Nice.

    Is this going to be the same so-called “transparent” process we saw with the IPCC?

  15. Open source it if they want real transparency, just like Anthony’s Surface Stations. Then leave it for review and criticism on blogs. Then peer review it with a public debate between peers prior to the next step.

  16. This has to be a good development if it reveals the raw data for each and every station and if it properly details the extent of and explains the reasons behind any and all adjustment/harmonisation made to the raw data. It will be interesting to see how they deal with station drop outs.

  17. Perhaps, Mr Watts, you should tender with a few other ‘independent practicioners’ to ensure ‘appropriate amounts of independent monitoring’?

    Seems to me that there is a role for:
    1. Monitoring and grading current stations.
    2. Determining optimal location and frequency of stations.
    3. Installing new stations according to global best practice.
    4. Monitoring data collection methods and execution.

  18. This shows the new meaning of Robust is Flimsy and the new meaning of transparent is well hidden. I expect more of the same with more people on the payoff! This is actually a good way to receive more funding and increase the staff.

  19. A “relook” or a “new raw data open access methodology” accomplishes nothing without a major housecleaning of the MET, the NCDC, and every other “official” agency involved. Sorry to be a bucket of cold water, but those in charge now have shown by their prior action that they are not to be trusted again and “their” solution is no solution to the fundamental problem of untrustworthiness. I vote “NO CONFIDENCE”!! Throw the bums out and use a clean broom to sweep their offices spic n’ span.

    PS: Brown & Co. should also get the heave hoo!

  20. Who are these 150 climate scientists?
    Who paid for them to convene in Turkey and under what umbrella?
    Is this meeting a reaction to Climategate and to discuss reactions?

  21. Who is going to give the Met the money to do this? Why would that entity give that money to the Met? Why are the people that effed it up in the first place being given the opporuntity to do it again?

  22. If we really wanted anyone to “start all over again to produce a new trove of global temperature data”, why use those who already have a record of bias?

    With kind regards,
    Oliver K. Manuel

  23. The “back to square one” reassessment sounds very hopeful but is it:

    1. A delaying tactic to keep the Government research funding rolling for as long as possible?

    2. A delaying tactic to ensure that National policies to combat AGW proceed so far down the line that it would be even more difficult (and possibly politically suicidal) to do a huge U-turn than if it was done in the near future?

  24. I’m with John (09:02), what are 150 staff of the British Met Office doing in Turkey, and on whose dime are they there?
    It better not be mine.

  25. Since it was the Met Office’s lack of oversight and incompetence that resulted in the existing datasets being flawed, any new dataset should be completely removed from the hands of the Met Office and CRU. It should be put out to competitive tender. It is not a job that needs Met Office or UEA staff to do at great expense. A few competent statisticians and software developers could do the job.

    Why should the taxpayers pay the Met Office and CRU again because of their incompetence? The Met Office and CRU should have their funding reduced to pay for the new database.

  26. This “linkable” proposal is actually a superb response to the
    flurry of anticipated FOIA requests its announcement will
    otherwise spawn.

    However, the discussions that went into drafting
    the proposal are not so transparent.

    Does this mean the old “raw” data is available somewhere,
    in some format ?

  27. They know now that the value added calculations will have to occur before the temperatures leave the stephenson screen.

  28. It will be important to establish the criteria for the temperature dataset do-over at the beginning. Otherwise it will be a double waste of money. Whoever is charged with the mission should get input from all interested parties before approving a plan.

    Just off the top of my head:
    1. archive of original records (digital images of the forms and photographs/diagrams of the sites)
    2. meta-data on station siting and changes
    3. full description of raw data audit rules
    4. processing algorithms and code
    5. equipment specifications and effects of instruments on temperatures (CRS paint, sensor drift, etc.)

  29. If genuine, it would be welcomed.

    But it sounds to me like partisans trying to justify their position with a flag wrapping party… Some buzz words / phrases that got my attention:

    The executive summary of the Met Office proposal to the World Meteorological Organization’s Committee for Climatology was obtained by Fox News. In it, the Met Office defends its controversial historical record of temperature readings, along with similar data collected in the U.S., as a “robust indicator of global change.”

    There is that “robust” red flag again. Nothing wrong with the word, except it has been used too often by too many to mean “Our guesses are strong, trust me”. What’s wrong with just “find the truth”?

    But yes, I must agree that the data sets as presently composed are “robust indicators of global change”… accurate temperatures, not so much…

    But it admits that “further development” of the record is required “in particular to better assess the risks posed by changes in extremes of climate.”

    OK, already has a conclusion in mind. We also see the newest dodge being trotted out “extremes of climate”. Well, unless we have a dinosaur pleasing tropical jungle in Antarctica or a mile thick ice sheet in New York, it just isn’t an ‘extreme of climate’. So how about dumping that idea.

    How about “better and ACCURATE assessment of present weather trends, including 30 year and 60 year cycles of weather such as those driven by ocean oscillations”. You know, things like the PDO that is twice as long as your broken definition of climate as “average of 30 years weather”. (It isn’t, by the way. Climate is your latitude, altitude, and distance from the ocean with honorable mention for what atmospheric and ocean major currents you are near. Not 30 years of temperature data.

    The Sahara is a desert. It was a desert 100 years ago. It will be a desert 100 years from now. Tundra stays tundra. “Mediterranean Climate” has been Mediterranean for a few thousand years and will stay that way too. It was in the Iron Age Cold Period, Roman Optimum, The Little Ice Age, and even now, in the Modern Warm Period.

    So drop the broken biased phrasing about changes of climate when you really mean 30 year weather cycles. Ok?

    And “further development” is the last thing we need. I’d much rather see some un-developed raw data… Somehow I always thought a record simply “was” and did not require any “development”… One “develops” a disease or tin mine or even a “bad attitude”, but a record simply ought to be recorded…

    As a result, the proposal says, “we feel that it is timely to propose an international effort to reanalyze surface temperature data

    “Analysis” is a code word for ‘cook the books’ as near as I can tell. The raw data are run through the NCDC “analysis” and come out with a warming trend. The NCDC “unadjusted” (they avoid the “raw” word) data are run into the GIStemp “analysis” and come out completely mangled. What the ClimateGate CRU crew did was also an “analysis”…

    Please, sir, might I have my temperature data set non-analyzed? And certainly not re-analyzed … once through the pooch was one time too many.

    in collaboration with the World Meteorological Organization (WMO),

    Oh GAK! So they will be consulting with the UN gang of powermongers and central state control advocates on what to do to the planet data series? I’m sure that will work out well and unbiased /sarcoff>

    Please, Sirrah, might I have my data un-influenced by the WMO and related political hangers on?

    which has the responsibility for global observing and monitoring systems for weather and climate.”

    Pardon? I was under the impression that we had a global system composed of independent sovereign nations each running their own weather services. Has NOAA now been put under the “responsibility” of the UN?

    IMHO, anything that comes out of a WMO driven process will be obtuse and untrustworthy, at best.

    –”verifiable datasets starting from a common databank of unrestricted data”

    So an ‘analyzed” dataset, but not the “unrestricted data” directly… Will the data be “unrestricted” to everyone, or only to the WMO approved agencies?

    –”methods that are fully documented in the peer reviewed literature and open to scrutiny;”

    That would be more comforting if we knew that the “peers” doing the reviewing were not going to be just another re-grouping of the same “peers” who suborned the process to begin with.

    How about a “public reviewed” process instead? Or even, just make the data freely available and stand back. Let a free market of ideas sort out the good approaches from the bad.

    –”a set of independent assessments of surface temperature produced by independent groups using independent methods,”

    Great. But who gets chosen by whom as the ‘approved’ independent voices? If it is a “Y’all Come!” and anyone can have the data, fine. If it’s another “Circle Of Three” like NCDC, CRU, and GISS all working hand in glove and claiming independence, well, no thanks.

    –”comprehensive audit trails to deliver confidence in the results;”

    That would be good.

    –”robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.”

    There is that “robust” flag again. But if, by this phrase, they mean:

    “Well designed and fully reported error bounds calculations and code review / data audit” then Great!. If they just mean “our group of friends all agreed we can “homogenize” the data and fabricate fill in by our favorite ways, since it’s already “in the literature” that we self-reviewed: Then, no, please don’t waste time and money on the Circus…

    Basically, while I like the idea of a ‘do over'; this just smells like a “do show” for dealing with the “raised awareness” after ClimateGate…

    We don’t need “bread and circuses” no matter how well peer reviewed and reanalyzed. We just need verifiable UN-processed data. We’ll work out the rest…

  30. I am cautiously –very cautiously– pleased. Fighting over analysis is healthy and expected. Let’s just all have access to the same data and knowledge of how it was collected and what was done to it before we get to the knife work on “what does it mean?”.

  31. Neil Hampshire (08:56:47) :

    Who are the “Key Players”?
    I hope it is not the “Hockey Team”
    ——-
    Reply:
    I’ve often said that a scientist without his data is no scientist. He is a soothsayer.

    And who would ever hire a soothsayer to be a scientist?

    I certainly wouldn’t.

    Time to hire a completely new staff–people who are actually scientists.

  32. How’s this for a radical suggestion: How about admitting that there is at the moment no scientific way to measure the temperature of the Earth (except perhaps to the extent satellites can be used).

    Unless and until there is a world wide grid of temperature measuring stations providing simultaneous real time measurements covering ocean and land, mountain and desert, town and country etc any process and any measurement is flawed. It depends far too much on the underlying assumptions loaded into the process rather than the raw measurements.

    Quite frankly at the moment all that can be measured is temperature in time series in particular places, not for the world as a whole. If something cannot be measured accurately it cannot be measured accurately.

    With human temperature we can get a rough idea of whether somebody ‘has a temperature’ by putting a hand on their forehead. But this doesn’t give us degrees C or F and it certainly doesn’t give us measurements to a tenth of a degree and it would be bogus and unscientific to pretend otherwise. You wouldn’t measure world records without a high precision watch or other measuring device.

    Same for the temperature of the Earth. If somebody wants to do this they have to set up a valid (and very expensive) way to do it. Otherwise they should come clean and admit that anything they report is at best very very approximate and likely to be biased in a number of ways by underlying assumptions and measurement locations.

    And any ‘temperatures’ depending on tree ring or other proxies should be dumped.

  33. Perhaps a do over will answer questions like the adjustments made to Anchorage,siberia,cet(uk),cet(prague) finland, darwin,NZ and all the others. Why there is no co2 induced temps in the u.s despite it being the main co2 emitter for about a century. Darwin was bad enough but have you all seen anchoage??? Explain that!

  34. This is really great news …. for the global warming believers!

    Why? … Because it will drive a wedge into the sceptic community and sort between those who are sceptical because it is right to be sceptical, and those who would be sceptical of global warming even if lava were flowing down the road.

    But seriously, I would have given anything to have been a fly on the wall as these guys realised that they had no choice but to do something like this!

  35. E.M.Smith (09:26:39) :
    (…)
    And “further development” is the last thing we need. I’d much rather see some un-developed raw data… Somehow I always thought a record simply “was” and did not require any “development”… One “develops” a disease or tin mine or even a “bad attitude”, but a record simply ought to be recorded…
    ——————
    Reply:
    Yet even if you’re developing a tin mine (or gold mine or any other type of mine), your raw sample data is the ultimate qualifier when making stepwise development decisions and particularly when you’re putting the project up for capitalization. At that point, the funding banks have independent consultants take the raw data and everthing else you’ve done and do a full-blown model/plan/economic analysis for comparison before funds are allocated.

    In climate science as in mining, raw data is the key.

  36. IsoTherm (09:42:19),

    Thanx for that example of alarmist thinking.

    Yes, I would be a skeptic of CAGW even if lava were flowing down the road. But I wouldn’t be a skeptic of volcanic activity…

    …while the alarmists would be blaming lava on global warming.

  37. Unfortunately it is not searchable, as they still seem to be living in the typewriter age, having photoscanned the printed document.

    Fortunately, the rest of us live in the OCR age. Output below. Have at it.

    ~~~~~~~~~~~~~~

    PROPOSAL FOR A NEW INTERNATIONAL ANALYSIS OF LAND SURFACE AIR TEMPERATURE DATA

    CONTENT OF DOCUMENT:
    UK Met Office submits this document for consideration by the CCI session

    Appendix:

    • Proposal for a new international analysis of land surface air temperature data

    ~~

    PROPOSAL FOR A NEW INTERNATIONAL ANALYSIS OF LAND SURFACE AIR
    TEMPERATURE DATA

    Submitted by UK Met Office

    Executive summary

    Surface temperature datasets are of critical importance for detecting, monitoring and communicating climate change. They are also essential for testing the validity of the climate models that are used to produce predictions of future climate change. The current datasets, constructed in the UK and US using different methodologies, agree in showing that the world is warming. Taken together these records provide a robust indicator of global change and form part of the evidence base that led the IPCC Fourth Assessment Report to conclude that “warming of the climate system is unequivocal”

    To meet future needs to better understand the risks of dangerous climate change and to adapt to the effects of global warming, further development of these datasets is required, in particular to better assess the risks posed by changes in extremes of climate. This will require robust and transparent surface temperature datasets at finer temporal fidelity than current products.
    The current surface temperature datasets were first put together in the 1980s to the best standards of dataset development at that time; they are independent analyses and give the same results, thereby corroborating each other.

    In the case of the CRU land surface temperature dataset (CRUTEM3, which forms the land component of the HadCRUT dataset) there are substantial IPR issues around the raw station data that underpin the dataset; we are actively pursuing resolution of these issues so that the base data can be made openly available. We know that several stations have already been explicitly forbidden from release by the rights’ holders so we will not be able to release all the under-pinning station data.

    Consequently we have been considering how the datasets can be brought up to modern standards and made fit for the purpose of addressing 21st Century needs. We feel that it is timely to propose an international effort to reanalyze surface temperature data in collaboration with the World Meteorological Organization (WMO), which has the responsibility for global observing and monitoring systems for weather and climate.
    The proposed activity would provide:

    1. Verifiable datasets starting from a common databank of unrestricted data at both monthly and finer temporal resolutions (daily and perhaps even sub-daily);

    2. Methods that are fully documented in the peer reviewed literature and open to scrutiny;

    3. A set of independent assessments of surface temperature produced by independent groups using independent methods;

    4. Robust benchmarking of performance and comprehensive audit trails to deliver confidence in the results;

    5. Robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.

    It is important to emphasize that we do not anticipate any substantial changes in the resulting global and continental-scale multi-decadal trends. This effort will ensure that the datasets are completely robust and that all methods are transparent.

    Background

    In many respects HadCRUT has been the default choice of surface dataset in all 4 IPCC Assessment Reports. However we must stress that other independent datasets are used which support the HadCRUT data. There are three centres which currently calculate global average temperature each month:

    • Met Office, in collaboration with the Climatic Research Unit (CRU) at the University of East Anglia (UK);

    • Goddard Institute for Space Studies (GISS), which is part of NASA (USA);

    • National Climatic Data Center (NCDC), which is part of the National Oceanic and Atmospheric Administration (NOAA) (USA).

    These groups work independently and use different methods in the way they process data to calculate the global average temperature. Despite this, the results of each are similar from month to month and year to year, and there is robust agreement on temperature trends from decade to decade.

    All existing surface temperature datasets are homogenized at the monthly resolution, and are therefore suitable for characterizing multi-decadal trends. These are adequate for answering the pressing 20th Century questions of whether climate is changing and if so how. But they are fundamentally ill-conditioned to answer 21st Century questions such as how extremes are changing and therefore what adaptation and mitigation decisions should be taken. Monthly resolution data cannot verify model projections of extremes in temperature which by definition are (sub-) daily resolution events.

    Through collaboration with NCDC we have two quality controlled, but not homogenized products at the daily and sub-daily resolution (HadGHCND and HadlSD – the latter about to be submitted to peer review), spanning 1950 onwards and 1973 onwards respectively. However, because these are not homogenized, they may retain time-varying biases. It is an open scientific question as to whether homogenization is feasible at these timescales whilst retaining the true temporal characteristics of the record. In particular, seasonally invariant adjustments which are adequate for monthly timescale data will be grossly inadequate at the daily or sub-daily resolution. Clearly homogenization of these data is highly desirable but some detailed research is needed to define the best approach.

    The way forward

    Recognizing that no single institution can undertake such a fundamental data collection, re-analysis and verification process single-handedly, we would envisage this as a broad community effort – a ‘grand challenge’ so to speak – involving UK and international partners.

    The UK would convene a workshop to be hosted by the Met Office Hadley Centre and invite key players who could plausibly create such datasets with the aim of initiating an agreed community challenge to create an ensemble of open source land temperature datasets for the 21st Century both at monthly temporal resolution and also at the daily and sub-daily timescales needed to monitor extremes. Such an approach would help distribute many of the basic tasks, ensuring that the most appropriate parties were responsible for each part as well as providing a focused framework and timeline. This effort would ideally have involvement from, and be coordinated under, the umbrella of one or more of the Commission for Climatology, the Global Climate Observing System, or the World Climate Research Programme, with assistance from other WMO constituent bodies as appropriate.

    Activities that would be required within any overall programme are:

    1. Creation of an agreed international databank of surface observations to be made available without restriction, akin to the l-COADS databank in the ocean domain. Note that NCDC already have substantial efforts in this regard and would be a key participant and likely host as the designated world data bank. Data to be available at monthly, daily and sub-daily resolutions;

    2. Multiple independent groups undertake efforts to create datasets at various temporal resolutions based upon this data-bank. Participants will be required to create a full audit trail and publish their methodology in the peer-reviewed literature. Strong preference will be given to automated systems and creations of ensembles that reflect the uncertainties in the observations and methods;

    3. One or more groups to create realistic test-cases of the spatio-temporal observational availability by sampling output from a range of climate simulations from a number of models, adding realistic error structures;

    4. Groups to run their algorithms against the test-cases and one or more groups, preferably completely independent, to undertake a holistic assessment based upon the results of this verification exercise from all groups.

    ###

  38. We can write our own code to analyze the data, right?

    My view is that this is what needs to be done urgently. Start the project now and release it under an open source license. The code doesn’t need to incorporate the adjustments and assumptions, but could include the ability to see the results based on loading different sets adjustments and assumptions.

    This will clearly allow us to see the effects of different modelling techniques, and see which results held under which assumptions.

    Surely with a genuine interest in science, whether currently holding warmist, luke-warmist or skeptical views, would oppose such a project?

  39. “We feel that it is timely to propose an international effort to reanalyze surface temperature data”……

    Reanalyze temperature data from what start point? Half way up the ladder I suspect, more fudging in the pipeline! Ah what the hell it’s only a proposal.

  40. Don’t worry Smokey, the believers are also having their problems. Manmade global warming has lost its scare factor, and has even become a bit comical (What did the actress say to the global warming … want a snowjob?)

    It doesn’t have the kudos they need to bring tug at the environmental guilt strings and pull the cash. Likewise the wind lobby have seen the writing on the wall for manmade global warming, and are right this minute opening up the file marked: “plan B — it’s the end of oil”.

    Moreover, there’s a lot of environmental campaigns that have been completely sidelined by the guys running the warming alarmism campaign. These environmentalists are also getting pretty sick and tired of loosing the limelight to this stupid manmade warming and there’s many ready and there’s probably more environmentalists ready to stick the knife in AGW campaing thant there are sceptics.

    This next year is going to fun!

  41. This just sounds like a typical UK civil servant’s way of getting more funding before the AGW train finally hits the buffers.

  42. I quite agree. When I see the words ‘robust, ‘vigorous’ or, for that matter, ‘stakeholder': very popular with the UK Labour government, you can be sure it means exactly the opposite.

  43. If they are smart they will invite Mcintyre and Watts to the workshop.

    I don’t think that its wise to question the honesty of this process before it starts.

    THE SMART THING TO DO PEOPLE IS THIS.

    campaign for things that will give you TRUST in this process, like the inclusion of Watts and Mcintyre in that workshop.

    Don’t criticize. Suggest additional measures that will give you trust. Take part in creating a science you can then trust

    I know Anthony and Steve. Both will use the opportunity to move science forward.

    But somebody on the other side has to suggest this, perhaps Judith Curry.

    If you get an opportunity to ask her nicely, please do so

  44. I thought the raw data had been thrown away. Funny. I wonder where it turned up?

    As for their remark about not expecting to find any differences this time round, then they either believe their own spin, in which case they may well approach this in an honest and transparant manner, or they don’t believe it. Then expect every kind of trick, to “hide the decline.”

  45. “… a meeting on Monday of about 150 climate scientists in the quiet Turkish seaside resort of Antalya …”

    Off to a good boondoggling start. There’s money to be spent!

    Recreating the data isn’t likely to accomplish anything. They still don’t know how to identify the portion of the record attributable to CO2 and what parts are just nature doing her thing. That’s why the other guys are using models – and ignoring the results that don’t match reality.

  46. I have zero confidence in their honesty to produce an unbiased dataset of any kind.
    I think theyll rig it again. Leopards dont change their spots.

  47. Let’s call a spade a spade here.

    The British Met Office has become a joke, and when it comes to doom-laden predictions nobody believes a word they say anymore. They are also faced in six weeks’ time with the humiliating prospect of being dropped by the BBC as the national weather forecasters of choice.

    Yes, indeed, extinction from the British public’s conscientiousness is staring the Met Office in the face. Being part of the global warming lying propaganda machine has its price, after all.

    So now they say they want to create a “common trove of global temperature data that is open to public scrutiny and “rigorous” peer review.” Well, frankly, I don’t believe them. We all know what happened to the last “common trove of global temperature data” they had. And let’s not forget how they fought tooth and nail to keep everyone away from what should have been publicly-accessible data. Broke the Freedom of Information laws, too, though I don’t notice any prosecutions over this shameful behaviour.

    As for being “open to public scrutiny and “rigorous” peer review.”, people should note that this access can be revoked at any time for any reason they want to give, such as when it doesn’t support their climate models. Also notice how they’ve left out any mention of the public having access to the methods by which they have been processing the input data, tweaking it, pulling it, pushing it, deleting bits of it, and generally torturing the data until it confesses and says what they want it to say.

  48. Chuckles (09:09:18) :

    I’m with John (09:02), what are 150 staff of the British Met Office doing in Turkey, and on whose dime are they there?
    It better not be mine.

    Don’t worry, mate, it’ll probably be ours:-)

    I welcome this news with a measure of caution, like those dare devil adventure movies when the hero says, “that was too easy, somethings wrong……..!” Do you get the idea? It could just be a put up job, rather along the lines of British comedy series Yes Minister when Civil Servant Sir Humphry Appleby says to the Minister, “Never hold a Public Enquiry unless you know the outcome beforehand!”. As others have pointed out, they have pretty much concluded that the new dataset will be not much different from the old one, but then again I wouldn’t have expected it to be. Just what are those “adjustments” that they do to raw data that shows no significant warming, but does after the applied “adjustments”?

  49. I noticed Anthony mentions the PDF is not searchable. Does anyone have OCR software to convert the pdf to one with searchable text? It is probably easy enough to do that.

  50. Come one can we stop complaining about the cost. Surely the reason we’ve got into this stupid mess was because we tried to get global temperature data on the cheap and we paid peanuts and got monkeys.

    And, remember that although the Met has been part of the problem the real scientists in the Met can be part of the solution. Again, the Met isn’t universally paid up members of the lunatic warming brigade. There are sensible people there, particularly the old hands who have seen the kinds of mistakes that can be made in forecasting, and who must be pretty miffed at the way the met Office has been taken over by PR media consultants like the WWF head.

    Give these guys some decent temperature data and a management that aren’t trying to flog global warming, and I’ve no doubt they will see the error of their ways. Or as Michael fish would put it:-

    “Earlier on today, apparently, a woman rang the BBC and said she heard there wasn’t global warming on the way… well, if you’re watching, don’t worry, there is”.

  51. R.S.Brown (09:14:31) :
    This “linkable” proposal is actually a superb response to the
    flurry of anticipated FOIA requests its announcement will
    otherwise spawn.

    However, the discussions that went into drafting
    the proposal are not so transparent.

    Does this mean the old “raw” data is available somewhere,
    in some format ?

    They always have been, unfortunately there are restrictions on access to some of them by the originating weather bureaux. That is why the Met Office is doing this via the WMO. Just because you’re granted the right to use some data, e.g. CRU, doesn’t give you the right to disseminate it (or even to keep it beyond the duration of the study).

  52. Alan the Brit (10:27:16) :
    Chuckles (09:09:18) :

    I’m with John (09:02), what are 150 staff of the British Met Office doing in Turkey, and on whose dime are they there?
    It better not be mine.

    Don’t worry, mate, it’ll probably be ours:-)

    Reading comprehension not John’s strong point apparently.

  53. Kate (10:27:14) :
    So now they say they want to create a “common trove of global temperature data that is open to public scrutiny and “rigorous” peer review.” Well, frankly, I don’t believe them. We all know what happened to the last “common trove of global temperature data” they had. And let’s not forget how they fought tooth and nail to keep everyone away from what should have been publicly-accessible data. Broke the Freedom of Information laws, too, though I don’t notice any prosecutions over this shameful behaviour.

    As far as I am aware none of your statement is true, so what you know isn’t very reliable.

  54. OH, heck we’re about to lose our funding to generate temperature data the old way. Let’s get more money to fix what we did with the last chunk of tax payer money we got.

    A great scam if you can pull it off.

  55. “Britain’s Weather Office Proposes Climate-Gate Do-Over”

    Does that mean we’re going to get a brand new Climate-gate?

    Just a guess here but will the new conclusion be; “wow its worse than we thought.”

    To be legitimate the redo has to start with the real raw data, i.e the handwritten temperature logs from the individual weather stations. That data then needs to be copied into databases without any adjustments.

  56. On the missing data.

    The story goes something like this. When Mc asked for the data CRU responded that some of the data (raw data) had gone missing.
    In the mails Jones says that he can recover the lost data by going
    back to the original sources. That’s my recollection.

    So its like this: Jones had receieved raw data from NWS around the world.
    over the course of years some of this data was lost. They will now go
    back to the original source to request the data again.

    That’s a charitable reading of the situation.

  57. Pascvaks (08:50:43) :
    “A “relook” or a “new raw data open access methodology” accomplishes nothing without a major housecleaning of the MET, the NCDC, and every other “official” agency involved.”

    You are correct but you didn’t go far enough. Any remake is a waste of time and money.

    Sufacestations.org has established that the USHCN data is worthless and can not be “cleaned up”. All Surfacestations.org did was show that about 10% of US stations are currently valid. There is no reason to assume that these same stations were valid, 5, 10, 50, 100 or more years ago. We only have a snapshot and to verify these stations were valid in the past we need to know what they looked like in the past, who took the measurements and how diligent they were. None of this is available. It may be an inconvenient truth but it is the truth.

    Let’s talk about the rest of the world. Sorry about being an American chauvinist but if the US records are unacceptable today and, therefore, in the past what makes you think that any other continents records were OK 100 years plus ago? This isn’t a court of law where you are presumed innocent until proven guilty. These records should be considered suspect until proven accurate and that is simply not possible.

    What we are left with is satellite data that only goes back to 1979 and proxy data that needs to be proven to be accurate before it can be trusted. The fact that the “hide the decline trick” was used shows that the proxy data can not be trusted. That’s the biggest issue with the “trick”. When the tree rings didn’t match the thermometer records they stopped using the tree rings but ignored the fact that the divergence undermined the validity of using tree rings as an accurate proxy of past temperatures in the first place. Tree rings maybe our best proxy but that doesn’t make it acceptable any more then I would bet my stock portfolio on woman’s skirt lengths.

    Neither can the Medieval Warming Period or the Little Ice age be trusted. No one knows if they were worldwide. They certainly occurred in some places but that’s not good enough to project global temperatures.

    There is nothing wrong with admitting that we don’t know. It is a lot better then pretending that we do know something that we don’t really know.

  58. This looks like the 18 month review they announced in Dec but was rejected later by the Prime Mentalist (old Gordon Flat Earther Broon). So either Gordon has changed his mind or with an election looming and probable change of govt the Met Office Sir Humphry Appleby’s are preparing for the new masters.

  59. Please see http://boballab.wordpress.com/2010/02/20/now-this-is-interesting-a-different-larger-dataset-at-ncdc/ (e.g. DATSAV2 is the official climatological database for surface observations).

    This link was originally posted at the Chiefio site (link on right bar). Apparently, about 13,000 stations worldwide are available with a “beginning” date of January 1, 1930, but this is not truly “raw” data. It has already gone through some quality assurance steps. See http://www1.ncdc.noaa.gov/pub/data/documentlibrary/tddoc/td9950.pdf for details.

    Apparently, Boballab has found the source for NCDC’s data (and probably everybody else’s also) . The Met Office needs to do nothing. The data collection work is already done. What the Met Office is going to do and what climate scientists have been doing for 20 years is cherry-picking data from DATSAV2. Why not try to use ALL of the data from DATSAV2? I would be willing to buy a copy to share with Anthony, if the purchase agreement allows, so that he can post it somewhere and make it publicly accessible. Doing this would be as monumental as finally making the Dead Sea Scrolls publicly accessible, after a generation of scientists carefully prevented anyone but a select few to do scholarship on them.

  60. Yes, the word ‘robust’ has been forever tainted for me too.

    Others I cannot hear without associating them with AGW are ‘rigorous’ and ‘unequivocal’.

    You can also be sure there is either obfuscation or mendacity afoot when you hear a politician/AGWer use the words ‘clear’ or ‘clearly’.

  61. Copner (10:02:47) :

    If the raw data is available to everyone without restriction
    If the metadata is available to everyone without restriction
    If all processing code is made available with restriction

    Then people will have the ability to:

    1. Create their own series.
    2. Audit the work of others.
    3. Improve the work of others.

    Things to watch for:

    1. Raw data that really isnt raw, but adjusted before it is sent into MET.
    2. Pre-selection of raw data. Countries have many many sources. One needs to be assurred that the data supplied is in fact a random sample or complete sample of all the data available. I can pick stations in the US from 1880 to present that show cooling.

    there’s more, but that’s the biggest two.

  62. “When initially contacted by Fox News to discuss the proposal, its likely cost, how long it would take to complete, and its relationship to the Climate-gate scandal, the Met Office declared that no press officers were available to answer questions”

    Does it not strike anyone else as slightly odd that “how long?” and “how much?” were not available as immediate answers? Or indeed had (I presume) not been mentioned in the original release? [Or is it just the cynic in me that says 'how long?" is "until our pensions come through" and "how much" is an excuse to ask for money which they won't get and thus provide the necessary delay to get to that pension date?!]

    Other than that, however much some of us might trust them as far we can throw them after all they’ve pulled in the past, fact is it seems like they’re giving us everything we’ve asked for. Start with the raw data and then document everything you do to it. And think how unimaginable that would have been only a few short months ago. So, cut them no slack, but let’s try and do what we can to help speed this along, because I for one would like to know the truth behind those “adjustments”.

    I wonder what “Harry” is doing for the next 10 years… *grin*

  63. These are excellent ideas:

    We can write our own code to analyze the data, right?

    My view is that this is what needs to be done urgently. Start the project now and release it under an open source license.

    campaign for things that will give you TRUST in this process, like the inclusion of Watts and Mcintyre in that workshop.

    Parallel analyses of the same source data can and should be done: it both helps verify the work and may answer questions that have been raised time and again by many here.

    And if the process is meant to counter loss of confidence in the integrity of the science, then reasonable requests for inclusion should be honored… and if they’re not, skeptics who have a high profile after the recent controversy should be in the media asking why.

  64. Phil. 10:59:02
    They always have been, unfortunately there are restrictions on access to some of them by the originating weather bureaux. That is why the Met Office is doing this via the WMO. Just because you’re granted the right to use some data, e.g. CRU, doesn’t give you the right to disseminate it (or even to keep it beyond the duration of the study).

    …which of course is why Phil Jones was briefing his freedom of information officer on why they shouldn’t release information to this bunch of fault-finding nitpickers because “they only want to try and find something wrong with it”. And why he was telling IPCC people to delete their emails. Absolutely, the only reason data wasn’t released was permissions – nobody was brazenly flouting the freedom of information law at all…

  65. They should just implement #1 and make all data available with comments about the source and what is known about its history.

    And even #1 specifies access to an “agreed databank” rather than access to all surface gathered observations. Granted the latter may be hard to organize.

    The other three steps are not clear. But it looks as if they will lead to a new international bureaucracy which will evolve into another IPCC. That is what bureaucracies do.

    Just organize the original data. Scientists – amateur and professional – will do the rest.

  66. The way I interpret it (but it may just be wishful thinking on my part), they are proposing something similar to the system set up for ARGO (http://www.argo.ucsd.edu/) where the raw data is collected and posted online and several different groups of researchers are tasked with the data analysis.

    If that’s the case, they should go further and bid out the data collection and storage part of the system to qualified data processing organizations and not to some group of academics that are totally ignorant of and could care less about data accuracy, data integrity, data backup, system reliability, system security, etc. All you have to do is look at the botched job the CRU has done the first go-round to justify keeping it out of academia. However, I have no argument with some university DP departments. There are a few competent ones; you just have to dig them out of the morass created by the majority.

    Following the raw data collection phase, I would hope that several independent, competing groups would be tasked with and funded for analyzing, correcting and adjusting the individual station records, with all code/calculations, intermediate data and results posted on-line.

    Concurrent with the data collection and station data analysis, several groups should be tasked and funded to perform physical audits of each weather station similar to those done by Anthony’s SurfaceStations.Org web site with a full report for each station posted on-line, to include any and all available station meta-data. This will also require the development of some form of non-subjective grading system for station siting and hardware reliability, repeatability, accuracy and calibration.

    While the above steps may solve the problem of recovering and verifying the historical climate data, they do nothing to fix the system for use in the 21st century. For the future, we can either upgrade the current system by slowly attacking each station and each problem, or perhaps a better, less expensive approach would be to design and build self powered, self contained ARGO type devices that would be inexpensive and capable of being both manually sited and air-dropped in to remote locations. Perhaps they should even hire the original ARGO team to design and monitor a totally new system of weather stations. From what I have seen, the only money spent on climate research so far that has done anything other that provide jobs for the handicapped, has been spent on the ARGO system.

  67. “the Met proposals argues says that its old datasets “are adequate for answering the pressing 20th Century questions”

    It’s a relief if that means they’re going to stop trying to answer stupid 21st Century questions.

    JonesII – Peer review doesn’t require Scotland Yard, it requires Arkham Asylum.

  68. “It could just be a put up job, rather along the lines of British comedy series Yes Minister when Civil Servant Sir Humphry Appleby says to the Minister, “Never hold a Public Enquiry unless you know the outcome beforehand!”.”

    I offer this interpretation of the nuanced language of the Met Office proposal; which coming from a public body should never be taken at face value. It appears that they are going to take action to enhance their assessment, but with a built-in assumption that there is still a substantial risk i.e. ” further development of the record is required in particular to better assess the risks posed by changes in extremes of climate”. No suggestion that one outcome might be that there is no risk and therefore no need for continuing funding.

    The other bullet points are also reinforcing the same sense that something is going to be done and in a fully open and accountable manner. The trigger words and phrases flow like warm honey:
    – verifiable datasets … unrestricted data
    – methods fully documented … peer reviewed literature … open to scrutiny
    – independent assessments … independent groups … independent methods
    – comprehensive audit trails …. confidence in the results

    The last point is a classic piece of civil service speak:
    – robust assessment of uncertainties associated with observational error, temporal and geographical inhomogeneities

    In other words it doesn’t matter what issues are thrown up by going over the data again, the spin on the data will be even stronger; strong enough to deflect the critics again.

    Reading between the lines of the Met Office lines, we can see the that the main intention might also be to prolong the the study, so that ‘in the fullness of time and the due course of events’ people get tired of it and it drops off the horizon.

  69. Have Fox spiked this story? link Britain’s Weather Office Proposes Climate-Gate Do-Over comes up with “No content item selected” on Fox.

    Also bit concerned no date, time etc on the PDF

  70. Phil. (11:06:00) :
    Kate (10:27:14) :
    “…As far as I am aware none of your statement is true, so what you know isn’t very reliable.”

    Not true? Not reliable?

    Oh, dear. Try this, for a start…

    ***************************************************************************
    How the Met Office blocked questions on its own man’s role in the “hockey stick” climate row

    The Meteorological Office is blocking public scrutiny of the central role played by its top climate scientist in a highly controversial report by the beleaguered United Nations Intergovernmental Panel on Climate Change.

    Professor John Mitchell, the Met Office’s Director of Climate Science, shared responsibility for the most worrying headline in the 2007 Nobel Prize-winning IPCC report – that the Earth is now hotter than at any time in the past 1,300 years. And he approved the inclusion in the report of the famous ‘hockey stick’ graph, showing centuries of level or declining temperatures until a steep 20th Century rise.

    By the time the 2007 report was being written, the graph had been heavily criticised by climate sceptics who had shown it minimised the ‘medieval warm period’ around 1000AD, when the Vikings established farming settlements in Greenland.

    In fact, according to some scientists, the planet was then as warm, or even warmer, than it is today. Early drafts of the report were fiercely contested by official IPCC reviewers, who cited other scientific papers stating that the 1,300-year claim and the graph were inaccurate.

    But the final version, approved by Prof Mitchell, the relevant chapter’s review editor, swept aside these concerns.

    Now, the Met Office is refusing to disclose Prof Mitchell’s working papers and correspondence with his IPCC colleagues in response to requests filed under the Freedom of Information Act.

    The block has been endorsed in writing by Defence Secretary Bob Ainsworth – whose department has responsibility for the Met Office.

    Documents obtained by The Mail on Sunday reveal that the Met Office’s stonewalling was part of a co-ordinated, legally questionable strategy by climate change academics linked with the IPCC to block access to outsiders.

    Last month, the Information Commissioner ruled that scientists from the Climatic Research Unit at the University of East Anglia – the source of the leaked ‘Warmergate’ emails – acted unlawfully in refusing FOI requests to share their data.

    Some of the FOI requests made to them came from the same person who has made requests to the Met Office.

    He is David Holland, an electrical engineer familiar with advanced statistics who has written several papers questioning orthodox thinking on global warming.

    The Met Office’s first response to Mr Holland was a claim that Prof Mitchell’s records had been ‘deleted’ from its computers.

    Later, officials admitted they did exist after all, but could not be disclosed because they were ‘personal’, and had nothing to do with the professor’s Met Office job.

    Finally, they conceded that this too was misleading because Prof Mitchell had been paid by the Met Office for his IPCC work and had received Government expenses to travel to IPCC meetings.

    The Met Office had even boasted of his role in a Press release when the report first came out.

    But disclosure, they added, was still rejected on the grounds it would ‘inhibit the free and frank provision of advice or the free and frank provision of views’.

    It would also ‘prejudice Britain’s relationship with an international organisation’ and thus be contrary to UK interests.

    In a written response justifying the refusal dated August 20, 2008, Mr Ainsworth – then MoD Minister of State – used exactly the same language.

    Mr Holland also filed a request for the papers kept by Sir Brian Hoskins of Reading University, who was the review editor of a different chapter of the IPCC report.

    When this too was refused, Mr Holland used the Data Protection Act to obtain a copy of an email from Sir Brian to the university’s information officer.

    The email, dated July 17, 2008 – when Mr Holland was also trying to get material from the Met Office and the CRU – provides clear evidence of a co-ordinated effort to hide data. Sir Brian wrote:

    ‘I have made enquiries and found that both the Met Office/MOD and UEA are resisting the FOI requests made by Holland. The latter are very relevant to us, as UK universities should speak with the same voice on this. I gather that they are using academic freedom as their reason.’

    At the CRU, as the Warmergate emails reveal, its director, Dr Phil Jones (who is currently suspended), wrote to an American colleague: “We are still getting FOI requests as well as Reading. All our FOI officers have been in discussions and are now using the same exceptions – not to respond.”

    Last night Benny Peiser, director of the Global Warming Policy Foundation, said the affair further undermined the credibility of the IPCC and those associated with it. He said: “It’s of critical importance that data such as this should be open. More importantly, the questions being raised about the hockey stick mean that we may have to reassess the climate history of the past 2,000 years. The attempt to make the medieval warm period disappear is being seriously weakened, and the claim that now is the warmest time for 1,300 years is no longer based on reliable evidence.”

    Despite repeated requests, the MoD and Met Office failed to comment
    ***************************************************************************
    …Happy now?

  71. In a political world, open science is the only science.

    Maybe they could just start with some of the online work already done. I bet you could pay the groups a paltry few million and get in completed quickly.

    Yeah for open science.

  72. “To meet future needs to better understand the risks of dangerous climate change and to adapt to the effects of global warming, further development of these datasets is required, in particular to better assess the risks posed by changes in extremes of climate. This will require robust and transparent surface temperature datasets at finer temporal fidelity than current products.”

    Can someone interpret for me what they mean by “to better assess the risk posed by changes in extremes of climate?”

  73. if the Met office is serious about having a rethink and starting again it will only have credability if the leading skeptic scientists are included such as Plimer and Carter in Australia , Lindzen and Singer in the US etc etc, assuming they would agree to be involved.

  74. “At a meeting on Monday of about 150 climate scientists in the quiet Turkish seaside resort of Antalya, representatives of the weather office (known in Britain as the Met Office)” of course they couldn’t have the discussion in their own country, it had to be at a quiet foreign resort. I assumed they walked using the chunnel to get over the Channel.

  75. steven mosher (11:39:44) :

    I am very concerned with the state of data in the US from 1882 to 1914 for the West Coast. The records have that ‘patchwork’ look to them. They are in dire need of review with referees present. This is not the only period that looks tainted, but is the most grevious.

  76. Pete Ballard (12:25:19) :

    The ‘risks’ to climate are the intended ‘remedies’ that are on the table to correct for a perceived condition that might not exist at all.
    The question is: You are in a vehicle going down hill that you don’t know how it works. A curve is approaching. You have 3 pedals on the floor and only 1 chance at applying correction. To which pedal will you now stick your foot into? Pedal 1, 2, 3 or will you try to ride it out?

  77. “Then, in a last defense of its old ways, the Met proposals argues says that its old datasets “are adequate for answering the pressing 20th Century questions of whether climate is changing and if so how.”

    The 21st Century shows no statistical warming, but that’s what they’ll review, just leave the 20th Century warming as is.

  78. I posted this info. quite a while ago at Climate Audit, but for anyone interested here, almost all of the old station records are available in book form at large university libraries in the Gov’t Docs section as Smithsonian Miscellaneous Collections Vol. 79, 90, and 105. From 1941 on, they are available as World Weather Records from the U.S. Dept. of Commerce in 10 year increments. The Vol. 79 is the only one I know of which is available online, and it is scanned in at archive.org. All the corrections are in the errata and stations notes of each volume.

    If you read Warwick Hughes, he has put the books containing the stations used by CRU online at his website. They are TR022 and TR027.

    A coherent online collation of all records would certainly be appreciated, but I suspect there are records being added all the time as old collections are uncovered and digitized.

  79. In addition to the suggestions above regarding making any code open source, I’d like to see:

    1. A database of scanned images of original records. Most of the old records are going to need to be transcribed into an electronic format and this will provide a cross check on the accuracy of that transcription.

    2. A set of database of temperature data, starting with original un-modified, and then going through whatever steps are required to produce the ‘final’ data. This allows researchers to examine the consequences of alternative treatments of the data (eg. due to site shifts, changes in instrumentation, etc)

    3. A database of site information to assist with analysing site changes and the impact on that site’s records.

    This provides close to full transparency. The original data and all manipulations will be open for scrutiny and open for researching alternative analyses.

    For data that can not be released publicly due to non-disclosure agreements, these can be documented as such and negotiations can be undertaken to see what CAN be released. eg. Maybe those countries will allow the release of records up to 1970 (say), but want to keep the data after that point on a for-sale basis as a source of income. Even there, they may be open to providing summary data to the public in the spirit of cooperation.

  80. Phil., where’d ya go, Phil.? Yoo-Hoo, Phi-i-i-i-l!!

    Kate (12:05:47) put the ball right back in your court. Too bad you’re not around to return the serve.

  81. It’s the cause of temp change that’s important not necessarily the change itself.
    The planetary systems controlling this are many and complex and we don’t understand their interaction yet, that plainly comes across from this whole mess. The Met Office really hasn’t got a clue for all their computer power. History, geology, climateology, oceanology, solar and planetary interaction to name but a few all come into it, even a thick uneducated layman like me can work that out from reading all the arguments from both sides. The science is certainly not past dispute. It’s only just beginning.

  82. “…several stations already have been forbidden from release by the rights’ holders…”

    Therefore the data from those stations should be excluded from the analysis as it is not verifiable.

    Anything less is just another fraud.

  83. Folks, I understand the reason that everyone is looking on this overture with a jaundiced eye – and rightly so. But this is an amazing opportunity. Given the recent events it shouldn’t take too much too make sure that among those funded for the “rigorous” peer review would be folks like Pielke, Sr., McKitrick, etc. If they use the funding to perform an exhaustive analysis the data including the biases starting with the nearly complete Watts database we should be very confident in the quality of those results – no matter what the results are. And, again, in the current climate, I think it would be EXCEPTIONALLY difficult for the usual suspects to deny publication, as long as the work meets the required quality and is as complete as it should now be possible for such an analysis to be, given adequate funding to support the effort.

  84. rbateman (12:45:06) :
    steven mosher (11:39:44) :

    I am very concerned with the state of data in the US from 1882 to 1914 for the West Coast.

    Doing this right will be a huge task. It could rival the florida recount

  85. steven mosher (10:20:06) :

    If they are smart they will invite Mcintyre and Watts to the workshop.

    I’m available, if there’s a seaside resort involved!

    Joking aside, there must / should be an institutionalized “devil’s advocate” division. It will enhance the credibility of the outcome. (This is what the IPCC should have done too. It would have been a wise move in the long run.)

    IsoTherm (10:52:37) :

    Come one can we stop complaining about the cost. Surely the reason we’ve got into this stupid mess was because we tried to get global temperature data on the cheap and we paid peanuts and got monkeys.

    Yep. Let’s get a worldwide real-time weather-monitoring set-up installed.

  86. In order best to understand the language and the coded meanings behind such language as is used by the Met Office one must have lived through the oppressive rule of the UK’s present government (lower case ‘g’ is intentional). The Met’s press release is crafted in the finest socialist doublespeak; the Labour Party has used Orwell’s ‘1984’ as a political Manual of Practice.
    Please remember that no-one has risen to any position of power or prestige in any organisation funded by this government without acting to please the government and to back it’s every ill-informed policy proposal. This government had Sir David (Reds under the Beds) King as it’s Chief Scientific Advisor. We now have John Beddington, Professor of Applied Population Biology (what’s that?), who trained as an economist (there’s THAT ‘qualification again) as the country’s leading Scientific Advisor and who just happens to be “an expert in leading green issues”.
    The Met has survived by kow-towing to every governmental whim insofar as climate matters have been concerned. It is likely, but not certain, that the incumbent government will soon be replaced. Alas, it will be replaced by another political party which is indistinguishable from the present one other than by the colour of the rosettes they all wear. “Call Me Dave”, the leader of the Conservatives has pledged to continue pouring tax-funded money into the “greening of Britain”.
    Despite the apparently fine words and equally fine sentiments emanating from the Met Office, I can assure you that all that will be produced is more of the same. We will NOT get honesty, transparency and access to the ‘raw’ data as promised. Why? Simply because that is what has been promised by the Met Office.
    We in Britain have become all too accustomed to lies couched in such ‘robust’ terms.

  87. “The proposed activity would provide:….”

    How about this:

    “6. A survey of the spatial and temporal dynamics of the temperature field and a demonstration of what is required to comply with the requirements of Shannon’s Sampling Theorem. Data which does not meet these reqirements may not be used for temperature reconstructions.”

    Sorry to harp on about it. But there really is an elelphant in the corner, and if nobody is prepared to face up to it, I struggle to see how all the proposed work can be given any credence.

    Until we have convincingly determined how to sample it, we cannot trust the digital data to hold sufficient information to reproduce the signal.

    So basic … so simple.

  88. Deutche Democratishe Republik….probably not so Democratic
    Peoples Republic of China…….probably not the peoples republic
    Democratic Republic of the Congo…probably not so Democratic

    Rubust data………………………probably not so robust
    Raw data………………………….probably not so raw

  89. I read the executive summary. It appears to me they want to be in charge of a new database construction that will prove once and for all that they were correct all along about global warning. Just clean up the presentation under a new name and make it the base of all the other databases.
    New face, same old crap.
    We have the same thing being presented in the U.S. A new database organization, same people in charge, to prove the same conclusions.
    Tell the rubes that WE will fix the problems that they have found, give it a new face and con them for a few more years and a lot more money.
    No one that had any management control should have anything to do with a new database as they are proved to be ether incompatant or cheats.

  90. RockyRoad (09:50:13): Re: ‘In climate science as in mining, raw data is the key.’

    Yes, raw-data, not ‘adjusted’ raw-data. Confirming ‘raw-data’ requires the confirmation of a documented chain of custody of the data.

    Remember Bre-X?

    Bre-X was a group of companies in Canada. A major part of the group, Bre-X Minerals Ltd. based on Calgary, was involved in a major gold mining scandal when it was reported to be sitting on an enormous gold deposit at Busang, Indonesia (on Borneo). Bre-X bought the Busang site in March 1993 and in October 1995 announced significant amounts of gold had been discovered, sending its stock price soaring. Originally a penny stock, its stock price reached a peak at CAD $286.50 on the Toronto Stock Exchange (TSX), with a total capitalization of over CAD $6 billion. Bre-X Minerals collapsed in 1997 after the gold samples were found to be a fraud.

    Exposure of the fraud

    On March 26 the American firm Freeport-McMoRan, a prospective partner in developing Busang, announced that its own due-diligence core samples showed ‘insignificant amounts of gold’. A frenzied sell-off of shares ensued and Suharto postponed signing the mining deal. Bre-X demanded more reviews and commissioned a review of the test drilling. Results were not favorable to them, and on April 1, Bre-X refused to comment. David Walsh blamed the whole affair on web ‘ghost writers’ who had spread rumors on the Internet and damaged the company’s reputation. Canadian gold analyst Egizio Bianchini considered the rumors ‘preposterous’. A third-party independent company, Strathcona Minerals, was brought in to make its own analysis. They published their results on May 4: the Busang ore samples had been salted with gold dust. The lab’s tests showed that gold in one hole had been shaved off gold jewelry though it has never been proved at what stage this gold had been added to those samples. This gold also occurs in quantities that do not support the actual original assays.


    “David Walsh blamed the whole affair on web ‘ghost writers’ who had spread rumors on the Internet and damaged the company’s reputation”

    “Canadian gold analyst Egizio Bianchini considered the rumors ‘preposterous’”

    “A third-party independent company, Strathcona Minerals, was brought in to make its own analysis.”

    Now substitute heat for gold; the CRU for Bre-X; The BBC for David Walsh; the IPCC for Egizio Bianchin; and Climate Audit for Strathcona Minerals.

    Sound familiar?

  91. Rhys Jaggar (08:36:05) :

    Perhaps, Mr Watts, you should tender with a few other ‘independent practicioners’ to ensure ‘appropriate amounts of independent monitoring’?

    Seems to me that there is a role for:
    1. Monitoring and grading current stations.
    2. Determining optimal location and frequency of stations.
    3. Installing new stations according to global best practice.
    4. Monitoring data collection methods and execution.

    =============================================

    is it possible to plot a global map of stations for all to use – as a google earth map – that will highlight the gaping holes of missing stations as well as creating a reference point of how far apart stations are in a region. Local knowledge can then play a part is spotting excessive variations. Just a thought…..

  92. Yup. The word “rigorous” is as banal as “robust”. Both laughable, only more so now since “Return to Almora”. I can’t get THAT juxtaposition out of my head. What were those turkeys doing at the seaside resort?

  93. “PROPOSAL FOR A NEW INTERNATIONAL ANALYSIS OF LAND SURFACE AIR
    TEMPERATURE DATA
    Submitted by UK Met Office
    Executive summary
    Surface temperature datasets are of critical importance for detecting, monitoring and communicating climate change. They are also essential for testing the validity of the climate models that are used to produce predictions of future climate change.”

    Interpretation
    We got to justify the 2.5 Amplification number one way or another.

  94. Want to be that they’ll do the process the wrong way? They’ll try to create the whole thing before unveiling it, instead of starting by making the raw data available and letting us monitor the code in SVN, along with documentation, as they build the system, and have suitable comment tools from the outside world.

  95. I am reading through the comments and snorting as often as just smiling.

    Much redundancy in the following, and pardons to any and all whose comments I repeat:

    1. Yes, starting over from scratch is THE right thing to do. Jones et al should consider this a slap in the face. It clearly says, “We have seen enough that we won’t be able to shut anybody up without doing this right. Now WHERE in the HELL did you put all this crap?”

    2. It is clear that the emails and the subsequent brouhaha are having an effect. Those who say, “WE WON!” are not entirely incorrect. “We have had an effect,” would be more like it.

    3. If they FIND all the data, then it wasn’t lost, after all.

    4. Like the New Zealand Parliament exchange the other day, “We DO want to see the schedule of the adjustments made.” THAT WILL BE HUGE. THESE ARE COMPLETELY CHALLENGE-ABLE. We will win on UHI adjustments, even if it takes a long process. That alone will drop the forecasts, once it is seen that the current adjustment schedule(s) are inadequately allowing for this.

    5. Gaps in the met stations – and the referencing of them to nearby stations (pegging urban ones to rural ones, for example – should be revealed. (The Antarctica situation should be totally addressable, too.)

    6. If independent means someone besides CRU tackles it, then the processing – in a transparent and well-documented and well-archived process – cannot be any worse. As long as Mann is not involved there won’t be able to be any “hiding the decline.” If the usual suspects are included, expect it to be less than it could be.

    7. Recall that lots of climatologists and meteorologists who were intimidated before are not intimidated anymore. Their voices will be heard loud and clear, if and when less than due diligence is followed.

    8. Listen folks, they won’t be able to hide behind, “The dog ate my homework,” anymore. We will know WHAT databases are included. We will know which algorithms are being used. We will know which adjustments are applied. AND THEY WILL KNOW THIS TIME THAT SOMEONE IS WATCHING. Jones et al for the last 20 years has operated as if everyone else was idiots; Mann thought he could intimidate everyone into submission. It will be a whole new ballgame.

    9.

    “In the case of… CRUTEM3… there are substantial IPR [I assume this is "independent peer review"] issues around the raw data that underpin the dataset.”

    THAT IS A WIN, FOLKS. Even the Met Office sees the huge holes in what was being done, and that the peer review process broke down.

    10. The five proposed activities essentially spell out the scientific method which should have been used in the original work, and if followed will likely, after all voices are heard, produce adjusted data that shows the warming (which is real) to be much smaller than Jones et al and Mann et al produced.

    11. If Hockey Stick Team members are allowed to participate, we would all like to be flies on the wall when they:
    – Try to find their data, with others looking on
    – Try to justify their cherry picking of data to others
    – Try to justify their low UHI adjustments (ALL of which should, in itself, be peer reviewed)
    – Try to explain their algorithms and stepped met station movement adjutments

    12. This – in spite of all the caveats and bows to the existing keepers of the databases – is what should have been done back in 1988.

    I’ve gone on long enough, especially for someone who isn’t a stat man or meteorologist/climatologist.

  96. Gidday Anthony, I do not think you are right to say “..GHCN (source for GISS and HadCRUT)..”.
    Also- we are talking land data in these great issues – so the “Had” which refers to SST can be left off too. It is a fact that CRUT was generated in the 1980’s before GHCN was published.
    I am not aware that in later Jones et al iterations with increased station numbers – that they ever turned to GHCN data. I do know that in recent years as Jones/CRU have come under pressure to release data – they have used this excuse saying in effect, “..go and get the GHCN to work with, it is public..”.
    I can assure you that analysing the GHCN will not get you within a country mile of understanding the poor science that has riddled the Jones et al compilations from the 1980’s.
    I suppose there might be a few cases of CRU station data being EXACTLY the same month by month as a GHCN version. I have never come across one – if anybody ever finds any examples, please let me know.

    REPLY; Well as I’ve seen it, based on the station lists provided, most of the GHCN stations (they few that are left) are in the CRU set also. – Anthony

  97. “We feel that it is timely to propose an international effort to reanalyze surface temperature data”

    Verifiable datasets starting from a common databank of unrestricted data

    Ah, reanalyze. Not a databank of raw records as transcribed from original sources and supplementary information where available? The databank to be “reanalyzed” every few years, one presumes, so that after a decade nobody can ever re-construct prior sub-sets except in spirit. Sounds a bit familiar.

    What is wrong with these people? What do they have against one single open source of RAW data that everybody can work from? Can you imagine Historians accepting a system where all original sources were locked away safely but you could work from any of the approved standard 10 volumes of “summary” publicly available. Just as a bonus history changes in subtle ways with each new release.

    Welcome to the information age – much like other ages but faster.

    How about this basic system outline – Jones et al 2017 create their data set “on the fly” from a standard bank of RAW data.

    .. The sub-set used came from RAW using the SQL query …”SELECT * FROM monthly_mean WHERE … ”

    .. After homogenisation [specific details here] We then applied the First difference method [Peterson 98] [implementation code (FORTRAN) available here] ….

    Then in 2100 some poor soul could accurately replicate the entire paper without FOI requests or trying to find a reasonable replica of the data sets used or trying to figure out if there was an error in the code.

    While I welcome the effort in principle, the whole thing sounds [admittedly - from the very basic outline available] like a very long way to get exactly where we are now.

    ___________________________________________________________

    Through collaboration with NCDC we have two quality controlled, but not homogenized products at the daily and sub-daily resolution (HadGHCND and HadlSD – the latter about to be submitted to peer review), spanning 1950 onwards and 1973 onwards respectively.

    My ears pricked up at this one until reaching the last sentence. Hm .. any particular reason to choose 1950 as your start date?

    They just can’t stop themselves.

  98. Take the climate zone Meacham, Oregon sits in (up high and in a bowl) and adjust all other similarly situated monitors and we will get nothing but freezing your butt off temps. Let’s hope they ask a weather man how to do this grid by grid “fill in” business. And let’s hope they ask those that have determined our agricultural climate zones how to do this. But forget railroad engineers. Not that I don’t like them. The crew that works the railroad around Wallowa County are swell guys. They are our local drinkin buddies most Friday nights at the Lostine Tavern and readily admit they don’t know a damn thing about the science behind weather or climate. But they know lots and lots about trains, and they eat red meat. Lots and lots of red meat. But veggies, not so much. My kinda guy.

  99. As I posted at Cimate Audit:

    http://docs.lib.noaa.gov/rescue/cso/data_rescue_signal_corps_annual_reports.html

    for “official” U.S. location/temperature reports from
    1861 – 1942.

    None of the major data sets have bothered to use this huge body
    of data.

    REPLY: There was no standardized thermometer exposure before about 1892 when the Stevenson Screen was put into use. Thus early data is suspect unless the Signal Corp has some system I’m unfamiliar with.

  100. I have long argued that temperature alone has NO relevance.

    I will again argue this point quite dramatically!

    I will regularly place my hands, unprotected, into an oven, pre-heated to 200 or 250ºC

    I suffer no ill effects!

    If I inadvertently place my hands into a stream of steam, (from a pan of boiling water), I quickly get scalded.

    In advance I answer, In the case of the oven, everything is as quickly as possible to avoid heat loss from the oven, on the case of the steam, a matter of seconds.

    DaveE.

  101. Re: REPLY

    The Signal Corps (part of the U.S. Army) had standardized procedures and standard issue equipment/barometers
    for their stations before 1898. I’m looking for the operating manuals they issued now.

    My point is the the information found in:

    http://docs.lib.noaa.gov/rescue/cso/data_rescue_signal_corps_annual_reports.html

    can be used as a stand-alone data set.

    The Stevenson screen material was contiguous with a goodly
    number of years/places with “old” Signal Corp reports, beginning
    in 1898 and running until 1949.

    To my knowledge, no one has ever done a data plot of the “old” dataa, let alone one comparing the resulting reports between the two methods.

    That doesn’t mean the “old” historical data isn’t useful.

    However, I can’t recommend splicing the old fashioned
    reports onto them new fangled ones… unless you’re named
    Jones or Mann.

  102. Smokey (13:25:56) :
    Phil., where’d ya go, Phil.? Yoo-Hoo, Phi-i-i-i-l!!

    Kate (12:05:47) put the ball right back in your court. Too bad you’re not around to return the serve.

    Some of us have work to do Smokey! Actually you have misread Kate’s post, to continue the tennis analogy she missed the ball and tried to hit another ball back hoping I wouldn’t notice.

    Kate (12:05:47) :
    Phil. (11:06:00) :
    “…As far as I am aware none of your statement is true, so what you know isn’t very reliable.”

    Not true? Not reliable?

    Oh, dear. Try this, for a start…

    Not good enough Kate, you referred to a “common trove of global temperature data” that the Met Office had had custody of, no mention of that in your rebuttal, instead you went off at a tangent about the Hockey stick!

    You also said that they had tried “to keep everyone away from what should have been publicly-accessible data”, yet in your rebuttal you don’t mention this.

    You also state that the Met Office broke the Freedom of Information laws, again you fail to substantiate this.

    Your problem appears to be a reliance on the Daily Mail as a source, which is demonstrably unreliable.
    For example, the Mail says: “Last month, the Information Commissioner ruled that scientists from the Climatic Research Unit at the University of East Anglia – the source of the leaked ‘Warmergate’ emails – acted unlawfully in refusing FOI requests to share their data”, not true that hasn’t happened.
    They also say that Phil Jones has been suspended, also not true.

  103. Personally, it doesn’t matter what they try to do to clean up their image. They are not to be trusted – a tiger does not change its stripes. Climate science credibility is dead.

  104. ” in the quiet Turkish seaside resort of Antalya, ”

    Why are these meetings always held in posh places?

    For the British Met Office, why not Cleethorpes? If they absolutely must go abroad, Omsk might be an option. And if they are desperate for a bit of sun, I’m sure we could put them up in Oodnadatta.

  105. As long as the government sector has control of virtually all the funds going to science then there will exist a large scale risk of broad influence on the scientific outcome by the government. This then is the situation we are faced with:

    1) If we are confident going forward that private citizens can do a much better job watchdogging the govt & science interaction, then we would keep on funding science almost solely through the government.

    AND/OR

    2) If we do not have high confidence that our government will refrain in future from influencing the outcome of science that they are funding, then we must wisely start to establish many simultaneous alternate private/voluntary funding vehicles for science outside government control.

    I am personally not confident that large government can ever be adequately watchdogged regarding possible influencing of scientific outcome.

    Maybe a small government could be watchdogged, but I see no realistic downsizing of government in the near future. In fact, unfortunately, I see a trend toward even larger government.

    Note: I am doing this from my Blackberry while on public transportation in Taipei, so spelling & grammar are out the window (no pun intended).

    John

  106. Seriously, how much in total did the Copenhagen beanfeast cost the taxpayers of the world?

    And how many water purification systems, sewage systems, vaccination and other public health programs, or plain old schools could that money have paid for?

  107. A top priority must be for a cleanup of acronyms and speciously over-used words such as “ROBUST.” The past reports from IPCC have been oft-declared “robust.” Well, we come to find they are rife with error. Thus, “robustness” is now a liability in science.

    The first Encarta definition of robust is: “strong, healthy, and hardy in constitution.” All characteristics we dare say are NOT reflected in the climate science to date.

  108. The Met Office will never, ever compromise their relationship with the government, we must all realise that, surely?

    Is it reasonable to expect that by “starting again” they are going to admit that they were wrong all along? Somehow I doubt it. The outcome must fit the plan, and there’s more than one way of skinning a cat. The result is known, the trick will be to get the data to fit seamlessly – they’ll be more careful the second time around. Leopards don’t change their spots…

  109. Met Office:
    “We do not anticipate any substantial change in the resulting global and continental scale-trends” as a result of the new round of data, but “this effort will ensure that the data sets are completely rubbery…oops..robust, and that all methods are transcendental…er…what we mean is…

  110. Long delays by the Information Commissioner’s Office in investigating freedom of information complaints are undermining the effectiveness of Britain’s Freedom of Information Act.

    This poorly-drafted law has allowed so many ludicrous provisions and exceptions to government departments and public bodies such as the UAE and the Met Office, the question of why anyone would bother trying to enforce it at all has to be raised. All a body has to do once an inconvenient request under the Freedom of Information Act has been made is to delay any action for six months or more, at which point the request expires, and the data itself can also be destroyed without penalty.

    Nearly 500 formal decision notices issued by the Information Commissioners Office in the 18 months to 31 March 2009. On average, it took 19.7 months from the date of a complaint to the ICO to the date on which the ICO’s decision was issued. 46% of cases took between 1 and 2 years from complaint to decision notice and 30% took more than 2 years to a decision.

    The ICO’s investigation into a complaint did not begin, on average, until 8 months after the complaint had been received. In 28% of cases, there was a delay of more than a year before the investigation even began.

    In July 2009 the Campaign for Freedom of Information drafted an amendment, which Lord Dubs attempted to make to the Coroners and Justice Bill. This would have amended the Freedom of Information Act to allow a section 77 prosecution to be brought within 3 years of the offence being committed, provided it was within 6 months of the ICO obtaining evidence of the offence. The Information Commissioners Office supported this amendment.

    The government did not accept the amendment because – it claimed – there was no evidence that the 6 month limit was causing systemic problems. It did say that if such evidence arose, it would look for ways to put the matter right, and if necessary amend the FOI Act. If the government accepted that the 6 month limit only ran from the time when the ICO became aware of the offence, it would have said the amendment was unnecessary for that reason.

    That has also tended to confirm that section 127(1) does not at present allow a prosecution to be brought more than 6 months after the offence itself has occurred, and that the FOI Act should be amended so that prosecutions can be brought after that 6 month period is over.

    In the Sunday Telegraph on January 30 2010, Christopher Booker suggested that a prosecution for conspiracy to commit an offence under section 77 of the FOI Act could be brought under the Criminal Law Act 1977, even if the 6 month period had expired.

    However, it appears that any proceedings for conspiracy to commit an offence would be subject to the same time limits as those applying to the offence itself. Section 4(4) of the 1977 Act states:

    “Where (a) an offence has been committed in pursuance of any agreement; and (b) proceedings may not be instituted for that offence because any time limit applicable to the institution of any such proceedings has expired, proceedings under section 1 above for conspiracy to commit that offence shall not be instituted against any person on the basis of that agreement.”

    This suggests that, even if a conspiracy charge were possible, it would not provide a way round the problem created by the 6 month limit on prosecutions in the Magistrates Court Act.

    http://www.cfoi.org.uk/pdf/foidelaysreport.pdf

    http://www.cfoi.org.uk/pdf/foidelaystable.pdf

    http://www.cfoi.org.uk/foi030709pr.html

    ***************************************************************************

    By the way, Phil, the discontent about the Met Office is not about me or the Daily Mail. You will find the same information all over the British media and elsewhere. I merely posted the relevant details from the Mail on Sunday’s article to illustrate a point, the sentiments of which I believe reflect those of the entire country.

  111. I think it would be timely to revisit the whole process of how temperature is best measured. Up until now people have been using whatever data they were provided, not going out and seeking the data they actually required. It is time to design the experiment properly – not just collect data in an ad-hoc fashion and try to use modelling to extract from inadequate data the results required.

    Changes I’d like to see are

    1. Measuring temperature at a height much higher above the surface. A station will be considerably less susceptible to local environmental effects if you stick it on the top of a 10m pole.

    2. Quantity has a quality all its own. Use hundreds of cheap sensors and average – not one or two expensive high quality ones.

    3. Replace single stations with clusters. Not a single instrument at a discrete location but a cluster of 20 or 30 mounted on poles scattered across a couple of square kilometers. That way the effect of disturbance to one instrument can be measured and accounted for in a much more robust fashion.

    4. Spread it around. If your aim is to measure average temperatures across a wide region – then you need sensors spread across a wide region. Measuring at one or two discrete locations and trying to extrapolate is just ridiculous.

    Lets design the experiment properly. The results might not be directly comparable to the historical record if we change the method of measurement, but they will certainly be a lot more accurate going forward.

  112. I hope that there are people around who have some parts of the original unadjusted raw data so that when this new “do over” data is released, we can check parts of it against the original raw data to verify it hasn’t been adjusted in any way.

  113. Ian H: “I think it would be timely to revisit the whole process of how temperature is best measured. ”

    I have to disagree, because from a quality point of view, it is easier to maintain a few good sensors, than to find e.g. that all your multitude of sensors have ants nests in them!

    I used to be responsible for it must be 16,000 sensors, and it was a god forsaken job trying to keep them reading anything like the proper reading – and that was in one factory. Imagine what that is like when all those sensors are dispersed to the four corners of the globe (lovely phrase!)

    Sensors go wrong, and the current HadCrut3 sensors appear to be recalibrated about once a year, and it is quite noticeable that this forces quite significant changes in the data up to a year previously.

    Yes, let’s improve the way we monitor global temperature – even work out a meaningful way to average temperature (it’s technical – but a simple average isn’t necessarily right), but quantity is no substitute for quality – otherwise all those climate scientists voting for christmas (turkeys-xmas) would be worth something.

  114. Congratulations Anthony Your efforts are bearing fruit.

    UN weather meeting agrees to refine climate data AP

    GENEVA — Experts at a U.N. climate meeting have agreed to collect more precise temperature data as part of a global effort to monitor climate change.

    The World Meteorological Organization says delegates at a meeting in Antalya, Turkey, approved in principle a British proposal to gather land surface temperature data more frequently and ensure the process is transparent. . . .

  115. Sorry – forgot the configuration, the figure was only 3000 sensors! And I also have experience with Meterological temperature measurement in the “field” – and to let you into a secret – I never the readings because I couldn’t see how the sensors wouldn’t be drenched in water with our Scottish horizontal rain!

  116. Kate (02:24:45) :
    By the way, Phil, the discontent about the Met Office is not about me or the Daily Mail. You will find the same information all over the British media and elsewhere. I merely posted the relevant details from the Mail on Sunday’s article to illustrate a point, the sentiments of which I believe reflect those of the entire country.

    Whether there’s discontent is not the issue, that doesn’t give you or the Daily Mail the right to make things up! As I said before none of your statement was true, and you’ve failed to address any of those points. The standards of the British media in their reporting on this subject has been abysmal, the Mail in particular has simply made up statements by scientists.
    Your discontent would be better directed in that direction.

  117. @feet2thefire (15:20:59) & IanH:

    Congratulations.

    woodNfish (19:22:02) :

    They are not to be trusted – a tiger does not change its stripes.

    “A zebra cannot change his spots” — Al Gore.

  118. So, let me get this right, the same “Robust” data fed into the same “models” with the same fudge factors?
    Is the Met Office still chaired by Robert Napier, a former global warming activist and previously head of WWF UK?
    Anyone smell a rat? More money down the drain!

  119. First post – but ardent followers, AGW undecided, but love these posts.
    Our fear is that science in Universities etc. is not being taught on first principles – Do not wish to name names etc. but please see this post from most expensive Global Science experiment which we find rather shocking: If this is being instilled in our junior scientists then, who can help us?

    Another problem is that our data is still “sensitive.” In order to make full use of our friends’ computers, we would want to give them full access to our data. But we want to be the first to publish results with that data! So it is a bit nervous-making to just send the data to whoever asks for it. More likely, there would be someone out there who would try to use the data, but wouldn’t really understand it, and so would end up misidentifying something interesting. Then we’d have to spend our time trying to fix the things they’d done wrong. There was an interesting discussion about that at a conference I attended a few years ago. Someone asked that all LHC data be made publicly available. Of course, we raised this objection then (that they wouldn’t understand what we were giving them). And then a person asked a very nice question: “The data from the previous experiment at CERN (called LEP) is publicly available. Has anyone looked at it?” No one outside the experiments had. So one more reason to not try to make our data public.

Comments are closed.