Just over a month after Climategate started, we have breaking news from Climate Audit
Steve McIntyre writes:
The UK Met Office has released a large tranche of station data, together with code.
Only last summer, the Met Office had turned down my FOI request for station data, saying that the provision of station data to me would threaten the course of UK international relations. Apparently, these excuses have somehow ceased to apply.
Last summer the Met Office stated:
The Met Office received the data information from Professor Jones at the University of East Anglia on the strict understanding by the data providers that this station data must not be publicly released. If any of this information were released, scientists could be reluctant to share information and participate in scientific projects with the public sector organisations based in the UK in future. It would also damage the trust that scientists have in those scientists who happen to be employed in the public sector and could show the Met Office ignored the confidentiality in which the data information was provided.
However, the effective conduct of international relations depends upon maintaining trust and confidence between states and international organisations. This relationship of trust allows for the free and frank exchange of information on the understanding that it will be treated in confidence. If the United Kingdom does not respect such confidences, its ability to protect and promote United Kingdom interests through international relations may be hampered…
The Met Office are not party to information which would allow us to determine which countries and stations data can or cannot be released as records were not kept, or given to the Met Office, therefore we cannot release data where we have no authority to do so…
Some of the information was provided to Professor Jones on the strict understanding by the data providers that this station data must not be publicly released and it cannot be determined which countries or stations data were given in confidence as records were not kept. The Met Office received the data from Professor Jones on the proviso that it would not be released to any other source and to release it without authority would seriously affect the relationship between the United Kingdom and other Countries and Institutions.
The Met Office announced the release of “station records were produced by the Climatic Research Unit, University of East Anglia, in collaboration with the Met Office Hadley Centre.”
The station data zipfile here is described as a “subset of the full HadCRUT3 record of global temperatures” consisting of:
a network of individual land stations that has been designated by the World Meteorological Organization for use in climate monitoring. The data show monthly average temperature values for over 1,500 land stations…
The stations that we have released are those in the CRUTEM3 database that are also either in the WMO Regional Basic Climatological Network (RBCN) and so freely available without restrictions on re-use; or those for which we have received permission from the national met. service which owns the underlying station data.
I haven’t parsed the data set yet to see what countries are not included in the subset and/or what stations are not included in the subset.
The release was previously reported by Bishop Hill and John Graham-Cumming, who’s already done a preliminary run of the source code made available at the new webpage.
We’ve reported on a previous incident where the Met Office had made untrue statements in order to thwart an FOI request. Is this change of heart an admission of error in at their FOI refusal last summer or has there been a relevant change in their legal situation (as distinct from bad publicity)?

Well done, everyone who contributes here. Very, very well done.
Let’s keep going….
Maybe Bill Whittle at PJTV had something to do with this http://www.pjtv.com/?cmd=video&video-id=2889
_Jim (15:23:55) :
It’s not the transcribed data that I really want to see, it’s the hand-written data.
The one’s where you get them online in PDF form.
And if they did the trasnscribing, then they have the hand-written data.
Some of that is clearly missing.
Gregg E. (02:58:03) : “September 1995, Hewlett Packard produces the first CD-R drive priced under $1000, the Sure Store 4020i…”
I, for one, am not convinced that the original data files were deleted until recently.
Bill wrote,
“The data was not confidential it was commercial as Jones said about a year ago….Jones said the data would be released when…He also said that…”
At what point will this madness stop? At what point will there be a realization that a mere rebuttal is sufficient to deny or explain away facts that are well established and not in controversy, but instead are completely ignored so as not to be acknowledged, let alone squarely addressed?
I am so tired of being patronized, and having my intelligence insulted.
Unfortunately for your defense of Jones and what he might have said publicly, we also have on record other things that he wrote privately, like the fact that he would rather delete files than send them, and that he would “hide behind” the confidentiality act that you so clearly, reasonably, calmly and patronizingly alluded to. Which he did.
Try to explain that away as “an unfortunate turn of phrase”, or boyz-will-be-boyz sentiment. There’s no trick to understanding this. He said, and I quote, “I WILL HIDE BEHIND IT”.
a) You live (or die) with it *after* the transciption (you move ON with life; you make a call as a professional instead of worrying over every bit of frayed, unreadable paper. There are deadlines in this life and there aren’t enough hours to spend near-endless hours on a ‘personal project’).
b) Do you really think another transcription working off a ratty copy (or a COURSE image as linked by Bill) is going to improve the interpretation on the second transcription? Of course not …
c) You aren’t (I would think not) doing archaelogy with your ‘temperature’ dataset at this point; otherwise you’re going to be handling every data-record like they tried doing with the punched card ballots in Florida re: 2000 presidential election … ‘individual interpretation’ of chad, crease, extra punched holes etc. I’ll take the efforts and ‘reliability’ of a trained keypunch operator any day … One you’ve got XX MB of data and a time period of xx hours – it is a very simple process to assess what percentage of the ‘records’ you’ve got time to examine. 100% QA you’re NOT going to be able to do (one person or a few people anyway). Verification of a few select sites: sure. Does this make it ‘right’? Now you’re in the realm of making a value judgement on a situation that was controlled mostly by available resoures e.g. as ‘time’.
d) I don’t think other things have been considered. We have not considered what age, rot, fading, fires, sprinklers going off is going to do to these ‘paper’ records.
e) What is the ‘error rate’ on these records anyway? One percent (1%)? Five percent (5%)? No establishment at present as to what the ‘error limts’ are.
Extra credit: can any of YOU guys transcribe Bill’s linked raw record image? I didn’t think so … so the point is moot if *this* is the form the ‘raw’ records are in.
.
.
I have converted the MET data for Sacramento, CA and added it plus a graph to the bottom of the page
http://www.robertb.darkhorizons.org/SacMonthlyAMS_COOP.htm
The graph doesn’t compare with the raw data at all, until I noticed the
1880 cool year was on the opposite side, so I flipped the graph round.
Now, it looks like they read the tape backwards, and spit the data into the forward sucession of years. And their correction makes Sac. flat, as it is in real life (very flat place).
Is this a frontiere of science institution?
C’mon, MET, surely you can do better than this.
Or, maybe not. Times are hard, I know, and good help is hard to come by.
You sure you Metters wouldn’t be better off outsourcing the trudgy data to
the capable hands of M&M? I’m sure they can get the job done right.
_Jim (21:14:48) :
I’ll take those frayed pieces of paper, all of ’em you can dig up, thank you.
Read my post on Sac, CA data.
Any questions about the frayed paper?
_Jim (15:23:55) :
No, Jim, they didn’t buy scanners, they built them.
That’s why you entrust your raw science data to higher learning, they invent what they need.
“Accordingly, in 1991 Caltech and the ST ScI completed a Memorandum of Understanding which defined The Palomar–ST ScI Digitized Sky Survey. Major features of this program are digitization based on scans of the original plates, processing of all fields in the three passbands (2682 plates total), the use of a 1 sampling interval, and distribution of the full-plate pixel data to the community.
In support of this, the microdensitometers used for the original scanning for the Guide Star Catalog (Lasker et al. 1990) were rebuilt as laser-illuminated 5-channel systems capable of scanning rates well in excess of 1000 plates per year. The metrology is stable to 0.5 m, and the densitometry extends three density units above the sky (note that typical sky values on the original plates are in the range 1.5–2.5). The scans are of dimension 23040, which corresponds to 1.1 Gbyte per plate (for a 2.8 Tbyte survey total). ”
But you never, ever, throw away the raw science data.
That is the equivalent of sacrilege.
Scientists DON’T do that.
And what’s all this garbage about data storage capacity?
Universities invented the darn computers and storage systems.
If they need a bigger hammer, they build it.
Simple as that.
rbateman (21:34:12) :
“I have converted the MET data for Sacramento, CA and added it plus a graph to the bottom of the page
http://www.robertb.darkhorizons.org/SacMonthlyAMS_COOP.htm
The graph doesn’t compare with the raw data at all”
You are certainly right; something odd there.
http://www.vukcevic.talktalk.net/ScrCa.gif
I have made a short program to convert all data (1890 – 2009) to a temperature anomaly (with a 10 year running mean), in order to compare locations.
Ps.
Anyone: Is there a way of finding country and location without unzipping folders?
This is another total lie, All government agencies, the met office in Bracknel all had huge ICL mainframes, some had more than one, these all had a large room with perhaps 40 tape cabinets and another room with tapes in, also hard disk drives, perhaps 20 or so, I do not remember the capacity but as the taxpayer was picking up the tab, endless, this lot was likely to have been backed up on to modern media as technology improved. Yes the likelyhood is that some of this was deleted but more likely hidden in the mountain of data they have on various storage methods, they are career liars.
Well I checked. The UEA does have a history department as well as an American Studies department with more than a few Ph.D.’s wandering around. Why are they not consulting them for guidance? They might tell them that, historically speaking, the lie is often not nearly as damning as the subsequent coverups. That this is precisely the kind of thing that justifies putting the “gate” in Climategate.
Ground temp data from the UEA/CRU are worthless, so what are we left with?
The NOAA data sets aren’t even cited or used for conclusions without James Hansen’s “value-added” laundry services.
The NASA GISS data are now being checked against the raw data (that NASA has now attempted to hide), and what we find are station data that are so corrected, adjusted, homogenized and otherwise morphed into little hockey sticks that they bear VERY little resemblance to the raw ingredients that they were all cooked from.
Which leaves us with what…satellite records that they couldn’t get their hands on, and which have already been thoroughly checked reviewed (and even objected to and subsequently corrected) that haven’t been tampered with, and which show nothing at all unusual?
How about those thousands of “robust” “overwhelming” regional studies around the world that relied on some combination of the IPCC, NASA GISS and UEA data, that cite everything catastrophic warming related under the sun, but aren’t even qualified to conclude changes or human CO2 atmospheric attribution for their observations and conclusions?
Likewise, we have a “putative” consensus that says, in effect, that “The evidence may have been fabricated, but that doesn’t make the story untrue.”?
Ask CBS or Dan Rather how well that line sells.
With regard to the physical evidence for recent Global warming, I consider the NOAA records of the Barrow melting date on Alaska’s North Slope to be convincing. See Fig. 3 in:
http://www.esrl.noaa.gov/gmd/grad/snomelt.html
I have visited Barrow and Sagwon and have experienced the fairly rapid snowmelt on the slope.
“With regard to the physical evidence for recent Global warming, I consider the NOAA records of the Barrow melting date on Alaska’s North Slope to be convincing.”
You do realize that you’ve cited something regional as evidence for something global, don’t you? That’s like citing local weather changes (Kennedy, I’m looking in your Virginial direction), mistaking weather for climate. Would you be equally convinced if you saw record freezing and ice accumulation and growth elsewhere? There’s plenty of that to go around, and not just in one Hemisphere. If I showed only that, I could make a very strong case for catastrophic tipping into the next ice age.
North Atlantic temperature anomaly decreases with the latitude.
All locations are islands far from either urban areas or any large land mass.
http://www.vukcevic.talktalk.net/NA-TA.gif
Happy C’MASS and NEW YEAR to all of you
This, while seemingly nice, means nothing. Releasing homogenized temps and some code that corresponds to their desired output is garbage.
What’s needed for a true picture is the raw data and the code they use for the hokus-pokus adjustments.
This is nothing more than smoke and mirrors.
That’s exactly right. For just two out of literally hundreds of examples, here’s what you get with NASA GISS’ wonderful “USHCN adjusted” and then subsequently “homogenized” station plots:
http://examples.com/giss/santarosa_phased.gif
http://examples.com/giss/marysville_phased.gif
NASA took down its option to view the raw code that you see in the initial frame, leaving only their “adjusted” and then “homogenized” data.
That’s just two examples of garbage. with temp data that we can look and verify for ourselves. It’s bad enough when you can plainly see with a blink comparator what NASA is doing. With the CRU, they’re claiming that they can’t even give anyone the option to check.
Without the raw station plots, the data MUST be assumed as completely worthless as NASA GISS data.
Brass Monkey, you seem to have not read the report you are referring to. I did. You conclude that the 10 day overall increase in early snowmelt is evidence of global warming. I did not see that conclusion in the report.
Steven Douglas and Pamela Gray, when I spent the summer of 1964 on the Slope we could still land aircraft on ice bound lakes in early June and the snow melt happened about mid-June, except in gullies and north facing hill slopes.
NOAA’s data seem to show a consistent trend to earlier melt; although I haven’t been back to check, this seems significant. I don’t think it can be due to more open water along the north coast as it is the rivers of melt water that starts the coastal sea-ice melt, not vice-versa.
The area is underlain by deep permafrost, so unlike the Arctic Ocean ice it cannot be affected by warm water melting from below. (Except in a few places where there are hot springs at the mountain front.)
My guess is that the earlier snow melt is caused by solar radiation and has nothing to do with increasing CO2. The very recent possible reversal could suggest that we are entering a renewed cooling phase.
Perhaps the area could be regarded as a ‘coal-mine canary’ candidate to provide early warning of Northern Hemisphere temperature trends.