Publicly available data being purged at UK's Hadley Climate Centre

CRU_deniedWhile I’ve been getting lots of attention for “take down” of a single file that infringed on my copyright, another, much more broad and serious event is unfolding in the UK at the Hadley Climate Centre.

It appears that the “mole” has caused a Centre-wide panic and they are purging publicly available climate data.

If their climate science is so solid, so unassailable, why would they need to do this? Why hide the climate data gathered from public domain sources worldwide such as NOAA and NCDC? Steve McIntyre tells the story and wonders also. I’m sure all of those who complained in my case will do the same about Hadley, since it will affect the climate community worldwide. I suppose we now have a new term: “Climate Data Deniers” – Anthony

“Unprecedented” Data Purge At CRU

by Steve McIntyre on July 31st, 2009

On July 31, 2009, the purge of public data at CRU reached levels “unprecedented” in its recorded history. Climate Audit reader Super-Grover said that the data purge was “worse” than we expected.

On Monday, July 27, 2009, as reported in a prior thread, CRU deleted three files pertaining to station data from their public directory ftp.cru.uea.ac.uk/.

The next day, on July 28, Phil Jones deleted data from his public file – see screenshot with timestemp in post here, leaving online a variety of files from the 1990s as shown in the following screenshot taken on July 28, 2009.

The following day, the following listing of station data available since 1996 (discussed in my post CRU Then and Now) was deleted from public access: ftp.cru.uea.ac.uk/projects/advance10k/cruwlda2.zip, though other data in the file remained.

This morning, everything in Dr Phil’s directory had been removed.

This is part of a broader lockdown at CRU. Ian Harris, Dave Lister, Kate Willett, Tim Osborn, Dimitrios, Clive Wilkinson and Colin Harpham all altered their FTP directories this morning. Only one directory (Tim Osborn -see below) has added material.

Revisiting the Advance 10K webpage this morning, all Advance 10K data was deleted from their FTP site. None of the Advance 10K data links at www.cru.uea.ac.uk/advance10k/climdata.htm work any more.

If you go to the directory page ftp.cru.uea.ac.uk/projects which formerly hosted ftp.cru.uea.ac.uk/projects/advance10k directory, it now contains only two directories between Sept 1999 and the present, both dated 8/1/2008, but containing data from 2001.

On July 31, 2009 at 10:41 am, Tim Osborn published a webpage entitled “controversy.htm”. It is located in a folder entitled www.cru.uea.ac.uk/people/timosborn/censored/ and the webpage www.cru.uea.ac.uk/people/timosborn/censored/controversy.htm itself is of course censored.

I presume that the data has not been totally destroyed, only that, after many years of public availability, it has been put under lock and key. It’s as though CRU is having a collective temper tantrum.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
108 Comments
Inline Feedbacks
View all comments
M White
August 2, 2009 3:03 am

Philip_B (00:26:03) :
“Many public bodies make money from selling their data, which requires that it be proprietary, and I believe the UK’s Met office is one of them.”
I am a British Tax payer. That makes me with all of the other British Tax payers proprietors of the Met Office (and the data). I can’t see any national security implications on this data being in the public domain, so whats the problem

fredo
August 2, 2009 3:13 am

Cleaning up ftp-servers every once and a while is standard practice in every company and institute in the world.
The only thing this cleaning is good for is to demonstrate that climate skeptics indeed are not so much skeptical towards the science, but rather have a paranoid vision towards the world around them.
Which of course has been known for ages

Wade
August 2, 2009 4:43 am

Philp: Wade, this may surprise you, but the US Constitution has no legal standing in the UK.

I know that. But this is an American website and they are being accused by Americans for actions which they have no proof, just coincidences. They should know better. It was also to show how our rights are slowly being justified away. “Oh … it looks bad, therefore they are GUILTY!” Prove it! We ask the global warming believers to offer proof, but when we have to do it ourselves, we justify it away. Skeptics are always called apologists. Our rights should be upheld. The right to be innocent until PROVEN guilty is a fundamental human right. Don’t let the eco-Communists win by taking it away.
It is an easy trap to fall in to.

RunFromMadness
August 2, 2009 5:12 am

Found something interesting on CRU’s FTP server
First Dimitrios…
http://img269.imageshack.us/img269/5906/dimitrios.png
…who was working for a guy called Le Chiffre…
http://img510.imageshack.us/img510/8427/007dimitrios.jpg
…who was working for a secret organisation called Quantum which was run by an environmental group called Greene Planet which belonged to the Al Gore-like Dominic Greene
http://img197.imageshack.us/img197/5314/greene.jpg
Then another intriguing find is project SPECTRE…
http://img9.imageshack.us/img9/7429/spectre.png
…which has been known to stand for SPecial Executive for Counter-intelligence, Terrorism, Revenge and Extortion, which is what Greenpeace gets up to quite often. SPECTRE used to be led by this animal loving, anti-development, anti-industry tinpot…
http://img6.imageshack.us/img6/3357/blofeld.jpg
So there you have it folks!

Tim McHenry
August 2, 2009 5:47 am

Re: Philip_B (00:26:03
Certainly, I was not arguing the present state of things but what they SHOULD be doing. All I hear is “confidentiality agreements” (though they are never explained – what agreements??) or that the government or institutions may indeed WANT to make money off of this data. In the latter case I am arguing that they SHOULD NOT be publicly funded!

wws
August 2, 2009 5:48 am

Oh yes, Fredo, Hadley makes a huge fuss about refusing FOI requests, refusing to hand over public data to anyone who might “just try to find something wrong with it”, then find out that some of the “top secret” (ie, public) data has sneaked out anyways and – OF COURSE! it’s just a little cleaning, standard practice in every company in the world!
The people doing this really aren’t worth having their boots licked, Fredo. Think about that.

Mike Monce
August 2, 2009 6:03 am

“Dude, in science the hypothesis is always assumed guilty and does not have the right to remain silent or not have it’s data or methods searched.”
LOL!! Wes George: can I use your line in all my intro physics classes? I don’t think I’ve ever seen such a succinct, and humorous statement of the “scientific method”.

bill
August 2, 2009 6:10 am

Its all about when the FTP site was created/laziness/students.
This ftp site was used by many people students/researchers/staff/CRU
It was used to transfer data from person to person to home etc
It was used to receive/transmit data from/to other external sources
The structure and directory names are standard on many setups and are not particularly meaningful.
When it was set up the internet was a much more technical (and friendly) place where there were not many hackers trying to destroy your system. Security was not uppermost in peoples minds.
Students directories would be created when they arrive and should be archived and removed when they leave. Students are students and will store garbage in silly directory names if they are allowed!
The University of East Anglia of which CRU is a part is like all british universities – underfunded.
IT in the early days would havebeen by some student who knew about “these things” not by a specialised department.
For nearly 30 years the ftp setup was adequate.
McIntyre with a bunch of followers then comes along and starts roaming the inadequately protected servers.
The UEA IT dept gets alerted by the activity of McIntyre and his cohorts and asks that what should be used for temporary files gets cleaned, and then only used for its designated purpose.
Unecessary files get deleted – the current situation.
I would imagine McIntyre and gang has scraped the data from each and every available directory, So there is no need for CRU to retain it!!
as fredo (03:13:37) says to think directory cleaning is to hide data is paranoid.
In the 1980s it is quite possible that verbal gentlemen agreements were made that prevented passing on of commercial data:
Conditions of a Verbal Agreement
Under law there are two basic terms that constitute a binding agreement. The verbal agreement will be binding if there was an agreement on the services to be performed and an agreement was reached on remuneration for this service. This agreement can be reached by a verbal exchange in person, via telephone or via an email.
http://www.contractsandagreements.co.uk/law-and-verbal-agreements.html
Remember there was no internet as we know it. there was no pdf files and operating system/data/and office applications all fitted on one floppy disk. Agreements if not verbal would have been paper – not as easy to search for as on a hard disk!
The CRU has been around since the 70’s under different leadership:
“Hubert Lamb retired as Director in 1978. He was succeeded by Tom Wigley (to 1993), Trevor Davies (1993-1998), Jean Palutikof and Phil Jones (jointly from 1998 to 2004) and Phil Jones (to the present).”
http://www.cru.uea.ac.uk/cru/about/history/
The Ftp site was created before security was an issue – few password protected directories
Laziness – users have nor removed files transfered to them, after they have reteived them.
Students are notoriously bad at housekeeping, security, and invalid use of storage.
Why make a conspiracy out of it?

Bill Hunter
August 2, 2009 6:30 am

Wade (16:52:02) :
“I think it is a little pre-mature to assume this is because “they” have ulterior motives.”
“I think of this like the right to remain silent and right to not allow search without a warrant. Some may say that if you have nothing to hide then you should allow your property to be freely searched by law enforcement. The US government was founded on preventing such broad assumptions.”
Nothing premature whatsoever. By definition their motives are ulterior as they have not stated nor revealed them.
As far as freedom from unreasonable search and seizure goes there are two problems.
One is it applies to individuals not government entities. There is an accountability issue here. Governments tend to hide science to achieve their political ambitions.
Second, we are not talking about law enforcement here. We are talking about a public wanting of supporting information for the science that they are going to be taxed over and their freedoms limited.

An Inquirer
August 2, 2009 7:22 am

fredo:
Periodic cleaning up ftp-servers, etc. is indeed often a standard practice, but it is not standard to start such clean-ups when your data is under a data request. In fact, Arthur Anderson was convicted of a felony when it followed its pre-established data-retention policy. While I do not believe that Jones and Hadley are subject to any investigation that would make its clean-ups a potential crime, still there is no evidence or suggestion that it is following a standard practice.
Indeed the paranoid behavior belongs not to the skeptics but to many leading Global Warming Pessimists who vehemently object to release of data, algorithms, and documentation that would support their conclusions.
I used to be a Global Warming Pessimist, but the refusal for proponents of CO2-based GW to be open about their studies was one of the reasons that I have departed from their camp.

pyromancer76
August 2, 2009 7:22 am

Anthony, you and Steve McIntyre are performing a great service to all science and scientists worthy of the name. All those apologists for the CRU purge of public data (with no explanation as to when it will be restored) will be quite chagrined when they find themselves impoverished by taxes and ever expanding fees from “cap-and-trade-and-sequestering” of the relatively puny human contribution to CO2 in the atmosphere. (Google Large Igneous Provinces and see their massive spouting of a variety of atmospheric gases.)
Raw data files gathered by tax-payer money must remain uncompromised and accessible to those who paid for it. All data necessary for scientific endeavors must be available to all scientists.
Christopher Booker has it about right: The 1990 head of the Met Office, Sir John Houghton set up the Hadley Center in Exeter; it was linked to the Climate Research Unit (CRU) at U of East Anglia; the result would be “a record of global temperatures based on surface weather stations across the world — a data set known as HadCrut”; Sir John and his CRU ally Professor Phil Jones programed the HacCrut computer models to conclude ANTHROPOGENIC CO2 GLOBAL WARMING; “Sir John played a central role in running the IPCC, selecting many of the contributors to its reports”; he and Prof Jones were prominent champions of the IPCC’s notorious “hockey stick” graph. The stench arising from these malicious-to-science activities is overwhelming.
It is time to through the blaggards out! Or, “Arrr, keelhaul the blaggards!” All “scientists” who are practicing the black arts of denying access to data, deleting data, or cooking the data should begin to wake up to their dark future. I hope Great Britain’s elections next year will do just that to prevaricating politicians. While we are at it, U.S. voters should end the far leftist experiment that began here in January of this year. All leftist are authoritarians — there will be no science worthy of the name under their regime.

savethesharks
August 2, 2009 7:23 am

“Tim McHenry (05:47:50) :Certainly, I was not arguing the present state of things but what they SHOULD be doing. All I hear is “confidentiality agreements” (though they are never explained – what agreements??) or that the government or institutions may indeed WANT to make money off of this data. In the latter case I am arguing that they SHOULD NOT be publicly funded!”
Could not agree more.
Furthermore…I don’t care if its the UK..USA…wherever…it is a really bizarre, unacceptable curiosity in today’s world…namely:
Taxpayer-funded “public servants” who have unmerited and ill-gotten “private-sector egos”!!
Matters of national security withstanding, f you are funded by the public, then you belong to the public, period!
To put it another way…[and listen to this all you big-egos in public places….I wont mention any names JAMES HANSEN]….to put it another way:
If the public underwrites your research, or cuts your paycheck, then you are, in essence, the public’s bitch!
CHRIS
Norfolk, VA, USA

Robert Wood
August 2, 2009 7:37 am

Bill, I’m not claiming conspiracy or anything at all.
It’s simply amusing to watch HADCRU cover their arses in a hurry.

bill
August 2, 2009 8:06 am

pyromancer76 (07:22:10) :Raw data files gathered by tax-payer money must remain uncompromised and accessible to those who paid for it.
Was it paid for? data was “obtained” from institutions who would nomally charge. Should this be given freely over the internet when you do not own it?

rbateman
August 2, 2009 8:21 am

The whole thing smells of tape erasing/file shredding.
I am not a crook.
Fret not, the smaller fish will get thrown under the bus.
The smarter agencies who realized they were being used distanced themselves.
Next up: Who knew what/when and internal fingerpointing as things go very badly for the hucksters.
When natural phenomena turn hard againt the agenda, the handwriting is on the wall as to the investigation.

ex-believer
August 2, 2009 8:24 am

Come on fredo, you have to admit the timing on this file deletion thing smells a little, I don’t know, fishy?
SMMOOOOOCHH!! I know it was you HADCRU, you broke my heart. You broke my heart…

Reed Coray
August 2, 2009 9:05 am

bill (06:10:41) wrote: “Why make a conspiracy out of it?”
Answer: The timing.

Kevin Kilty
August 2, 2009 9:26 am

John Edmondson (01:29:15) :
You wrote…
“I later asked if the model could be run from 1600 AD , but all I got back was some scientific paper sidestepping the question.
Can anybody answer my question as to whether running the model backwards in time or running from 1600 AD is a valid method of testing it’s accuracy?”
It appears that no one has answered your inquiry, so I shall try to do so.
First the issue of running forward from 1600AD: Running from 1600AD might be a valid method of testing the accuracy (reliability might be a better term) of the simulation program except that one possibly needs much better information regarding initial conditions circa 1600AD than one really has. In other words, if the modeling program results do not match the reality of the past 409 years, is the departure due to algoritm problems, or due to incorrect initial conditions?
The question about running in reverse is more complex: I presume you mean run in reverse and duplicate climate that we have already observed, and in doing so validate the algorithm or code in the same manner as running forward from 1600AD? In the simplest sort of idea one might think to just input a negative Delta time step with current conditions and let the program run. I doubt this could even work in practice because the program was never intended to run thusly, and so quantities that in reverse should have a negative increment, may have a positive one just the same.
Moreover, even if the program could run in such a manner, the idea is just plainly wong in theory. If any portion of the simulation involves dissipation, then information is lost in the forward time step, and to try to simply reverse things will lead to completely wrong results. Turbulence, mixing, diffusion, and so forth are all dissipative processes that destroy information in the forward direction. Once information is irreversibly gone, to try to reverse the process and try to run backward in time generally results in oscillation. To suppress the oscillation one reduces resolution through smoothing because the oscillation appears first at small spatial scales. Eventually smoothing eliminates all resolution completely.
In many disciplines in geophysics we try to go backward against irreversible processes as a matter of data analysis, (analytic continuation of potential fields in space, time reversing of acoustic fields in seismology, time reverse heat conduction to retrieve past climate, which was really a worthless endeavor, and so forth). I have been associated with all these ideas in one way or another. The results? Resolution is often poor, results depend very greatly on noise in the data, and are suspect as a result, results are ambiguous, and so forth.

Jim
August 2, 2009 9:33 am

*******************
bill (08:06:46) :
pyromancer76 (07:22:10) :Raw data files gathered by tax-payer money must remain uncompromised and accessible to those who paid for it.
Was it paid for? data was “obtained” from institutions who would nomally charge. Should this be given freely over the internet when you do not own it?
**************************
I think the question of who owns the data and computer code is a red herring. The real question here is should such data be used for scientific works. I believe the answer to that is a resounding NO!! No public scientific journal should publish works that use secret data and methods, digital or analog. Anyone should be able to access data and methods so that it can be verified or refuted. So OK, let it remain a secret, but don’t allow it to be used for scientific endeavors or allow those works to be published as science. It isn’t science if it can’t be verified.

pyromancer76
August 2, 2009 9:33 am

Bill, this data is for a public record of global temperatures which will be (is being) used to make public policy. No secrecy, ever. If necessary, pay for the data. And I can imagine many freeby quid-pro-quos that might be arranged in order to acquire data. Or, you scratch my back and I’ll scratch yours. Public data. No deals.
Bill, this is about science, not “ownership”.

Pamela Gray
August 2, 2009 9:36 am

I am just thinking out loud here: If the raw data was subjected to a proprietary manipulation/adjustment/fudge factor and then tabulated or listed, it is no longer raw data and thus WOULD have copyright protection. I have noticed that each private/public group that gives us graphed temp data has their own version of adjustments. That process would result in copyrighted data files. What would then be done under normal published peer reviewed practice is to write an article describing the manipulation/model/algorithm so that others can critique and discuss its merits and application on the raw data.
To that end, in order to free up space for scientific discussion instead of all this feather ruffling, the publicly funded entity that collects temperature data from the sensors should have this raw, unadjusted temperature data available to the public in two forms, raw actual and raw anomoly. Then each of us can have a go at it with our own algorithm adjustment as we see fit and post it as copyrighted material.
To that end, I think we should be knocking at the door of the supplier of the raw data, not an end user of it.

red432
August 2, 2009 9:46 am

as much as i’d like to see some of these guys paraded down the street with dunce caps like in the cultural revolution, I think this is part of a routine clean up that may be highly commendable. distributing undocumented half baked data is probably not a good idea. they should publish all their data from public sources, together with documentation about where the data came from and how it was derived.

PM
August 2, 2009 9:46 am

Im my opinion, when someone or some institution wants to change our life, our fiscal taxation and many more impositions to us, and uses that data to impose that changes, should give us all data. Freely and openly to confirm their studys or their propaganda. Otherwise its ditadorial imposition, as any communist system.
We dont need to have to pay for that data because its used to impose us their mantra. And we need to check their claims. Otherwise is a dictadorship system.
They dont have the right to impose us their policies, using that data and hiding that data to us, to check the reliability of their claims.
Its not a question of copyrights. Its a question of Democracy, Freedom and basic Human Rigths.
Im my opinion, were suffering the abuses of a new modern totalitarism. Were living in Orwells Times. This is the new comunism or new nacional-socialism. This is the western civilization in decadence.
This is a new brave world. The world where the State has the power, the means and the desire to control us as they want, to enslave us, as they conspirate to treat us like animals.
Sorry by my bad english.

MikeU
August 2, 2009 9:47 am

The most likely explanation for this is simply that they were negligent with their data, never intending for it to be public. None of their links pointed into those directories. They did have reason to want to keep it private, since Phil Jones apparently makes everyone he gives the data to that they won’t give it to anyone else. I agree that’s bogus in science that’s publicly funded and has no compelling reason (such as classified research) to keep it private. However, I those are their current ground rules.
Once it became clear that this gargantuan “hole” had been found, they undoubtedly panicked, moving the data somewhere else until they can restrict access to it the way they want to. Without such steps, people like Phil Jones would stop providing them with data (or worse, demand that the data they already had be deleted, not just moved to a more secure area).
Long-term, the proper way to deal with this is of course to change the rules, so that such data cannot be hidden from fellow scientists and academics. Do that, and the gate-keepers of such data will either relent and abide by the rules, or else become irrelevant over time, replaced by researchers who will.

Steven G
August 2, 2009 9:47 am

CRU has just shot themselves in the foot. By making their data so secretive and unverifiable, they’ll also be taking themselves out of the loop when it comes to credible research. Researchers (even those with confidential agreements providing access to CRU data) will be forced to use other data sources because no sane person in the scientific community should believe research conclusions based on data that is unverifiable. Consequently, if I was a climate researcher, I wouldn’t waste my time using this data. I’d use other sources that my peers can corroborate.