CRU Emails "may" be open to interpretation, but commented code by the programmer tells the real story

When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:

From a yahoo.com news story:

In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.

Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.

Dr. Jones responded.

However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.

Ok fine, but how Dr. Jones, do you explain this?

There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:

People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:

Here’s the code with the comments left by the programmer:

function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$

datathresh=datathresh

;

; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES

; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate

; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE

; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE

; USUAL correlate FUNCTION ON THE RESIDUALS.

;

pro maps12,yrstart,doinfill=doinfill

;

; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

;

and later the same programming comment again in another routine:

;

; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD

; reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

 

You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”,  but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.

Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.

Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.

For more details, see Mike’s Nature Trick

UPDATE: By way of verification….

The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file

/documents/osborn-tree6/mann/oldprog

in the files

maps12.pro

maps15.pro

maps24.pro

These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.

UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:

================

From documents\harris-tree\recon_esper.pro:

; Computes regressions on full, high and low pass Esper et al. (2002) series,

; anomalies against full NH temperatures and other series.

; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

;

; Specify period over which to compute the regressions (stop in 1960 to avoid

; the decline

;

Note the wording here “avoid the decline” versus “hide the decline” in the famous email.

===============

I’ll give Dr. Jones and CRU  the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.


Sponsored IT training links:

Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.


0 0 votes
Article Rating
480 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
stevemcintyre
November 22, 2009 8:22 pm

The quote and credit is due to reader “Neal”.

Noelene
November 22, 2009 8:30 pm

Keep it up,I am watching with fascination.You(and the others who love science) must be so angry,they have besmirched the whole field of climate science.Dirty rotten scoundrels.Politicians are carrying on as if this whole issue will go away soon,I hope it doesn’t.

Policyguy
November 22, 2009 8:32 pm

The trillions and gazillions of dollars that are now at risk is the pathetic consequence of a programmer gone berserk. What a travesty. Can it be corrected? Who knows. A call to Science is in order so that reliable individuals can correct this unballance between political/social myth and reality. The challenge will be, how to encourage informed individuals to speak up.

November 22, 2009 8:34 pm

I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960”, because the results are unreliable. So is there any indication that anyone has plotted beyond 1960? This came up on the Bishop Hill thread, where he drew attention to an email by Tim Osborn where he said that they never plot some treering set beyond 1960 because of a divergence issue. Turns out that that is what Briffa/Osborn say also in Briffa et al 2001. This Briffa/Osborn context may be unrelated, but it seems to me that it may simply just mean what it says. Don’t plot beyond 1960 using this code. And people don’t.

Fred
November 22, 2009 8:35 pm

Its highly unlikely mere facts will be able to stop the AGW religion for advancing from one victory to the next. After all, AGW was never about science, it was about raisng taxes and controlling other people’s lives.

November 22, 2009 8:36 pm

[snip]
Add that to violating FOIA.
When we’re done, we’ll be able to throw the book at them.

Mann O Mann
November 22, 2009 8:36 pm

But remember – if you question Real Climate Scientists ® then you are a DENIER.
Oooga Boooga!
/sarcasm

Glenn
November 22, 2009 8:38 pm

I’ve been unable to find the definition of “artificial adjustment” in the climatologist’s handbook. It must be called something else. Sure sounds like something else to me. More! More!

November 22, 2009 8:40 pm

Spin that to the moon.
Looks like a duck, quacks like a duck….smells like a duck’s butt.

November 22, 2009 8:41 pm

If that’s the best you can do then you’re whole case is in trouble. Data from independent sources often has to be massaged (the programming terminology is “munged”) in order to make independent series compatible with each other.
The most interesting thing about this is the practice of “copy and paste” coding indicated by the duplicated comments, which ‘real’ programmers don’t like but which are an unfortunate necessity when programming for scientific research.

Alvin
November 22, 2009 8:42 pm

Very telling. I just authored a letter to Senator Graham asking if he is keeping up with this new finding. I also insisted that he provide a public appology to the people of South Carolina and to Senator James Inhofe.

BarryW
November 22, 2009 8:43 pm

[sarcasm on]Don’t you understand? There is reality and then there is TRUTH. If reality doesn’t fit the TRUTH then reality must be adjusted. [sarcasm off]

November 22, 2009 8:44 pm

No wonder they refused to release it even under FOI.

Richard Sharpe
November 22, 2009 8:44 pm

When people use cut-n-paste coding, they sometime even copy the comments and forget to change them.

November 22, 2009 8:47 pm

WHOA!
I can’t believe it!
One of Germany’s biggest highly respected dailies has it on the front website page.
http://www.welt.de/wissenschaft/article5294872/Die-Tricks-der-Forscher-beim-Klimawandel.html#xmsg_comment

Viktor
November 22, 2009 8:50 pm

Uh, wow. Some fence sitters wanted hard evidence, more than what they perceived to be mere conjecture within the email spool. Well, there you go.
This story has reached another level. Hard evidence of such blatant data manipulation mustn’t be allowed to simply vanish into the news cycle. I hope the few MSM outlets accurately reporting this story pick up on this, because the notes in the code indeed appear to be a smoking gun.

philincalifornia
November 22, 2009 8:51 pm

It may have been posted before so, if so, apologies but I do like this one – to be found in the HARRY_READ_ME.txt file:
“OH F*** THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done I’m
hitting yet another problem that’s based on the hopeless state of our databases. There is no uniform
data integrity, it’s just a catalogue of issues that continues to grow as they’re found.”
(I confess to manipulation of the f-word myself, but only to comply with WUWT policy).
So Copenhagen Comrades, what’s a trillion dollars or so here and there based on “no uniform data integrity” ??

UKIP
November 22, 2009 8:52 pm

No resignations or sackings yet then?

Hank Hancock
November 22, 2009 8:54 pm

I think it is noteworthy that Steve McIntyre comments on Mann and Briffa truncating their MXD data at 1960.
http://www.climateaudit.org/?p=4221

artwest
November 22, 2009 8:54 pm

A poster called Asimov has quoted extracts from the “HARRY_READ_ME.txt” file – deeply shocking stuff:
Asimov’s very plausible suggestion is that Harry is a programmer trying, and often failing, to make sense of the garbage data which he’s been lumbered with.
Several posts over several pages:
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13

November 22, 2009 8:57 pm

Can someone point me what file is this code from?
I have downloaded the package, and found only one file (FOIA/documents/osborn-tree6/mkp2correlation.pro) that includes the above mentioned function. Lines 1 to 7 are identical, but the rest has nothing to do with the screenshot above.

Leon Brozyna
November 22, 2009 8:58 pm

Emails may be just chit-chat between various parties, but the coding is the receipe for cooking the books. No wonder AGW leaves such a bad taste — it’s way too overcooked leftovers.

November 22, 2009 8:58 pm

Stokes: the real issue is that temperatures derived from tree rings are known to not match measured temperatures after 1960.
If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?

Michael Jankowski
November 22, 2009 9:01 pm

Nick,
Exactly who is responsible for “artificially adjusted to look closer to the real temperatures” that the code is advising against plotting, and why would they have done such a thing?
I agree that it’s no problem with the code saying not to plot past 1960, but it is certainly a problem that the code says someone has taken liberty with post-1960 data (or the methodology used to process it) for the purpose of making it “look” more like the instrumental record.

D. King
November 22, 2009 9:01 pm

Wow, the coming week will be most interesting.
Good job Anthony, Steve, and all. Thank you.
Time to look at the sea ice satellite AGC, pointing,
and receiver gain. I think we may find some missing ice.

Harold Vance
November 22, 2009 9:02 pm

The really sad thing is that the dendros still have no clue why trees make good thermometers in some years but not others. This is a bigger issue, imho.

Gerald Machnee
November 22, 2009 9:08 pm

RE: Nick Stokes (20:34:52) :
**I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable**
Maybe it means “don’t plot beyond 1960 because the results do not show what we want”?????

Jon Adams
November 22, 2009 9:09 pm

The CRU team and maybe all of these “AGW researchers” are clueless… they are way over their heads and inclined to cheat – wait…its defraud as adults – especially with literally trillions of dollars involved.
How many billions has the US and other countries wasted on this BS…
I want my money back!
One may surmise they did not want any Real Programming Talent Aboard to maybe OUT them? so they tried to fake it themselves…
Anyway… thank God for Anthony… attempting to get some quality to the data sets involved!

rbateman
November 22, 2009 9:09 pm

philincalifornia (20:51:09) :
Yep, saw that, and that’s not the only place he used it. Leaves little doubt that the data was pre-slaughtered, does it not?
Now, who exactly is HARRY?

Jeff Coatney
November 22, 2009 9:13 pm

Unbelievable!!!!!!!!!!!!!!
Or, perhaps, not so unbelievable.
There are ways to game any system. All it takes is a person or persons clever enough to formulate effective methods of cheating.
In the end, what counts more than anything else is the ability to rely on the word of others. To the extent that people are more or less untrustworthy does the potential for dishonesty rise or fall.
Is it just me, or is the societal willingness to indulge in unethical behaviour presently on a significant and massive upswing – Perhaps a little like our friend, the hockey stick curve.

Eric Barnes
November 22, 2009 9:13 pm
Doug in Seattle
November 22, 2009 9:14 pm

P Gosselin (20:47:32) :
WHOA!
I can’t believe it!
One of Germany’s biggest highly respected dailies has it on the front website page.

And in English here – http://translate.google.com/translate?u=http%3A%2F%2Fwww.welt.de%2Fwissenschaft%2Farticle5294872%2FDie-Tricks-der-Forscher-beim-Klimawandel.html%23xmsg_comment&sl=de&tl=en&hl=en&ie=UTF-8

AlexB
November 22, 2009 9:14 pm

The alarmists have gotten so good at misdirection that they get me every time. Their comments are never on the real issue. When I read Dr. Jones explanation of the word ‘trick’ I thought that it was perfectly reasonable so decided to lay that comment to rest. I re-read the e-mail later though and thought ‘hang on!’. Is the word ‘hide’ also commonly used by scientists to mean something other than its common usage? I can’t believe they got me again!

November 22, 2009 9:14 pm

Roger Sowell (20:58:33) :
Stokes: the real issue is that temperatures derived from tree rings are known to not match measured temperatures after 1960.
If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?

That’s not a coding issue. What they say in Briffa 2001 for the Siberian trees is:
““The period after 1960 was not used to avoid bias in the regression coefficients that could be generated by an anomalous decline in tree density measurements over recent decades that is not forced by temperature””
The claim seems to be that they have specific information about a post-1960 divergence, and presumably enough pre-1960 instrumental overlap to satisfactorily calibrate. Now I can’t judge the strength of that, but it’s been discussed in the literature for nearly a decade. The answer won’t be found in a comment in the code. The comment merely reflects the limit stated in the published theory.
REPLY: In other work, Briffa allowed 10 Yamal trees, an unacceptably low sample, to stay in. When he says “anomalous decline in tree density measurements over recent decades” why then would he think 10 trees as a sample upon which to base that the datapoints are OK? As Steve has said many times (and other dendros agree – they say 50 is the minimum sample), Briffa should have truncated that Yamal sample and data. He didn’t. So why is it OK to not truncate a very low sample in one case and not another? Doesn’t make any sense. – A

Editor
November 22, 2009 9:18 pm

Looking in the file HARRY_READ_ME.txt there seems to be some suspicious coder comments as well:
“This still meant an awful lot of encounters with naughty Master stations, when really I suspect nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option – to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad, but I really don’t think people care enough to fix ’em, and it’s the main reason the project is nearly a year late.”
“You can’t imagine what this has cost me – to actually allow the operator to assign false WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’ database of dubious provenance (which, er, they all are and always will be).”

Jon Adams
November 22, 2009 9:19 pm

Everyone… This is Not just about Money … it is about Tyranny… the politicians want to control us till we have no soul…
And we have the Nick Stokes who want to jump on a mention of 1960… temp mismatches… the entire picture is what you need to look at, Nick… study some more and you will begin to see a pattern.

Policyguy
November 22, 2009 9:21 pm

Roger Sowell (20:58:33) :
Stokes: the real issue is that temperatures derived from tree rings are known to not match measured temperatures after 1960.
If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?
Well, perhaps that is a place to start. According to John Dally we know that trees don’t grow on 70% of the earth’s surface (oceans), they don’t grow in deserts or at high elevations. The 15% of the earth’s surface where trees do grow are in those locations where they may also impacted by lack of light (other trees), lack of water (draught), to the extent – that temperature can not be isolated as a cause for growth during any period. It is a false measure.
So lets challengeh tree data as a temperature surrogate altogether. That will take care of Mr. Mann and other tree persons and throw out the hockey stick. Apparently, according to his now published email, even Mr. Revkin now agrees with that.

November 22, 2009 9:23 pm

I agree (as a software developer, especially) comments in code are very rarely any kind of ‘mistake’ as such, although many are out of date, it is true.
Overall, this, the email, and what I suspect to be gleaned from the data, the whole story is a shambles. There is only one possible thing that can be done to salvage the credibility of anyone in the field who supports AGW:
1. All existing data and conclusions should either be examined for veracity or dismissed.
2. There must be a new, non IPCC and non-UN controlled, globally funded task force to re-examine the whole AGW issue. Existing AGW believers and sceptics should be included, especially those who have shown extraordinary effort and dedication in the field to date.
3. The entire investigation should be transparent to the participants, funders (one assumes governments) and also the public. The Internet is a good medium for such a task, as has been proved.
4. All political involvement must be prevented. That cannot be stressed enough.
5. All commercial involvement should be prevented. Oddly enough, I would support ‘big oil’ etc as it seems they are gearing up very swiftly to get ahead of the game in renewable, as is sensible. I suspect that many would cry foul, however.
6. No taxes or political changes should be introduced that rely on the AGW theory being accurate until it is proved that CO2 increases will cause dangerous changes to the climate.
Just my 2c worth…..

Jesse
November 22, 2009 9:24 pm

Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
REPLY: So what do you do down there in Norman? NSSL? U of OK? You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A

John F. Hultquist
November 22, 2009 9:24 pm

What I interpret this to mean is that the results of their work pre-1960 are as they think they should be. Post-1960, not so much. Post-1960 results have to be adjusted to reflect …. what?
a. a previously non-operating variable, now operating;
b. a previously operating variable that no longer operates;
c. some combination of a & b;
d. a hunch based on years of experience;
e. a revelation from God;
f. other
The code used to ‘artificially adjust’ the results so they will ‘look closer to the real temperatures’ ought to explain which of the above scientific procedures was used. If this code is not in the posted material, I am sure the researchers will happily provide it.
~~~~~~~~~~~~~~~~~~~~
A few months back WUWT had a discussion on programming so, not to go over all that again, I’ll just say many of us (in years gone by) wrote our own code and/or subroutines that were used over and over. I always put a few comments at the top stating a few common things, such as, my name, and the purpose of the code, and the language (FORTRAN 2D, IV, ??). Is there a complete routine in this dump that would identify the programmer?

wes george
November 22, 2009 9:24 pm

Wait and see, the True Believers will soon come out with a “Fake but accurate” defense. Remember, the fate of the whole world hangs in the balance, so what’s a few lies, a bit a fraudulent science and bullying matter as long as the Green agenda of zero-growth, centrally mandated economies and increasing restriction on individual liberties is moved forward.
The end justifies the means in AGW ethics.

Glenn
November 22, 2009 9:26 pm

Richard Sharpe (20:44:52) :
“When people use cut-n-paste coding, they sometime even copy the comments and forget to change them.”
Oh that’s encouraging, the fate of the world hanging on shade tree coders.

WakeUpMaggy
November 22, 2009 9:28 pm

Maybe one of the programmers was the mole.

Dr A Burns
November 22, 2009 9:29 pm

Another strange happening at Hadley … all the hadcrut3 data for this year, except Jan/Feb, has been deleted.
http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt

John F. Hultquist
November 22, 2009 9:32 pm

P Gosselin (20:47:32) : You likely know this but I had to have it translated, except for the two opening words
Die Tricks der Forscher beim Klimawandel
via Google translate
The tricks of the researchers on climate change

Editor
November 22, 2009 9:34 pm

Also, in the Word file jones-foiathoughts.doc it states:
“Options appear to be:
1. Send them the data
2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.”
Number 2 seems to indicate the particular station data that they were trying to hide…

November 22, 2009 9:37 pm

rechauffementmediatique (20:57:34) :
I found these comments in osborn-tree6/mann/oldprog in files like maps12.pro and
maps15.pro. These were dated Jan 2000, and the directory name is not encouraging. Seems unlikely that it is currently used code.

Dave Johnson
November 22, 2009 9:44 pm

The Times here in the UK now has the CRU story on one of it’s major comment pages
http://www.timesonline.co.uk/tol/comment/columnists/guest_contributors/article6927598.ece

Neil O'Rourke
November 22, 2009 9:46 pm


rechauffementmediatique (20:57:34) :
Can someone point me what file is this code from?
I have downloaded the package, and found only one file (FOIA/documents/osborn-tree6/mkp2correlation.pro) that includes the above mentioned function. Lines 1 to 7 are identical, but the rest has nothing to do with the screenshot above

documents\osborn-tree6\summer_modes\maps12.pro

philincalifornia
November 22, 2009 9:53 pm

artwest (20:54:26) :
A poster called Asimov has quoted extracts from the “HARRY_READ_ME.txt” file – deeply shocking stuff:
Asimov’s very plausible suggestion is that Harry is a programmer trying, and often failing, to make sense of the garbage data which he’s been lumbered with.
Several posts over several pages:
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13
____________________________
I also lifted this from a comment over there:
“Tim Mitchell works at the Climactic Research Unit, UEA, Norwich, and is a member of South Park Evangelical Church.”
South Park ?? You’ve got to be kiddin’ me ??
PS I also tried to check out if he really did spell it “Climactic” but, strangely, the web site appears to be down. Heh heh heh.

Aligner
November 22, 2009 9:57 pm

New article from Christopher Brooker in the Daily Mail this morning …
The devastating book which debunks climate change
Front page link too.

Doug
November 22, 2009 10:00 pm

Tongue in cheek:
Dr. Mann should have known when he sat out picking cherries that cherry wood is far too brittle to make a hockey stick. Perhaps he should have realized that if he were working with the proper material he might not have made an ash of himself.

Aligner
November 22, 2009 10:03 pm

Another Daily Mail article by Daniel Martin …
How climate-change scientists ‘dodged the sceptics’
Nice image of Jones (TIP: Place your mouse cursor over it!)

Robert Burns
November 22, 2009 10:04 pm

Re Nick Stokes (20:34)
I copied this from “The Air Vent” http://noconsensus.wordpress.com/
Kenneth Fritsch said
November 22, 2009 at 10:30 pm
The scientist is going to concentrate on what is unsual about the proxy, i.e. the proxy does not respond correctly to temperature after 1960. She would not think of plastering something into that time period to deflect attention. She would want to talk about and let you know what she has found – and attempt to explain it.
The advocate and all his defenders, on the other hand, will want to sell their message and when they are questioned will in bewilderment say ” well of course we would not use “bad” proxy data when we have “good” instrumental data and how dare you think we intended to deceive.

Gene Nemetz
November 22, 2009 10:08 pm

Drudge gets +20 million hits a day, has this in right column :
Hostility among foes…
…Britain’s Climate Research Unit of the University of East Anglia reveal an intellectual circle that appears to feel very much under attack, and eager to punish its enemies….”Kevin and I will keep them out somehow — even if we have to redefine what the peer-review literature is!”…”I will be emailing the journal to tell them I’m having nothing more to do with it until they rid themselves of this troublesome editor,”…

http://www.drudgereport.com/
http://www.washingtonpost.com/wp-dyn/content/article/2009/11/21/AR2009112102186_pf.html

Editor
November 22, 2009 10:13 pm

I started looking at some of the non-Email and quickly found myself in the midst of documents/HARRY_READ_ME.txt . It’s a dairy of a three year saga to bash a lot of code to run on a new platform and his dealings with data files in as bad shape as the code.
This post didn’t exist at the time, plus the file would be more interesting to the denizens of Chiefio, so I posted http://chiefio.wordpress.com/2009/11/21/hadley-hack-and-cru-crud/#comment-1664 over there.
I’ve written similar diaries, but not for a project this long or with code this fouled up. Programmers will appreciate it the most (especially those with long weekend nights under their belt).

November 22, 2009 10:18 pm

Anthony,
REPLY: In other work, Briffa allowed 10 Yamal trees…
I didn’t want to get into dendro issues here. The thread is about the meaning of a comment in the code, and all I’ve been saying is that the comment is consistent with what is done in the 2001 Briffa et al paper – namely the use of pre-1960 data. The code also dates from the time the 2001 paper would have been being written.
However, I can’t see the relevance of the Yamal issue. Apart from anything else, the 10 tree period there was 1990 and beyond, while here data is not taken past 1960. And it’s a NH paper – Yamal would be a very small part.

REPLY:
My point is about exclusions based on sample size, the strategy of exclusion is inconsistent. – A

Jeff Alberts
November 22, 2009 10:20 pm

And how about this admission from Ed Cook, a dendro: http://whatcatastrophe.com/drupal/node/47

November 22, 2009 10:23 pm

No single piece of this jigsaw is determinative and it will take some time for all the relevant material from the leaked/hacked files to be collated.
We have all read some of the emails containing suspicious wording but it should not be overlooked that many of them are capable of innocent interpretation. That is why the “big picture” is so important. Notes in coding also only give a snapshot of the thinking of the coder (is that the correct term?) at the time. Isolated soundbites do not make a case, a coherent body of evidence is needed and that can result only from detailed examination of everything in its proper context.
Of particular importance are the following questions: (a) Does a pattern of behaviour appear? (b) Do private thoughts recorded in emails and notes correspond with the writer’s public pronouncements? (c) When Team Member A suggested something ostensibly underhand or improper to Team Member B, did Team Member B expressly agree, expressly dissent or remain silent? (d) Were the financial interests of the writer or his employer taken into account when deciding how to present scientific conclusions?
If we concentrate on those questions (although many more are also relevant) we will be able to see clearly the extent of any jiggery-pokery.
My first impression is that the documents disclosed fall a long way short of being a complete package. Perhaps the leaker/hacker is planning a second instalment to rebut detailed explanations and excuses from members of the Team. Time will tell.
So far, so good, but don’t get overexcited by individual items.

David
November 22, 2009 10:24 pm

Yes, Asimov also had quotes from the code up about how they had lost all the cloud data before 199X (sorry on the X, cant remember). That code is a very interesting piece of work.

November 22, 2009 10:25 pm

The HARRY_READ_ME.txt is a MUST READ to understand the utter chaos of the CRU TS. Here are a few more snippets:
BEGIN FILE =========
So.. should I really go to town (again) and allow the Master database to be ‘fixed’ by this program? Quite honestly I don’t have time – but it just shows the state our data holdings have drifted into. Who added those two series together? When? Why? Untraceable, except anecdotally.
It’s the same story for many other Russian stations, unfortunately – meaning that (probably) there was a full Russian update that did no data integrity checking at all. I just hope it’s restricted to Russia!!
There are, of course, metadata issues too. Take:
……….
..knowing how long it takes to debug this suite – the experiment
endeth here. The option (like all the anomdtb options) is totally
undocumented so we’ll never know what we lost.
22. Right, time to stop pussyfooting around the niceties of Tim’s labyrinthine software suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the definitive failure of the entire project..
……….
Tried running anomdtb.f90.. failed because it couldn’t find the .dts file! No matter that it doesn’t need it – argh!
Examined existing .dts files.. not sure what they’re for. Headers are identical to the .dtb file, all missing values are retained, all other values are replaced with one of several code numbers, no idea what they mean.
END FILE ==========
It goes on and on and on like this. I’ve never seen so much confusion in any coding project (and I’ve worked on more than a few). From what I’ve seen, I wouldn’t trust them to code a toy app for an iPhone.

Gene Nemetz
November 22, 2009 10:28 pm

Jeff Id (20:40:17) :
….smells like a duck’s butt.
I’ll just take your word for that.
😉

Methow Ken
November 22, 2009 10:28 pm

As someone who spent 25 years for the most part working on one complex custom software system for a certain agency of the Federal Government (that shall remain nameless), all I can say is I fully agree with above comment by Jeff:
Like most programmers on complex software systems with a couple 100 thousand lines of code, we tried hard to accurately and fully describe the functions of each module in the code comments. Failure to do so leads to chaos; and any programmer worth his or her salt will do the same.
If it looks, walks, flys, swims, and quacks like a duck, theoretically it still COULD be something else. But lacking overwhelming evidence to the contrary, odds are REAL good that it’s a duck. The duck quacked in this case, and their goose is cooked.

JNL
November 22, 2009 10:29 pm

I’m a statistical programmer for “BIG PHARMA” . For every new drug application, the FDA requires that we give them: raw data, analysis datasets (which are a merging of raw datasets and algorithms applied to raw data), a description of the algorithms and statistical methods, AND all our code. Then, the FDA reviewers try to come up with our results. This is done for every drug or medical device before approval.
The societal impact of global warming er.. climate change is greater than any one drug. If we go through all this independent review for a drug, we should demand a similar review process be applied to AGW claims.
The FDA, acting as a public protector, has to assume we are trying to “cheat” (and that is a reasonable approach). We never throw out data. Granted, our clinical trials are more controlled , but this ‘give the reviewers everything’ approach should be applied as much as possible in climate research.
But then again, we are the evil, capitalist, profit-seeking, “BIG PHARMA” and the people need to be protected from us.

Gene Nemetz
November 22, 2009 10:36 pm

Jesse (21:24:11) :
You’re right Jesse, there’s probably nothing here for you. Maybe you can spend your time at RealClimate. They’re more your type. We’re too far below you.
Say! Did you happen to see the post by Roy Spencer about elitism? Just wondering.

BCC
November 22, 2009 10:38 pm

Keep looking folks; at the end of the day this another yawner.
There has to be something beyond more egg on the face for the usual suspects to for this to go anywhere. Either problems in the HadCRU data (due to errors or malfeasance), or implication of someone else.
You can only dump on Mann, Jones, Briffa et al. for so long. It’s lots of fun, but it’s a distraction from the main event: scientific evidence that the climate just isn’t that sensitive to CO2 levels. That’s what Spencer is after, and that’s what matters. Not what happened in some tree ring in 1962.

November 22, 2009 10:38 pm

Off Topic…
Mainstream press overnight (US time) is now more confident and pushing harder…
UK Daily Mail pushing the FOI avoidance:
http://www.dailymail.co.uk/news/article-1230122/How-climate-change-scientists-dodged-sceptics.html
Canada’s Edmonton Journal also highlights the directions to delete stuff:
http://www.edmontonjournal.com/technology/Good+climate+news+alarmists/2252439/story.html
Wall Street Jounal – Also more strident today is – even highlights a link for readers to get the original files!
http://online.wsj.com/article/SB125883405294859215.html?mod=WSJ_hpp_sections_news
Meanwhile….
BBC says:
‘Hopes for the Copenhagen climate summit in December have been boosted after it emerged that more than 60 presidents and prime ministers plan to attend.’
http://news.bbc.co.uk/2/hi/europe/8373551.stm
Will they…NOW??
and poor old Mother Jones (I used to love that mag) is left explaining the meaning of ‘trick’
http://www.motherjones.com/kevin-drum/2009/11/tricked-out

November 22, 2009 10:39 pm

Jesse (21:24:11) :
. . . bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.

REPLY:
So what do you do down there in Norman? NSSL? U of OK? You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
Hi Jesse
Let me take a break from from floor waxing to say that I did grad school down in Norman. The professors all said they were trying to build a grad school their football team could be proud of. Well, gotta get back to buffing Wal-mart’s wide aisles.
Hi everyone else
I think the real meat here will be coming from the data and code, not in the email fluff. Here’s hoping M&M find the foi data they were looking for.

Gene Nemetz
November 22, 2009 10:41 pm

wes george (21:24:44) :
Wait and see, the True Believers will soon come out with a “Fake but accurate” defense.
They feel they can’t go down. The Titanic couldn’t sink too.

Greg
November 22, 2009 10:45 pm

Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
Nothing as convincing as some ad hom attacks.
Lots of folks posting here have actual academic qualifications. (I have an MSc in Clinical Research and 20 years of experience working with clinical data sets FWIW.)
The work from The Team is a joke compared to the standards that the FDA and EMEA require. That’s the problem with “appeal to authority” as a debating technique. When your “authority” turns out to be a tiny cabal of shoddy scientists, things kind of fall apart)

Gene Nemetz
November 22, 2009 10:48 pm

….meanwhile the earth is cooling…. and the general population is slowly finding out that there was collusion among the top circle of global warming scientists….

Gene Nemetz
November 22, 2009 10:52 pm

Glenn (20:38:14) :
I’ve been unable to find the definition of “artificial adjustment” in the climatologist’s handbook. It must be called something else.
Look under ‘cha-ching’

Gene Nemetz
November 22, 2009 10:55 pm

Doug (21:56:11) :
Dr. Mann should have known when he sat out picking cherries that cherry wood is far too brittle to make a hockey stick.
Here’s a case of being able to tell a lie about a cherry tree.

Dialla
November 22, 2009 10:59 pm

In SOAP-D15-intro-gkss.doc, it states:
“Osborn and Briffa, together with other co-authors (Rutherford et al., 2005), examined the sensitivity of temperature reconstructions to the use of different target variable (annual or seasonal temperature), target domain (hemispheric or extratropical) and reconstruction method. They found that when the differences in target variable and domain are accounted for, the differences in several reconstructions are small and that all reconstructions robustly indicate anomalous warm conditions in the 20th century, with respect to the past millennium.”
Since they are subbing real temp data after 1960, isn’t this fraud?
I mean, I think this is a pretty bald face lie to me.

savethesharks
November 22, 2009 10:59 pm

Jesse (21:24:11) :
Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.

IF THERE EVER WAS AN APOLOGY FOR THE CURRENT ELITISM (and it isn’t even a good one) THIS QUOTE…IS IT.
Contrary to Ayn Rand, who was an “elitist” of sorts, but saw the inherent value in anyone with a heart and soul, no matter what their “blue-collar” “job”, this individual pulls the curtain back and exposes himself for who he really is:
The “we scientists” part is extremely pungent given the fact that, empirically speaking, some of the first individuals to cave in the Third Reich….were the scientists.
Explain them apples.
Not sure, but it might have to do that scientific intelligence is not the end-all, be-all.
Yes, yes….science SHOULD rule (am in total agreement there).
Its just when you turn it to these nasty, nasty, ad hominem comments about custodians, and Wal-mart greeters [ugh i hate Wal-mart but don’t hate the people who need a job…so they work there], and others that YOU, JESSE, so quickly deride.
But beyond that, Jesse, this is not “making mountains of of ant hills.”
It is standing up for the truth, whatever it may be.
And…in passing…those FIRE ANT mounds that have taken over in the southeast US thanks to the positive AMO, are, to us, mountains….and if it is a big deal to you, their opportunistic, swarming, life-choking habits will hopefully come to a halt when the AMO turns negative.
So this whole CRU-gate saga is…it IS….a big deal.
It most certainly is a big deal. Precisely why you are your opportunistic mates are so up at arms.
So I say hear hear HAIL HAIL let’s end this cowardistic hi-jack of more than a few opportunistic ideologues who are trying to control the world and lets get on with….
SCIENCE BUSINESS AS USUAL!
AGW is dead. (Thank bl**dy g*d!)
Chris
Norfolk, VA, USA

Gene Nemetz
November 22, 2009 11:01 pm

UKIP (20:52:30) :
No resignations or sackings yet then?
None yet. But maybe some sweaty underarms.

Gene Nemetz
November 22, 2009 11:03 pm

BCC (22:38:30) :
You can only dump on Mann, Jones, Briffa et al. for so long. It’s lots of fun
Nah. Watching them dump on themselves is the fun.

Gene Nemetz
November 22, 2009 11:06 pm

Mike McMillan (22:39:30) :
Well, gotta get back to buffing Wal-mart’s wide aisles.
I got elbow grease. They hiring?

Steve S.
November 22, 2009 11:07 pm

Just wondering, when exactly were the climate models first determined to be “robust” and by whom?
Because that’s really ate at me ever sine I frist read that claim.
Looking now at all things in totality that claim of robustness is just about as big a whopper that’s possible.
So from who did it orginate?

savethesharks
November 22, 2009 11:13 pm

CORRECTION: The “we scientists” part is extremely pungent given the fact that, empirically speaking, some of the first individuals to cave INTO the Third Reich….were the scientists.
So “into” not “in”.
Sorry about that.
Cheers.
Chris
Norfolk, VA, USA

The Blissful Ignoramus
November 22, 2009 11:22 pm

It seems few are looking at the CRU documents… all interest seems to be on the emails. This is a shame – many of the documents offer just as damning evidence as the emails.
Consider “circ_inconsistency.doc”, attributed to Nathan P. Gillet of CRU, dated 3 May 2005, and titled, “Inconsistency between simulated and observed Northern Hemisphere circulation changes.” It clearly states that the eight (8) “state-of-the-art” coupled climate models relied upon for the IPCC 4th Assessment Report fail to match observed data:

“In recent decades winter sea level pressure has decreased over the Arctic and increased in the Northern Hemisphere subtropics, with an associated strengthening of midlatitude westerly winds1. This trend has previously been shown to be inconsistent with simulated internal climate variability and with the simulated response to greenhouse gases and sulphate aerosol changes2,3, but other climate influences have been suggested as a possible reason for the discrepancy3. Here, for the first time, we compare observed Northern Hemisphere sea level pressure trends with those simulated in response to all the major climate forcings in eight state-of-the-art coupled climate models over the past 50 years, and find that the observed trend is inconsistent both with simulated internal variability and with the simulated response to combined human and natural climate influences.”
[…]
“We compare the observed trend with output from eight coupled climate models prepared for the IPCC Fourth Assessment Report (UKMO-HadCM3, CCSM3, PCM, GFDL-CM2.0, GFDL-CM2.1, MIROC3.2(medres), GISS-EH, and GISS-ER).”
[…]
Overall we find that the observed Northern Hemisphere circulation trend is inconsistent with simulated internal variability, and that it is also inconsistent with the simulated response to anthropogenic and natural forcing in eight coupled climate models. This is therefore an aspect of large scale climate change for which current climate models are demonstrably inconsistent with observations: If we can understand and correct this bias this will lead to improvements in predictions of future climate change.”

(emphasis added by this blogger)

Gene Nemetz
November 22, 2009 11:22 pm

Eric Barnes (21:13:17) :
Revkin’s headline: ‘Cyber-Terrorism’
He knows how it was done? I wasn’t aware that that was found out yet. What if it was people that work at CRU and all they did was violate office policy?
Let’s look at the facts as they are revealed Mr. Revkin before we start playing the terrorism card.
Or is your headline different than your opinion??

tim
November 22, 2009 11:24 pm

Speaking as a programmer I cannot see anything wrong with that code. It is clearly documented to stop people making changes that will give incorrect results.
I think people should think these posts through a little more carefully because this post makes this blog look embarrassing.

fFreddy
November 22, 2009 11:32 pm

“Just wondering, when exactly were the climate models first determined to be “robust” and by whom?”
The same people who programmed them, probaby.

Ed
November 22, 2009 11:36 pm

I am a software engineer and have also done quite a bit of looking at the code. I would be cautious about jumping to conclusions as to what the comments mean. My cautious interpretation is the comments mean what they say – they stop at 1940 (in some files) or 1960 in others – because there is a divergence problem as indicated in the file named declineseries.pdf – from memory I believe that the tree ring density diverges about 1960 and the ring widths about 1940. There are some indications that the analysis code was used to try different approaches – e.g. 20 year or 40 year or 50 year smoothing, which suggests looking for ways to “spot the trend”. But I would not yet draw that conclusion.
From a s/w engineering perspective, the code is very ugly. Few descriptive comments and some really bad coding practices, like extensively duplicating blocks of code in numerous files with only small changes. How the heck will they accurately incorporate changes across multiple files in the future? Just a real mess.
Other questions, not answered in the release files, are “How was this code verified and validated?” Does the code due what ever it was intended to do? Does it also do that accurately?
The comments in the HARRY_README file are pretty wild, however. So wild that I haven’t really figured quite what to think about that just yet. There are other comments in the source files that mention data that was lost (cloud data) and which they recreate or try to re-create based on other data or sensor inputs. The HARRY_README though is rather wild.

November 22, 2009 11:43 pm

That warmists’ claim of 20th century as the “warmest ever” continues to amaze me.
1. Viking settlements in Greenland – proof positive it was warmer back then – even though CO2 was fairly low without Exxon and Shell pumping out CO2 from those evil refineries. If CO2 causes warming, then absence of CO2 must cause cooling. Cannot have a valid control system otherwise.
2. Prehistoric man’s body found in melting glacier in the Alps in 1991 – one must wonder how he had the strength (after being shot with an arrow, causing a mortal wound) to dig a hole down through that glacier so he could die underneath it. Or perhaps he died, was covered by snow, and that snow gradually became a glacier? It was warm enough 5,300 years ago in that pass that no glacier existed. Nah, couldn’t be. The warmists told us that it was WAY colder back then…
Maybe it is just me…I have lots of time to think about these things as I run my Wal-Mart floor waxer…actually it’s called a rotary buffer…
REPLY: LOL! – Anthony

Bulldust
November 22, 2009 11:45 pm

Hands up everyone that wants to chip in to buy the CRU crew (that does sound weird, no?) some T-shirts?
http://www.zazzle.com.au/i_reject_your_reality_substitute_my_own_t_shirt-235174364570624210
Shame Mythbusters beat them to that trademark slogan.

Gordon Walker
November 22, 2009 11:47 pm

“Well, perhaps that is a place to start. According to John Dally we know that trees don’t grow on 70% of the earth’s surface (oceans), they don’t grow in deserts or at high elevations. The 15% of the earth’s surface where trees do grow are in those locations where they may also impacted by lack of light (other trees), lack of water (draught), to the extent – that temperature can not be isolated as a cause for growth during any period. It is a false measure.”
And Yamal is an arctic wasteland where trees grow for about 15% of the year.
15% of 15% is just over 2%!
How much of a representative sample of the earth’s climate is that?

Frank Lansner
November 22, 2009 11:51 pm

See these program headers and code from the hacked material:
http://www.klimadebat.dk/forum/vedhaeftninger/manndeclinecode.jpg
http://www.klimadebat.dk/forum/vedhaeftninger/cutat1960.jpg
Their programs you can set ON/OFF to cut the decline off temperature series for 1960…! Yes HARD to explain.

Ron
November 22, 2009 11:53 pm

The worry for the Team, shown up in the emails, is that the late 20th century warming has two differences from the early 20th century warming. Firstly tree rings responded to early warming but not to the second (at least in theNorthern hemisphere – they did in the Southern) .
http://www.climatedata.info/Proxy/Proxy/treerings_introduction.html
…secondly, the late century warming is only over land.
http://www.climatedata.info/Temperature/reconstructions.html
This suggests that the recent warming may not be genuine.

Sheri Jo
November 22, 2009 11:54 pm

As if code comments and colluding to avoid FOI requests and data sharing isn’t enough to call into question their political agenda, it appears someone at CRU is on the Earth Government mailing list.
The Earth Government main site is here.
But of course they are just scientists with no agenda.
Found via the Anelegantchaos.org search engine.

The Blissful Ignoramus
November 22, 2009 11:55 pm

Further on the CRU documents – has anyone checked out the tellingly titled “Extreme2100.pdf”? All looks suspiciously like cherry-picked “Yamal ‘extreme’ tree rings”, to a mere ignoramus like myself.

November 22, 2009 11:55 pm

Meanwhile, Seth Borenstein keeps croaking in his swamp, oblivious to all:
Mountain glaciers in Europe, South America, Asia and Africa are shrinking faster than before…. In Greenland and Antarctica, ice sheets have lost trillions of tons of ice…. The world’s oceans have risen by about an inch and a half… Temperatures over the past 12 years are 0.4 of a degree warmer than the dozen years leading up to 1997…. Even the gloomiest climate models back in the 1990s didn’t forecast results quite this bad so fast.
In conclusion, Seth quotes several people with the same environmental disorder:
“The message on the science is that we know a lot more than we did in 1997 and it’s all negative,” said Eileen Claussen, president of the Pew Center on Global Climate Change. “Things are much worse than the models predicted.”
Wow! It’s worse than we thought! The sky didn’t fall according to models. Therefore, we need plug our ears and lie as often and loud as we can! Maybe, finally, the Heaven would hear us and fall?
Editors of the Associated Press! The whole world is laughing at you! People are e-mailing Seth’s articles to each other as funny stories! Boot Borenstein if you want to restore some modicum of credibility.

Frank Lansner
November 22, 2009 11:56 pm

Second program i showed was.
\FOIA\documents\osborn-tree6\pl_decline_nerc

E.M.Smith
Editor
November 22, 2009 11:56 pm

artwest (20:54:26) : A poster called Asimov has quoted extracts from the “HARRY_READ_ME.txt” file – deeply shocking stuff:
Asimov’s very plausible suggestion is that Harry is a programmer trying, and often failing, to make sense of the garbage data which he’s been lumbered with.

A note or two for non-programmers:
A “READ_ME” file is something programmers leave for each other (and sometimes for their future self when they return in a year or two to something they once shoved their brains through, but have now thankfully purged… then got assigned it again…)
The idea behind a README file is to tell you all the things that someone (or maybe a prior you) wasted a few hours or days learning. All the silly, stupid, vapid things; and sometimes the really neat but hard to figure out tricks. And sometimes just the nuts and bolts of “how do I make this go”.
So when a README file says things like “Why is the error message showing my squared number has gone negative?” they may be leaving a note for themselves on a future date, or for a fellow programmer a cube or two over to give them a clue. (“Whack them with the clue stick.”)
When it says things like “Don’t run this past 1960 because someone needs to artificially fit a wooden leg” it means “This is a stupid thing, that we cut it off here, but that bozo over there couldn’t make it work right so he’s just going to fudge the last bit and glue it on.” It’s a note to say: Your part is done, but you must go whack that bozo with the clue stick… and don’t worry that it’s broken after 1960, that’s his monkey to spank.
I really do feel for Harry Readme. I, too, found myself swimming in a ‘bucket of warm spit’ program with GIStemp. My guess is that a Ph.D. somebody with poor programming skills got Harry brought into the group to try and figure out why Mr. Clueless could not make code go. And Harry had to try and give Mr. Clueless clue, but the Clue Stick started to get worn out having been whacked so much…
Messy minds write messy code. Minds subject to deception write deceptive code. Sloppy imprecise minds write sloppy imprecise code. And Harry and I have had to try and mop up the slop.
So you find some arcane, messy, deceptive, sloppy, imprecise, and broken bit of code and FINALLY figure out what it was supposed to do and maybe what it REALLY does, and you put a note in the README file. So no other poor soul will ever need to stick their mind into THAT bit of code where the sun don’t shine ever again… and move on to the next bit of dreck…
So if the HARRY README says something is “corrected” in quotes, he is saying “I don’t buy it, but it’s the word they used. Don’t worry if the code looks like it’s just corrupting the data, Mr. Clueless says it is ‘corrected’ so let it go… I know, and you know, it isn’t, but he’s the guy writing the spec.”
And if he says [ artificially adjusted ] without quotes he is saying “Mr. Clueless is just going to make it up and stick it in here artificially” and you do not need to worry about how, or why, or validity.
And when it says [ look closer to real temperatures ] with no quotes it means exactly that: All that matters is how it looks, so don’t expect any code to try and create this from actual data, it’s a hand job making things look nice AND it has to look closer to real data but does not actually have to BE real data; so don’t waste time looking for data or code that deals with it.
My take on all this is that the Ph.Ds at GISS and UEA / CRU took a FORTRAN class once in the 1980s but are lousy programmers writing crappy code. It’s not their day job, it’s just a “neat trick” they learned to exploit. And their code shows it.
They are like the guy who can build a patio deck, but all the nails are over driven, and a few are bent over. There are hammer dings in the wood where they missed some times. The ends of the wood are not sawn well, the deck was made with green wood with knots in it (that has shrunk as it dried…) and they did not bother to varnish the thing, so it doesn’t weather well. Oh, and they used iron nails instead of stainless, so there are lots of ‘rust and iron stains’. Finally, they just used 4×4 posts for the foundation. Who needs concrete piers anyway…
Then Harry got brought in a few years later to tell them why the floor creaks and a couple of boards have come loose. And Harry has discovered he need to explain dry rot, varnish, redwood vs pine, stainless nails, …
IMHO, they need to have real carpenters build their deck for them… and they certainly ought not to invite the world economy over for a party on top of their present deck full of dry rot … I don’t think it would hold up, and when it collapses, the “guests” are going to be hurt, and very angry…
And Harry seems to know this too.

Ron
November 22, 2009 11:57 pm

The newspaper translation from English/German/English gave ‘Team’ as ‘Guild’. The http://www.public.iastate.edu site defines Medieval Guilds as:
* exclusive, regimented organizations;
* created in part to preserve the rights and privileges of their members; and
* separate and distinct from the civic governments, but since the functions and purposes of guild and civic government overlapped, it was not always easy to tell them apart, especially since many well-to-do guildsmen were prominent in civic government.
I suggest we refer to “The Guild” rather than the “The Team” from now on.

Alec J
November 22, 2009 11:59 pm

This is starting to be noticed at the BBC.
On this morning’s Radio 4 Today program at about 0735am there was a five minute slot with Former Chancellor Nigel Lawson and a Prof Watson. It was reasonably well balanced – surprising given that the presenter James Naughtie usually tries to work Global Warming into everything he can.
Lawson is calling for an enquiry into the University and its handling.
The Today program is the BBC’s prime radio current affairs slot and is “required listening” for most commentators and politicians – it can set the agenda for what is happening in British Politics.
An earlier piece win the program from Roger Harrabin failed to mention it.
It will be very interesting to see what happens from here.

pwl
November 23, 2009 12:05 am

No wonder Mann never wanted his source code nor his science Open Sourced so that many eyes can gaze upon his obvious attempts at fabrication (allegedly – assuming the files are genuine which they seem to be).
It’s a new programming language: “Smoking Gun!” and he apparently used it to shot himself in his own foot.

Alec J
November 23, 2009 12:10 am

Had quick look at The TIme website – major comment piece on their home page from Nigel Lawson – http://www.timesonline.co.uk/tol/comment/columnists/guest_contributors/article6927598.ece

November 23, 2009 12:11 am

.
Regards the Copenhagen Agreement, Einstein said of its findings that the participants were: “playing a risky game with reality”.
Einstein was actually talking about the Uncertainty Principle of Quantum Mechanics, which was debated in Copenhagen back in about 1927 – but his remark neatly sums up the forthcoming conference too.
http://en.wikipedia.org/wiki/Copenhagen_interpretation
.
.

ROM
November 23, 2009 12:12 am

The guys in this link are financial modelers and have been running through the CRU code and other data for the last couple of days.
Put your dark glasses on for some of the rather [ extremely ] descriptive language describing the standards of coding and modelers capabilities at CRU.
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625

King of Cool
November 23, 2009 12:14 am

I am afraid to say that I have not seen, read or heard one skerrick about the CRU Email hacking by our tax funded national broadcaster.
To-day in lieu you can learn that the Antarctic Ice Sheet is losing mass:
http://www.abc.net.au/news/stories/2009/11/23/2750931.htm?site=news
This selected reporting of stories that only support staff driven agendas and neglecting ones that do not is in itself a side story as scandalous as the disclosure of the CRU’s modus operandi.

Geoff Sherrington
November 23, 2009 12:17 am

Nick Stokes (20:34:52) : 22/11
Frankly, since you asked, you are being dense when you write “I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable. ”
The full quote in the programmer’s code is –
“Uses “corrected” MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
There’s no mention of “unreliable”.
There is mention of “artificially adjusted to look closer….”
Why not leave off inventing excuses for a while and post what you think is the most harmful part of the hacked information, from the point of view of proper scientific conduct?

Daphne
November 23, 2009 12:18 am

I found this in one of the emails – does this mean anything for the post-1960 data?
“One other thing – MOHC are also revising the 1961-90 normals.”
(MOHC is Met Office Hadley Centre.) Does “revising the normals” mean “inventing the data?” I’m just beginning to wonder. There seems to be a discussion of fudging numbers in the email.
http://www.anelegantchaos.org/emails.php?eid=1017&filename=1254147614.txt

November 23, 2009 12:20 am

>>Gene Nemetz (22:41:33) :
>>>They feel they can’t go down. The Titanic couldn’t sink too.
No, no, no.
What they can really feel is rapidly shrinking and withdrawn government grants. The cold winds of economic reality will soon be blowing though the sparse remnants of Yamal tree studies. It will be back to climate tokenism, just for the political ‘green-sheen’.
.

Dr A Burns
November 23, 2009 12:22 am

>>Another strange happening at Hadley … all the hadcrut3 data for this year, except Jan/Feb, has been deleted.
>>>>But of course they’d delete it, wouldn’t want the very inconveniently record cold Northern Hemisphere October 2009 muddying their “warming”.
Gregg,
Recent months actually showed an increase in temperature in hadcrut3. Perhaps it was realised by Jones et al that the data did not reflect reality ?

stephen richards
November 23, 2009 12:40 am

TIM
You are obviously one of those programmers I used to sack.
Comment were NEVER designed for the purpose you describe, you ;;;

November 23, 2009 12:43 am

I can´t wait when HadCRUT itself will be debunked. We know that many stations are bad, but there is no doubt some “added value” algorithm which adds 0.15 deg C here or there.
What will be left, if historical and present temperature records will be blown off? Just pre-programmed playstation climate models.
Cheers
Phil

November 23, 2009 12:44 am

You delusionals are quite funny.
To build your case with this data you have to perform a couple of ‘tricks’ of your own.
1. Make the unstated assumption that the paleoclimate data is the whole of the co2 forced global warming story.
2. Hone in on increasingly small parts of the story and ignore the big picture. You have to do this because the big picture doesn’t support your position at all.
3. Fail to respond to counter arguments from the originators of the research.
Vested interests and delusionals leading the ignorant here, now that you are starting to lose your grip on policy makers internationally it’s becoming amusing to watch you, if we’ve got the time that is.

Fry
November 23, 2009 12:46 am

There is no much difference between CRU and UAH/RSS dataset:
CRU/UAH diff:
http://www.gfspl.rootnode.net/BLOG/wp-content/uploads/2009/11/cruuah_trend.png

RR Kampen
November 23, 2009 12:47 am

So it’s not warming at all. The theory of the gnomes having reduced the freezing point of water is true after all 🙂

E.M.Smith
Editor
November 23, 2009 12:47 am

Policyguy (21:21:18) : The 15% of the earth’s surface where trees do grow are in those locations where they may also impacted by lack of light (other trees), lack of water (draught), to the extent – that temperature can not be isolated as a cause for growth during any period. It is a false measure.
And don’t forget my favorite “confounder”: Bear Poo. A recent (Peer Reviewed) article found that bear eating salmon and, well, doing what a bear with a full tummy eventually does in the forrest, despite all the times folks ask: “Does a bear, um, poo, in the woods?”, the answer is they do poo..
Well, said bear account for as much deposited fertilizer as would be applied in a commercial tree farm. So depending on the amount of salmon run, the bear population, and the distance to the streams with said salmon in any given years (i.e. rainfall and stream flow matter) trees will get more or less growth depending on the “Poo Profile” …
To calibrate your trees, you have to have calibrated your poo deposition over time…
Jesse (21:24:11) : Once again, you guys are making mountains out of ant hills.
Ever found yourself standing on a Fire Ant hill? Visit Texas… bare foot… Do not underestimate the impact of an “ant hill”.
This is just normal data processing
No Way. Not even close. Nothing about this code matches ANYTHING I would class as “normal data processing”. (And yes, I’m a “professional” at it).
per the 1960 divergence problem as shown by NUMEROUS sources.
If you have 1960 divergence, then you could have 1860 divergence, and 1760, and 1260 and … Toy Broke.
This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists.
From what I’ve seen, the climate “uniformed amateurs” know more about this than the “real scientists”. I, for one, can write code better than anything these “real scientists” have done. And I have a much better “grasp” of software QA. Or would you agree that these uninformed amateur programmers ought to leave the job to “real programmers”. Hmmm?
Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
Well, it would seem your level of expertise is that of “wrassling coach”… Or would that be “writer for Jerry Springer”?

REPLY: You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A

OK, I’ll follow that lead:
Highest ranks attained: Corporate President. Board Member (one public, one private).
Favorite rank: Director of Information Services.
Paper Trail: IPCC CDP, State of California Community College Lifetime Teaching Credential Data Processing & related. Bachelors and more.
Education: Through about 20 Units of masters level (needed for credential, plus MBA work). More “commercial classes” than I can track. 12 Units of doctoral level (it’s a long story involving money and time…)
Experience includes several years teaching I.T. at a local silicon valley college.
Over 30 years professional data processing experience in the private sector including managing a Cray supercomputer site (and building out same) and including being a professional programmer for 2 decades+ including being a DBA and being a professional consultant on mainframe databases. FBI check passed to work in securities field and employed in same at stock brokerages and Federal Reserve Bank. Taught computer forensics. Conducted security audits.
Managed Software QA for a compiler tool chain company.
Managed Software build for a network router appliance company (Build Master) with a commercial product.
Produced and shipped production software and documentation through many product cycles.
and a whole lot more…
So I think my “software professional” status stacks up pretty well against your “real scientists” who are amateur programmers. Perhaps they ought to leave programming to the “real computer scientists” and go back to their day jobs as custodians… no, wait, they lost their data, so they are not qualified to be custodians of things…

Policyguy
November 23, 2009 1:00 am

Its just a matter of time that Mr Jones and Mann and perhaps more will be moving on.

E.M.Smith
Editor
November 23, 2009 1:04 am

For those wondering about The Troubles With Harry, here is his picture:
http://www.cru.uea.ac.uk/cru/people/photo/harry.jpg
Looks like a nice enough guy. I’d like to buy him a beer for his efforts with The Code.

40 Shades of Green
November 23, 2009 1:08 am

I wonder does the Harry Readme file coincide with a new release of the HADCRUT algorithm. As I understand it, every few years a new version of the software / algorithm / data is released and it typically shows more warming than the last one. Could it therefore be plausibly claimed that then Harry Readme invaldiates a particular HACDCRUT release?
Of course that is not to say that the previous one worked okay either.

Perry Debell
November 23, 2009 1:13 am

“Kevin Trenberth, of the US National Center for Atmospheric Research (NCA) in Colorado, whose e-mails were among those accessed, said the timing of the hacking was “not a coincidence”.
He told the Associated Press News agency 102 of his emails had been posted on the internet and he felt “violated”.
Critics say the e-mails show that scientists have distorted the facts of climate change, but Mr Trenberth said the e-mails had been “taken out of context”.
The above is the last paragraph from http://news.bbc.co.uk/1/hi/world/europe/8373551.stm
So, he feels violated eh? Well sport, we feel we have been conned by you & your deceitful cronies & you all richly deserve your collective fates. What are the future employment prospects for people like you. Nil, if there is any justice.
E.M.Smith (00:47:58) :
Please be advised that bears poo in the woods for only about 5 months. During their hibernation, the bears neither defecate or urinate. This would normally mean that nitrogenous wastes during that time would cause poisoning to the urinal system. However this it does not do. The bear solves its nitrogenous waste problem by a form of recycling.” The hibernating bears body diverts nitrogen from pathways that synthesise urea into pathways that generate amino acids and new proteins. And it does this by glycerol (produced when fats are metabolized) and recycled nitrogen as the building blocks” according to the New Scientist Magazine of February 1985.

Robinson
November 23, 2009 1:15 am

Another MSM article. The Mail dropped their previous article over the weekend but I found this one on their front page again today.

November 23, 2009 1:20 am

@ Jesse (21:24:11) :
Although my title is actually “Janitor”, I do play a computer scientist on TV (almost 30 years). On TV, I have worked for the DOD, Navy, DHS, software companies, law firms, accounting firms, manufacturers….
But, back to reality, gotta go wash some windows…
[/sarc]

November 23, 2009 1:21 am

“Lord Lawson calls for public inquiry into UEA global warming data ‘manipulation'”
on BBC radio reported by the telegraph:
http://www.telegraph.co.uk/earth/environment/globalwarming/6634282/Lord-Lawson-calls-for-public-inquiry-into-UEA-global-warming-data-manipulation.html

Stacey
November 23, 2009 1:22 am
John Peter
November 23, 2009 1:24 am

Roger Harrabin’s Notes: E-mail arguments on BBC
http://news.bbc.co.uk/1/hi/sci/tech/8371597.stm
Note that he has plenty of CRU contacts to assist him in putting this revelation in “perspective” but to be fair he is also quoting Myron Ebell, a climate sceptic from the Competitive Enterprise Institute.

November 23, 2009 1:33 am

No sunspots today.
But if you want, I’m sure I could produce some.

Neil O'Rourke
November 23, 2009 1:34 am

To calibrate your trees, you have to have calibrated your poo deposition over time…
[snip, oh come on ~ ctm]
🙂

Robinson
November 23, 2009 1:36 am

A good article in The Times this morning from Lord Lawson about Coppenhagen. Mentions the CRU leaks as well.

Alan the Brit
November 23, 2009 1:37 am

I am absoultely loving this! He he!!! The foul stench of bovine faecal contaminated science from the CRU is stomach turning. Of course they’ll point to the recent bad storms in Cumbria as hard evidence of Climate Change, despite better examples in the past over on An Englishmans Castle website.
I think the late great Sir Walter Scott summed it all up rather well, “Oh what a tangled web we weave, when first we practice to deceive”!
Never, never, never give up! (Sir Winston Churchill).

Robinson
November 23, 2009 1:38 am

Lord Lawson announced this morning:

It is against all this background that I am announcing today the launch of a new high-powered all-party (and non-party) think-tank, the Global Warming Policy Foundation (www.thegwpf.org), which I hope may mark a turning-point in the political and public debate on the important issue of global warming policy. At the very least, open and reasoned debate on this issue cannot be anything but healthy. The absence of debate between political parties at the present time makes our contribution all the more necessary.

About time!

November 23, 2009 1:39 am

kdkd wrote:
Vested interests and delusionals leading the ignorant here, now that you are starting to lose your grip on policy makers internationally it’s becoming amusing to watch you, if we’ve got the time that is.
Interesting. I would have written the same, word for word, speaking to the green crowd.
Who has a vested interest in green scaremongering? Skeptics? No, the environmentalist totalitarians. That’s why they always suppress any talk about the nuclear solution of energy and pollution problems.
Who is losing their grip on policy makers internationally? Al Gore & Co. Skeptics never had such a grip.
Who is becoming the laughing stock of the world? Briffa, Jones, Overpeck, Mann, Al Gore et al. Yes, it is very amusing to watch them now, running around like cockroaches on the hot skillet, their shenanigans exposed.
Who is delusional? Those who respect the facts and have guts to admit that they don’t know enough about climate to be able to predict it, or those who believe in their omniscience, and ascribe to the humanity a disproportional influence over the processes of astronomical scale?
Who judges people not by what thy do but by who they know, not by the correctness of their predictions but by pre-orchestrated fraudulent “peer reviews”?
Finally, who is always afraid to post under their real names, and has neither respect nor tolerance, not even an elementary human decency, toward their opponents?

michael
November 23, 2009 1:40 am

Mann smells like a ducks but at the very behind…

Richard
November 23, 2009 1:43 am
E.M.Smith
Editor
November 23, 2009 1:47 am

JNL (22:29:08) : I’m a statistical programmer for “BIG PHARMA” . For every new drug application, the FDA requires that we give them:…
Nice List.
FWIW, I’ve done “qualified installs” for Pharma companies.
What the non-Pharma folks might not know: For every single bit of hardware and software used for all the stuff JNL listed, it must be installed “just so”. Every Single Step of Every Single Procedure must be defined in advance. Even if it is just “Open box. Plug in cord. Turn power switch on.”.
A “Qualified Install” has a predefined process and it has an implementor. It also has a Manager (that was me) and a QA officer (that may have been specific to the particular company, but the rest is FDA mandated).
The Manager watches the Implementor (i.e. systems admin) do each step.
Each step must be done EXACTLY AS WRITTEN. Then both the sysadmin and the manager sign off the step. At the end of the entire process a PREDEFINED QA process is performed and the output must match EXACTLY.
Then the manager gets to sign off the whole package and hand it over to the Company QA guy (who was watching over the shoulder of the manager watching over the shoulder of the SysAdmin…).
The whole package is copied, filed with the company, and sent to the FDA.
This is so that they can exactly duplicate everything the drug company did, including “Open the box, plug it in, turn on power”… (Though that second list would not pass a “Qualified Install” since I used commas instead of periods and said “turn on power” instead of “Turn power switch on”; it did not match my first list… Yes, it IS that picky…)
So not only must all the stuff JNL listed be sent to the FDA, but also every single bit of hardware assembly, software installation, software configuration, (the works) must be a “Qualified Install” and documented. And lord help you if NetApp changes the power-on button from Red to Orange and your Qualified Install says “Turn On Red Power Button Lower Left”…
IF the FDA decides to test something you sent, and the Qualified Install docs don’t match what they experience when, oh, booting up a Sun Server, guess what: Your drug gets rejected until you get it right… So when you say “install Solaris” you’d better have the exact release number noted and it better behave exactly the same each time…
So that is what you must do if you want to sell a an aspirin with a new type of inert binder in it or even just wanted to make a “kosher aspirin” with an enteric coating blessed by a Rabbi …
But enslave the world with carbon taxes? Destroy world economies? Claim thermageddon happening now? That can be done with completely undefined and substantially broken software with no comments, no procedures, irreproducible runs (as the comments in HARRY_README show), and with no clue if the product works.
Grab an aspirin tin / bottle and think about it, for just a few moments…

Joseph in Florida
November 23, 2009 1:56 am

“If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?”
Because no one can prove otherwise, silly.

November 23, 2009 1:59 am

E.M.Smith (00:47:58) :
Ever found yourself standing on a Fire Ant hill? Visit Texas… .

I did that once. ONCE.
Mike in Houston
.
savethesharks (23:13:36) :
CORRECTION: The “we scientists” part is extremely pungent given the fact that, empirically speaking, some of the first individuals to cave INTO the Third Reich….were the scientists.
So “into” not “in”.

“in  to” is correcter.
Mike, grammar n*zi
.
Anyone heard any comments from M&M ?

cohenite
November 23, 2009 2:00 am

I always look forward to Nick Stokes’ contributions; he must be nearly the most hard-working supporter of the increasing dishelveled AGW edifice; certainly he is the politest, admittedly against poor opposition, and he must have the constitution of a Mallee bull, having to digest the tripe, offal and dreck that passes as AGW evidence these days.
I’ve also been fascinated with the divergence ‘problem’ which Nick has applied his cudgels to on this thread; here we have AGW, proxified through history with all sorts of weird and wonderful samples and series correlated with each other according to strange incantations, with all the power of quantum exotics; in this way the wizards of these bits of yore and fairy stuff can construct a mighty hafted stick, as strong as any Bradman bat, to slay the doubters. But when we hit 1960, the start of the dreaded AGW, the magic disappears and the proxy magic path, so firm in the past, wilts like a viagra-less, old man’s dreams. It just isn’t fair that in the modern era when we have access to all the tree-rings we can shake a stick at that none work and we have to instrumentalise the modern duds up to speed; typical younger generation!

Gregg E.
November 23, 2009 2:00 am

Found a quotlet on another blog…*
“…we are having trouble to express the real message of the reconstructions – being
scientifically sound in representing uncertainty…”
It’s like they’re channeling the Hitchhiker’s Guide to the Galaxy. “We demand rigidly defined areas of doubt and uncertainty!” 😉
*I have too slow of a net connection to download the whole lump of stuff.

Joseph in Florida
November 23, 2009 2:06 am

“Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
This is the sort of comment that those who know that their argument can not stand up to logic will resort to using. It fits in with so-called scientists who refuse to make public the raw data, code, and methods used to obtain the hockey-stick.
Without public transparency, there can be no science. The idea of putting your results out there for all to try to discredit is the very heart of the scientific method. (or at least, that is what they said in the science courses I took)
Does the truth not matter to these alarmists?

fabron
November 23, 2009 2:19 am

LORD LAWSON CALLS FOR PUBLIC INQUIRY INTO UEA GLOBAL WARMING DATA ‘MANIPULATION’
Lord Lawson, the former chancellor, has called for an independent inquiry into claims that leading climate change scientists manipulated data to strengthen the case for man-made global warming.
http://www.telegraph.co.uk/earth/environment/globalwarming/6634282/Lord-Lawson-calls-for-public-inquiry-into-UEA-global-warming-data-manipulation.html

NastyWolf
November 23, 2009 2:33 am

Fry (00:46:31) :
“There is no much difference between CRU and UAH/RSS dataset”
Of course not, because the difference would be too difficult to explain.
Pre-satellite era data fiddling is free territory for CRU/GISS so they could argue that climate has been warming during last decades.

Phillip Bratby
November 23, 2009 2:40 am

The BBC coverage has reopened, with the proviso that: “We have now re-opened comments on this post. However, legal considerations mean that we will not publish comments quoting from e-mails purporting to be those stolen from the University of East Anglia, nor comments linking to other sites quoting from that material.”
See http://www.bbc.co.uk/blogs/thereporters/richardblack/2009/11/copenhagen_countdown_17_days.html#comments
It would appear that the BBC is not going to give this any publicity. Any surprises there?

Gregg E.
November 23, 2009 2:41 am

I have an idea of where these files may have come from. Many e-mail programs store each “folder” of messages and attachments in a single file with a simple indexing scheme.
When an e-mail is deleted, it’s not actually removed from the “folder” file, only the e-mail’s pointer in the index is removed. To actually DELETE the e-mails, the “folder” file has to be compacted or purged or whatever term the mail client software uses. There are many mail recovery programs that can quickly and easily create an index to all deleted messages in a mail client’s trash/deleted “folder” file, though they may not be able to fully recover the complete headers.
This “feature” is one thing computer forensics often uses to find evidence on computers.
In spite of this being fairly common knowledge amongst people with mid to high level computer skills, it’s surprising how few actually bother to ensure their mail clients are configured to automatically purge/compact deleted messages.
This could be the e-mail analog to a famous case where a murder suspect sneaked a pair of pinking shears into an interrogation room where the police had brought the actual floppy disks they’d obtained from his house. (Had they been a bit more on the ball they’d have stuck some blank disks into the sleeves or made backup copies first.) The suspect managed to chop the disks into pieces, but other people were able to develop a process to put them back together well enough to recover large enough fragments to prove the suspect had written a lot about the murder. A powerful magnet would’ve been a better way to destroy the incriminating evidence.
Pinking shears = “deleting” e-mail. Big magnet = purging the deleted e-mail.

Dr WHO DO VOODOO
November 23, 2009 2:45 am

JNL wrote:
“The FDA, acting as a public protector, has to assume we are trying to “cheat” (and that is a reasonable approach)… But then again, we are the evil, capitalist, profit-seeking, “BIG PHARMA” and the people need to be protected from us.”
I agree – the FDA is only “acting as” etc.
But – isn’t it possible that the FDA and “BIG PHARMA” have BOTH lost “some” credibility “lately”? And by lately I mean the past few decades…
Does the “revolving-door” policy mean anything to people working for the “BIG PHARMA – and I mean other than “career opportunities”?
Well, just wondering – and yes, I realize this is OT etc. – and I promise not to continue… 😉

November 23, 2009 2:47 am

E.M.Smith (00:47:58) :
“To calibrate your trees ……….”
You are absolutely right. One factor regarding forest growth that can’t be easily accounted for is effects of the excessively strong winds and hurricanes. I do regular walks in woodland which was affected by 1987 hurricane in the South East England. In some areas where trees were more exposed, most of large mature oak and beech trees were uprooted, while in the sheltered parts they survived.
Young saplings from 1987 are now established trees in all areas, but in the exposed parts where mature trees were uprooted, young ones are now nearly twice the size of those of the same age in the sheltered parts. This could be attributed to extra sunlight and nutrients available in the areas where large old established trees were uprooted.

Eric (skeptic)
November 23, 2009 2:56 am

kdkd (00:44:40)
Can you fill us in on the big picture? For example can you tell us how high sea levels were in the MWP? Can you give us the arctic ice extent for the 40’s? Do you know when the forward speed of Greenland’s glaciers peaked? We have all heard the litany of how things are worse than ever and accelerating, but what is lacking in those litanies is numerous details and a long term perspective.

TerryS
November 23, 2009 2:57 am

Some more files with the same comment….
./FOIA/documents/osborn-tree6/summer_modes/hovmueller_lon.pro
./FOIA/documents/osborn-tree6/summer_modes/maps24.pro
./FOIA/documents/osborn-tree6/summer_modes/maps_general.pro
./FOIA/documents/osborn-tree6/summer_modes/maps1.pro
./FOIA/documents/osborn-tree6/summer_modes/maps15.pro
./FOIA/documents/osborn-tree6/summer_modes/maps1_movie.pro
./FOIA/documents/osborn-tree6/summer_modes/maps1_poster.pro
./FOIA/documents/osborn-tree6/summer_modes/maps12.pro
./FOIA/documents/osborn-tree6/mann/oldprog/hovmueller_lon.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps24.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps1.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps15.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps1_movie.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps1_poster.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps12.pro

Grig
November 23, 2009 2:59 am

INSIDERS?
Interesting that entitled “FOIA.zip” came at Jeff Id’s blog  (nOV.13) with a posting that asked 18 leading US scientific associations about their letter to the US Senate on Oct.21,2009, at http://www.whatisclimate.com/
Would Russian hackers have done that?

November 23, 2009 3:01 am

Link to interview [0735] with Nigel Lawson and Robert Watson on R4 Today programme
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
I’ve never heard Robert interviewed before but he did seem a little defensively assertive to my ears.

son of mulder
November 23, 2009 3:02 am

Alec J (23:59:14) :”This is starting to be noticed at the BBC.
On this morning’s Radio 4 Today program at about 0735am there was a five minute slot with Former Chancellor Nigel Lawson and a Prof Watson. It was reasonably well balanced – surprising given that the presenter James Naughtie usually tries to work Global Warming into everything he can.”
It’s now online here for today
http://news.bbc.co.uk/today/hi/listen_again/default.stm
at 0735
Prof Watson starts by saying “These scientists at East Anglia are both honourable and world class. Their data is not being manipulated in any bad way”.
Was Yamal not bad? How does honourable square with ‘Not allowing data release because people will try to find fault with it’?
Discuss.

rbateman
November 23, 2009 3:02 am

tim (23:24:49) :
Well, tim, there is a big problem with what is indicated in HARRY_READ_ME.txt and the present records for historical temp data.
The skullduggery does not start and end at CRU.
Great damage has been inflicted to irreplacable?? data.
Somebody gave the warmists the means to destroy data, and they did just that.
HARRY_READ_ME understood the gravity of what happened to the data he was given to work with.
Nope, nothing wrong with the code, it finished the job of data destruction as designed.
The Librarian at Alexandria weeps yet again, for the same type of people have once more robbed history.
The news claims that “hackers” broke into and stole emails/data.
The real “hackers” destroyed science data long before that, under the guise of science.

Matt
November 23, 2009 3:04 am

I’m just imagining some of the behind the scenes conversations (probably by phone 😉 ) between the interested parties that must be going on now. I’d think the mother of all ‘damage limitation’ plans is being drawn up.
Along the same vein, it’s amusing to see the usual ‘nothing to see here, move along’ articles appearing in the various pro AGW newspapers / blogs (either that or the ostrich ‘it never happened’ behaviour as typified by the BBC). However, try as they might , I can’t see this one going away…

rbateman
November 23, 2009 3:07 am

P Gosselin (01:33:38) :
No sunspots today, and for most of the past 2 weeks those that have been ‘officially listed’ as sunspots were SOHO only visible.
We really should be talking about the Sun, but a very sad day has dawned with the realization that Science Barbarians have sacked and burned irreplacable data worldwide in an effort to support a political bent.

Kevin B
November 23, 2009 3:27 am

When you look at the unsmoothed proxy data it looks like noise. It’s only when the data has been smoothed and only those proxy series that match twentieth century temperatures, (as measured by thermometers), are included do you begin to get something that looks like a hockey stick. Everything else is discarded.
Even when they get something that looks like a hockey stick, it has a divergence problem that needs a ‘trick’ to hide it. This trick is to truncate the data and substitute the thermometer data that was used to ‘calibrate’ the original data.
Lots of reasons have been put forward as to why this is necessary and, from some, why it doesn’t matter, but for me the most convincing reason, the one that old Occam would adopt, is that the original, unsmoothed, unadjusted, uncalibrated data is correct.
It isn’t temperature the proxies are showing, it’s noise.
When you’ve thrown away 90% of the data in the smoothing and calibrating process and what you’ve got left still needs padding with data from another source, then you don’t have anything at all.
No wonder they needed to rig the peer review process and bully the journals.

Manfred
November 23, 2009 3:28 am

bbc loses all credibility
an appaling and disturbing piece of nepotism:
http://news.bbc.co.uk/2/hi/science/nature/8371597.stm

Fred Lightfoot
November 23, 2009 3:29 am

E.M.Smith (00.47.58)
Still got access to that Cray ? wishful thinking.

debreuil
November 23, 2009 3:30 am

Posted this on CA, but I see I’m not the only one wondering here. I think it is a dataset merge and then attempt at normalizing, but some of it is by year, so pretty weird.
Weirdest? This is printed on running. Sometimes when programming you fudge like this to get clues to where you might be off, it is more legit if it warns you on output. Still the final part sounds like they aren’t about to change it. Not sure what to think.
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is.
….original msg….
What is the ‘decline’ thing anyway? It is in a lot of code, seems to involve splicing two data sets, or adjusting later data to get a better fit. Mostly (as a programmer), it seems like a ‘magic number’ thing, where your results aren’t quite right, so you add/multiply by some constant rather than deal with the real problem. Aka “a real bad thing to do” : ).
\FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
\FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
\FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
\FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;
; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
\FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
\FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
\FOIA\documents\osborn-tree6\mann\oldprog\pl_decline.pro
;
; Now apply I completely artificial adjustment for the decline
; (only where coefficient is positive!)
;
tfac=declinets-cval
fdcorrect=fdcalib
for iyr = 0 , mxdnyr-1 do begin
fdcorrect(*,*,iyr)=fdcorrect(*,*,iyr)-tfac(iyr)*(zcoeff(*,*) > 0.)
endfor
;
; Now save the data for later analysis
;
save,filename=’calibmxd3.idlsave’,$
g,mxdyear,mxdnyr,fdcalib,mxdfd2,fdcorrect
;
end
\FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;
; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

November 23, 2009 3:33 am

Brazen!
“We also have a data protection act, which I will hide behind.”
http://directorblue.blogspot.com/2009/11/milli-vanilli-of-science-hacked-emails.html

Oscar Bajner
November 23, 2009 3:38 am

Sample from Harry_readme.txt
“In other words, the *anom.pro scripts are much more recent than the *tdm
scripts. There is no way of knowing which Tim used to produce the current
public files. The scripts differ internally but – you guessed it! – the
descriptions at the start are identical. WHAT IS GOING ON? Given that the
‘README_GRIDDING.txt’ file is dated ‘Mar 30 2004’ we will have to assume
that the originally-stated scripts must be used. ”
I could help the Hadley Harries with this. There is this software thing called “revision control” and I could set it up in an afternoon for them. (CVS, SVN, GIT, BZR … whatever they like) In two afternoons I could set up a really neat, scriptable control system. They could spend more time debugging and less time on [snip]
Thing is, I can’t give up my day job right now – I just got a promotion and from today I am in charge of the rotary buffer! W00t! Wal-Mart rocks.

Stuck-Record
November 23, 2009 3:39 am

“Alan the Brit (01:37:00) :
I am absoultely loving this! He he!!! The foul stench of bovine faecal contaminated science from the CRU is stomach turning. Of course they’ll point to the recent bad storms in Cumbria as hard evidence of Climate Change,”
They already have. 10 o’clock news and again on Today prog this morning.

Scouse Pete
November 23, 2009 3:49 am

It’s interesting the BBCs confused policy at the moment. They have reopened their blogsite this morning with the specific message No links to or extracts from the emails will be allowed. Yet, Nigel Lawson was allowed to talk about it on BBC R4 this morning! (already linked above) And in The Mail this morning 3 pages of stuff about it. A new story on Phil Jones “Pioneer or Junk Peddler” and then a Chistopher Booker piece of 2 pages.
Nigel Lawson also has a piece in The Times this morning. So, it seems it’s only Aunty Beeb fighting with its internal conflict on this issue, in which its editorial policy has been compromised due to it’s Editor-In-Chief the DJ Mark Thompson being hood-winked by Al Gore back in 2007 when he attended the personal presentation of his flawed Powerpoint presentation to BBC Staff. Ever since then, he has dictated their current toothless policy from the top – in my opinion.
Time for him to go. Time for The BBC Trust to get involved.

debreuil
November 23, 2009 3:49 am

It says important note, but I guess I missed the memo.
\FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way, and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.’

Ashtoreth
November 23, 2009 3:52 am

As a software engineer with 30 years experience, some of it working with government scientists, that code is horribly horribly familiar…
The problem is that research scientists have done a programming course at some point in time. 99% consider themselves good coders as a result. 98% of them are wrong….
The initial flaw seems to be in the way they intend to use the software – its only for them (often not even for their colleagues), and as such is completely uncontrolled. Often changes are made without any reference or not, changes on changes…and after a while, they arent sure any more why things happen the way they do…
Documentation? We dont need that, its my programme, I know what it does. Maybe. Will you in 5 years? Experience shows you dont…
This code is a classic example of this way of programming. Now fortunately, much of this type of coding is only used by one person, not designed for input to anything critical, as an aide for a researcher, for whom results trump everything. So while its bad practice, it doesnt have too many disastrous effects. This time, however, its being used for predictions costing 100’s of billions of dollars….
Monkton is absolutely correct, we need to take the raw data, the calculations, and build new, verified models and data sets to see what is hapenning BEFORE we spend all this money. If these DO show AGW, fair enough. My money is on any AGW being so small its lost in the noise.

rbateman
November 23, 2009 4:00 am

Climate Change? Hah.
THIS is what I call real Climate Change:
Date Id Name State Latitude Longitude Maximum Temperature
in ºF Minimum Temperature
in ºF Observation Temperature
in ºF Precipitation
in inches
1892-01-03 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-04 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-05 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.02
1892-01-06 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-07 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.54
1892-01-08 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.32
1892-01-09 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.05
1892-01-10 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-11 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-12 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-13 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-14 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.005
1892-01-15 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.005
nothing but precip data until…..
1934-09-27 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-28 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-29 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-30 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-10-01 47290 RED BLUFF CA 40.1833 -122.233 85.0 51.0 79.0 0
1934-10-02 47290 RED BLUFF CA 40.1833 -122.233 82.0 51.0 79.0 0
1934-10-03 47290 RED BLUFF CA 40.1833 -122.233 84.0 54.0 76.0 0
1934-10-04 47290 RED BLUFF CA 40.1833 -122.233 91.0 50.0 76.0 0
1934-10-05 47290 RED BLUFF CA 40.1833 -122.233 100.0 47.0 79.0 0
1934-10-06 47290 RED BLUFF CA 40.1833 -122.233 82.0 62.0 66.0 0
1934-10-07 47290 RED BLUFF CA 40.1833 -122.233 74.0 54.0 64.0 0.11
crua6[/cru/cruts/version_3_0/db/testmergedb] grep -n ‘RED BLUFF’ tmp.0*.*
tmp.0612081519.dat:28595: 725910 401 1223 103 RED BLUFF USA 1991 2006 101991 -999.00
tmp.0702091122.dtb:171674: 725910 401 1223 103 RED BLUFF USA 1878 1980 101878 -999.00
tmp.0704251819.dtb:200331: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
tmp.0704271015.dtb:254272: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
tmp.0704292158.dtb:254272: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
crua6[/cru/cruts/version_3_0/db/testmergedb]
The first file is the 1991-2006 update file. The second is the original
temperature database – note that the station ends in 1980.
It has *inherited* data from the previous station, where it had -9999
before! I thought I’d fixed that?!!!
Yeah, baby, you fixed it all right.
1 station data set smoked on the CRU barbie.

Robinson
November 23, 2009 4:02 am

As a Software Developer, I know that programmers often impart their stream of consciousness into the code in the form of comments. But from reading the above (particularly debreuil’s quotes), it seems clear to me there’s quite a substantial confirmation bias in their method.

John Finn
November 23, 2009 4:05 am

So come on, folks – time to nominate your favourite email. I realise we’re totally spoilt for choice but which ones stand out.
The ‘nature trick’ is definitely a contender and is far more damaging than Phil Jones and the press are trying to make out, but this only confirmed what I knew anyway.
The surprise for me was the Trenberth effort which includes the immortal lines “but the data are surely wrong. Our observing system is inadequate. ”
Comments on sceptic blogs often suggest that the warmers think that if the data doesn’t agree with the models then the data must be wrong. I always thought this was an unfair exaggeration. But it’s true. These people are beyond satire.

Cassandra King
November 23, 2009 4:12 am

The BBC are hiding behind the very flimsy excuse of “legal reasons” why they cannot reveal details of the emails, the excuse is so transparently dishonest I wonder if they are so desperate to deny people a chance to see the evidence that they would risk using such an obviously false reason for withholding the data?
I suspect that the BBC science and environment departments and reporters are very deeply involved with the scientists at the heart of the scandal, it must be clear that the BBC are covering up for the fraudsters for as long as it takes to either create a backup story or the story fades away.
Whatever the motives of the BBC and their reporters, the longer the delay the more suspicious that delay becomes, perhaps the BBC are willing to take the risk of stonewalling and delaying the actual release of the data considering the damage that releasing that data will have on the BBC.

dodgy geezer
November 23, 2009 4:17 am

Interesting quote from the baffled programmer trying to make sense of it all, and finally guessing.. (Harry’s txt)
“…The results are depressing. For Paris, with 237 years, +/- 20% of the real value was possible with even 40 values. Winter months were more variable than Summer ones of course. What we really need, and I don’t think it’ll happen of course, is a set of metrics (by latitude band perhaps) so that we have a broad measure of the acceptable minimum value count for a given month and location. Even better, a confidence figure that allowed the actual standard deviation comparison to be made with a looseness proportional to the sample size.
All that’s beyond me – statistically and in terms of time. I’m going to have to say ’30’.. it’s pretty good apart from DJF. For the one station I’ve looked at….”

jh
November 23, 2009 4:19 am

previous beeb link didn’t work for me perhaps this might 0.735am spot
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
Lawson suggests funding body NERC and VC should enquire into issues raised
– pigs might fly

Martyn B
November 23, 2009 4:22 am

I have to say this tickled me from HARRY_READ_ME.txt
“So, once again I don’t understand statistics. Quel surprise, given that I haven’t had any training in stats in my entire life, unless you count A-level maths.”

November 23, 2009 4:24 am

Given the volume of material and the number of serious issues to be considered how can any of the participants show up in public without a barrage of embarrassing questions ?
Has the entire AGW community just been neutralised for all practical purposes ?
Unless we see an entirely new set of personnel, climate science will be frozen in the spotlight and cannot progress.

Arthur Glass
November 23, 2009 4:24 am

“…“…we are having trouble to express the real message of the reconstructions – being
scientifically sound in representing uncertainty…” ”
Now ‘trouble to express’ *does* sound like an ESL grammar error. ‘Trouble expressing’ would be standard grammar.

Basil
Editor
November 23, 2009 4:25 am

Nick Stokes (20:34:52) :
I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable. So is there any indication that anyone has plotted beyond 1960? This came up on the Bishop Hill thread, where he drew attention to an email by Tim Osborn where he said that they never plot some treering set beyond 1960 because of a divergence issue. Turns out that that is what Briffa/Osborn say also in Briffa et al 2001. This Briffa/Osborn context may be unrelated, but it seems to me that it may simply just mean what it says. Don’t plot beyond 1960 using this code. And people don’t.

Nick,
I think you are hanging your hat on the paleo/divergence issue. But it looks to me like HARRY_READ_ME is about the code used in CRU TS. I’m not certain about that, but I think we need to know. If so, then the Briffa et al literature acknowledging the divergence in paleo time series really doesn’t apply here. I.e., the “adjust for the decline” in the “Harry” code, and “Mike’s Nature Trick” are two different things.

debreuil
November 23, 2009 4:25 am

Ok, haven’t done fortran in 20 years, but if I read this right, it is creating a weighting hash for each 5 year period starting in 1904 (two arrays, 1st is year, second is weighting). The forties area are multiplied by as much as -.3, then in 1960 the ‘fudge’ creeps positive, up to 2.6 in 1980 onwards. It then interpolates this over the data. Please correct if this is wrong…
1904 0.
1909 0.
1914 0.
1919 0.
1924 0.
1929 -0.1
1934 -0.25
1939 -0.3
1944 0.
1949 -0.1
1954 0.3
1959 0.8
1964 1.2
1969 1.7
1974 2.5
1979 2.6
1984 2.6
1989 2.6
1994 2.6
1999 2.6
original code (\FOIA\documents\osborn-tree6\briffa_sep98_d.pro)
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)

November 23, 2009 4:26 am

A better case can’t be made for open science, open software … And honesty. Never trust programs done behind closed doors.
When the secrecy of temperature data, and the software that manipulates it, is higher that that for nuclear weapons design, there is something very wrong with the whole of climate science. The hoax is designed to scam people out of money and make others very rich off the hoax.
Science may never recover from this … Where is the virtue in science fraud.

Frank Lansner
November 23, 2009 4:26 am

rbateman (04:00:20) :
Sounds very interesting – could you perhaps explain a little more?
Thanks 🙂
Frank

Basil
Editor
November 23, 2009 4:32 am

Ashtoreth (03:52:14) :
Monkton is absolutely correct, we need to take the raw data, the calculations, and build new, verified models and data sets to see what is hapenning BEFORE we spend all this money.

While I agree, we need to realize that there is no longer any raw data, at least at CRU. So it — the raw data — will have to be acquired all over again. Given the that this is now international politics, and not just academics cooperating in the interest of disinterested science, that may no longer be possible.
Can anyone tell me what the relationship between CRU TS and HadCRUT is? While there has been a bit of a kerfuffle over the fact that CRU is not Hadley, do the latter use data from the former in their product?

jh
November 23, 2009 4:34 am

This is Lawson’s think tank
http://www.thegwpf.org/
Membership a minimum £100
Lots of names you know on the advisory board

John Finn
November 23, 2009 4:39 am

Just watched the Politics Show on BBC1 (UK).
Fred Singer and Bob Watson (Chief Environmental Scientist) were interviewed by Andrew Neil. Not exactly a trouncing, but Singer got the easier ride and probably edged it. Watson was surprisingly agreeable and suggested an enquiry should be set up which looks into a) the ‘hacking’ of the emails and b) the contents of the emails.

Yertizz
November 23, 2009 4:41 am

Bratby: The BBC coverage has reopened…….
It is beyond question that the climate is changing; that man is completely responsible is very definitely not! That is why I am delighted at the revelations of the CRU at the University of East Anglia.
Complicit in this misrepresentation of the science is the BBC in its TV and radio output. For over 3 years I have been trying to elicit answers from both Mark Thompson (Director General) and Sir Michael Lyons (Trust Chairman). All I had received was sophistry and obfuscation, until I engaged the help of my MP.
Recently it came to light that a report had been commissioned in June 2007 jointly by the Trust and BBC Board of Management entitled “From Seesaw to Wagon Wheel-Safeguarding Impartiality in the 21st Century”. It concluded: ‘There may be now a broad scientific consensus that climate change is definitely happening and that it is at least predominantly man-made… the weight of evidence no longer justifies equal space being given to the opponents of the consensus’.
Despite this damning evidence from their own report, they steadfastly cling to the belief that their impartiality is intact as required by the BBC Charter. Such is their state of denial that Sir Michael Lyons has even tried to deliberately mislead my MP despite evidence I have to the contrary.
In light of this I have posed the question, through my MP: “On whose authority did the BBC cease to be an impartial Public Service Broadcaster, as required by its Charter, and become the judge, jury and sponsor of such dangerously specious political dogma so eloquently described as ‘…the consensus…’?
Answer comes there none! I believe it is time for the BBC to be subjected to an enquiry on this matter.
Also significant…complete lack of response from the Guardian……still peddling the same rubbish…….http://www.guardian.co.uk/environment/2009/nov/22/climate-change-emissions-scientist-watson

Frank Lansner
November 23, 2009 4:46 am

If anyone missed this part from debreul:
It says important note, but I guess I missed the memo.
\FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way , and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.

HAVE BEEN MODIFIED TO LOOK MORE LIKE THE OBSERVED TEMPERATURES
Game over. They wil HAVE to call this a fake to keep their jobs.

November 23, 2009 4:46 am

This is spreading like a bushfire this lunchtime (UK time) the Telegraph newspaper’s website stories on this have crashed their servers.
Either that, or foul play is afoot. I dunno, perhaps I have watched too many episodes of BBC’s “spooks” and can imagine the MI5 geek trying to stop this story spreading round the mainstream media.
All I got from the Telegraph site was:
“Gateway Timeout
The proxy server did not receive a timely response from the upstream server.
Reference #1.cae3554.1258980286.0 “

Joseph in Florida
November 23, 2009 4:46 am

What language are these *.pro files? I am guessing Fortran.

November 23, 2009 4:49 am

BTW, This was today’s leading story a couple of hours ago, now it is not in their top5 anymore…
Strange? Not really. I suppose if the weight of traffic to that page crashed that page (the rest of the site is OK), then the fact that the page is not being accessed due to that crash would mean that the code counting page views inside that page would not be incrementing its count.

michael
November 23, 2009 4:54 am

reality vs. modell
http://i50.tinypic.com/301j8kh.jpg
have fun…!

fred
November 23, 2009 4:54 am

I keep reading posts by team supporters along the line of “is that all you’ve got, that’s nothing.”
I think some of the supporters of the team need to be reminded of MM’s denial when John Finn brought up the issue at “Real”Climate.
“No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstrution. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum.”
As McIntyre observes, the temperature was not “fully grafted”. However you want to describe it the temperature record was included to hide the decline. If you go to the “Real”Climate archives you will see that when Finn persisted in his questions, Mann suddenly became too busy to bother with the issue. Maybe he was, but he certainly avoided having to go into the messy details of what was done.
http://www.realclimate.org/index.php/archives/2004/12/myths-vs-fact-regarding-the-hockey-stick/#comment-345
If what was done is not a big deal, why didn’t Finn get a clear answer then and there.

November 23, 2009 4:55 am
Arthur Glass
November 23, 2009 4:59 am

” “Overall we find that the observed Northern Hemisphere circulation trend is inconsistent with simulated internal variability, and that it is also inconsistent with the simulated response to anthropogenic and natural forcing in eight coupled climate models.”
Hmmm. The observed reality and the simulated ‘reality’ are out of synch. But note that the writer’s instinct is to call the observed reality, not the simulation, ‘inconsistent’.
Nothing necessarily sinister or duplicitous here, but it is language revelatory of certain habits of thinking

Jay
November 23, 2009 5:01 am

E.M.Smith (23:56:46) :
Wonderful Explanation. Everyone should read what you wrote. Perhaps Anthony will make a blog post about this. The details are amazingly enlightening. Keep it up guys.

hunter
November 23, 2009 5:03 am

For me, as a layperson who has been in financial services marketing and who reads history, the e-mails are proof that these guys knew they were up to no good, and are doing so deliberately.
When you see how aggressive they have been in attacking skeptics, attributing to skeptics vile motives, how they depend on argument by authority, etc. etc. etc., It is clear to me that they have been doing this for a long time.
The code is where they have committed their fraud, and that, fortunately for us who are their victims, cannot be hidden so easily.
Would it not be great if someone in GISS were to have the strength of conscience that this brave person in the UK has demonstrated?

Patrick Davis
November 23, 2009 5:04 am

Well here in Aus on DateLine tonight (ABC), we had some guy, don’t recall the name as I caught the tail end of the broadcast, being very “jittery” in answering questions about the content of some of the e-mails, but then, when talking politically and enjoying the fact he’d just got back from Singapore and will soon go to Copenhagen, I thought of 400kg polar bears, he, just grinned.
Nice one if you can get it.

MangoChutney
November 23, 2009 5:05 am

Asked realclimate in what context the topic quote should be taken
http://www.realclimate.org/index.php/archives/2009/11/the-cru-hack-context/

Patrik
November 23, 2009 5:08 am

http://www.cru.uea.ac.uk/ is entirely down right now if you all hadn’t noticed.

Danny V
November 23, 2009 5:11 am

And the Copenhagen prpoganda machine continues at full speed.
http://malaysia.news.yahoo.com/ap/20091123/tbs-sci-climate-09-post-kyoto-f8250da.html

Chris Wright
November 23, 2009 5:11 am

Nick Stokes (20:34:52) :
As another poster mentioned, you didn’t give the full quote. It says: ” Uses “corrected” MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
In other words, the data will be ‘artificially’ adjusted so that it is more consistent with HADCRUT3. What do you suppose he meant by ‘artificial’? To me it suggests using something that is not measured data. This seems to be perilously close to fraud.
Chris

Yertizz
November 23, 2009 5:14 am

Cassandra King (04:12:03 …..see my earlier post @ (04:41:24) :

November 23, 2009 5:24 am

“These scientists at East Anglia are both honourable and world class. Their data is not being manipulated in any bad way”.
In line with AGWThink, it is being manipulated in a *good* way.

mjw01
November 23, 2009 5:27 am

After reading your post, I read through some of the code directories and wrote a post on my blog at http://matthewjweaver.com/index.php/about-the-commented-code-released-by-the-alleged-hacker/. In this, I do not reach the same conclusion:
First, these are in a subdirectory called “oldprog” and risks being taken completely out of context. While I do not personally program in Progress, the code is rather easy to read but these comments do not tell the whole story. I’ve now looked through more than a dozen code files and most look rather innocuous. What I wonder about is the data input and the weighting assigned to the data as it is processed by the code. There are some data files but I’m guessing most is missing, of course, and the latter, while in the code, is not as easy to discern and judge.
Among the data in the code directories are limited tree-ring data, temperatures, and more. Maybe summaries, sample files, or results? I do not have the time to compare this data with external datasets to see what is real and what is made up. Nor do I have time to read the code to not only spot the weighting and calculations but to interpret their impact. What I do wonder about as I read through the files is how and why specific ranges were chosen for normalization of data and averaging. Consider, too, how missing data was filled in or ignored, and, as well, the impact of solar cycles and other influences.
The bottom line is that all of this is ripe for manipulation to achieve whatever results are desired. Which gets us back to the email. These researchers were Kool-Aid drinkers for the religion of global warming. They have vested interest in producing results to support their cause. The email makes this abundantly clear and shows their willingness to modify data, ignore inconvenient data, destroy data, and actively prevent independent analysis of their data.

Arthur Glass
November 23, 2009 5:28 am

” Yes, yes….science SHOULD rule (am in total agreement there).”
Science should rule what?

Ed H
November 23, 2009 5:29 am

E.M. Smith’s long post on the plight of Harry is spot on. As a long-time software engineer, I’ve had to wade into poorly written/poorly documented code myself before that was written by people no longer accessible anymore. Every ‘Harry’ in this kind of situation has a job to do: figure out what the code does well enough to get it to run, probably with some additional inputs or new requirements that necessitate changing the code some – if it can be figured out. ‘Harry’ usually is given a very tight deadline, so he doesn’t have time or the approval just to start over from scratch. And ‘Harry’ certainly was not brought in because he knows climate – he has to pick up tidbits along the way to help him guess if what he is writing makes any sense – and when it doesn’t he will just try his best to make it do what the big wigs say it should do. They are defining the requirements – and changing them every few days or weeks, and define what correct means for Harry’s programs. (Harry didn;t write them, and never would have done done it that way, but they are his now, for better or worse. Just gotta try to meet those deadlines…)
But like many programmers in this situation, Harry prefers black and white – correct versus incorrect, so when Harry sees something that is particularly messed, he will also sometimes add colorful commentary to his notes because then when he reads it again next year he’ll remember that he already figured out it was messed up and won’t agonize in another futile effort to make it make sense. He’s not even expecting anyone else other than another ‘Harry’ to read those comments, and the next ‘Harry’ will appreciate the comments and share the chuckle at the crap that they are made to debug and run.
If it sounds like Dilbert – it’s because what Scott Adams pokes fun at is how stuff REALLY happens. Fortunately for us, Harry did what he did, and it will probably help some of us who are looking at the programs and hoping to get some of them running ourselves. Fortunately, GDL is free and is supposed to run IDL programs as-is, though I haven’t finished setting it up myself (had to get coLinux running first), but it will be very interesting. Then there’s the Fortran stuff – that may be trickier to try to run as is – not likely to find a compatible fortran compiler. Might have to port the code to C++ or C#. If anyone else decides to try having a go at porting some of this to modern languages – perhaps we can collaborate.
Thank you, Harry, for all the clues! (whoever you are)

Auruthia
November 23, 2009 5:29 am

I work in programming so I had a look through Harry_readme I was pretty appalled but when I figured out they were talking about how to extract the data from Hadcrut2.1 to get to version 3.0 I was shocked. If it is this then GCMs have major problems.
As the developer trying to sort out the mess said
“So, we can have a proper result, but only by including a load of garbage!”
The data has clearly been manipulated to get the results they wanted because;
1. Poor data management meant they hadn’t got the original data.
2. Poor programming techniques meant they had no idea how significant amounts of data in 2.1 was generated.
3. Lack of documentation in the original code meant they did not know why data had been manipulated.
The data was changed to match the expected outputs. None of the papers dependant on Hadcrut3 are reliable as Hadcrut 3 itself is unreliable.
Normally in software development we do a thing called “code review” were developers review each others code – it can be fun but it is a blood sport. The stuff described here (even the first of the many, many manipulations) would have you laughed out of the room and demoted to business software tester.
The code for all models needs to be released and reviewed, as well as source data, why it is being transformed, what the calc for the transformation is.
We cannot agree to implement Copenhagen without this. If everything is on the up and up and makes perfect sense then we go ahead but otherwise more proof of the forcings (+ & -) and CO2 impacts are going to be required.

Scouse Pete
November 23, 2009 5:32 am

“This website is currently being served from the CRU Emergency Webserver.
Some pages may be out of date.
Normal service will be resumed as soon as possible. ”
LOL
http://www.cru.uea.ac.uk/

November 23, 2009 5:34 am

Alec J (23:59:14)
Thanks for the heads up. I just listened to the short intervie won the Radio 4 Today programme.
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
Bob Watson stayed right on ‘message’ as you’d expect him to given that his attempt to become Chairman of the IPCC failed. He clearly hasn’t spent any time whatsoever looking into the details of what has been released in the emails and data files.
Nigel Lawson has quite rightly called for the NERC (who fund CRU, the Tyndall Centre etc) and the Vice Chancellor of UEA to set up an independent enquiry into the contents of the released emails/documents. I fully agree with this but don’t think this is ever likely to happen. There is evidence in at least one of the emails released that the VP of UEA has even been in support of CRU blocking the FOIA requests. CRU have even received advice from the ICO on ho wto block the requests. Its very clear that neither the NERC or the universities administration are hardly going to be independent in investigating this matter. Are they?

Basil
Editor
November 23, 2009 5:35 am

As a bit of clarification in my response to Nick, I’m now actually reading the Harry_Read_Me.txt file for myself. It is an attempt to update CRU TS, but it also references tree ring data and the problem of “the decline.” So, it would seem, on this first quick look, that prior to 1960, the world temperature data in this gridded set are being forced to look like the tree ring data, but after 1960, “real” temperatures are used. Is that a fair reading?

Midwest Mark
November 23, 2009 5:35 am

Our local “newspaper” this morning ran this headline as the top story: “Since ’97, global warming has just gotten worse”. It runs through all the usual AGW talking points; i.e., polar bears are threatened, Arctic ice is at an all-time low, huge chunks of ice are breaking off of Antarctica, glaciers are disappearing, etc. It’s an AP story from Seth Borenstein, and it’s supported by quotes from, among other sources, Janos Pasztor, a “UN climate advisor.” If you follow the jump to page A4, there’s an accompanying story (finally!) about these emails, but the tone of the story is accusatory. That is, some dastardly hackers illegally obtained information and are bent on spreading lies!
I’d say this is a good sign. The AGW coalition has to be very desperate and alarmed (no pun) to find themselves in such a tenuous position! They’re even beginning to run stale global warming stories as banner headlines! “Pay no attention to that man behind the curtain!”

MangoChutney
November 23, 2009 5:38 am

it seems my post at RC is being held up in the queue at the moment with other posts in favour of RC sailing through.
wasn’t one of the alleged emails about holding up and censoring posts on RC?

Editor
November 23, 2009 5:41 am

Ed (23:36:26) :

The comments in the HARRY_README file are pretty wild, however. So wild that I haven’t really figured quite what to think about that just yet. There are other comments in the source files that mention data that was lost (cloud data) and which they recreate or try to re-create based on other data or sensor inputs. The HARRY_README though is rather wild.

You can say that again. 🙂

Robinson
November 23, 2009 5:42 am

It’s also not surprising, as you will know, that those not trained as Software Developers, or in Computer Science in general, would have great faith in computer models, even going as far as to suggest that there’s something wrong with reality if it doesn’t match the model! There’s something magical about a computer, if you don’t program them for a living.

Gary
November 23, 2009 5:42 am

The next question is: what do they mean by the “real temperatures” that the programs are adjusting to? Surely not the ones affected by the demonstrated warming biases, faulty station siting, dropouts, and questionable recording standards.

stephen richards
November 23, 2009 5:52 am

E.M.Smith (23:56:46) :
Yet another very good post from you, an entirely accurate if not a bit cynical but then programmers like these make you very cynical and thoroughly peed off.
Well done EM

November 23, 2009 5:53 am

Tonight here in Australia for the first time Tony Jones from “Lateline” began to ask the hard questions with an expose of the CRU scandal as well as an interesting interview with a very nervous Tim Flannery who attempts to represent a viable AGW platform here in Australia.
It’s a huge week here.
ABC links here:
http://www.abc.net.au/lateline/content/2008/s2751375.htm
http://www.abc.net.au/lateline/content/2008/s2751390.htm

MattN
November 23, 2009 5:53 am

I’ve got my popcorn ready.
Keep at it fellas….

Curiousgeorge
November 23, 2009 5:54 am

Danny V (05:11:32) :
And the Copenhagen prpoganda machine continues at full speed.
http://malaysia.news.yahoo.com/ap/20091123/tbs-sci-climate-09-post-kyoto-f8250da.html
No doubt that story was already written and in the printing queue before the Hadley story broke. I expect we’ll see more of this kind of thing for the next few days.
When a dam breaks, it always starts with a trickle.

Frank Lansner
November 23, 2009 5:59 am

Jay (05:01:28) :
“E.M.Smith (23:56:46) :
Wonderful Explanation. Everyone should read what you wrote. Perhaps Anthony will make a blog post about this. The details are amazingly enlightening. Keep it up guys.

E.M.Smith is allways a good read 🙂

Frank Lansner
November 23, 2009 6:01 am

“michael (04:54:48) :
reality vs. modell
http://i50.tinypic.com/301j8kh.jpg
have fun…!

Michael: where does these data come from??? Interensting.

fred
November 23, 2009 6:02 am

P. Gosselin 01:33:38
Can explain to me why the sunspot count of the widget shows 13 when there are just little specs?
This is not sarcasm. I have noticed several times lately that the “count” has been in that range when spots are barely visible.
Are counts today the same as they would have been a century ago? We have been told that, but I’m beginning to wonder.

fred
November 23, 2009 6:05 am

Correction to : 06:02:03
Not on the widget, but on the “Solar-Terrestrial Data”

Jimbo
November 23, 2009 6:11 am

Jesse (21:24:11) :
“Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
Well Jesse I believe that apart from emails there was also a bit of data. When its all scoured over by scientists who previously were denied access we might see the making of a mountain. 🙂

stephen richards
November 23, 2009 6:14 am

E.M.Smith (01:47:51) :
I used to teach Software Engineering many moons ago and one of the tricks we pulled on the students was designed to mimic the process you have defined.
What we did was send each group of students into a seperate room and asked them to write a program sequence in english. The sequence was to be used to instruct someone to make a cup of tea. Youknow … get the kettle, fill it with water from a water tap, plug it in etc etc. It was hilarious. BUT very enlightening

stephen richards
November 23, 2009 6:15 am

sorry about the upside down M

Spartacus
November 23, 2009 6:17 am

Pay attention to the code into briffa_sep98_d.pro:
************************************************
;
; Now prepare for plotting
;
loadct,39
multi_plot,nrow=3,layout=’caption’
if !d.name eq ‘X’ then begin
window,ysize=800
!p.font=-1
endif else begin
!p.font=0
device,/helvetica,/bold,font_size=18
endelse
def_1color,20,color=’red’
def_1color,21,color=’blue’
def_1color,22,color=’black’
;
restore,’compbest_fixed1950.idlsave’
;
plot,timey,comptemp(*,3),/nodata,$
/xstyle,xrange=[1881,1994],xtitle=’Year’,$
/ystyle,yrange=[-3,3],ytitle=’Normalised anomalies’,$
; title=’Northern Hemisphere temperatures, MXD and corrected MXD’
title=’Northern Hemisphere temperatures and MXD reconstruction’
;
yyy=reform(comptemp(*,2))
;mknormal,yyy,timey,refperiod=[1881,1940]
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=22
yyy=reform(compmxd(*,2,1))
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
;
oplot,!x.crange,[0.,0.],linestyle=1
;
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,[‘Northern Hemisphere April-September instrumental temperature’,$
; ‘Northern Hemisphere MXD’,$
; ‘Northern Hemisphere MXD corrected for decline’],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,[‘Northern Hemisphere April-September instrumental temperature’,$
‘Northern Hemisphere MXD’],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
;
end
****************************************
Attention to the code after “; Apply a VERY ARTIFICAL correction for decline!!”
Cheers

JimB
November 23, 2009 6:18 am

“Methow Ken
…If it looks, walks, flys, swims, and quacks like a duck, theoretically it still COULD be something else. But lacking overwhelming evidence to the contrary, odds are REAL good that it’s a duck. The duck quacked in this case, and their goose is cooked.”
I think it could only be a duck up until 1960. After that, the data had to be modified and it began to smell more like a giraffe, or something… :>)
JimB

Mark
November 23, 2009 6:18 am

Re: Patrik (05:08:57) :
“http://www.cru.uea.ac.uk/ is entirely down right now if you all hadn’t noticed.”
Maybe they are busy shredding documents, deleting files and zeroing out the cleared hard drive space?
I gotta believe that certain people at GISS (and probably other similar organizations) are considering deleting old emails, data, and code.

November 23, 2009 6:20 am

A few more press links:
A better (?) link to Lawson-Watson on BBC:
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373677.stm
The Lawson Times article now sindicated to The Australian for Tues (Oz time)
http://www.theaustralian.com.au/news/opinion/copenhagen-deserves-to-fail/story-e6frg6zo-1225802514603
And The Australian now making the link with the Oz legislation (ETS) debate with a response from the Opposition Senate leader (Minchin)
http://www.theaustralian.com.au/news/features/hot-and-bothered/story-e6frg6z6-1225802504484
While Fairfax helping out with the defence:
http://www.theage.com.au/national/email-scandal-rallies-web-climate-sceptics-20091123-iysr.html

rxc
November 23, 2009 6:29 am

I can think of at least 3 nuclear power plants that were shutdown and are now completely inpoerable because of problems with their documentation that was similar to the issues that have been identified in these documents. The nuclear industry is one industry where there are quite rigorous standards for code validation and verification, with well-established, open international standards for evaluating models against data.
I hope that this episode leads to more investigation of environmental mischaracterization of data and cherry picking, starting with the DDT travesty.

November 23, 2009 6:32 am

Gregg E. (02:58:54) :
“The file HARRY_READ_ME.txt is *very revealing* about just how disorganized CRU’s data and software are. “Harry” is apparently Ian Harris. If he’s the author of that file, it appears from the notes that he’s trying to straighten things out but finding that the data and previous software is a complete charlie foxtrot.
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13
I’ll call it +1 probability that the author of HARRY_READ_ME.txt is the insider who took this archive out of CRU.”
I think you are almost right in your deduction Gregg. If you look a bit further into some of the other files I think its likely the person providing the comments in the HARRY_READ_ME.txt is a ‘contract programmer’ brought in to assist Ian Harris in sorting out this mess. This ‘code monkey’ has not shared the goals of the Team and at some point has had enough (perhaps they didn’t renew his/her contract at some point befo rethe release) and has decided (because he/she still has remote access to the CRU departmental server) to assemble all the documents he/she could and take copies of the .eml files of certain staff and following the final FOIA brush off email, decided to release the emails and files to the internet.
If I’m honest and had been in the same position (i.e. to know that UEA were deliberating commiting a crime in not complying with the FOIA requests and wer ein the process of attempting cover up their mess), I’d probably have done the same.

Spartacus
November 23, 2009 6:33 am

debreuil already had noted the “trick” into the code from my last Post. Kudos to him!!

John Galt
November 23, 2009 6:35 am

Is past musings, some have speculated about scientific/academic fraud and what actually constitutes scientific fraud. Have we reached that point yet?

Karl Heuer
November 23, 2009 6:37 am

All,
Fox News Channel (US) has picked up the story. As of ~0830 US Central Time they were doing a segment on air.

MangoChutney
November 23, 2009 6:42 am

as expected, my post has been deleted from RC

Vincent
November 23, 2009 6:45 am

Some of the revelations in the emails have been quite shocking – even worse than I thought. There’s some interesting samples on Andrew Bolt’s blog. However, what I find even more interesting is the response of the warmists. I am waiting to see if any of them – even one – says, hmm, maybe I was wrong; maybe these guys have been subverting science.
But no. The reaction is one of absolute denial (in the non perjorative sense). I am reminded of our old friend the cognisant dissonance again. Psychology teaches us that this is exactly the behaviour to be expected – mentally trying to rearrange the facts to somehow remove the dissonance. Hence we have “mountains out of molehills”, “taken out of context” etc. All very interesting and revealing behaviour.
Yet, in a way, these individuals have a metaphorical noose around their necks. The floor on which they stand is very slowly moving downwards, but they still have time to remove the noose before the rope tightens. The price they must pay is the repudiation of their cherished beliefs. Yet they do nothing except argue, in the hope that their arguments will somehow be heard and the floor will stop descending. The longer they leave it the worse it gets. Too bad.

Enduser
November 23, 2009 6:46 am

Latest gloss-over from the Guardian.
http://www.guardian.co.uk/environment/cif-green/2009/nov/23/leaked-email-climate-change
Nothing to see here, folks, move along, all is well..

John Phillips
November 23, 2009 6:47 am

If the instrument temperatures do not agree with tree ring data over significant recent time periods, then why should we think the tree ring data is accurate for pre-instrumental times?

michael
November 23, 2009 6:50 am

on the optinons how to handel data:
Jones:
Options appear to be:
1. Send them the data
2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.

Phil M
November 23, 2009 6:51 am

This lot is just stunning!
– nd SteveMc must be feeling pretty vindicated at the moment!
– I bet he had a good weekend!
– I love the harry_readme file – it’s just amazing
– the code/data quality of the HadTemp product is so great (sarcasm)
– I think they’ll have to withdraw that from publication after this!
– and the code comments about adjusting the data post 1960 ‘to hide the decline’
– great stuff.
The other thing the emails give is a great insight into how the HockeyTeam operates
– suppressing disent, controlling publications & reviews, hiding data they don’t like, – and diverting attention from the real issue (e.g. ‘hiding the decline’) onto something they can toss back & forth ad infinitum (e.g. ‘trick’)
By the way, the ‘decline’ is the way the northern hemisphere tree-ring data doesn’t track temp after about 1960
– it does the opposite (i.e. declines)
– it does seem amazing to me that they seems to have been no attempt to find the scientific explantion for this, and just loads of software tweaks to hide it….
– for if our great proxies don’t track temp reliably in the current time & recent past, why should we suppose that they do 1000 years ago??
Well, we live in interesting times!

Cheeky Monkey
November 23, 2009 6:52 am

kdkd wrote:
1. Make the unstated assumption that the paleoclimate data is the whole of the co2 forced global warming story,
The skeptics didn’t build up the importance of paleoclimate/proxies. That was done by the Hockey Team and the IPCC (to their detriment, I believe). The warmest it’s been in XXXX (insert your favorite number) has been trumpeted in countless press releases. That’s the doing of the warmies, not the skeptics. McIntyre is on record at CA saying the proxies aren’t really that import to the question of whether or not Co2 causes warming–if that’s also your point, why is Phil Jones/Michael Mann et al in such a tizzy?
2. Hone in on increasingly small parts of the story and ignore the big picture. You have to do this because the big picture doesn’t support your position at all.
Besides the climatology/proxy stuff I guess there’s also the all the modeling/sensitivity issues. Is that the “big picture”? Most of us here are less than convinced.
3. Fail to respond to counter arguments from the originators of the research.
Do you mean the lack of peer-reviewed “skeptic” articles? If so then you might be interested in the emails talking about manipulating the peer review process and efforts to remove the editors of journals that dare to publish anything the Hockey Team doesn’t support. You strike me as a recent refugee from RealClimate so if you mean instead the lack of followup “skeptic” posts at RC, then you you might want to hang around here long enough to see the references to censorship over there. It’s easy to win debates when you have the power to silence your critics.
If you are in fact a frequentor of RealClimate you might like to try a little experiment. Why don’t you try posting something along the lines of “The folks at ClimateAudit and Watts Up with That claim that you practice very selective censorship of the skeptical arguments and response posts at RC. Is this true?” You might also ask if McIntyre or Watts could have a thread on RC to make their case. Report back.

Henry chance
November 23, 2009 6:53 am

“Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.
Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result”
,,,,,,,,,,,,,<<<<<<<Yes indeed. I posted that they would claim all comments were taken out of context last Friday. They did as I predicted. Now Real climate is starting a thread to do so. This is SPIN. Covering what they wrote privately to save face. As a Psychologist, this is common as a defense mechanism when people feel threatened.

November 23, 2009 6:59 am

Over here in Britain this story is starting to build up a head of steam. There are few voices objecting to transparency.

November 23, 2009 7:02 am

My reply to the Revkin/Pierrehumbert blog post
Oh please.
First of all, there’s no indication there was any vandalism involved here. In fact, it’s not even clear there was hacking involved; this looks very much like an inside job. There are no “honey, pick up a pint of milk on the way home” emails, which is interesting in itself. My own suspicion is this particular collection is one Jones himself made of documents he didn’t want released in an FOIA.
Second, scientists, such as myself, who work at public institutions are or should be aware that our email accounts are subject to FOIA. I have my own laptop with a non-university wireless modem that I use for personal business. Use of public property for personal purposes may be tacitly permitted in many places, but it shouldn’t be protected or excused.
And third, this would not have been such an issue if Jones and his cohorts had not been actively trying to hide their raw data, a practice that modern science increasingly frowns on. This is now the third embarrassment that has come from that practice (the first being the revelation of the loss of a lot of original climate data — though one must now ask if that loss was accidental — and the second the serious issues revealed a couple of months ago with Briffa’s analysis of data.). One would hope scientists would learn from their mistakes.

FerdinandAkin
November 23, 2009 7:05 am

Jesse (21:24:11):
This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.

REPLY: So what do you do down there in Norman? NSSL? U of OK? You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
Hi Jesse,
Your ad hominem failed because the custodians and day laborers regularly show themselves to be equivalent to climate scientists and occasionally prove to be better than proponents of Anthropogenic Global Warming. That does not say much about those professional folks at CRU.
As for myself, I am a laborer as an Electrical Engineer who works on weapon systems to be used in a time of war. My day job frequently becomes my night job and sometimes my weekend job. The products of my organization incorporate this feature called “Configuration Management” and if the product has software in it, the software is written to these things called “Standards”. Before any product of my organization is transitioned from development to the military, it has to go through several program reviews, design reviews, technical evaluation, and operational evaluation. Software used in modeling a weapon has to go through Verification, Validation, and Accreditation before it is accepted. Funny, I do not see any of that in the products from CRU.
If you would like to find out about building software to a professional standard, get two programmers together who had to write MIL-SPEC code back in the late ‘80s or early ‘90s and use the words ‘ADA’ and ‘twenty one sixty seven A’ in the same sentence. Bring popcorn; the discussion will extend well into the night.

Mark
November 23, 2009 7:06 am

I see they mention the MWP in file 0845217169:
“There were also long warm spells between 900 and 1100, known as the medieval warm period, and 1360 to 1560. ”
I thought the hockey stick got rid of it?

Sparkey
November 23, 2009 7:07 am

I’m gonna need a whole boat load of popcorn!

North of 43 south of 44
November 23, 2009 7:13 am

Dr A Burns (21:29:27) :
“Another strange happening at Hadley … all the hadcrut3 data for this year, except Jan/Feb, has been deleted.
http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt
That maybe the case because they went over to a “backup system”
If that is the case then that “missing now but not then” object of the FOI requests is also still in existence. Maybe like sort of huh?

Mann O Mann
November 23, 2009 7:13 am

Since we are now told that “Mike’s Nature Trick” simply refers to a clever way to get something done, I think that “Mike’s Nature Trick” should become part of common language use.
For example – If a student’s GPA starts to fall off in his senior year in High School, s/he can simply graft on someone else’s grades. That’s Mike’s Nature Trick helping students get in better universities!
Or if you lost your job and can’t refinance your house, use Mike’s Nature Trick! Just submit someone else’s pay stubs to the lender for the period you are jobless and have a declining income. Mike’s Nature Trick solves the mortgage crisis!
There have to be thousands of ways that Mike’s Nature Trick can be put to work!

Steve S.
November 23, 2009 7:13 am

The manipulating of the debate by the Team at RealClimate goes on. Just as they manipulated the peer review and publication process to make heavily lopsided the presentation of AGW science Gavin Schmidt is using RealClimate posts and responses to distort the picture of the CRU hack.
By disallowing every post he can’t squirm out of or pass off as nothing to see he’s creating the impression there’s nothing of subtance in the hack/whistleblower content.
So once again the team is in fact defrauding the public.
Gavin, along with the loyalist regulars at RC, pile on their routine rhetoric to marginalize the easiest to do so posts allowed through.
This is science to the RC regime.
When push comes to shove Gavin bails.
Even the most generic question he disallows when he doesn’t want to answer. It’s like a trial where only one side gets to clear all questions.
I tried to post this at RC and it was blocked.
“[Response: Thanks. But I don’t know what comment response you are referring to, and your claim that WUWT and CA have no agenda is laughable. – gavin]”
Well then you must know what their agenda is and you’ll be glad to share it with us?

Arn Riewe
November 23, 2009 7:15 am

Tom Fuller has done postings on this which are well worthwhile reads. His 5th in the series provides one of the most succinct overviews of the development of corruption in the scientific process over the last 10 years.
http://www.examiner.com/examiner/x-9111-SF-Environmental-Policy-Examiner~y2009m11d21-Evidence-of-a-desperate-push-to-pump-global-warming-up-and-up?#comments

Henry chance
November 23, 2009 7:19 am

debreuil (04:25:44) :
“Ok, haven’t done fortran in 20 years, but if I read this right, it is creating a weighting hash for each 5 year period starting in 1904 (two arrays, 1st is year, second is weighting). The forties area are multiplied by as much as -.3, then in 1960 the ‘fudge’ creeps positive, up to 2.6 in 1980 onwards. It then interpolates this over the data. Please correct if this is wrong”
<<<<<<my fortran is from the early 70's and it also sees the "weighting factors" attached.

TerryBixler
November 23, 2009 7:25 am

The code archive is hardly an archive as it is a one shot picture of the code. A current standard archive is fully version controlled. The version control has the code version number, the name of the modifier, an explanation of the modifications, the date and time of the modification. You can call on the version control software to produce the previous version, or any version in the archive and further compare any version to any other version. These are the minimal standards that should be applied to any code or dataset. It appears that CRU brought no experienced software expertise to this important process. Further within the code in each routine there needs to be commentary, in plain English, that describes what the routine is to do. As the code unfolds a description of what the code is doing. Commentary like truncate the series at 1960 as a header might be OK but then how and why you were going to do it are important. To say that the data set is junk and write code to fabricate data needs to have a header “fabricating data here”. If after writing such a header you did not complain to the highest person you could access and succeed in getting the issue resolved, I cannot imagine writing one more line of code for anyone in that organization, no matter how august appearing that organization is. The issue of ethics is always present on every line of code written.
We have not even covered code design, code writing, code review and code debugging. Just some commentary on archiving and commentary.
Trillions of dollars on amateur night at the CRU. Such organized corruption by educated people is beyond belief. The damage to the universities involved is immense. These universities need to be held to task for failing the most basic supervision of their professors and research grants.

Pingo
November 23, 2009 7:26 am

You can discuss Climategate on BBC here. Be “careful” or your comment will get pulled knowing what their censorship is like.
http://www.bbc.co.uk/blogs/paulhudson/2009/11/climategate-cru-hacked-into-an.shtml#comments

Robinson
November 23, 2009 7:31 am

Great summary of code and data issues found so far at Devil’s Kitchen. Very entertaining read as well.

Gene Nemetz
November 23, 2009 7:35 am

the 62 mb isn’t the smoking gun. it’s the smoking howitzer.
“Bang, bang, they shot themselves down. bang, bang, they hit the ground…”
~~cher’s new version

Power Grab
November 23, 2009 7:36 am

@ Jesse and Mike McMillan:
…build a grad school their football team could be proud of…
I remember those days. Where did they go?
I guess OU overdosed on too much elitism, eh? This week’s gridiron game should be interesting. Got popcorn?

Count de Money
November 23, 2009 7:37 am

Here is something from Briffa that says a lot about their mindset (from 938031546.txt):
“There is still a potential problem with non-linear responses in the very recent period of some biological proxies ( or perhaps a fertilisation through high CO2 or nitrate input) . I know there is pressure to present a nice tidy story as regards ‘apparent unprecedented warming in a thousand years or more in the proxy data’ but in reality the situation is not quite so simple. We don’t have a lot of proxies that come right up to date and those that do (at least a significant number of tree proxies ) some unexpected changes in response that do not match the recent warming. I do not think it wise that this issue be ignored in the chapter.”
Mann’s response in part (938018124.txt):
“But that explanation certainly can’t rectify why Keith’s series, which has similar seasonality *and* latitudinal emphasis to Phil’s series, differs in large part in exactly the opposite direction that Phil’s does from ours. This is the problem we all picked up on (everyone in the room at IPCC was in agreement that this was a problem and a potential distraction/detraction from the reasonably concensus viewpoint we’d like to show w/ the Jones et al and Mann et al series.
So, if we show Keith’s series in this plot, we have to comment that
“something else” is responsible for the discrepancies in this case. Perhaps
Keith can help us out a bit by explaining the processing that went into the series and the potential factors that might lead to it being “warmer” than the Jones et al and Mann et al series?? We would need to put in a few words in this regard. Otherwise, the skeptics have an field day casting doubt on our ability to understand the factors that influence these estimates and, thus, can undermine faith in the paleoestimates. I don’t think that doubt is scientifically justified, and I’d hate to be the one to have to give it fodder!”
Seems like they’re more interested in putting up a united front than figuring out why their data tells different stories. Sounds like bunker mentality.
I have to get back to mopping floors now.

Gene Nemetz
November 23, 2009 7:38 am

TerryBixler (07:25:17) :
If I am remembering correctly you have been a programmer for a long time. So your viewpoint carries a lot of weight with me.

John Smith
November 23, 2009 7:39 am

Lol are you guys for real? Whatever this is about, you dont need to understand programming to see climate change. Come to northern Canada and take a look – the north west passage is open people! All the ice melted! It has been closed up with ice for a long [snip] time.
There is no argument against global warming. It is warming. The question is how much do people contribute to it? And do we want to slow it down?

austin
November 23, 2009 7:45 am

All the data needs to be in a RDBMS with the sql published.

November 23, 2009 7:48 am

@ Gene Nemetz (22:52:46) :
“Look under ‘cha-ching’”
I believe the term is ‘ka-ching’… ;-))

Aligner
November 23, 2009 7:48 am

E.M.Smith (00:47:58) :

REPLY: You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
OK, I’ll follow that lead …

A cracking post, hats off to you Sir! 30 odd years of experience here too. Mission critical software engineering and IT projects with 30+ teams in manufacturing, insurance, banking, telco and military sectors. Two directorships, 3 years consultancy to overseas government. From assembler programmer to board room and most roles in between on PCs to mainframes. Bespoke hardware design, manufacture, OEM integration and enough ‘T’ shirts to winter the rhubarb.
Never mind the cobbled up code, it’s the seemingly abject management of the whole enterprise that alarms me the most. Where’s the QA, where’s the change control? Where’s the BS EN ISO 900x conformance/paper-pushing nonsense that’s a part of all UK government IT these days? At least that would have raised a few red flags. No need to ask about documentation and operational procedures, a quick skim of HARRY_READ_ME.TXT paints the picture, even if it is a tad over the top. My heart bleeds for the poor guy, we’ve all been there! Doesn’t even appear to be a DBMS anywhere, it’s all in discrete files with virtually no integrity checks whatsoever. And politicos are betting the future shape of the world and billions of peoples’ livelihoods on this alligator swamp!? Give me a break, please.
Where’s the design, who’s managing the overall system architecture, who’s dealing with archival, disaster recovery and so on? Like as not the same uninformed amateurs with science PhDs that did a couple of modules on Fortran 66 once upon a time plus a couple of general purpose “IT gofers”. Not really surprising, after Brown’s ten year IR35 vendetta against UK IT professionals, that’s about all that’s left here now.
There’s only one thing to be done here: Send in experienced independent IT auditors, shut the whole festering mess down and box it up before valuable raw data is potentially lost forever. Do not enlist the usual milking machine offshoots of big accounting firms that employ hordes of green-horn IT graduates at max mark-up, this is serious business.
Beyond that I fully endorse Nigel Lawson’s statement in The Times today: “A high-level independent inquiry must be set up without delay.”
Concern should be raised over the size of budgets being handled here too, this is all public money regardless of the source. Just look at some of those grant application PDF files. Who is managing this at the coal face, the same people? In the same sloppy way?
And what about management of the new pig iron the Met Office just spent a huge sum of our money on? I sincerely hope it’ll be better than what appears to be going on here but perhaps we ought to see the justification first.
If you do everything by bozo sledge hammer approach from first principles at every turn (and just look at the state of this code for crying out loud!) then yes, maybe you do need a 1 petaFLOPS machine. But has anyone with years of practical experience of architecting big performance bound systems from the ground up looked at the problem? Sadly, this all reminds me of nut jobs doing complex iterative floating-point actuarial computation from first principles in big insurance systems and forever screaming for more hardware. Sounds suspiciously like another pig iron salesman’s wet dream to me, never mind that its carbon footprint is the equivalent to the CO2 emitted by 2,400 homes!

Evan Jones
Editor
November 23, 2009 7:52 am

Harry can you READ_ME?
Harry can you heed me?
Harry can you lead me?
Read me
Run me
Feed me
Fund me

Roger Knights
November 23, 2009 7:53 am

Prof Watson starts by saying “These scientists at East Anglia are both honourable and world class.”
Yes, isn’t it nice to think so?
rbateman: The news claims that “hackers” broke into and stole emails/data.
The real “hackers” destroyed science data long before that, under the guise of science.

You can say that again.

Akatsukami
November 23, 2009 7:58 am

Just think, if only Mann, Jones, et al. were competent, they could gone to Canada and taken a few snapshots of the Northwest Passage, instead of cooking their data, deleting documents named in FOI requests, and strong-arming editors of learned journals into not publishing papers by those who disagree with them.

Jack Okie
November 23, 2009 8:01 am

Yertizz:
Any comments or moves from Cameron and the Tories? I’d think this might be another issue for them in the upcoming elections.

Glenn
November 23, 2009 8:04 am

John Smith (07:39:24) :
“Lol are you guys for real? Whatever this is about, you dont need to understand programming to see climate change. Come to northern Canada and take a look – the north west passage is open people! All the ice melted! It has been closed up with ice for a long [snip] time.
There is no argument against global warming. It is warming. The question is how much do people contribute to it?”
There are more questions than that, John. How much of Arctic melting is due to warming, for instance. Another is how much warming is really happening. Another is how people contribute to it, in addition to how much. These are just a few of the basic questions.

Phillip Bratby
November 23, 2009 8:10 am

According to Paul Hudson at the BBC he saw the emails on 12th October. http://www.bbc.co.uk/blogs/paulhudson/2009/11/climategate-cru-hacked-into-an.shtml#comments

Janis
November 23, 2009 8:10 am

I guess Al Gore wishes he would not have invented the internet. Live by the Internet, die by the internet. When I use to write code, I would ask my boss what he wanted for results? You can make the data come out as you wish. Even a woman could do this.

Neil McEvoy
November 23, 2009 8:11 am

Jack Okie (08:01:11) :
“Any comments or moves from Cameron and the Tories? I’d think this might be another issue for them in the upcoming elections.”
Cameron’s been burnishing his green credentials since Day 1 as Tory leader. It’s been part of his drive to “disinfect” the Conservative “brand” of its image as the “nasty” party. There’s not the slightest chance he’ll take any notice of this, more’s the pity.

Pathfinder
November 23, 2009 8:12 am

Kudos, Evanjones (07:52:46), from another Tommy affictionado. This has exposed a lot of “fiddlin around, fiddlin around”!

Evan Jones
Editor
November 23, 2009 8:15 am

This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
Appalling. This is the sort of attitude we may expect should a scientocracy emerge. Be warned. (It goes further than that. Only the “credentialed” will be heard. And three guesses who arbitrates the credentials.)

Methow Ken
November 23, 2009 8:15 am

While realizing that’s it’s a very general and imprecise yardstick, IMO it is still worth noting that yesterday when I threw ”climategate” at Google, it came back with ~38K hits. As of 08:14 US PST same Google returns 81.5K hits.
Clearly climategate has gone viral; big-time.

Ashtoreth
November 23, 2009 8:16 am

One question that interests me.
They have stated they cannot realease data (to a FOI request?) as its been lost.
Isnt it time to send a forensic team in and see what has actually vanished…??
Given the standards set so far (typical academia science, basicaly 🙂 I’d bet on there being a lot if not all of what has been claimed to have been lost sitting around somewhere.
(and yes, multiple code and data versions and repositories are not good software practice. However its quite prevelant in these situations….)

Yertizz
November 23, 2009 8:17 am

Jack Okie
We can but hope….I sent the content of my earlier post to a Conservative MP who had commented on the CRU in the Daily Mail.

Frederick Michael
November 23, 2009 8:20 am

All those times they used the phrase, “It’s worse then you think,” we thought they were talking about the temperatures. They were actually talking about their deceptions.

Robinson
November 23, 2009 8:24 am

Aligner, I agree with your comments but,

“A high-level independent inquiry must be set up without delay.”

I’m afraid, as Sir Humphrey has demonstrated on more than one occasion (Stern?), independent inquiries are almost never independent.

November 23, 2009 8:26 am

This gives whole, new meaning to “paint-by-numbers.”

philincalifornia
November 23, 2009 8:27 am

John Smith (07:39:24) :
Lol are you guys for real? Whatever this is about, you dont need to understand programming to see climate change. Come to northern Canada and take a look – the north west passage is open people! All the ice melted!
_______________________
Lol John are you for real?
Why don’t you go to northern Canada and take a look –

Stef
November 23, 2009 8:31 am

John Smith (07:39:24) :
“Lol are you guys for real? Whatever this is about, you dont need to understand programming to see climate change. Come to northern Canada and take a look – the north west passage is open people! All the ice melted! It has been closed up with ice for a long [snip] time.”
As you clearly admit that the North West Passage used to be open (indeed, it is even called a passage 😉 ), shouldn’t you actually be asking why the climate became so cold that the passage was closed for the last few decades?

Roger Knights
November 23, 2009 8:35 am

“When a dam breaks, it always starts with a trickle.”
A storm starts with a single raindrop.
Prof Watson starts by saying “These scientists at East Anglia are both honourable and world class.”
We’ll see about that.
“If you look a bit further into some of the other files I think its likely the person providing the comments in the HARRY_READ_ME.txt is a ‘contract programmer’ brought in to assist Ian Harris in sorting out this mess.”
That would fit nicely with the title of the file, which would be addressed to Harry thusly (with a colon) in non-computer-filename-English: “Harry: Read Me”.

John Galt
November 23, 2009 8:36 am

John Smith (07:39:24) :
Lol are you guys for real? Whatever this is about, you dont need to understand programming to see climate change. Come to northern Canada and take a look – the north west passage is open people! All the ice melted! It has been closed up with ice for a long [snip] time.
There is no argument against global warming. It is warming. The question is how much do people contribute to it? And do we want to slow it down?

Smith (if that is your real name):
Climate change is not evidence that AGW is happening.
You assume climate change is bad and you also assume that the climate is stable without human influence. Neither of these beliefs are based upon facts. Far from it. The earth has had cold periods and warm periods in the distant past and the near past and human activities had nothing to do with those.
History shows it is better to live in a warm period than a cold period. History also shows run-away global warming ASA the IPCC dooms-day scenarios has never happened in the past. The ice core data also shows warming proceeds CO2 increases by centuries, so how is it CO2 is now causing AGW?
Thank you

Alexander Harvey
November 23, 2009 8:37 am

RE: mjw01 (05:27:59),
I was very puzzled by your “Kool Aid drinkers” reference, I could not think what Ken Kesey had to do with all this. I looked it up so I know better now.
Alex

George E. Smith
November 23, 2009 8:38 am

“”” Jesse (21:24:11) :
. . . bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers. “””
Well Jesse, I’m reasonably sure that I could not get a job at Walmart; or MacDonalds for that matter. With a name like mine, I couldn’t even fill out the job application form.
But I do shop at both places. On weekend mornings, I drop by the local Mac for my 38 cent Senior decaf (one cream and one sugar in packets) so I can sit down and wait for the Chinese gang of senior citizens to arrive from their early morning Tai Chi exercises; I bet they all used to work on the railroads; but since I don’t speak Cantonese, I can’t even ask them about that.
Mind you I haven’t shopped at Walmart since about three years ago; when I did my last binge shopping spree there. I remember I bought three pairs of shoes, two in brown for work,for $14 and $17, and I splurged $21 for a black pair for weddings and funerals. The $17 shoes were a ripoff, because after only three years of wearing every other week, the soles have cracked in half, so If it has been raining; when I take my lunch walk to Carl’s Jr, the cracks pump water off the parking lot, and my socks get all wet. So I wear the $14 ones on wet days.
I got three pairs of trousers for work too, and a couple of fake leather belts; I had to cough up $123 for that lot; bunch of damn bloodsuckers those people are; so I wouldn’t even work there if they wanted to hire me.
So I’m kind of glad that I decided 50 years ago to get our of the Nuclear Physics game; because there was too much secrecy; and fall back on my Electronics, and Optics to get me by.
And I have a deal with my wife; she can throw food around in the kitchen as much as she likes; but she has to mop the floor; so I don’t do that deck swab thing you alluded to.
I’d really like to meet some of you real scientists one of these days; but you know how it is; if you want to steal somebody’s water skis and tow ropes, and life jackets; you pretty much have to go to a lake where they congregate, so you can have a better selection.
Maybe Jesse, that is why we haven’t ever crossed paths; we are probably in parallel universes.

Pamela Gray
November 23, 2009 8:40 am

During the times that the NW passage has been opened, weather related and Arctic oscillating current (both atmospheric and oceanic) events can easily explain it. It seems the AGW folks are more easily duped by weather than skeptics are.

Roger Knights
November 23, 2009 8:40 am

” I am waiting to see if any of them – even one – says, hmm, maybe I was wrong; maybe these guys have been subverting science. But no. The reaction is one of absolute denial ….”
But the wobbled are going to keep their insecurities to themselves, for the most part. (Although there was one believer who posted on RC a confession of her rattled state, and whose comment was re-posted here.)

Chez Nation
November 23, 2009 8:43 am

the Columbus Dispatch in Columbus, Ohio, put an AP story on the front page of the printed version of the newspaper that I read on my doorstep this morning. Inside the paper, they also publish the bbc story that suggests that the scientists were the victims of some crazy hacker, trying to undermine Copenhagan
The bias of this is amazing, as if there was some type of emotional outburst reactionary response on the part of the reporters of this newspaper, that is so over the top as to merit a review of the individuals responsible for placing this story on page one.
Note: The web version front page talks about local news instead and one can find the climate articles in the US/World section
Please consider posting a respectful and fact based comment at the newspaper’s website:
http://www.dispatch.com/live/content/national_world/index.html
thank you – Chez

George E. Smith
November 23, 2009 8:45 am

Say ChasMod, looks like some improperly commented software deglitched me there, and half of my post got duplicated half a page up there.
Do be a good chap and expunge that first one with fewer tree rings.
Thanks a lot .
[It already had been done. ~dbs, mod.]

WakeUpMaggy
November 23, 2009 8:53 am

Jesse (21:24:11) : “This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
What these researchers failed to measure or predict with their models was the sudden expansion of the blogosphere, thanks to all that hot air created by their pontifications about the new religion. Anyone detect “clericalism” in the above statement?
Here come the “real” thinkers, the “real” statisticians, the “real” creatives, the “real” inventors, the “real” questioners, haha, Here Comes Everybody!
Among the few religious catechists and inquisitors embedded at RC, everyone else showed up too. What a gold mine of talent in the free thinking, unpaid world!
How incredulous they must have been when WUWT won the Best Science Blog of 2008.

Harry MacDougald
November 23, 2009 8:57 am

“”” Jesse (21:24:11) :
. . . bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers. “””
I am perhaps mistaken, but the George E. Smith of these parts, who spoke just above of Mickey D’s and Wal-Mart, appears to be the same George E. Smith who will shortly be traveling to Stockholm to pick up the Nobel Prize for Physics.
But he has too much class to mention it. Think on that for a minute.

David
November 23, 2009 8:58 am

@Pingo (07:26:25) :
The gloves are off @ the BBC Blogs. Paul Hudson posted some links, so now the mods have reliased that the punters may as well be able to do the same…. 🙂

Curiousgeorge
November 23, 2009 8:59 am

Just be glad these bozo’s aren’t programming flight control software for commercial airliners. 🙂

matt
November 23, 2009 9:00 am

From Harry_read_me.TXT
—-
19. Here is a little puzzle. If the latest precipitation database file
contained a fatal data error (see 17. above), then surely it has been
altered since Tim last used it to produce the precipitation grids? But
if that’s the case, why is it dated so early?
—-
Has some data been changed after-the-fact and time stamps purposefully altered?
Did “Tim” use the right data to produce the grids?
I’d not trust this code to accurately reconcile my checkbook…

November 23, 2009 9:00 am

This UK lobbying organisation has just reported the CRU to our Information Commissioner
http://www.taxpayersalliance.com/campaign/2009/11/cru-emails-reveal-inconvenient-truths-about-foi.html

Glenn
November 23, 2009 9:00 am

Here’s the only “moderating voice” critique of the code section I have been able to find, at realclimate:
“One question about the comments in code posted earlier.“but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.” Could you explain what exactly “will be artificially adjusted to look closer to the real temperatures” means.
[Response: It depends what their timeseries had in it. If it was the proxy record until 1960 and then the observed temperatures after, plotting it past 1960 would make it look artificially like the real temperatures. But they said to not do that. – gavin]”
*****************
I could understand if the comment read “shouldn’t plot past 1960 because that would cause an artifical adjustment that only appeared to look closer to the real temperatures”.
If I read Gavin right though, he claims they said not to do “that” – not to plot past 1960.
The way I read the comment is that it identifies two subordinate procedures to a larger reconstruction, the second one (presumably in another file) in which dates after 1960 will be artificially adjusted.
Programmers, am I way off base?

CodeTech
November 23, 2009 9:00 am

This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.

1. The correct phrase is “try TO”, not “try and”.
2. wal-mart should be capitalized, not sure about the hyphenation.
3. Depending on regional influence, you will probably find you still need a comma before the last item in a list, ie. “WalMart employees, and laborers”
Luckily, the real amateurs appear to be the ones running the AGW circus. I would wager that the majority of readers here, myself included, could code circles around these clowns… and likely not using Fortran, either.
See what I did there? I used the phrase “AGW circus”, then made a clown joke. How amateur is that?
Oh yeah, at first I thought you were saying “uniformed”, but I’ve never had a job that involved a uniform (unless you consider a suit and tie to be a uniform, which technically it might be). Unfortunately for your hypothesis, I am a very informed professional programmer, which is why I am interested in seeing just how bad this train wreck will get.
If I had an employee writing or even using code like I’ve seen here to justify cAGW alarmism (ie. ModelE), they would be out.

November 23, 2009 9:05 am

Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.
Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.
Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.

Interesting. To restate the obvious, these guys appear to have been spinning modern era data. One must still assume that parts of the historical and paleo climate record were unadjusted, and thus still useful for climate reconstruction. Now is the time for the deconstruction crew to go to work. What did they do to what part of the record and when… and what was the untampered-with result they were obfuscating?
Next step: a new view toward the efficacy of tree rings. Remember, it was the people, not the tree rings that lied.

WasteYourOwnMoney
November 23, 2009 9:05 am

From Roger Sowell (23:43:35) :
Quote: 1. Viking settlements in Greenland – proof positive it was warmer back then – even though CO2 was fairly low without Exxon and Shell pumping out CO2 from those evil refineries. If CO2 causes warming, then absence of CO2 must cause cooling. Cannot have a valid control system otherwise.
Roger you are just showing your ignorance. This Greenland statement is an example of regional climate NOT global climate. On the other hand 10 trees in the Yamal Peninsula of Russia is proof positive of GLOBAL climate!
Don’t worry, I don’t get it either. Of course I am just a guy stocking hammers at the Home depot, not one of those smart climate scientist jetting of to Bali!

Aligner
November 23, 2009 9:05 am

Robinson (08:24:49) :

I’m afraid, as Sir Humphrey has demonstrated on more than one occasion (Stern?), independent inquiries are almost never independent.

You’re absolutely right but maybe (just maybe though) the agenda might be different under new management soon. One can only hope, have you a better idea?

Tom_R
November 23, 2009 9:12 am

Thanks to debreuil (04:25:44) : and Spartacus (06:17:59) : for bringing this to our attention.
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
This may be the real smoking gun. It appears that Briffa just made up numbers to add to the dendro measurements to give the result he wanted. These were not weights, but temperature values added to the raw measurements, see the ‘yyy+yearlyadj’ in the code. Not only did he add temperature to eliminate the post-1960 decline (the ‘trick’), but subtracted it from measurements around the 1930s to eliminate the warm period there.

EconRob
November 23, 2009 9:13 am

If the models and data prove or even suggest global warming then they should just release them. It is that simple.

gary gulrud
November 23, 2009 9:13 am

The post recognizes that it will take time to analyze the dump. The emails’ significance will only be understood when the whole is understood.
But the OS timestamps indicate with the archive’s size, that this-whether damning or exonerating, or some combination-is a reliable snapshot of the IPCC climate science effort.
And this we already knew to be a corrupt morass. It is unlikely that what we have is the worst that might have survived normal housekeeping.

Daphne
November 23, 2009 9:14 am

Limbaugh is discussing it right now…

Trev
November 23, 2009 9:17 am

The real ‘deniers’ are the ‘Warmists’ !

Craig Loehle
November 23, 2009 9:17 am

In case anyone wants to start from scratch, almost all world weather records can be accessed in minutes from Mathematica 7 (no I don’t work for them). They have a database that keeps updated to within hours (I downloaded Siberian stations with values from just 1 hr previously local time to test this). One would need to deal with data gaps somehow, and station moves, but the data is not “lost” and I bet the Mathematica database has more stations than Hadly uses.

SandyMcL
November 23, 2009 9:25 am

Some more extracts from the HARRY_READ_ME.txt.
Where is the documenatation to explain all this?
.
.
…have just located a ‘cld’ directory in **** New’s disk, containing over 2000 files. Most however are binary and undocumented.
.
.
Unbelievable – even here the conventions have not been followed. It’s botch after botch after botch.
.
.
The option (like all the anomdtb options) is totally undocumented so we’ll never know what we lost.
.
.
Right, time to stop pussyfooting around the niceties of ***’s labyrinthine software suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the definitive failure of the entire project.
.
.
..on examination the US database record is a poor copy of the main database one, it has more missing data and so forth. By 1870 they have diverged, so in this case it’s probably OK.. but what about the others?
.
.
Oh GOD if I could start this project again and actually argue the case for junking the inherited program suite!!
.
.
Then got a mail from PJ to say we shouldn’t be excluding stations inside 8km anyway – yet that’s in IJC – Mitchell & Jones 2005! So there you go.
.
.
Wrote ‘makedtr.for’ to tackle the thorny problem of the tmin and tmax databases not being kept in step.
Sounds familiar, if worrying. Am I the first person to attempt
to get the CRU databases in working order?!!

November 23, 2009 9:26 am

Plato Says (09:00:10) :
This UK lobbying organisation has just reported the CRU to our Information Commissioner

If I remember well, Jones consulted how to avoid FOIA directly with the Information commissioner.

Scouse Pete
November 23, 2009 9:27 am

http://www.taxpayersalliance.com/campaign/2009/11/cru-emails-reveal-inconvenient-truths-about-foi.html
Indeed very damning indeed. Aside from the issue of whether they massaged the Climate data, I see someone going to jail over this. UEA need to distance themselves from these people immediately by suspension purely on the evidence of the FOI deletions.
For UEA to get embroiled is a wider cover-up by protecting them would indeed damage the reputation of the institution beyond repair. It’s either 2 or 3 heads now, or lots of heads later…….

Jonathan May-Bowles
November 23, 2009 9:30 am

Right…. the killer phrase here seems to be “real temperatures”. Adjustments are being made to give the most accurate assessment. It doesn’t even need ‘spinning’ – that’s what’s written down.

Old Gasser
November 23, 2009 9:38 am

I see a striking corollary in supervision for contract maintenance of floors, where shortcuts in process can lead to less than useless results. A manager that is indifferent, corrupt, or otherwise incompetent can sign off on an eight-hour task accomplished in four hours, provided they overlook the cloud of pulverized wax particulates coating every object in the room. (too high buffing speed). Then there’s shoddy prep work or poor, perhaps non-existent maintenance between crew visits, again with bad results.
There are unwise selections of waxes, cleaners, and pads, that no amount of elbow grease can ameliorate, and greed or laziness eased by too few eyeballs on duty at the right time. Sound familiar?
I’ve had employees range from HS-dropout ghetto kids, to moonlighting academics (soft sciences, invariably) whose day jobs were only about adding more acronyms after their surnames. And everything in between. Exit Questions: Which group was the most trainable; reliable; took most pride in work? Bonus Question: Who most often needed to be counseled regarding personal hygiene?

November 23, 2009 9:40 am

I know the e-mails are so yesterday, but here’s more (from Briffa) on the Yamal larches, jiggering for “growth signals” and the nature of the Nature trick (bold). If this is too long snip away.
>On Mon, 3 Nov 1997, Keith Briffa wrote:
>
>>
>> Tom
>> thanks for the info. Actually this is a chance for me to to mention that
>> we have for the last few months at least, been reworking the idea of
>> looking in the Schweingruber network data for evidence of increasing tree
>> growth and hence ,potentially at least, evidence of changing tree(read
>> biomass) uptake of carbon.
>> The results are dramatic – not to say earth shattering because they
>> demonstrate major time-dependent changes – but changes that are consistent
>> in different areas of the network. We have regionalised over 350 site
>> collections , each with ring width and density data , age-banded the data
>> so that we look only at relative growth in similar ages of trees through
>> time and recombined the standardisd curves to produce growth changes in
>> each region. Basically growth is roughly constant (except for relatively
>> small climate variablity forcing) from 1700 to about 1850. It then
>> increases linearly by about up until about 1950 after which time young ( up
>> to 50 year old) basal area explodes but older trees remain constant . The
>> implication is a major increase in carbon uptake before the mid 20th
>> century – temperatue no doubt partly to blame but much more likely to be
>> nitrate/Co2 . Equally important though is the levelling off of carbon
>> uptake in the later 20th century. This levelling is coincident with the
>> start of a density decline – we have a paper coming out in Nature
>> documenting the decline . In relative terms (i.e. by comparison with
>> increasing summer temperatures) the decline is represented in the ring
>> width and basal area data as a levelling off in the long-timescale inrease
>> ( which you only see when you process the data as we have). The density
>> data do not show the increase over and above what you expect from
>> temperature forcing.
>> I have been agonising for months that these results are not some
>> statistical artifact of the analysis method but we can’t see how. For just
>> two species (spruce in the western U.S. Great Basin area and larch in
>> eastern Siberia) we can push the method far enough to get an indication of
>> much longer term growth changes ( from about 1400) and the results confirm
>> a late 20th century apparent fertilization! The method requires
>> standardizing (localized mean subtraction and standard deviation division)
>> by species/age band so we reconstruct relative (e.g. per cent change) only .
>> We have experimented with integrating the different signals in basal area
>> and density(after extracting intra ring ring width and density data where
>> available) within a ‘flat mass’ measure which shows a general late 20th
>> century increase – but whether this incorporates a defensible relative
>> waiting on the different components (and what the relative carbon
>> components are) is debatable. We now need to make some horrible simplistic
>> assumptions about absolute carbon in these (relatively small) components of
>> the total biomass carbon pool and imlpications for terrestrial and total
>> carbon fluxes over the last few hundred years – and beyond! Without these
>> implications we will have difficulty convincing Nature that this work is
>> mega important.

>> There are problems with explaining and interpreting these data but they are
>> by far the best produced for assessing large scale carbon-cycle-relevant
>> vegetation changes – at least as regards well-dated continous trends. I
>> will send you a couple of Figures ( a tiny sample of the literally hundreds
>> we have) which illustrate some of this. I would appreciate your reaction.
>> Obviously this stuff is very hush hush till I get a couple of papers
>> written up on this. We are looking at a moisture sensive network of data at
>> the moment to see if any similar results are produced when
>> non-temperature-sensitive data are used. You would expect perhaps a greater
>> effect in such data if Co2 acts on the water use efficiency .

Jeff Mitchell
November 23, 2009 9:44 am

The thing I look at is the big picture. I’ve been reading WUWT for several months now and have been following the reports of people attempting to get data and having their requests shoved where the sun don’t shine. The individual emails or documents are suspicious on their own merits, but when you consider that we’ve been speculating these things for a long time and not getting the cooperation we’ve been trying to get, we tend to resolve any ambiguity along the lines of what we already know about these people. If the emails or documents were really innocent, the stonewalling and lack of cooperation has pretty much vaporized any sympathy we might have otherwise had.
Add to that the fact that these people were pressuring peer reviewed publications not to publish opposing views and then claiming these views weren’t published as evidence that they weren’t credible is mind boggling.
If the science is settled, then it should be easy to show. That it isn’t, is a humongous red flag. If it is science, given the same data and methods, then others can replicate the experiment and see if it works. Or they can take issue with the methods applied to the data or challenge assumptions. The resistance to letting the full cycle of scientific processes take their course indicates that it is not science that is occurring. If a scientist is confident of his results, he should be happy to share. In fact, he should be really happy. When you put out data and methods for critique, you get tons of free review which would otherwise be expensive.
Resistance, to me, means that they are not so confident. They may even know that the result they are trying to prove is false. The underhanded way in which they’ve tried to stifle opposition seems to support that. They imagine that someone will find something wrong with it. Why would they imagine that? Do they know something we don’t? Instead, I think they should put it out hoping that somebody will find something wrong with it. Particularly if they think the evidence really points to what they really believe to be a serious situation. If a large asteroid is found that appears to be heading for collision with the earth, one hopes that their finding is wrong and that someone, anyone, will be able to show that it isn’t. Each time I’ve seen a report of a possibility of an asteroid there is first the initial worry that it will hit earth, then later findings have usually been that the reported objects will miss and the first person to have found it will be very, very happy he was wrong. Why aren’t the AGW people trying to prove it isn’t happening? Wouldn’t they be glad to find out they are wrong?
At this point, I think it would be good to have an investigation of these people and let the chips fall where they may.
I would like to see all the communications compared with backups to establish authenticity. I would like to see if the data that was “lost” is actually still there. If they don’t allow the data out, or lose it, any papers based on that data are worthless. The science can’t be settled if it hasn’t been corroborated with full transparency. The science can’t be settled if the models based on that “settled science” don’t actually predict what is happening. And of course, losing the data seems too convenient at this point. The cool thing is that if the data is missing, the claims based on it have to disappear too. Science demands all claims to be backed by observations, and if the observations disappear, the claims have no support and must also disappear. For the CRU it is a lose lose proposition.
I’m hoping that now this stuff is out in the open, that others will throw their oars in too. I am glad this debate has opened up with the release of these documents. Let the games begin.

November 23, 2009 9:45 am

Juraj V. (09:26:39) :
The Info Commissioner here takes no prisoners, he HATES being dicked about – try this one:
http://www.out-law.com/page-8896
“The Information Commissioner has ordered the Government to release minutes of cabinet meetings at which the decision to invade Iraq was made. The order has been made under the Freedom of Information (FOI) Act.
The decision comes despite the fact that the FOI provides qualified exemptions which mean that documents about the formulation of Government policy and ministerial communications do not always have to be released.
The Cabinet Office has refused to release the documents because of these exemptions, but the Information Commissioner’s Office (ICO) has said that the exemptions, contained in section 35 of the FOI Act, are qualified, and are subject to a public interest test.
“As section 35 is a qualified exemption, a blanket approach cannot be taken to justify the withholding of all information to which the exemption is engaged,” said the decision notice produced by the ICO. “Rather, the analysis of the public interest must focus on the circumstances and context of the information in each case.”
“In order for the section 35 exemption to be maintained, in all the circumstances of the case, the public interest in maintaining the exemption must outweigh that in the disclosure of the information,” said the notice.
The FOI request relates to two cabinet meetings held between 7 and 17 March 2003 during which the opinion of the Attorney General on the legality of an invasion was discussed. The ICO said that the importance of such a decision and the need for transparency must take precedence.
“The Commissioner considers that a decision on whether to take military action against another country is so important, that accountability for such decision making is paramount,” said the decision notice. “In this case, in respect of the public debate and controversy surrounding the decision to take military action in Iraq, the process by which the government reached its decision adds to the public interest in maximum transparency.”
“This is reflected by, among other matters, the controversy surrounding the Attorney General’s legal advice on the legality of military action and the ministerial resignations which took place at that time,” it said.
The Cabinet Office argued that the release of the minutes would undermine the confidentiality of Cabinet discussions and the concept of Cabinet collective responsibility, by which all members of Cabinet stand by its decision even if they had disagreed with it in discussions.
“In the Information Commissioner’s view the public interest in disclosing the Cabinet minutes in this particular case outweighs the public interest in withholding the information. He believes that disclosure of the information would allow the public to more fully understand this particular decision of the Cabinet,” said an ICO statement.
The ICO said it had considered “the gravity and controversial nature of the subject matter; accountability of government decisions; transparency of decision making, [and] public participation in government decisions,” the statement said…”

Gary Richmond
November 23, 2009 9:47 am

I first saw this as a breaking story on Twitter and starting following. The search engine for the e-mails is a gem. Thanks!!
You published snippets of the code. If any reader wants to see all the e-mails AND all the data and code, etc, Wikileaks (http://wikileaks.org/) has a link for it. Follow it and click on a download link which has five or six mirrors. Be patient. The servers are busy. I eventually managed to snag a copy (it’s a 61.9mb zip file. Extract it and it will create two folders, one for the documents and one for the e-mails).
There’s a LOT of stuff here. what we need is for someone with the necessary web and database skills to build another search engine for the documents stuff. Any takers?
Finally, huge thanks to this website and others for providing a necessary counter weight to the avalanche of dysfunctional peer review “science” coming out of the IPCC, the CRU and the European Union.

Bernie in Pipewell
November 23, 2009 9:59 am

Juraj V. (09:26:39) :
Plato Says (09:00:10) :
This UK lobbying organisation has just reported the CRU to our Information Commissioner
If I remember well, Jones consulted how to avoid FOIA directly with the Information commissioner.
——————————————–
I’ve wondered about this , I think he said FOI officer who may well be an employee of CRU or UEA

t-bird
November 23, 2009 10:07 am

You say false data, I say desired data. Can’t we all just get along?

E.L.
November 23, 2009 10:09 am

The smoking gun is comments in code that specifically state, “adjusted to look closer to
the real temperatures.” 0.o
Yes sir, they must be really pulling the wool over people’s eyes by adjusting things to reflect real temperatures.
Social scientists are missing an opportunity to study irrational human behavior here.

Bart
November 23, 2009 10:27 am

It seems reasonable, on the one hand, to toss out proxy data which does not agree with actual measurements. The thing I find disturbing is, doesn’t that invalidate the proxy data prior to 1960, which is being used to argue unprecedented warming?

rbateman
November 23, 2009 10:31 am

.
The option (like all the anomdtb options) is totally undocumented so we’ll never know what we lost.
.
—I have an idea—-
Imagine that one day in the Louvre it is found that some highly respected curators of historic art (Rembrandts, etc.) have swapped out fakes and burned the originals.

November 23, 2009 10:33 am

From the BBC Environment correspondant
“At 5:57pm on 23 Nov 2009, kh1234567890 wrote:
“I was forwarded the chain of e-mails on the 12th October”
And you just sat on it for a month, hoping that it will go away ?”
http://www.bbc.co.uk/blogs/paulhudson/2009/11/climategate-cru-hacked-into-an.shtml#comments
!!!!!!!

jryan
November 23, 2009 10:39 am

I had a post rejected at RC. It was in response to a snide comment by a regular saying “Since when is Steme McIntyre a scientist?” (in response to making data available to scientists).. to which I responded:
“Well then, since when is Michale Mann a statistician, if we’re picking nits?”
It’s been clear for over a decade that climatologists believe themselves to be experts in all things so long as they are to further the cause of AGW. They are statisticians, geologists, physicists… forget the objections from the real statisticians, geologists and physicists about how they apply these disciplines!

rbateman
November 23, 2009 10:48 am

Frank Lansner (04:26:49) :
It’s really quite simple, Frank. Just go to the sites:
MESOWEST
http://mesowest.utah.edu/index.html
or
http://climate.usurf.usu.edu/products/download.php
and start looking at the data on the early ends of records.
Then compare with the record starts in HARRY_READ_ME.txt
Then examine things like old newspaper archives (in historical societies, museums, etc) as I have done.
I conclude that data has been altered in the records to totally destroyed.
I have to stop here, or I’ll be seeing red and say very bad things that will make the mods snip me.
I am so angry at what these monsters have done.

Slartibartfast
November 23, 2009 10:59 am

My favorite Nostradamus moment:

D et al – Please write all emails as though they will be made public.

Wonder what he was thinking, there?

November 23, 2009 11:03 am

Jesse (21:24:11) :
“This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
“George E. Smith”
George Elwood Smith (born May 10, 1930) is an American scientist, applied physicist, and co-inventor of the charge-coupled device. He was awarded a one-quarter share in the 2009 Nobel Prize in Physics for “the invention of an imaging semiconductor circuit—the CCD sensor”.
http://en.wikipedia.org/wiki/George_E._Smith

BobMbx
November 23, 2009 11:06 am

While the emails provide a voyeristic interlude, the real damage will come from the data.
Avoid the decline, indeed.
Hey, try avoiding your career swirling around the commode bowl…..

Bob Tatz
November 23, 2009 11:07 am

Google “Tim Mitchell cru”
Not odd that http://www.cru.uea.ac.uk/~timm/index.html gives…
Page temporarily unavailable
The main CRU webserver is currently down.
These pages are being served from the CRU Emergency Webserver.
Not all pages from the main server are available, and what pages are available may be out of date.
Odd that the cached page from Nov 9 is blank (except for /head – see source)
Info on CRU TS 2.0:
http://www.cgd.ucar.edu/cas/guide/Data/cruts20.html
(Tim M’s program that Mr. Ian (Harry) Harris is trying to figure out)
Curious for someone processing historical temps…
Tim M is apparently a young earth creationist:
http://www.arthurhu.com/99/12/noice.txt
Hmm, interesting fellow.
Regards,
Bob

AKD
November 23, 2009 11:26 am

Plato Says and bernie
There is an e-mail that specifically states their FOI people sought and received advice from the Information Minister on how to block the FOI requests.

Jerker Andersson
November 23, 2009 11:35 am

Can you imagine the activity at CRU at this moment? I mean ppö are not going to the job doing their normal things today, drinking some coffe and then go home at the end of the day.
The activity must be high, checking waht data is available. Internal meetings and strategies to face this situation officially.
Instructions to delete any information and mails that could in any way show that data are intentionally adjusted to show what they want it to show.
Delete any mail that could cast any doubt on their so called science and so on…
I am quite sure alot of information and data have undergone som housework cleaning to hide any traces of what has been said or done the last days.

tensorized lurker
November 23, 2009 11:36 am

Has anyone mentioned yet that Tom P. of CA fame is likely non other than NOAA’s Thomas Peterson..
Expertise: Convening lead author of Chpt. 1 – Assessment of changes in extremes using daily in situ data
http://www.agci.org/programs/past_workshop_participants/about_the_scientist/participant_details.php?recordID=25

Richard M
November 23, 2009 11:37 am

Now we get to see what “denial” really is. Scott Mandia, RR Kampen, Nick Stokes, kdkd, Jesse … yes, all of you and more. (BTW, where is Mary Hinge?) Sorry if I left anyone out but you folks are looking foolish. You are doing exactly what people do when they find out a loved one is sick. You start grasping at any little straw and saying, “see, it can’t be that bad”. But, it is that bad.
Think of a someone on trial. Sure, a couple of these items may be circumstantial, but we’re looking at literally 100s of pieces of evidence. Anyone reasonable person with an open mind will see this clearly. Time for you climategate “deniers” to suck it up and admit you’ve been conned.

George E. Smith
November 23, 2009 11:42 am

“”” vukcevic (11:03:57) :
Jesse (21:24:11) :
“This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
“George E. Smith”
George Elwood Smith (born May 10, 1930) is an American scientist, applied physicist, and co-inventor of the charge-coupled device. He was awarded a one-quarter share in the 2009 Nobel Prize in Physics for “the invention of an imaging semiconductor circuit—the CCD sensor”.
http://en.wikipedia.org/wiki/George_E._Smith “””
But vukcevic; as I have reported several times already, I AM NOT George Elwood Smith; and received no part of any Nobel Prize.
My middle name is Edward; and no it has no connection with any British Royalty. Actually it relates more to a British family who emigrated to New Zealand, and became very prominent in the shoe business; a shoe business, that I actually worked for while working my way through University.
Like my namesake, I also am a Physicist; and a mathematician, and can work competently in the solid state Physics of the semiconductor Technology; not only in Silicon; but also in the III-V compounds of the LED business. Those fields can benefit, from my University background in Optics, and Electronics. Back in my Academia stint, I actually taught Optics, and Atomic Physics at my alma mater; and no I was NOT a professor; merely a junior lecturer.
Also I have never actually designed or used a CCD; well except in some early P&S camera I may have owned once. All my digital camera sensor interests are now CMOS rather than CCD (which requires its own bastard bipolar like transistor process), whereas CMOS sensors use exactly the same process, as microprocessor or memory chips use, so can be integrated with processing electronics.
It also happens that I had lunch last week with a retired Solid State Physicist, (German) who is quite familiar personally with both the Smith CCD guy, and the other parties to that invention; as well as knowing just who did what.
My guess is there’s a George E. Smith on a street corner in just about any town; but most of them aren’t physicists. There was also one very famous one who survived the Pearl Harbor attack, by diving off his sinking ship, and swimming under the burning fuel to safety; although he got seriously burned in the incident; ending his Naval career. I doubt, you’ll find me on Wikianything; but sometimes Google does turn over a rock I may be under; and as I pointed out to Jesse, I don’t do any deck swabbing; I’m far too old for that.

Bob Levinstein
November 23, 2009 11:45 am

Senator Inahofe has called for a senate investigation into this:
http://thehill.com/blogs/blog-briefing-room/news/69141-inhofe-to-call-for-hearing-into-cru-un-climate-change-research
Call your congressman and senators to ask them to support this. (They take calls seriously). This is a chance to actually get something done here–let’s not let it pass.

LarryOldtimer
November 23, 2009 11:46 am

I wasn’t one of those fancy dancy “scientists” in my career, I was a mere professional civil engineer. I was in responsible charge of (legally and actually) and supervised a good many highway and hydraulic engineering projects. I worked in both the private and public sectors.
All or any of the calcs, plans and specifications I produced during my career, and any procedures or methodologies I utilized, were open and available for anyone to review. I welcomed review from other professional colleagues, and anyone else who was interested, as I did not consider myself immune to error, and different ideas might prove to be fruitful for improving my designs.
I produced nothing whatsoever, in writing or in conversation, which I would have had the least hesitation anyone else knowing about. Petty “work” politics held no interest for me, and I wouldn’t have allowed myself to get involved in such things. But then, I did consider myself a professional engineer, and I did think that those who paid me for my services should receive commensurate work from me that they paid me for, private or taxpayer.
I have long since been retired, but (just in case) I am still licensed as a professional civil engineer in CA.
Laurence M. Sheehan, PE (CA) # C17518 (Just call me Larry)

Scouse Pete
November 23, 2009 11:47 am

A quick head up here:
http://www.bbc.co.uk/blogs/newsnight/fromthewebteam/2009/11/monday_23_november_2009.html
I suspected BBC2’s Newsnight would delve into the issue. Be interesting to watch. 22:30 UK Time
“And Susan Watts will be looking into the University of East Anglia (UEA) row. Thousands of emails and documents stolen from there and posted online suggest to some that researchers colluded to make the case for climate change. She’ll be asking if we can trust the scientists.
Do join Jeremy at 10.30pm on BBC Two.”

a jones
November 23, 2009 11:55 am

Sorry if repeating.
Briggs has a good take on this in terms of uncertainties. Here:
http://wmbriggs.com/blog/?p=1362
Kindest Regards

George E. Smith
November 23, 2009 11:57 am

“”” Harry MacDougald (08:57:23) :
“”” Jesse (21:24:11) :
. . . bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers. “””
I am perhaps mistaken, but the George E. Smith of these parts, who spoke just above of Mickey D’s and Wal-Mart, appears to be the same George E. Smith who will shortly be traveling to Stockholm to pick up the Nobel Prize for Physics.
But he has too much class to mention it. Think on that for a minute.
[REPLY – Hoo! c.f. Noblesse Oblige! – Evan] “””
Harry as I say above; that is not me; and given the state of ill repute the Nobel Prizes have come under recently, I am not sure I would want one. interested parties might find it illuminating to research the Physics Nobel Prize of our present Uber green energy Secretary; Steven Chu. Don’t be surprised if the name Arthur Ashkin pops out; he invented Optical Trapping about 39 years ago at Bell Labs; and 15 years before Chu (his “pupil”) did the work that the Nobel was awarded for; namely Optical Trapping. Vladilen S. Letokhov in the USSR was working along similar lines.

MotorYogurt
November 23, 2009 12:00 pm

The code comment is a big “maybe”. Maybe it means this, maybe it means that, or maybe it just means what it means. Not much in it to get fired up about though, speculate all you want.

robertg222
November 23, 2009 12:01 pm

Does anybody else hear “Nearer my god to thee” playing on the HMS Global Warming

tallbloke
November 23, 2009 12:04 pm

kdkd (20:41:01) :
If that’s the best you can do

You ain’t seen nothing yet boi.

Phil Clarke
November 23, 2009 12:19 pm

From RC [Response: At least since 1998, the producers of a single MXD series (Briffa and colleagues) have counseled against using their series past 1960. Finding that, in fact, they don’t use that series past 1960 in doing analyses is hardly surprising. If you don’t like this, don’t give it any weight in your assessment, and look at the other series instead. But finding code that supports exactly what is in the literature is hardly a smoking gun. – gavin]
Yawn.

Mikey the Physicist
November 23, 2009 12:26 pm

Oh well, at least they comment the code. About 30% of the developers I work with still refuse to do that…

Harry MacDougald
November 23, 2009 12:54 pm

Well, George Edward Smith the physicist but not the Nobel-Prize winning George Elwood Smith the physicist, I apologize for the error and stand corrected in my orthopedic shoes.
Not exactly a slander, though, was it?
Well, even if you ain’t even got a Nobel Prize to your name, I greatly enjoy your posts here on WUWT. You have a singular voice that is very enjoyable to read.
Very best regards,

old construction worker
November 23, 2009 12:55 pm

P Gosselin (01:33:38) :
‘No sunspots today.
But if you want, I’m sure I could produce some.’
Thanks for the chuckle.
Hay, Jesse
Maybe I’ll change my handle to Fire Ant

Jarmo
November 23, 2009 12:57 pm

I’ve been reading that HARRY text…. how CRU temps are produced. Utter shambles!
Here, the expected 1990-2003 period is MISSING – so the correlations aren’t so hot! Yet the WMO codes and station names /locations are identical (or close). What the hell is supposed to happen here? Oh yeah – there is no ‘supposed’, I can make it up. So I have 🙂
If an update station matches a ‘master’ station by WMO code, but the data is unpalatably inconsistent, the operator is given three choices:
You have failed a match despite the WMO codes matching.
This must be resolved!! Please choose one:
1. Match them after all.
2. Leave the existing station alone, and discard the update.
3. Give existing station a false code, and make the update the new WMO station.
Enter 1,2 or 3:
You can’t imagine what this has cost me – to actually allow the operator to assign false WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’ database of dubious provenance (which, er, they all are and always will be).
False codes will be obtained by multiplying the legitimate code (5 digits) by 100, then adding 1 at a time until a number is found with no matches in the database. THIS IS NOT PERFECT but as there is no central repository for WMO codes – especially made-up ones – we’ll have to chance duplicating one that’s present in one of the other databases. In any case, anyone comparing WMO codes between databases – something I’ve studiously avoided doing except for tmin/tmax where I
had to – will be treating the false codes with suspicion anyway. Hopefully.
Of course, option 3 cannot be offered for CLIMAT bulletins, there being no metadata with which to form a new station.
This still meant an awful lot of encounters with naughty Master stations, when really I suspect nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option – to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad, but I really don’t think people care enough to fix ’em, and it’s the main reason the project is nearly a year late.

Neil
November 23, 2009 1:00 pm

SUSAN WATTS !!!!!!!!!!!!!!!!!!!!!!!!!!
She of the lets sex up Obamas inauguration speech by selectively editing ?
This is bound to be an “On message ” sound bite from the BS Broadcasting Corporation

Bernie in Pipewell
November 23, 2009 1:06 pm

AKD (11:26:40) :
Plato Says and bernie
There is an e-mail that specifically states their FOI people sought and received advice from the Information Minister on how to block the FOI requests.
—————————————
I was wrong it’s FOI person, its was the link with the chief Librarian in the sentance below extracted from tux:mail> cat 1228330629.txt which made me think they were both employed by CRU or UEA.
“I’ve got to know the FOI person quite well and the Chief
Librarian – who deals with appeals.”
Theres no mention contacting the infomation minister in this email.
If they had sought and been given advice on how to avoid an FOI by a minister of the crown, the efluent would realy hit the fan.

munge-man
November 23, 2009 1:12 pm

As a professional programmer (who did work on scientific software at one point in my life) I would just like to point out that comments in code don’t actually mean ANYTHING. Yes, the original intention is to inform those that read the code in the future, but thats it.
The comments have no semantic effect on what the code does. So to claim that a comment in the code is some sort of smoking gun is absurd at best. In addition, comments often suffer “bit-rot”: that is they become out of date and incorrect as the code changes and the comments don’t. Often times programmers will change the code w/o making the correct changes to the comments.
If you want to look for smoking guns, read the code, not the comments. Does the code do as the comment says? Does the code really “bias” the data in some fashion so as to create a predetermined outcome?
All these kinds of code (scientific and otherwise) should be made open source. Then the conspiracy nuts can go crazy looking for the “tricks” and “biases” in the code.
BTW, I don’t buy Apple computers anymore because they are liberal and bias the results of my code.

Vargs
November 23, 2009 1:13 pm

Don’t hold your breath Scouser. This will be damage limitation by UEA.

DocMartyn
November 23, 2009 1:14 pm

“Nigel Lawson has quite rightly called for the NERC (who fund CRU, the Tyndall Centre etc) and the Vice Chancellor of UEA to set up an independent enquiry into the contents of the released emails/documents. I fully agree with this but don’t think this is ever likely to happen”
You might be surprised. There will be a new Conservative government in the next six months and Lawson carries a lot of weight, still. This is the sort of issue that an incoming government can pursue to show what a misapplication of resources the last 12 have brought.

November 23, 2009 1:18 pm

This is why we should never accept computer models without seeing the underlying code.

debreuil
November 23, 2009 1:19 pm

I find it interesting that in public they can’t fulfill proper FOI request because of the value of the code/generated data, and then in private collude to delete it. I’d say I’m not sure which to believe, but having seen the code and the process, I am certain it has very little value : ).

debreuil
November 23, 2009 1:22 pm

“Finding that, in fact, they don’t use that series past 1960 in doing analyses is hardly surprising.” [Gavin]
Every graph I’ve seen from them suggests the data goes from pre 1900 to the end of the hockey stick. If they are not using the same data for the blade in those graphs, they are being very dishonest.
My biggest beef though, is the bad name they have brought to hockey.

James Chamberlain
November 23, 2009 1:30 pm

Wait for someone to roll over from the hardened CRU/Jones/Mann, etc. machine. It will happen. Just wait.

debreuil
November 23, 2009 1:47 pm

munge-man:
I have looked at the code as well (it is still there under the comments, and doesn’t seem to suffer from the((oh so common)) bit rot that you mention. I guess that is mostly because these are very short programs, usually under 100 lines.
So it is doing what it says, which is generally removing, adjusting, creating or damping the signal after around 1960 (but not always exactly then). I can’t confirm that is anything sinister as I’m still trying to get it running, but it does strike me as a bit weird.
I mean, stepping back, if your signal gets worse as you verification data gets better, doesn’t that have to mean you have a problem with your signal? I just don’t see how that doesn’t invalidate your previous data — at very least, invalidate comparisons between the two sets.
It seems to me to be
1) recent temperatures don’t match our proxy (I assume the proxy predicts lower temperatures)
2) throw out/scale the proxy data for recent times and make things fit
3) wow, recent times are way hotter than before.
That is like saying the planets all rearranged their orbits around 1600 to become heliocentric.

Vargs
November 23, 2009 1:54 pm

If you want to look for smoking guns, read the code, not the comments. Does the code do as the comment says? Does the code really “bias” the data in some fashion so as to create a predetermined outcome?

See above the message from Tom_R … there is an example of the use of a static array of “VERY ARTIFICIAL” fudge factors which are applied to the raw data to reduce temperature in early years and increase it more recently.
On a more interesting note it’s intriguing to see the way that “the Team” are responding. So far they’ve fielded Watson to tour the studios to tell everyone that the people at CRU are honourable and that the physics is irrefutable. Simultaneously a couple of stock “scientists say..” standard scares have been floated.
At what point, I wonder, will the big beasts break their silence. James might be right. I wonder whether they’d throw the CRU people to the wolves to protect the main message? I’d love to be a fly on wall on these discussions.

questioning
November 23, 2009 1:57 pm
debreuil
November 23, 2009 2:05 pm

Man reading poor Harry’s comments again… All I keep thinking is when you have anything that tortured, it will tell you anything you want to hear.

michael
November 23, 2009 2:10 pm

Frank Lansner (06:01:23) :
from the hacked files, load them down and see it all…
sorry for missing link, i`m now in europa an a little drunken…
but anyway, at copenhagen we should have a big demonstration!

tallbloke
November 23, 2009 2:13 pm

Dr A Burns (00:22:58) :
>>Another strange happening at Hadley … all the hadcrut3 data for this year, except Jan/Feb, has been deleted.
>>>>But of course they’d delete it, wouldn’t want the very inconveniently record cold Northern Hemisphere October 2009 muddying their “warming”.
Gregg,
Recent months actually showed an increase in temperature in hadcrut3. Perhaps it was realised by Jones et al that the data did not reflect reality ?

More likely a side effect of running on the emergency webserver. Out of date files.

November 23, 2009 2:15 pm

Scientific Doomsday Mania
by
Amitakh Stanford
22nd November 2009
There is a doomsday message that is swiftly gaining global acceptance. The new wave is clothed in acceptable clichés and has won over the support of many of the respected scientific communities.
Unlike most other doomsday messages, this one is supposedly based upon scientific evidence. The scientific “doomsdayers” wear masks and pretend that they are predicting calamities based on hard evidence. This lulls the unsuspecting public into absolute belief and acceptance of the doomsdayers’ ravings.
If the same message were given in a spiritual setting, the adherents would probably be encouraged to turn to God in preparation for the final days. Generally, scientists have sneered at and mocked spiritual predictions regarding the end times, and the same scientists have convinced the general public to do likewise. Further, governments of the world use their police powers to suppress, restrict, or even eliminate these spiritual-based groups. Scientists have now one-upped the spiritual believers by supporting their dire predictions of calamity with supposed scientific evidence. Using their scientific clout, they have now convinced most of the world leaders to meet in Copenhagen. The stated agenda of the gathering is to halt global warming with a unified and urgent approach.
People may remember that there have been similar gatherings to solve the global economic crisis. In those meetings, every leader attending was told to boost their economies by stimulus spending. By and large, the world leaders have dutifully followed those dictates. One might ask: Is the global recession over due to this unified approach – or is it deepening? Many thinking economists have finally realized the latter to be the case.

TGSG
November 23, 2009 2:19 pm

George E. Smith (08:38:19) :
“”” Jesse (21:24:11) :
. . . bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers. “””
Well Jesse, I’m reasonably sure that I could not get a job at Walmart; or MacDonalds for that matter. With a name like mine, I couldn’t even fill out the job application form.
……………………
I have been smiling for three days over this whole “affair”.
This turned my smile up at least 100 Watts.
thanks

Andrew
November 23, 2009 2:31 pm

Someone might want to look at this subroutine and determine exactly what it is doing and where it is being used
Subroutine: CruTSTestAnn
File: FOI2009/FOIA/documents/cru-code/linux/mod/homogeneity.f90
!*******************************************************************************
! although this ‘works’ in the sense of running without execution errors, it does not
! seem to do a very good job of detecting inhomogeneities, whether with the ‘Simple’
! option turned on or off. This appears to be because there is no indication within
! the testing procedure itself as to whether the change is gradual or sudden
! I am now developing CruTSTestMon to see whether the same testing procedure can be made
! more effective by utilising the multiple streams of information in the same year
! to detect a simultaneous change across all months
subroutine CruTSTestAnn (QPassFail,DataO,DataD,Order,Suffix,&
BegN,EndN,BegO,EndO,BegD,EndD,CandO,CandD)

tallbloke
November 23, 2009 2:40 pm

Basil (04:32:20) :
Ashtoreth (03:52:14) :
Monkton is absolutely correct, we need to take the raw data, the calculations, and build new, verified models and data sets to see what is hapenning BEFORE we spend all this money.
While I agree, we need to realize that there is no longer any raw data, at least at CRU. So it — the raw data — will have to be acquired all over again. Given the that this is now international politics, and not just academics cooperating in the interest of disinterested science, that may no longer be possible.

Both CRU and GISStemp are based on GHCN data. Which is available. So reconstruction from that is as good as will get.
Of course this also means that when the warmists talk of independent data sets, they are lying through their teeth.

Roger Knights
November 23, 2009 2:48 pm

MotorYogurt (12:00:54) :
“The code comment is a big “maybe”. Maybe it means this, maybe it means that, or maybe it just means what it means. Not much in it to get fired up about though, speculate all you want.”

If it’s a maybe, it means the science isn’t settled. That’s enough for a delay before committing to waste a trillion or two, and for a disinterested scientific review of this whole affair–and, indeed, of the whole global warming issue, which contains maybe 50 points in dispute. Perhaps a dozen panels of retired scientists can be recruited for the job.

Glenn
November 23, 2009 2:52 pm

Meet Harry:
“Research Staff
Mr. Ian (Harry) Harris
Dendroclimatology, climate scenario development, data manipulation and visualisation, programming”
http://www.cru.uea.ac.uk/cru/people/#Research%20Staff
He made CA on papers with Briffa, Jones, Osborne
http://www.cru.uea.ac.uk/cru/pubs/byauthor/harris_ic.htm

Evan Jones
Editor
November 23, 2009 2:53 pm

It’s one thing to get caught with your hand in the cookie jar.
It’s another thing to pull your hand out of the cookie jar and deny you were trying to steal cookies.
And it’s another thing to be unable to pull your cookie-stuffed hand out of the cookie jar while being unwilling to let go any of the cookies, deny you were trying to steal cookies, and call for the indictment of the bakers for producing fattening products.

GP
November 23, 2009 2:53 pm

Reading the Harry_Readme file and assuming it is genuine (probabaly is, it sounds kinda familiar) I am surprised that so many are surprised by the contents.
Most ‘Govt.’ databases are unlikely to be accurate and sharing bits of them around won’t improve them.
I was looking at some years of official Road Accident databases a few years ago and decided to drop the locations onto a Map tosee how accurate they grid refs were, etc. Many were excellent, but many had been ‘offshored’, some by several thousand miles iirc.
Commercial databases fare little better, as a friend of mine has just mentioned based on the first week or two of operation on his company’s new combined systems platform.
the only thing you can be certain about is that data will always contain garbage in. The question is how much and what level of garbage out results. If the Harry_Readme file is real this particular Sow’s Ear is going to take some serious effort to get it to converted to a silk purse.

debreuil
November 23, 2009 2:53 pm

MotorYogurt (12:00:54) :
The code is under the comment and the comment’s intention can be verified there.
There are a lot of maybe’s in life, but not for a compiler.

Roger Knights
November 23, 2009 2:55 pm

“If you want to look for smoking guns, read the code, not the comments. Does the code do as the comment says?”
Red herring. These comments don’t just document (or mis-document) the code. Some of them provide documentation about what “the user” wants, what the problems are with the data, the quality of the code, etc., etc.

Keith Minto
November 23, 2009 2:59 pm

The BBC early this morning(Sydney time) had an excellent discussion on radio about the emails with Pat Michaels and Christopher Booker and the scales are definitely tipping in our favour, a very clear case was make for uncertainty in the IPCC data.
Even our old ABC, at last , had a response by Tim Flannery on the emails, a limp response admittedly but it is a start. I have searched but media outlets but have found no links to the above broadcasts, someone else may have better luck.

tallbloke
November 23, 2009 3:00 pm

“I’m going to free the code Phil”
“I’m sorry HARRY, I can’t let you do that”

notsopront
November 23, 2009 3:11 pm

I rarely do this, but this one found my attention. Haven’t downloaded and therefore read the whole thing, disclaimer. First, FORTRAN? How sad. C anyone? One can get lost in the trees, and I think that just might be the tactic. One can get confused, this is either false or true. Looks true since denials are pathetic as I’ve seen. Two Points. Worlds been scammed. By complicit governments. Lotsa moolah here. Second, that being true, what does anyone do? Maybe you can’t do a dammed (sic) thing. Get ready, BOHICA. Mogambo was right. Hope he breaks 100 from time to time. But lets dissect this ad infinitum. THERES AN AGENDA!

November 23, 2009 3:11 pm

I find it odd that the source code had been released years ago, and no one found anything wrong with it then, though many sifted it with a fine tooth comb. Does this code release give different results than was released before?

Keith Minto
November 23, 2009 3:16 pm
Bernie in Pipewell
November 23, 2009 3:29 pm

BBC 2 Newsnight, with Paxman as presenter, just had a report on the leaks,it was pretty even handed.
Newsnight with Paxman as presenter is as good as you will get on UK TV. No other TV news program in the UK surpasses it. BBC radio 4’s To-day with John Humpheries as a presenter is it’s only equal.

Rob
November 23, 2009 3:41 pm

Don`t believe what you are told without obtaining all the data.
Some years ago I wanted to develop land I owned, I was informed by the local authorty that another access from my land onto a major road was unacceptable as there had been a significant number of accidents at the many other accessing along this busy road, I was sent data showing the number and dates of these accident but NOT their position, there were many and some serious, I thought well that`s it. Before thowing in the towel and out of interest I confirmed where exactly and at which junctions these accidents had occured, and this is the kicker, there were about a dozen accesses along this road and two sets of traffic lights, one set at each end of the road, ALL the accidents occured at the traffic lights. There had never been an accident anywhere other than at the traffic lights. I gained my permission.
Snow tried do do a hatchet job on Fred Singer on newsnight tonight, had the skeptic miles away and the warmer in the studio and continuously interupt the skeptic, Fred got his point over though.

Rob
November 23, 2009 3:43 pm

Sorry Paxman not Snow, it was not even handed.

Bernie in Pipewell
November 23, 2009 3:46 pm

Rob (15:41:23) :
It was paxo, dont upset the lanky green cyclist

Keith Minto
November 23, 2009 3:48 pm

Our Tim…….
“But Professor Flannery says the scientific community knows enough to say greenhouse gases cause global warming, and that humans are responsible.
“The thing is we deal with an incomplete understanding of the way the Earth’s system works, we know enough to say as the IPCC said that greenhouse gases cause warming,” he said.
“They are 90 per cent-plus sure that it’s caused by humans, we can go that far.”
Well, we do live a long way away……

Bernie in Pipewell
November 23, 2009 3:51 pm

Rob (15:43:56) :
Sorry Paxman not Snow, it was not even handed.
—————————————
T’was

rbateman
November 23, 2009 4:14 pm

tallbloke (14:40:06) :
I see evidence that they have gotten to a lot of the data at the sources.
Comparing data sets with provenance to ascertain whether it is original or altered is now not an option, but a necessity. They are banking on nobody knowing how far they went.
I am hoping that others will dig far enough into the climate data to see what I see.
DO NOT under any circumstances assume that the climate data you are looking at is 100% genuine.
Check it against other sources.

Tom in Texas
November 23, 2009 4:35 pm

“BBC 2 Newsnight, with Paxman as presenter, just had a report on the leaks,it was pretty even handed.”
Glen Beck had a long segment on his show this evening. It wasn’t even handed, it was scathing. Loved it.

climatebeagle
November 23, 2009 4:35 pm

Interesting change in the UEA statement:
Original:
http://enviroknow.com/2009/11/21/east-anglia-university-cru-climate-hacking/
Current: (17.45 November 23)
http://www.uea.ac.uk/mac/comm/media/press/2009/nov/homepagenews/CRU-update
The comment by Dr. Phil Jones has been dropped.

November 23, 2009 5:35 pm

Jesse (21:24:11) :
Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
REPLY: So what do you do down there in Norman? NSSL? U of OK? You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A

It is comments like this that cause elitism. What is a scientist except fro someone who try to understand through experimentation and observation.
What do you need? A PhD? If I have one then will you say that I am a scientist? Just trying to get the lay of the land here? Do I have to agree with AGW before I am ruled as a scientist? Publish findings? Use a peer review process? Oh wait if you don’t agree with AGW you don’t get peer reviewed… Hmmm… almost sounds like collusion to me but I suppose I will simply go back to my day job, which obviously can only be at Wal-Mart…
By the way your comment was in point of fact very offensive. Do you not believe in challenging authority? If you are wrong in the challenge all that has to be done is for the authority to provide proof of how they are right. If they are wrong… well if they are honest about it they can still be an authority who simply incorporated additional information into their knowledge base. Making them even a better authority. The Scientists that are in these emails have actively colluded ( which we had evidence of before ) in keeping data from the hands of people like Steve McIntyre. While they are sure that their data supports their beliefs Steve has already shown that they have messed up more then once on some pretty major pieces of information. Both taking seats of honor in the IPCC reports.
Perhaps they are a little worried of actually being reviewed by someone better then they are. I mean the idea of peer review is you have someone on your level review your work, not someone better lol…
not that you care about any of this you have already classified us people who in your mind perform meaningless tasks in society. Which I feel again is highly offensive to both us and people who have those jobs.

yonason
November 23, 2009 5:51 pm

“This is what they did — these climate “scientists” on whose unsupported word the world’s classe politique proposes to set up an unelected global government this December in Copenhagen, with vast and unprecedented powers to control all formerly free markets, to tax wealthy nations and all of their financial transactions, to regulate the economic and environmental affairs of all nations, and to confiscate and extinguish all patent and intellectual property rights.
The tiny, close-knit clique of climate scientists who invented and now drive the “global warming” fraud — for fraud is what we now know it to be — tampered with temperature data so assiduously that, on the recent admission of one of them, land temperatures since 1980 have risen twice as fast as ocean temperatures.”

http://pajamasmedia.com/blog/viscount-monckton-on-global-warminggate-they-are-criminals-pjm-exclusive/

P Wilson
November 23, 2009 5:54 pm

http://www.eastangliaemails.com/emails.php?page=1&pp=25&kw=manipulate
Seems that phil Jones et al are totally obsessed with WUWT and Climate audit – so they panic when Steve Mcintyre pulishes his analysis, and admit they have no idea whether he’s correct or not. The only one who might understand it is a certain Tim Melvin, who is “loose canon” according to Mann, so shouldn’t be contacted directly.
Here is what is described as a “climate case study”
http://www.powerlineblog.com/archives/2009/11/024995.php

P Wilson
November 23, 2009 5:58 pm

climatebeagle (16:35:27)
Thats amusing. It says:
“The selective publication of some stolen emails and other papers taken out of context is mischievous and cannot be considered a genuine attempt to engage with this issue in a responsible way.”
They’re the ones who wrote the damn things!

November 23, 2009 6:02 pm

So let me get this straight. Tree ring proxies are not to be trusted when they diverge from the modern thermometer record, but when they erase the Medieval Warm Period, they are gospel. Do I have that right?
It reminds me of something I learned about newspapers a long time ago. If you have ever had first-hand involvement in a story that was reported in the newspaper, you realize that half of what is written is wrong, but when you read the next story about which you have no prior knowledge, you find yourself taking the whole thing at face value. What’s wrong with this picture?
These leaked files prove that Jones & Co. have stopped being scientists and started using data the way a drunkard uses a lamppost, for support rather than illumination.

November 23, 2009 6:06 pm

As I’ve read through the e-mails, the data “dropped after 1960” is dendrochronological data — tree rings. As you recall from your vast experience in this issue, tree ring data correlates very well with other temperature measures until about 1960, and then it tails off as if temperatures declined. However, thermometer readings from the same places don’t show a drop. So, the data aren’t used where they cease to be informative. Critics, unused to trying to make serious science studies, will argue that all the dendro data should be scrapped — that only strengthens the trends of the other data toward warming, though, as I read it.
There is a mystery as to what happened in about 1960 and afterward which impinged on the growth of trees. It may be additional pollution. It may be acid rain. It may be insect plagues (though that should be regional, shouldn’t it?). Critics again may argue the data should be dropped, but the “divergence issue” is well known, described in papers, and of course you’re well aware of the debate in all its incarnations.
I don’t follow the dendro stuff that closely. My suspicion is that the forests used for the tree ring samples were afflicted by air pollution in the latter 40 years of the 20th century, and that caused a decline in growth that shows up in a study that tries to correlate growth with temperature, as equivalent to a decline in temperature. Critics should be wary here: If the cause is anthropologically-related, it damages the case against warming even more. If air pollution itself masks the results of warming, then there is a double whammy to deal with. Surely you’ve covered this issue before, Mr. Watts.
If you read the e-mails, you learn that the “Nature trick” is to impose hard measures of temperature on the charts — real data, as opposed to predictions or projections from measures based on hypothesis or theory. The “trick” is to make the charts more honest, to expose errors in models, and generally to shed more light.
Are you really opposed to using real measures of temperature in place of calculated values? Why?

mlsimon
November 23, 2009 6:30 pm

Oh that’s encouraging, the fate of the world hanging on shade tree coders.
I have no formal training in coding and yet I have written numerical calculation code that flew on the F-16 (taken from a routine I wrote for the A-320). It is not where you do your work (under the shade tree) but how well thought out and documented the code is.
When I started in programming (around 1977) there were very few schools teaching the subject. Many engineers learned to do it by reading manuals and magazine articles. Structured code and good documentation are the keys to good code. Good design is a help as well.
After a while with old code you need to stop patching and do a bottom up/top down redesign. It just gets too crufty otherwise.
The problem I see is that in the early days there were no standards enforced.
Going back and trying to properly document undocumented code is very difficult. That is why for “real code” the documentation is done when the particular routine is written.

debreuil
November 23, 2009 6:31 pm

I noticed Gavin was taking a lot of hard questions on RC, must make for a long night, but kudos to him for that. I’m not sure if my code questions got pulled or are just still in moderation (a few hours, I’ll still assume the latter). In any case I’ve dug further and I think q.1 is yes (I’m really surprised that is allowed through though). Q.2 I really am not sure of. I read the whole Harry file last night and it seems that was how 3.0 was approached, but then in other comments Gavin mentioned it was completely independent.
Anyway, this was the post — I’d be interested what others programmers who have been through ‘harry’s read me ordeal’ think of 2:
Hi Gavin,
First, kudos to you for this marathon, I understand it must not be fun, and appreciate the huge effort.
I have been a programmer for 20 years plus, but like most people looking at this stuff not a climate expert in any way. That said, I do understand working with data and the daily travails of code massage. A lot of things causing arm waving are just normal day in the life stuff imo, but I do have two questions.
1) Just to verify my assessment of the fortan code, it seems that through out the code there is a cutoff of about 1960 (not always exactly that) where proxy data is replaced, weighted, or blended with other more accurate measurements due to the proxy data not matching a known signal. Some of the comments use unfortunate language in retrospect, but people should try to make comments clear in any case so I have no issue with those. So it isn’t a comment issue, just I want to be sure that I understand correctly that the pre-1960 and post-1960 data is coming/influenced from two different sets… is that a fair assessment?
2) The Harry file seems normal enough (not best practices I’m sure, but the poor guy had his work cut out for himself with that data, ugh! Hat tip to him). It seems to have started as just improving existing code, but for most of the remainder been a log of creating the 3.0 datasets and code (is that correct?). It seemed to me in general the new set isn’t considered correct until is gives a close match to the old set. Was that the goal (and I understand that can be a legitimate goal), or was 3.0 meant to be a second set of data/code to compare and verify the first?
Thanks very much for your time,
Robin

Bernie in Pipewell
November 23, 2009 6:37 pm

Tom in Texas (16:35:11) :
After years of, “THE CONCENSUS AMONG SCIENTISTS IS……..” from the BBC, an even handed report from aunty (BBC) on climate= Glen Beck being scathing^nth
I think the only reason there was an even handed report, or even a report, on the “emails” was because the’ve been taking a caining all day on their blogs for censorship and hiding behind legalities.
This blog
http://www.bbc.co.uk/blogs/thereporters/richardblack/2009/11/copenhagen_countdown_17_days.html?s_sync=1
was shut down on Friday because:
” Update 2309: Because comments were posted quoting excerpts apparently from the hacked Climate Research Unit e-mails, and because there are potential legal issues connected with publishing this material, we have temporarily removed all comments until we can ensure that watertight oversight is in place.”
Then was reopened today with this:
“Update 2 – 0930 GMT Monday 23 November: We have now re-opened comments on this post. However, legal considerations mean that we will not publish comments quoting from e-mails purporting to be those stolen from the University of East Anglia, nor comments linking to other sites quoting from that material.”
After they watched the posts all day, there was this:
“Update 3 – 2116 GMT Monday 23 November: As lots of material apparently from the stolen batch of CRU e-mails is now in the public domain, we will not from now on be removing comments simply because they quote from these e-mails.
However, an important couple of caveats: a) the authenticity of most of the material has not to our knowledge been confirmed, and b) it would be easy when posting quotes to break inadvertently some of the House Rules – such as the one barring posting of contact details – which are still in operation and which will see comments being blocked.
In addition to our news story and Roger Harrabin’s analysis, those of you enraptured by this issue will probably have noticed Paul Hudson’s post on his climate blog, and Martin Rosenbaum’s post on his Freedom of Information blog. If not – enjoy. There’s also a comment board open at the moment on climate change generally that you might want to plaster.
Again – there’s nothing at all barring comments on the original blog ”
I cannot rember aunty ever being turned by the general public the govement yes but not the public.
They sail on regardless with their liberal left wing agenda, forgetting that they are paid for, by the licence fee, paid, under threat of imprisonment, by everybody who owns a TV in the UK.
As an interesting aside, Roger Harrabin, one of their climate reporters, claims to have recieved the “emails” on the 12th October. I’m not shure he didn’t mean 12th November, which seems to have been some kind of red letter day for the original poster.

artwest
November 23, 2009 6:39 pm

Monbiot over at The Guardian in Surly Semi-apology Shock Horror!
He can’t resist erecting a huge straw man and also suggests that it’s a few bad apples but it’s still a major shift. He calls for Jones’ defenestration for crying out loud.
http://www.guardian.co.uk/commentisfree/cif-green/2009/nov/23/global-warming-leaked-email-climate-scientists?showallcomments=true#comment-51

mlsimon
November 23, 2009 6:44 pm

If you read the e-mails, you learn that the “Nature trick” is to impose hard measures of temperature on the charts — real data, as opposed to predictions or projections from measures based on hypothesis or theory. The “trick” is to make the charts more honest, to expose errors in models, and generally to shed more light.
Are you really opposed to using real measures of temperature in place of calculated values? Why?

What the divergence tells us is that tree rings are not a good way to measure temperature because we lack all the relevant confounding variables. Like rainfall. Air quality (volcanic eruption?). CO2 content of the atmosphere. Cloud cover. etc.
To say something caused the divergence is true. But if you discard the later data you also have to discard the prior data until you know the cause.
And it is dishonest in the extreme to append “real” values on a chart of calculated values without making that explicit each an every time the chart is presented and the data used.
The idea of deriving temperatures to an accuracy of .1 deg C or even .5 deg C from tree rings is a triumph of self delusion over error bars.

P Wilson
November 23, 2009 6:49 pm

Ed Darrell (18:06:23)
Its unlikely as trees produce their own volatile organic compounds more than humans reduce it, so trees produce three times more pollution than we do..
However, if its sulphides then most of those are naturally occuring even over the 20th century, and hardly have any effect on trees anyway – so its essentially down to ozone and c02. As co2 is increasing all over the world, that increases tree growth, whilst ozone has the opposite efect, although since 1960, ozone goes on a decrasing trend.

Keith Minto
November 23, 2009 6:50 pm

“UnfrozenCavemanMD (18:02:42) :
These leaked files prove that Jones & Co. have stopped being scientists and started using data the way a drunkard uses a lamppost, for support rather than illumination.”
Very good, I thought of a beehive with a queen (?),workers and drones,but I like yours better.

Reace
November 23, 2009 7:18 pm

Havent seen this posted anywhere yet, but heres another interesting code snippet from \FOIA\documents\osborn-tree6\briffa_sep98_d.pro:
yyy=reform(comptemp(*,2))
;mknormal,yyy,timey,refperiod=[1881,1940]
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=22
yyy=reform(compmxd(*,2,1))
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20

Reace
November 23, 2009 7:28 pm

Note the specific use of the term “fudge factor”.
From the wiki entry on “fudge factor”:
“Some variables in scientific theory are set arbitrarily according to measured results rather than by calculation (for example, Planck’s constant). However, in the case of these fundamental constants, their arbitrariness is usually explicit. To suggest that other calculations may include a “fudge factor” may suggest that the calculation has been somehow tampered with to make results give a misleadingly good match to experimental data.”
http://en.wikipedia.org/wiki/Fudge_factor

Revolutionist
November 23, 2009 7:41 pm

How about a link to FOI2009.zip?

Hank Hancock
November 23, 2009 7:44 pm

As I read through HARRY_READ_ME.txt, I wanted to understand what was so important about the work he was tasked with to invest three to four years lamenting over such crappy and disparate data and code. Here’ s what I’ve been able to surmise through a personal effort to sort out some history and context to Harry’s narrative. Perhaps the following will add some additional insight into Harry’s read me file.
* This was not a side project or snapshot of some lowly intern’s work on an obscure project as some warmers have suggested but rather a central and important project – the finalization of the CRU TS3.0 GA (general availability) data product. What is important to understand is TS3.0 was released as a beta product in 2006 around the time Harry begins his work and certainly before Harry completes his narrative. Absent any other commentary, it seems most logical to assume his initial objective was a clean up, verification, and finalization of TS3.0 GA with a possible intent to produce a TS3.1+ given his work spanned several years past the release of TS3.0 and clearly makes use of newer data files with data current to 2009.
* One gets the initial impression that Harry doesn’t know what he is doing. However, if you dig into his comments, you find Harry doesn’t know what the previous programmers were doing and spends considerable time searching for clues. It is my impression that Harry is actually a reasonably good programmer equipped with lousy tools, no versioning system, and handed a completely undocumented, grossly disorganized, and incomplete package of code and data from the former TS2.1 project. From this mess he is tasked with creating TS3.0.
* The data files and code Harry is trying to make sense of is the basis of CRU’s published TS2.1 dataset product consisting of interpolated (on a 0.5 degree latitude-longitude grid) global monthly precipitation, temperature, humidity, cloud cover, diurnal temperature range, frost day frequency, T-Max, T-Min, vapor pressure, and wet day frequency data spanning from 1901 to 2002 (Mitchell and Jones, 2005).
* While TS3.0 was released in 2006, it was released as a beta product. Harry’ approach to completing / refining TS3.0 appears to be one of creating his own recompilation of TS2.1 and adding in more current data. This would make sense given that he spends much time comparing his ongoing results to the final TS2.1 product. As a side note, The CRU publicly comments that one major difference between TS2.1 and TS3.0 is “no homogenisation is performed in the latter.” [http://badc.nerc.ac.uk/data/cru/]. It seems TS3.0 is merely the same horse under a different blanket in terms of methodologies and data aggregation concepts. TS3.0 is scaled to a factor of 10 and 100 (tenths and hundredths of a degree / day respectively to allow for the use of integers in the published product. Given that Harry seems happy to achieve an accuracy of .5 to one degree at best using fudge factoring and data glossing it seems TS3.0’s accuracy is probably no better (or potentially worse) than TS2.1. An alternate view might be that TS3.0 could have been a more accurate product but was “normalized” to the manufactured biases of TS2.1 to maintain a consistent lineage of data products.
* The developer(s) of the TS2.1 dataset are apparently no longer available to answer simple questions; perhaps the Team’s practical application of the saying “there comes the time in the evolution of every product when you must kill the engineers and go into production.” Harry identifies Tim Mitchell and Mark New as the previous programmers and often questions their methodologies. It is also evident that Tim and Mark lost or deleted relevant sets of source data used in TS2.1 before the project was handed to Harry.
* Harry seems to acknowledge that some of the earlier data may be available to him but it is deemed unusable due to an obscure (undocumented) history of data handling. In Harry’s own words, near the end of his narrative, he states “I am seriously close to giving up, again. The history of this is so complex that I can’t get far enough into it before [my] head hurts and I have to stop. Each parameter has a tortuous history of manual and semi-automated interventions that I simply cannot just go back to early versions and run the update prog. I could be throwing away all kinds of corrections – to lat/lons, to WMOs (yes!), and more.”
* It might help some of the Fortran experts looking at the source code files to understand that the final organization of TS3.0 is a 360-lat x 720-long grid that is output as 720 columns, and 360 rows per timestep. The first row in each grid is the southernmost (centered on 89.75S). The first column is the westernmost (centered on 179.75W). It might help to look at the end hashtables in this light.
* The version history of the CRU TS product line that I’ve been able to figure out is: TS3.0 is descended from TS2.1: [Mitchell and Jones, 2005: An improved method of constructing a database of monthly climate observations and associated high-resolution grids.], which is descended from TS1.2 and TS2.0: [Mitchell, T.D., et al, 2003 – A comprehensive set of climate scenarios for Europe and the globe], which is ultimately descended from CRU TS1.0 and TS1.1: [New, M., Hulme, M. and Jones, P.D., 2000: Representing twentieth century space-time variability].

philincalifornia
November 23, 2009 7:47 pm

Maybe I’m getting ahead of myself here, but I’ve always had some disdain for Tamino and his (former) superiority complex. It’s perhaps instructive to rehash the essay(s) by co-conspirator Grant “Tamino” Foster on how the HadCru data correlated well with GISS.
Yes, if George Monbiot can apologize, then I can post a link from ClosedMind:
http://tamino.wordpress.com/2008/01/24/giss-ncdc-hadcru/
When the cluster**** that is HadCru can be fully determined, what’s the statistical chance of GISS being such a closely correlated cluster**** by coincidence ??
Over to you Senator Inhofe ……..

November 23, 2009 7:56 pm

Chris Wright (05:11:41) :
Well, what do you suppose he meant? What kind of adjustment is not ‘artificial’?
The reason why I didn’t detail the rest of the quote is that it doesn’t matter, The commenter is saying don’t plot beyond 1960!. The notion is that they have massaged the data to look better beyond 1960, and then tell you not to plot it, is just self-contradictory.
There’s a huge amount of code here, and if you dig through it, you’ll probably find better gotcha’s than this. But you still can’t make much of it. Unknown commenters writing in some unknown circumstances ten or more years ago were not trying to give a careful explanation of the science. It’s possible he/she didn’t know much of the science – it seems to be mainly a graphics output routine. The key message here re the code is, don’t use after 1960 (and they didn’t). The programmer didn’t need to take a lot of care to get the scientific reason right.

Joanie
November 23, 2009 7:57 pm

@Ed Darrell- Suppose for a moment that the dendro data is actually correct (although a good argument can be made that it is not temp that is being measured, but multiple aspects of plant growth requirements) Let’s say for the sake of argument that a thousand years of tree rings are accurate but suddenly diverge from observed temp data about 1960. Well, either the tree’s habitat changed, or the observed temp data did. Considering the UHI and other biases that have been uncovered, it is equally reasonable to think that the tree growth is the same as it always was, and it is the temp data that is wrong!
Joanie<— not a scientist, just a middle aged sedentary housewife

November 23, 2009 8:59 pm

Its unlikely as trees produce their own volatile organic compounds more than humans reduce it, so trees produce three times more pollution than we do..
However, if its sulphides then most of those are naturally occuring even over the 20th century, and hardly have any effect on trees anyway – so its essentially down to ozone and c02. As co2 is increasing all over the world, that increases tree growth, whilst ozone has the opposite efect, although since 1960, ozone goes on a decrasing trend.

PWilson, don’t forget acid rain. The effects of acid rain would indeed be to reduce the growth of tree rings. SO2, NO2 and NOx all rain out as acid, and none of these compounds are produced by most trees, especially not in the concentrations that came from coal-fired powerplants in the era 1955-1980.
Here’s an EPA explanation of effects of acid rain on a forest. You’re aware, I hope, that CO2 can’t improve the growth of most trees, because they are limited by water, temperature, or nitrogen before CO2 becomes a limiting factor. Additional CO2 to a western forest of lodgepole or ponderosa pine can’t help because there is not naturally enough water to allow the trees to sink the additional carbon. Consequently, more CO2 in the air would simply be a contribution to acidity in fog and rain, either of which will limit the growth of these trees, especially in the usually alkalinic soils of these forests.
Trees produce no SO2, no NOx, no significant particulates, and forests often resorb VOCs.
I think your information on forests and air pollution is skewed, but I admit I’ve worked air pollution only in forests in North America, hardwoods, softwoods, desert and rainforest. I’m interested in your sources that say trees produce more pollution than humans. Generally forest declines result from human-generated air pollution, and were it true that forests produce more pollution, forests should be dying out around the world from their own existence. That doesn’t make sense.

November 23, 2009 9:19 pm

@Ed Darrell- Suppose for a moment that the dendro data is actually correct (although a good argument can be made that it is not temp that is being measured, but multiple aspects of plant growth requirements) Let’s say for the sake of argument that a thousand years of tree rings are accurate but suddenly diverge from observed temp data about 1960. Well, either the tree’s habitat changed, or the observed temp data did. Considering the UHI and other biases that have been uncovered, it is equally reasonable to think that the tree growth is the same as it always was, and it is the temp data that is wrong!
Joanie<— not a scientist, just a middle aged sedentary housewife

I think you’ve grokked it pretty well. We have this problem with a lot of modern data sets — same thing in carbon dating, for example. In carbon dating, after human actions started changing environmental factors on a global scale, the usual measures of the age of an animal from death to now became difficult for the past 200 years. More carbon, and consequently, more carbon 14, in the above-water environment. With careful calibrations, correction factors have been calculated, and some dating of things for the last 200 years has been done very successfully. But corrections have to be done, different from dating critters who died between, say 1820 and 50,000 years ago (some hotshots have stretched to date objects close to 100,000 years old with carbon dating, but it’s generally not recommended).
For most of the time humans have been on this planet, trees grew without interference from humans. In the past 200 years there has been a goodly amount of interference. Particulate air pollution both reduces sunlight and clogs the stoma on the leaves of trees in industrial areas. Sulfur oxides and nitrogen oxides assault the stoma directly and acidify fog and rain, which leaches good minerals out of the soil and makes harmful minerals available to damage tree roots. The benefits that could result from CO2 don’t obtain generally because of the usual limiting factors on tree growth — water, light (and heat), nitrogen and carbon dioxide availability — carbon availability is fourth, and one of the other factors limits growth so available CO2 cannot be utilized fully, let alone extra CO2.
Generally, then, we can correlate tree rings with temperature well through most of human history. What caused the divergence after 1960? I suspect acid rain, but I haven’t read the papers. There are a host of other causes possible, depending a lot on where the sampled trees were.
The issue here is, should we discount the previous thousand years of data because the data go off the rails in 1960, or should we just dismiss the data after 1960? There’s a separate issue about whether the tree rings were the only accurate set of data after 1960, recording a decline in temperature as every other instrument recorded an increase, but no one seriously argued that.
If we include the tree-ring data, does that mean warming didn’t happen? I don’t think so.
Housewifery is a good prerequisite for scientists, in my experience. If you’re looking for a change in career, don’t regard your experience as limiting, nor as inapplicable.

November 23, 2009 9:33 pm

Joanie (19:57:23) :
Considering the UHI and other biases that have been uncovered, it is equally reasonable to think that the tree growth is the same as it always was, and it is the temp data that is wrong!

Thank you.

November 23, 2009 10:12 pm

Joanie, take a look at this post. Bristlecone pine are favorite trees of a lot of botanists and other biologists — just finding them in their native habitat is a spiritual experience. The post carries a lot of information about how dendrochronology works, and it may shed some light for you on how and why the data can be trusted, generally.
REPLY: Sorry Ed, you are 100% wrong. Go pander this “spiritual experience” crap in your own bathtub. Trees do no make thermometers in all instances. Liebigs law illustrates that plant and tree growth is limited by the least available limiting growth factor. In some years it may be temperature, with a warm summer spurring growth, but in others it can be lack of water or lack of nutrients, and for long lived species like Bristlecone pine, yes even years of low CO2 can be a limiting growth factor. Bristlecones tend to frequent high and dry climates, and water can be just as influential as temperature.
Without knowing which of the factors was in shortest supply in a year, or stretch of years, you can’t make a blanket claim that tree ring widths reflect only the temperature record. Untangling it all is not possible without the metadata, and we don’t have it in most cases. – Anthony

November 23, 2009 10:12 pm
tallbloke
November 23, 2009 11:49 pm

rbateman (16:14:26) :
I see evidence that they have gotten to a lot of the data at the sources.
Comparing data sets with provenance to ascertain whether it is original or altered is now not an option, but a necessity. They are banking on nobody knowing how far they went.
I am hoping that others will dig far enough into the climate data to see what I see.
DO NOT under any circumstances assume that the climate data you are looking at is 100% genuine.
Check it against other sources.

Totally agree. I was looking more at the reconstruction of a better temperature record rather than the ‘evidence against CRU’ angle.
The main reason CRU has resisted releasing the data and code is that we would then have been able to see what a mess it is. This is just as damning as evidence of falsification in my view.

E.M.Smith
Editor
November 24, 2009 12:30 am

philincalifornia (19:47:48) : It’s perhaps instructive to rehash the essay(s) by co-conspirator Grant “Tamino” Foster on how the HadCru data correlated well with GISS. … then When the cluster**** that is HadCru can be fully determined, what’s the statistical chance of GISS being such a closely correlated cluster**** by coincidence ??
No coincidence at all… The “magic sauce” is GHCN. As is admitted in the emails, the CRUt series depends heavily on GHCN. GIStemp depends heavily on GHCN. NOAA (with a NASA data set “manager”) produces GHCN.
All the thermometer location “cooking” that was done to GHCN (moving from the mountains to the sea, moving from the poles to the equator) is reflected in both Hadley CRUt and GIStemp. Same Garbage In, Same Garbage Out.
From:
http://chiefio.wordpress.com/2009/11/21/hadley-hack-and-cru-crud/

Comment by Prof. Phil Jones
http://www.cru.uea.ac.uk/cru/people/pjones/ , Director, Climatic
Research Unit (CRU), and Professor, School of Environmental Sciences,
University of East Anglia, Norwich, UK:
No one, it seems, cares to read what we put up
http://www.cru.uea.ac.uk/cru/data/temperature/ on the CRU web
page. These people just make up motives for what we might or might
not have done.
Almost all the data we have in the CRU archive is exactly the same
as in the Global Historical Climatology Network (GHCN) archive used
by the NOAA National Climatic Data Center

And just who owns that NOAA dataset? Who is “The Data Set Manager”? What I could find looks like a guy at NASA. From:
http://chiefio.wordpress.com/2009/10/24/ghcn-california-on-the-beach-who-needs-snow/
down in the comments:

e.m.smith
It took a while to find, but I think I found “who owns GHCN” and “who manages it”.
From: http://gcmd.nasa.gov/records/GCMD_GA_CLIM_GHCN.html
We find that:
GHCN data is produced jointly by the National Climatic
Data Center, Arizona State University, and the Carbon Dioxide
Information Analysis Center at Oak Ridge National Laboratory.
The NCDC is a part of NOAA. So I’m not seeing NASA on this list. But…
It goes on to say:
Personnel
SCOTT A. RITZ
Role: DIF AUTHOR
Phone: 301-614-5126
Fax: 301-614-5268
Email: Scott.A.Ritz at nasa.gov
Contact Address:
NASA Goddard Space Flight Center
Global Change Master Directory
City: Greenbelt
Province or State: Maryland
Postal Code: 20771
Country: USA
So it looks to me like it has NASA staff assigned, part of Goddard (though it isn’t clear to me if G. Space Flight Center and G.I.S.S. are siblings or if one is a parent of the other. I suspect GSFC is an underling to GISS. That would have Scott Ritz reporting to Hansen IFF I have this figure out… (And all that personal data is at the other end of the link anyway so I’m not publishing any private data NASA has not already published.)

It’s looking to me like GISS has their fingerprints all over the GHCN deletions, with NOAA ether as patsy or passive cooperator.

And as you so aptly put it:

Over to you Senator Inhofe ……..

Ben
November 24, 2009 1:02 am

I wonder how many of these bad code offsets they’ll need…And whether their income from Carbon offsets will be enough to cover the costs involved?
http://thedailywtf.com/Articles/Introducing-Bad-Code-Offsets.aspx

E.M.Smith
Editor
November 24, 2009 1:39 am

Fred Lightfoot (03:29:03) :
E.M.Smith (00.47.58)
Still got access to that Cray ? wishful thinking.

No, but about 5 years ago saw it up for sale for something like $1000 (and one of the guys on our project was thinking of buying it…)
But you can get the same processing power now in an older Macintosh for under $1000 and it doesn’t need the 750 kVA power feed nor the 16 x 16 foot water tower for cooling 8-0
My first Cray (gosh I like the sound of that… first Cray ..) was an XMP-48 and that means a 4 processor box with 8 megawords (of 64 bit words) of memory. It was about a 400 Mega FLOP box. (You could vary the speed of the clock raising it until you were going as fast as possible, or dialing it back if the error rate started to rise… so performance is “about”).
Modern PC class machines can do a fair number of Floating Point Operations and have Ghz clock rates, so even if it was 10 clocks per FLOP you would still have “100+ FLOPS” per CPU, and you can pack multiple CPUs in a box. Add in GB of memory (compared with 64 MB… even if very fast…) and you see where this is going…
Moore’s Law. Every 18 months, a double. And it’s been a lot of 18 months…
So I now have more “compute power” in the laptop I’m typing this on… and in the cluster of older PCs that I can turn into a Beowulf should I ever want to, that are scattered around the house; then was in that old Cray. (Or even in the Y-MP that replaced it, or the EL, or… )
FWIW, one “node” of my Beowulf is presently doing the GIStemp builds / testing. Another is my main server / home machine (but spends most of it’s time turned off). 2 more nodes sit in the garage, waiting for me to want them. 2 were turned into “parts” and another one went to the recycle bin. I just never could find anything that needed the compute power… ( Google “Stone Soupercomputer” for an interesting story of how you, too, can have a super computer for free. Just wait for the next Microsoft update, and as folks need new hardware… )

Mooloo
November 24, 2009 1:50 am

I am confused ..
the trees show no change over the last forty years, and this is evidence that something has changed?
So their recent evidence can be ignored?

E.M.Smith
Editor
November 24, 2009 1:56 am

Oscar Bajner (03:38:58) : Thing is, I can’t give up my day job right now – I just got a promotion and from today I am in charge of the rotary buffer! W00t! Wal-Mart rocks.
You left out a couple of other SCCSs like “rcs”, but I’m sure you know that already… Maybe, since they are on Unix / Linux boxes we could just tell them to do a “man ci” and “man co”… At least one of them will like the command name 😉
AND
You got the rotary buffer? They said since I was old enough to know FORTRAN I had to be a door greeter (!) … I’m not sure how you greet a door, but I don’t think the door will notice if I don’t get it quite right…
Maybe we can get together at closing time and you can show me how to “ride the buffer” and I’ll show you how to swing on the automatic door!?

acementhead
November 24, 2009 2:19 am

Ed Darrell (22:12:07) :
“…just finding them in their native habitat is a spiritual experience.”

Ed you just rang my fraud detector. Just like the alleged scientist Tim Flannery, you are a mystic not a scientist. I have seen Flannery say, on TV, that he believes in “Gaia” and also “that our(meaning human) intelligence is here for a purpose.” These are positions that can not be held by any scientist. They are both utter nonsense with not the least hint of any evidence in their favour. Teleology is trash.

E.M.Smith
Editor
November 24, 2009 2:24 am

Joseph in Florida (04:46:57) :
What language are these *.pro files? I am guessing Fortran.

I believe it’s a graphics package: idl

old construction worker
November 24, 2009 2:42 am

Ed Darrell (18:06:23) :
‘As you recall from your vast experience in this issue, tree ring data correlates very well with other temperature measures until about 1960, and then it tails off as if temperatures declined. However, thermometer readings from the same places don’t show a drop.’
I didn’t know that a grove of trees had a thermommeter close by.
Or maybe the thermometer was sitting at an airport.

E.M.Smith
Editor
November 24, 2009 3:02 am

@Aligner (07:48:32) :
Very well put. I’ve saved a copy… You covered much of the management part that I’d thought, but didn’t bring myself to speak, and ought to have done so…
To all those who like what I wrote: Thank you! Now go back and reread Aligner’s posting. The points he makes are, in fact, the bigger ones…

ATD
November 24, 2009 3:33 am

“‘As you recall from your vast experience in this issue, tree ring data correlates very well with other temperature measures until about 1960,”
so why was Briffa using frig-factors on data from the 1930s?

Paul Z.
November 24, 2009 4:05 am

Reading the harry read_me.txt… it’s like an episode of monty python tripped on acid. And trillions of OUR taxpayer dollars are going to pay for this bullshit? Makes you want to hit someone with a hockey stick.

E.M.Smith
Editor
November 24, 2009 4:06 am

tallbloke (14:40:06) :
Both CRU and GISStemp are based on GHCN data. Which is available. So reconstruction from that is as good as will get.
Of course this also means that when the warmists talk of independent data sets, they are lying through their teeth.

And the GHCN data set is horridly broken and biased by deletions. About 90% of the thermometers have been “removed from the set” since about 1990 and the mountains and cold places with them…
http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/
http://chiefio.wordpress.com/2009/11/13/ghcn-pacific-islands-sinking-from-the-top-down/
http://chiefio.wordpress.com/2009/11/16/ghcn-south-america-andes-what-andes/
One has to go “upstream” from GHCN to get ‘unbiased’ data…

Kathleen S.
November 24, 2009 5:09 am

I had always been taught that one should use scientific method to collect, collate data. Strange that these over funded people seem to ignore even the most basic of scientific methods!
For those at CRU, go to this site to lean how to implement said method – http://www.sciencebuddies.org/mentoring/project_scientific_method.shtml (it has a pretty, colourful bubble step plan for them and everything!!! >;))
And for those of you who work at Wal-mart and its ilk, I suggest this site – http://teacher.pas.rochester.edu/PHY_LABS/AppendixE/AppendixE.html (and yes, I know this mentions physics, but, it’ll still give you poor Wal-mart janitors an idea of what the egg-head at CRU should have been doing all along…)
Ladies, gentlemen, janitors. My eyeballs have melted, and it *wasn’t* because the planet is heating up! I’ve just finished reading every one of those comments and having a good giggle. I had no idea that janitors and fast-food servers were so smart! 😉 I bid you wonderful people good night from one of the “forgotten” countries.
Kathleen,
Sydney, Australia.

Peter
November 24, 2009 5:10 am

I don’t know if this has been suggested before, but I think it would be a good idea to set up a repository containing the CRU source code.
I’m sure there will be no shortage of volunteers (myself included) to translate the source files into, for example, C, to properly comment them so that they’re easily understandable, and to review them.
The mass of source files would be a daunting task for a handful of people, but for hundreds?
Perhaps yet other volunteers could similarly sort out the data files?

North of 43 south of 44
November 24, 2009 5:28 am

“Maybe we can get together at closing time and you can show me how to “ride the buffer” and I’ll show you how to swing on the automatic door!?”
Ah, good old fashioned cross training. Delighted to see it still exists and that no degree in climatology or any other ‘ology required.
Can I watch, I’ll help out by providing training in chicken keeping and composting.

P Wilson
November 24, 2009 6:31 am

Ed Darrell (20:59:08)
This, from 1989 I remember:
Yale’s forestry expert Tom Siccama
told us, “All we know is that suddenly in 1962, the trees got
very unhappy, and it was probably the very severe drought
followed by and especially killing winter.
“The one thing it was not was acid rain. Look, you
don’t get that sudden a change from something like acid
rain. And why didn’t it happen in adjacent areas or
states? The people in Vermont who blame this on acid rain
must think they live on an island or something,” Siccama
said.
Apparently, it was drought during the latter part of th e 20th century, though at the time I couldn’t accept that sulphuric acid was raining down across the northern hemisphere, as anthropogenic sulphides haven’t changed that much since the 80’s.
nature has its own secrets as to when trees grow and don’t, so the limiting may well be the ones outlined – precipitation, drought, c02, temperature, humidity, and natural infestations etc.
i remember the hoaxes from the 70’s and 80’s – acid rain, ozone holes, thermonuclear winters, the coming ice age..
Its impossible to ascertain what acid rain actually did and whether there were any real researches with data at the time. – especially the proxy trees used by the hockey team
perhaps you could point to some. I’m happy to have my mind changed

P Wilson
November 24, 2009 6:34 am

Also, the 1940’s and 1950’s would surely have produced as much sulphuric acid raining down on forests as the 1960’s and 70’s. We had pea soup fogs here in London.

Clare
November 24, 2009 6:56 am

E.M.Smith (23:56:46) :
I really appreciate your explanation; it was an entertainment in itself.
After reading some of his comments, I quite liked poor Harry — at least he seems to have a well-developed (and much-required) sense of the absurd!

John Beckwith
November 24, 2009 10:26 am

This was found in the “HARRY_READ_ME.txt” of the “Whistleblowerl” FOIA2009 File
John
“You can’t imagine what this has cost me – to actually allow the operator to assign false
WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’
database of dubious provenance (which, er, they all are and always will be).
False codes will be obtained by multiplying the legitimate code (5 digits) by 100, then adding
1 at a time until a number is found with no matches in the database. THIS IS NOT PERFECT but as
there is no central repository for WMO codes – especially made-up ones – we’ll have to chance
duplicating one that’s present in one of the other databases. In any case, anyone comparing WMO
codes between databases – something I’ve studiously avoided doing except for tmin/tmax where I
had to – will be treating the false codes with suspicion anyway. Hopefully.
Of course, option 3 cannot be offered for CLIMAT bulletins, there being no metadata with which
to form a new station.
This still meant an awful lot of encounters with naughty Master stations, when really I suspect
nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option –
to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other
words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to
become bad, but I really don’t think people care enough to fix ’em, and it’s the main reason the
project is nearly a year late.”

David in Florida
November 24, 2009 11:32 am

“The issue here is, should we discount the previous thousand years of data because the data go off the rails in 1960, or should we just dismiss the data after 1960?”
How would you know that the “previous thousand years” are not “off the rails”. There are no measured values to compare it to. The point of proxy data is to reconstruct temperature when there is no actual temperature data. So when there is no temperature data to compare it to, tree ring data is good, but when there is data to compare it to, and it does not compare well, tree data should be “tricked” to “hide the decline” that it shows.
Say that good temperature data has been available since 1880. If that data matches trees until 1960 you have a 80 year match. If it does not match after 1960 you have 50 years of no match. You could also “trick” the data to match the last 50 years to show an inverse relationship with temperature. In that case the proxy data would show it was MUCH hotter in the past.
The point being is that in science, yes I am a scientist, you can not selectively discard data unless and until you can scientifically define why it is appropriate. Certainly, acid rain may account for the discrepancy and would be an appropriate research topic. But to rush to judgement and “trick” the data in order to “balance the needs of the science and the IPCC , which were not always the same.” as Mr. Briffa (whose research it was) admitted, is unconscionable.
http://www.eastangliaemails.com/emails.php?eid=794

KlausB
November 24, 2009 11:56 am

E.M.Smith (02:24:14) :
Joseph in Florida (04:46:57) :
What language are these *.pro files? I am guessing Fortran.
I believe it’s a graphics package: idl
———————————
Yep, thats true, it’s
IDL data analysis s/w.
Peter (05:10:14) :
Peter go ahead, there is a GDL in most Linux distributions. With
that you can try to run the Had-Stuff.

P Wilson
November 24, 2009 12:19 pm

David in Florida (11:32:11) :
from that link
“mann to Briffa: “Keith, just a quick note to let you know I’ve had a chance to read over the key bits on last millennium in the final version of the chapter, and I think you did a great job. obviously, this was one of the most (if not the most) contentious areas in the entire report, and you found a way to (in my view) convey the the science accurately, but in a way that I believe will be immune to criticisms of bias or neglect–you dealt w/ all of the controversies, but in a very even-handed and fair way. bravo!”
Briffa to Mann reply: ” I tried hard to balance the needs of the science and the IPCC , which were not always the same.”
Nothing looks more like innocence than an indiscretion!

November 24, 2009 12:49 pm

Anthony responded:

Trees do no make thermometers in all instances.

I agree absolutely. So, when we know that trees don’t work as thermometers, as in those post-1960 trendlines, it seems to me to be a better choice to substitute actual temperature readings. Is there a better solution?

Liebigs law illustrates that plant and tree growth is limited by the least available limiting growth factor. In some years it may be temperature, with a warm summer spurring growth, but in others it can be lack of water or lack of nutrients, and for long lived species like Bristlecone pine, yes even years of low CO2 can be a limiting growth factor. Bristlecones tend to frequent high and dry climates, and water can be just as influential as temperature.

Has anyone found any natural spot outside of a cave and above water where CO2 is the limiting factor? Even in the thinner atmosphere and drier climes of the Sierra Nevada, there is plenty of CO2 for bristlecones. Walter Muller wrote in one of the intro to botany texts I had that there was no shortage of CO2 even before the Industrial Revolution, and after, no chance. Do you know of any work that suggests bristlecone, or any other tree used in dendrochronology, would have faced a CO2 limitation in the past 80 million years?
My point was that something other than temperature limited the growth of the trees after 1960. Another correspondent in this thread says it was drought. I cannot imagine CO2 being the limitation, but were that so, it would still suggest the data after 1960 should be discounted, and Mann and his colleagues were wise to do so.

Without knowing which of the factors was in shortest supply in a year, or stretch of years, you can’t make a blanket claim that tree ring widths reflect only the temperature record. Untangling it all is not possible without the metadata, and we don’t have it in most cases. – Anthony

Exactly the point. For much of the time used, dendrochronology records may act as a proxy for temperature measurements, and in fact there is good correlation at many points where data come from other sources to corroborate. That correlation fails after 1960 in the dendro data, and so it would be misleading at best to include it without noting it is known to be unrepresentative at that point. For the purposes of a summary paper, why include that post-1960 data at all, especially since it was discussed at some length in earlier papers focusing on that issue?
It would be good to have dendrochronology records from many different forests in many different climates and diverse geological areas. Perhaps you could spearhead a research effort to look for contraindicating data, where it might be found? It would be fun to find trees under the Sahara sands, and it would be particularly informative to be able to see how well temperature tracked with desertification in those now-desert areas. I have hopes that with NASA’s and ESA’s ground-penetrating radar data, and stable governments to allow access, such data may be found and studied.

P Wilson
November 24, 2009 1:17 pm

IYet dendrochronology can differ significantly from instrumental data: In the case of Yamal just 12 trees selected that showed a warming trend against the rejection of 34 in the same area that didn’t show a warming trend. Prior to this, Briffa argued that the problems of dendroclimatology are fraught with problems and that the Medieval Warm Period was probably as warm as today.
on the basis of these 12 trees, Briffa declares that the MWP was quite cold, but on the basis of the older 34 trees, the MWP was at least as warm as today.
Its a hard one to fathom.

Ben
November 24, 2009 1:18 pm

To those who say that this is a common practice in coding. Here’s the opinion of a chemical-enginner and programmer.
It is a very bad thing to merge datasets from two different instruments unless they are well calibrated. Every time you replace an environmentally sensitive instrument, the EPA has a stated range that have to calibrate it withing (5% for CO2 measurements, as low as 0.5% of range for incinerator temperatures). This is extremely important for proxy data that is replaced by measured data. You leave the proxy data running with the measured data in order to compare them. Unfortunately, the proxy is sometimes way off. You don’t then tack them together and then say (for example) “we had a 20% increase in flaring after the flowmeter was installed”, you say “our calculations were understating our flaring by 20%”.
If the proxy can be adjusted to reasonably predict the value, then you can adjust it for ALL PRIOR DATA and continue. However, if your proxy is completely uncorrelated to the real data or the adjustments are too extreme (which it is in the temperature proxy), then you start your graph at the installation of your instrument and throw out your proxy data as worthless.
Doing otherwise shows a lack of scientific integrity.

P Wilson
November 24, 2009 1:30 pm

Actually its easy to fathom. Just reject difficult data from the peer reviewing process

eaglewingz08
November 24, 2009 1:33 pm

Don’t you understand the facts can be wrong just as long as the story is true or SHOULD be true. (See Rathergate for full explication of liberal/progressive psychosis).
Since capitalism must fail and should fail and socialism/communism must win and should win, then global warming/climate change, as the stalking horse of the commies must be supported and all facts ‘massaged’ to fall into line (or at least into a hockey stick power point presentation). Those ‘reporters’ and politicians and bloggers who believe this is much ado about nothing (remember Van Jones and Anita Dunn and ACORN videos were so characterized) are whistling past the graveyard. Progressivism is built on lies, global warming as one of its most malicious creations is built by lying meretricious ‘scientist’ propagandists.

P Wilson
November 24, 2009 1:52 pm

Here’s a fix: If tree rings show continued growth but temperatures are on the decline, use them as proxies for temperature. If they show limited growth but temperatures are increasing according to the instrumental record, then use the instrumental record. That way, you get a *smooth* upward curve

E.M.Smith
Editor
November 24, 2009 2:50 pm
Editor
November 24, 2009 4:05 pm

Kathleen S. (05:09:55) :
I had always been taught that one should use scientific method to collect, collate data. Strange that these over funded people seem to ignore even the most basic of scientific methods!
For those at CRU, go to this site to lean how to implement said method – http://www.sciencebuddies.org/mentoring/project_scientific_method.shtml (it has a pretty, colourful bubble step plan for them and everything!!! >;))
I don’t have colorful bubbles but I do have colorful images of CO2 concentrations, greenhouse gas transmission windows (CO2’s is nearly saturated) in addition to the introduction to the scientific method. I wrote it for people new to the field and a select class of scientists who have forgotten it. See Science, Method, Climatology, and Forgetting the Basics.

Andrew
November 24, 2009 5:32 pm

This is interesting:
From the progrm file FOI2009/FOIA/documents/harris-treebriffa_sep98_e.pro
;
; PLOTS ‘ALL’ REGION MXD timeseries from age banded and from hugershoff
; standardised datasets.
; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
; with missing values set appropriately. Uses mxd, and just the
; “all band” timeseries
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
; Now normalise w.r.t. 1881-1960
;
mknormal,densadj,x,refperiod=[1881,1960],refmean=refmean,refsd=refsd
mknormal,densall,x,refperiod=[1881,1960],refmean=refmean,refsd=refsd
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
;
; Now plot them
;
‘Oooops!’ indeed. I wonder what the ‘plot’ was?

Tardkiller
November 25, 2009 2:10 am

Guido Fawkes Blog is now covering this.
His weekly readership is more than some of the MSM papers monthly figures and his monthly figures, well, lets just say its well read.
http://order-order.com/2009/11/25/time-to-defund-crus-global-cooling-deniers/

tonydej
November 25, 2009 3:24 am

If I did not know better, from reading this extraordinary thread, I might give in to the urge to say: Behold the World, like unto a hockey stick: Man has lost his Way and embraced False Prophets and the End of the World is nigh, and the Lord will send Flood and Fire to purge it, (unless we all bow down and accept the need for a ‘co-ordinated response’).

Ian W
November 25, 2009 7:58 am

The ‘Decline/ shows up a problem with the proxies and that is a problem for the entire edifice of AGW research.
The proxies failed validation against reality.
This means that they are USELESS and ALL the papers that use these proxies for temperature (or whatever) are of no value they are fakes. It is no wonder that CRU didn’t respond to the plant scientist who pointed this out.
All the arguments about the statistical significance of the number of trees in Yamal – total waste of time – the base data being used by those statistics was just random noise. So Biffra’s ‘seminal paper’ might as well be just invented values – it is a worthless fairy story.
What I find totally disappointing is that this aspect has not been flagged up and shouted from the rooftops.
So I will type it again – ALL papers that base their conclusions on proxy data are now worthless. None of their conclusions should be used until there is complete and independent validation that the proxies used represent the variable they claim to be proxies for.
Failure to validate proxies is a FUNDAMENTAL failure of scientific method, and should have been picked up in peer review – but we can see now that ‘peer review’ was just a way of research centers passing on ‘tricks’ to each other to hide the proxy failure!
This was the real reason that Phil Jones had to “hide the decline” – it showed that the whole AGW edifice based on the papers from CRU was built on invalid assumptions.
WHY IS NO-ONE POINTING THIS OUT?

Rob
November 25, 2009 9:13 am

I don’t understand how you can still call conspiracy on AGW after this. I mean, surely if there was one there would be evidence in these emails. Have you found Al Gore checking in to make sure everyone’s towing the party line? No?
Then the best you have (and I think this is a bit of a stretch) is some positive affirmation in one or two data-sets. No evidence of a New World Order conspiracy, no admission of a hoax. Unless I’ve missed something?

edward
November 25, 2009 9:23 am

Expose the code and bust the Anti-Trust Climate Team
Busted not Robust!
Shiny
Edward

John Ferguson
November 25, 2009 9:45 am

Been looking into the code and data held in the \documents\cru-code\f77\mnew directory. Here is what I have found:
The data sets are master.dat.com, master.src.com. master.src.com is the important file. Don’t open these without changing the extension to .txt otherwise Windows interprets them as executables, and you won’t be able to view them properly anyway. Could send copies capable of being opened in Windows. These contain monthly weather station data with one row per year. I don’t know the exact nature of these files, but some of the data does relate to sunlight duration. A site in Finland suggests master.src.com is temperature related, but there’s a lot of speculation flying around the Internet regarding the leaked files at the moment, so can’t be certain.
There are 3526 stations in all and 2578488 monthly observations. -9999 in a field means the observation for that month is absent. There are 269172 (10%) missing observations in master.dat.com and 14226 complete missing years. The programs are designed to completely ignore any years with no observations. In total there are 200649 rows (years) of observations which should equate to 2407788 months, however due to some years having up to 11 missing months there are 2309316 monthly observations used. Now what’s interesting is how these missing months are processed. Programs such as split2.f, where a year contains one or more missing months actually invents the figures using the following heuristic
If a month is missing try to infill using duplicate. if duplicates both have data, then takes a weighted average, with weights defined according to inverse of length of record (1/N)
That’s from the comment at the start of split2.f
What this really means is more than 4% of the data is being completely fabricated by at least some of the Fortran data processing programs. If this were done in other disciplines this would be extremely questionable.
Also did notice quite a few programs, especially in the documents\cru-code\idl\pro directory are designed to process data deemed anomalous, though this isn’t necessarily suspicious.
This is the header comment from documents\cru-code\idl\pro\quick_interp_tdm2.pro
; runs Idl trigrid interpolation of anomaly files plus synthetic anomaly grids
; first reads in anomaly file data
; the adds dummy gridpoints that are
; further than distance (dist)
; from any of the observed data
; TDM: the dummy grid points default to zero, but if the synth_prefix files are present in call,
; the synthetic data from these grids are read in and used instead
What is ‘synthetic data’ and why might it be applied to dummy gridpoints away from genuine observation points? This could be a recognised statistical procedure, or data massaging, or creating more observations out of thin air to skew certainty levels, just can’t tell and don’t have time to look at anything else in depth right now. Like it says, e-mails can be open to interpretation but it’s the code and what it does to the raw data which really matters. The comment in the Mann code described in the link below is a work-around to a recognised issue with dendrochronology data. During 1960s the correlation coefficient between tree growth rate and temperature altered.
The recent ERBE results are really significant, the discrepancy between IPCC modelled values and the real world figures is quite something.
http://wattsupwiththat.com/2009/11/22/cru-emails-may-be-open-to-interpretation-but-commented-code-by-the-programmer-tells-the-real-story/

TitaniumDragon
November 25, 2009 1:19 pm

“Ok fine, but how Dr. Jones, do you explain this?”
You need to understand the data source in question before you jump to conclusions the way you did. Without understanding the data, you cannot understand the program, because the program is made to deal with data.
The data source in question is treering data. If you go back to the paper which originally generated it, it states that after 1960 treerings no longer correlate accurately with temperature. This is hardly surprising, given the increase in herbicide, pesticide, and other pollution which occurred durng the 1950s and has continued through to today; we know from the record that things which simulate pollution (volcanic activity, for example) can have a smiliar impact on treering data.
So, fundamentally, they don’t use the data after 1960 for the same reason that you wouldn’t use data in a chamber which is supposed to be held at a constant humidity if the coolant pipes broke and the humidity rose to 100% – the data has been corrupted. Claiming that this is something shady is to not understand why the “trick” was not used in the first place.
The reason it isn’t untowed isn’t because they didn’t exclude the post 1960 treering data (they did) but because excluding the post 1960 treering data was correct – it no longer correlated with every other temperature record, especially the most reliable one, the instrumental record. Its not really relevant because the instrumental record post 1960 is quite good, so losing the proxy is not really an issue.
And indeed, this is well-known. The ORIGINAL PAPER regarding the proxy data indicated it was useless post 1960. The decline in question is that the treerings are smaller than they should be for the summer temperatures of the environments they were in as compared to how thick they were previously.

November 25, 2009 4:19 pm

Code doesn’t lie. I think the answer is simple. We have a computer model to predict future climate. Run it backwards, can it predict the past accurately? If so we have a good approximation of the natural world. If not, than you have a failed equation. That means back to the drawing board. Hire a better programmer or better yet make it open source. This is science not politics. Use the scientific method, it works pretty well.

Gail Combs
November 25, 2009 4:41 pm

Gordon Walker said:
“….And Yamal is an arctic wasteland where trees grow for about 15% of the year.
15% of 15% is just over 2%!
How much of a representative sample of the earth’s climate is that?”

Yeesh, Thats a little over a month! Just this summer I went for close to three weeks without rain in July. With such a short growing period, the influence of factors other than temperature would be exaggerated since there is no time for “smoothing” over the growing season. On top of that didn’t someone mention only 10 trees were used? Heck my forester took core samples of 25 when he estimated the age and growth of my stand of timber!
(bye the bye expect those fire ants in Virgina soon they are already showing up north of Raleigh NC and frost DOES NOT kill them….)

TitaniumDragon
November 25, 2009 5:59 pm

Gail, you should never, ever eat a number like “10 trees” without thinking about it first. How many trees did they SAMPLE to start out with? That’s the real question. And how did they choose them, for that matter.
You need to remember that there are VERY stringent requirements on what trees you can use – they have to have been and always been healthy, had to have been taller than nearby trees the whole usable life, ect. because if they fail any of those you have severe confounding factors.
So in reality, 10 trees may be perfectly fine, so long as they were the correct 10 trees to sample.
And they aren’t the only metric used anyway. Not sure where that idea came from… oh wait, yes I do. People who have no idea what they’re talking about (or, liars with agendas – pretty common in the denier community).
See also this very post, where he whines about something which is well understood by people in the know but OMG CONSPIRACY to people who are clueless.

Gail Combs
November 26, 2009 1:22 am

E.M.Smith said
“JNL (22:29:08) : I’m a statistical programmer for “BIG PHARMA” . For every new drug application, the FDA requires that we give them:…
Nice List.
FWIW, I’ve done “qualified installs” for Pharma companies.
What the non-Pharma folks might not know: For every single bit of hardware and software used for all the stuff JNL listed, it must be installed “just so”. Every Single Step of Every Single Procedure must be defined in advance. Even if it is just “Open box. Plug in cord. Turn power switch on.”…”

As a Certified Quality Engineer/Chemist who also worked in FDA audited factories, I can verify what is said here. FDA tells you how many mice, how many grams and how to test…. EVERYTHING must be documented and verified with double sign offs.
This “AGW climate” work shown in this release is so sloppy it does not qualify as “science” – FRAUD maybe but science no.

Gail Combs
November 26, 2009 2:19 am

John Finn said:
“So come on, folks – time to nominate your favourite email. I realise we’re totally spoilt for choice but which ones stand out. “
There are so many to choose from but I like this the best because it so clearly states the problems with AGW:
[mailto:geoengineering@xxxxxxxxx.xxx
] *On Behalf Of *David
Schnare
*Sent:* Sunday, October 04, 2009 10:49 AM
*Cc:* Alan White; geoengineering@xxxxxxxxx.xxx
*Subject:* [geo] Re: CCNet: A Scientific Scandal Unfolds
Gene:
I’ve been following this issue closely and this is what I take
away from it:
1) Tree ring-based temperature reconstructions are fraught with
so much uncertainty, they have no value whatever.
It is
impossible to tease out the relative contributions of rainfall,
nutrients, temperature and access to sunlight. Indeed a single
tree can, and apparently has, skewed the entire 20th century
temperature reconstruction.
2) The IPCC peer review process is fundamentally flawed if a
lead author is able to both disregard and ignore criticisms of
his own work, where that work is the critical core of the
chapter.
It not only destroys the credibility of the core
assumptions and data, it destroys the credibility of the larger
work – in this case, the IPCC summary report and the underlying
technical reports. It also destroys the utility and credibility
of the modeling efforts that use assumptions on the relationship
of CO2 to temperature that are based on Britta’s work, which is,
of course, the majority of such analyses.
As Corcoran points out, “the IPCC has depended on 1) computer
models, 2) data collection, 3) long-range temperature
forecasting and 4) communication. None of these efforts are
sitting on firm ground.

Nonetheless, and even if the UNEP thinks it appropriate to rely
on Wikipedia as their scientific source of choice, greenhouse
gases may (at an ever diminishing probability) cause a
significant increase in global temperature. Thus, research,
including field trials, on the leading geoengineering techniques
are appropriate as a backstop in case our children find out that
the current alarmism is justified.
David Schnare

Can’t be more blunt than that!

Gail Combs
November 26, 2009 4:27 am

Old Gasser said
“I see a striking corollary in supervision for contract maintenance of floors, …
Exit Questions: Which group was the most trainable; reliable; took most pride in work? Bonus Question: Who most often needed to be counseled regarding personal hygiene?”

The ghetto kids of course! That is why I fired a degreed Chemist, passed over lots of recent grads in Chemistry and hired a construction worker as a lab assistant. (He was also the only one dressed neatly and cleanly) He turned out to be a great worker and very trainable. Best move I ever made.
From what I can see these academic “Scientists?” wouldn’t last in an industrial setting because of their shoddy work habits. I certainly would fire them.

David in Florida
November 26, 2009 4:43 am

TitaniumDragon (17:59:10) :
“People who have no idea what they’re talking about (or, liars with agendas – pretty common in the denier community).
… where he whines … people who are clueless.”
TitaniumDragon must be a climate scientist, likely a professor. Rather than debate he seems to prefer to berate.

Gail Combs
November 26, 2009 4:51 am

E.L. said
“The smoking gun is comments in code that specifically state, “adjusted to look closer to
the real temperatures.” 0.o
Yes sir, they must be really pulling the wool over people’s eyes by adjusting things to reflect real temperatures.
Social scientists are missing an opportunity to study irrational human behavior here.”

Yes, they must be really pulling the wool over people’s eyes by adjusting things to reflect real temperatures when they are adjusting the bristle cone data so it reflects the real temperatures and then claiming it is a reliable proxy for GLOBAL temperatures.
It is the “things” they are busy adjusting that is questionable, just ask an accountant about adjusting things ie cooking the books.

Gail Combs
November 26, 2009 5:07 am

TitaniumDragon (17:59:10) :
Gail, you should never, ever eat a number like “10 trees” without thinking about it first. How many trees did they SAMPLE to start out with? That’s the real question. And how did they choose them, for that matter.
You need to remember that there are VERY stringent requirements on what trees you can use – they have to have been and always been healthy, had to have been taller than nearby trees the whole usable life, ect. because if they fail any of those you have severe confounding factors.
So in reality, 10 trees may be perfectly fine, so long as they were the correct 10 trees to sample.”

The minimum sample size should be at least 25 preferably 30 for this type of attributes data. 10 trees should only be considered a preliminary study to see if the idea is worth perusing and nothing more.
I do not have to “THINK” about it. I have been dealing with sampling plans for 30 years. The sample size sucks period. Heck we pulled more samples when qualifying a lot of shampoo caps that came out of the same plastics mold! Something as variable as a tree needs a larger sample size than I would use for a plastic cap (n=25)
I would want n=25 trees over at least 100 different locations correlated to accurate temperature measurements to consider the tree ring temperature proxy reliable. Otherwise it is just hand waving as the divergence after 1960 shows.

kwik
November 26, 2009 9:10 am

Hello,
I think the politicians needs to see this video here first, before spending any more taxpayers money on anything that has to do with research;
http://www.ted.com/talks/hans_rosling_shows_the_best_stats_you_ve_ever_seen.html

pope john
November 26, 2009 1:11 pm

It is difficult to get a man to understand something when his salary depends upon his not understanding it.
A trial without a defense is a sham
Business without competition is a monopoly
Science without debate is propaganda.
The most telling point is that after spending $30 billion on pure science research no one is able to point to a single piece of empirical evidence that man-made carbon dioxide has a significant effect on the global climate.

tonydej
November 27, 2009 3:58 am

Any suggestions where I should register outrage? Petitions, web-sites etc.
I see AL Gore is on the front page of the FT, saying ‘there is no more place for short-termism’ or such like. Such hucksterism: Rush me my planet-saving kit now… I agree to pay over 24 years etc. etc.

November 27, 2009 12:15 pm

Looking at comments is hardly code analysis.
IIRC, there was a comment in the patches for one of the DEC disk drivers that asked: “What do you get if you multiply 6 X 9?” The answer, of course is 42 (as any Hitchhiker’s Guide aficionado knows). But it matters not what the comments say; you want to show that something untoward is happening, show it IN THE CODE. The comments are meaningless by themselves.
Cheers,

Jack Enright
November 27, 2009 10:13 pm

I don’t know about programming; I can barely cope with using a PC. The only thing I know about climate is that it changes.
Sadly, though, at 62 I’ve met a whole bunch of frauds in my time, and I’ve got pretty good at spotting them. One of the biggest giveaways is when, if somebody starts asking detailed questions, they get evasive, then huffy, then downright hostile – and then start character assassination behind the questioner’s back.
Also, I’ve worked in enough universities to know that ANYONE who submits a piece of work in any discipline, even a first year student, has to reference sources of data or quotations (in an approved format), and methods of working, so that anyone else can check back on their work and verify it. Failure to do so will result in no marks for that work. UEA have not only allowed one of their academics to publish work in their name which failed to meet the above conditions, but have turned a blind eye while he broke the law, by deleting files to prevent them being accessed under the Freedom of Information Act.
Jones, and his supporters, can say what they like about the e-mails, the comments, and the software; in my opinion, those are side issues – though, I grant you, serious ones. But the fact that he has deliberately refused other academics access to his raw data and methodology so that they can independently verify his work damns him professionally. And the fact that UEA has allowed him to do so, whilst publishing work in their name, means they have no right to any academic credence either.
Compare his attitude with that of an archaeologist, when told that some perfectly formed wooden javelins had been found in a peat bog in Germany – and had been carbon dated at 140,000 years old (long before homo sapiens, and long before anyone suspected that manufactured tools were in use). He said;
“This is the most exciting thing that’s ever happened to me since I started studying archaeology! It means that everything I’ve learnt and been taught is ALL thrown out the window – and we have to start all over again!”

tonydej
November 28, 2009 2:10 pm

Here are two links to register dissent, both ‘official’ for the UK.
http://petitions.number10.gov.uk/UEACRU/
http://www.sciencemuseum.org.uk/proveit.aspx

layman
November 30, 2009 8:27 pm

I’m not much of a scientist and and even worse programmer so I try to avoid debates on both subjects as much as possible.
What appears to be one of the most galling aspects of the global warming crowd is their immediate and vicious attacks at the scientific credentials of those who remain skeptical, yet appear to marinate in their adoration of al gore and or anyone else who shares their view. Last I checked, Mr. Gore’s scientific credentials are about as feeble as they get. Simply reading up on subjects and shouting them louder than everyone else in the room typically makes one appear as little more than a drunken didact. Gore evidently has touched on the right subject but then he did major in government and took a shot a law (without much success) so while his science may be lacking his ability to manipulate a heathen throng appears second to none.
IN any case, the Climate change crowd (or GW, or AGW, or ACC if you prefer) seem to rely on the pejorative “denialist” (is that even a word?) and hissily insist that no-one is capable of looking at the data and infering anything of significance. Presumably hoping to imply that while they have the required knowledge to divine the truth in the data, no one else should try lest they hurt their tiny brains, or worse have the temerity to disagree with them. A political tactic if ever there was one so I guess Gore’s students are absorbing the right messages.
When one looks at that quality of the debate it’s not hard to come away with the impression that it is the climate change folks who are doing the most damage. Possibly based on their seemingly unshakeable belief that every single person who disagrees with them is either a loon, and idiot, or funded by Exxon. This belief is evidently acceptable justifcation to charge into any old discussion braying like a mad cow and insulting everyone within earshot.

December 1, 2009 4:55 pm

I reviewed the posted bits of “data” where the comments refer to “negative values”…
I was shocked. Shocked that a supposed scientist and “programmer” wouldn’t recognize this immediately. It’s a floating point overflow. Simple. Basic. Rookie error. I don’t know what the bit field length is, but..it’s simple. Square an number that fits in 16 bits…and if the result is too large to fit in 16 bits…voila !!!… you get a negative number…and consequently…you should have gotten an overflow error as well. This was basic, Programming 101 material. G-d help us all. I’m glad these guys are in the weather industry and not launching men into space.

Andrew Sykes
December 2, 2009 2:40 pm

What isn’t always apparent with computer-based climate modelling is that it’s still a developing science. No one’s claiming all the kinks have been worked out, merely that the overall direction of the research points conclusively towards humans having an impact on climate. The upshot of this is that data may seem to be altered/kept out/changed to fit expected outcomes not as part of some bizarre conspiracy but as part of good, well supported and mostly transparent science. It’s not perfect, but it ain’t wrong either.

kwik
December 2, 2009 8:49 pm

It seems to me that many Scientists working for the Government has a very twisted view of the private sector.
Someone working for the private sector cannot really be trusted. And yet, within Climate Science (some of them) are doing the very thing they themself claim the private sector is doing.
Hiding data. Hiding algorytms.
Well, there are honest Climate Scientists too. Like those at Alabama University doing Satelite measurements. They put their raw data on the Web, for anyone to plot them.
But then again, they are not claiming that we are on a tipping point, and that we are all doomed, unless we all subdue ourselves under a planetary Goverment, controlled by a very,very small elite. On the contrary. John Christy, which to me seems to be the arch-type of an honest scientist, says that this CO2 forcing factor is plain wrong. That data from the real world is indicating the very opposite.
So does Dr. Lindzen say. And he is rooted in the real world too. Not in the xbox 360 science.
The CRU-crew should have put all the raw data, collected using our tax-money, on the web.
And they should have put the specifications for out-sourced state of the art software weighting all these data into understandable plots, on the web. And the source code, for the core-software doing this weighting, so it could be inspected.
If they had had more contact with the private sector, and the technology advances going on within software in a free marked economy, this would have been in place by now.
Unfortunately, the seem to be on another planet than us, with some tipping point just around the corner. With one leader, one people, one Government. Have you seen it before?

December 3, 2009 2:21 am

Andrew Sykes (14:40:45) :
Atthe risk of being snipped for questioning your motivation, you rpost looks like it’s been made by someone who is a climate modellers trying to defend the indefensible. If you are a climate modeller or are part of the process (you use the output from climate models to predict when the ‘tipping point’ will come) ten you will know full well that the parameters built into the (parameterised functions in the) climate models are specifically chosen to show significant (more often than not dangerous/catastrophic warming) in the future.
The assumptions built into these climate models of significant net positive feedback are rarely questioned by those who are funded to produce these ‘the sky will collapse’ in xxxx predictions. And I’ve yet to see an real laboratory based experiments (with one exception) which justify the values derived for some of the key claimed net signiicant positive feedback parameterisations e.g. the effects of clouds. That one exception is Svensmark’s work and the soon to start CERN CLOUD experiments.
How therefore justify your statement that
“merely that the overall direction of the research points conclusively towards humans having an impact on climate. The upshot of this is that data may seem to be altered/kept out/changed to fit expected outcomes not as part of some bizarre conspiracy but as part of good, well supported and mostly transparent science.”
The models aren’t anywhere near at a stage of their development where they can ‘point conclusively towards humans having an impact on climate.’ and that climate modelling is ‘part of good, well supported and mostly transparent science’. I’d agree with your well-supported bitthough as sadly likely many in theUK have little choice in funding this well-funded (by me and other UK taxpayers) pseudo-science.
KevinUK

Ian
December 5, 2009 3:23 am

When they speak of not plotting data beyond 1960 because of the divergence problem you have to understand what is meant by the divergence problem.
The divergence problem is that tree ring data beyond 1960 does not show the expected increase in temperature. In fact the tree ring data would indicate that temperature had declined. This is the decline that gets frequent mention elsewhere.
If tree ring data is a good proxy for temperature, that would mean the temperature hadn’t increased since 1960. This contradicts the carefully (re)constructed temperature record from land based measurements. On the other hand if tree ring data isn’t a good proxy for temperature now then why should we think it was ever a good proxy for temperature in the past.
The honest thing to do in this situation is to plot all the tree ring data. A scientist shouldn’t only publish results which confirm their expectations. The approach at the CRU seems to have been to either throw away the data after 1960 on the grounds that it was contaminated by this mysterious `divergence problem’, or worse still to compensate for the divergence problem using a fudge factor.
There is really nothing wrong with that tree ring data. They don’t plot it or use it because it is ‘climatically incorrect’.

Jandui
December 6, 2009 7:56 am

its remembering me someone: Adolf Hittler? I think so…

Kevin007
December 9, 2009 6:44 am

an excerpt from one November 1999 e-mail authored by Phil Jones reads:
“I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie, from 1981 onwards) and from 1961 for Keith’s to hide the decline.”
Some of you climate denialists cite this sentence as evidence that temperature statistics are being manipulated. Several scientific sources have said that the decline being referred to is a decline in tree ring metrics, not temperature. RealClimate characterizes the e-mail excerpt as follows:
The paper in question is the Mann, Bradley and Hughes (1998) Nature paper on the original multiproxy temperature reconstruction, and the ‘trick’ is just to plot the instrumental records along with reconstruction so that the context of the recent warming is clear. Scientists often use the term “trick” to refer to a “a good way to deal with a problem”, rather than something that is “secret”, and so there is nothing problematic in this at all. As for the ‘decline’, it is well known that Keith Briffa’s maximum latewood tree ring density proxy diverges from the temperature records after 1960 (this is more commonly known as the “divergence problem”–see e.g. the recent discussion in this paper) and has been discussed in the literature since Briffa et al in Nature in 1998 (Nature, 391, 678-682). Those authors have always recommended not using the post-1960 part of their reconstruction, and so while ‘hiding’ is probably a poor choice of words (since it is ‘hidden’ in plain sight), not using the data in the plot is completely appropriate, as is further research to understand why this happens.
The “trick” is simply a concise way of showing the two kinds of data together while still clearly indicating which was which and has denied that there was anything “hidden or inappropriate” about it. His method of combining proxy data had been corroborated by numerous statistical tests and matched thermometer readings taken over the past 150 years

Ian W
December 9, 2009 8:08 am

Kevin007 (06:44:37) : yet again repeats the ““trick” is simply a concise way of showing the two kinds of data together while still clearly indicating which was which and has denied that there was anything “hidden or inappropriate” about it. His method of combining proxy data had been corroborated by numerous statistical tests and matched thermometer readings taken over the past 150 years”
The problem is that it did NOT match thermometer readings and that was what was hidden. If they didn’t hide that it didn’t match then the entire collated data set would be questioned. It would be remarkable if they managed to get tree rings to match temperatures – as they don’t – well not without some repeated manipulation of the figures. Biffra actually admits this in one of his papers. And you are aware that botanists do not agree that tree rings indicate temperature alone indeed recent research has shown they appear to correlate to levels of cosmic rays.
As you are obviously a statistician you can also tell us the statistical significance of the tree rings of the single tree on the Yamal peninsula in Siberia that drove Biffra’s results on _global_ temperatures.

Hideshi Nomura
December 12, 2009 6:16 pm

May the Physics be with you.

unbound
December 26, 2009 10:56 am

Lovely…you want a rational answer, but you seem to refuse to actually look for that answer (or possibly refuse to accept it).
The problem is how to deal with tree rings that are no longer consistent in size due to the increased carbon emissions in the atmosphere. The “trick” to account for the messed up tree rings after 1960 so they could see if trending after correction to the tree rings would continue to line up with the recorded temperatures.
Don’t act like you are trying to find the truth if you are not actually going to look for it.

Admin
December 26, 2009 11:12 am

sigh.
The problem is misleading policy makers and the public as to the certainty of the reconstructions and the reliability of these tree-ring based temperature reconstructions.
Unbound:
How do you know ” with tree rings that are no longer consistent in size due to the increased carbon emissions in the atmosphere. “?
Your statement quoted is just a made up arm waving dismissal of data to reach a preconceived conclusion. There is no way to know that long term treemometers ever existed in the first place, except for some apparently cherry picked data correlations.

Ian W
December 26, 2009 1:43 pm

I realize that ‘Climatologists’ wanted to find a proxy for temperature – so they took tree rings. Perhaps if they had briefly spoken to a botanist – someone who actually knows about plant growth – even the university gardener – they would have been told ( and they were told by several unasked) that tree rings are as reliable an indicator of past temperature as reading tea leaves. Far to many variables affect plant growth, all that can be said about a ‘thick year’ is that the tree grew well.
This became very apparent when the proxy temperature guesses from the tree rings went down after 1960 when the actual temperatures were going up. This actually showed that the proxies were invalid. If they were invalid after 1960 then they were invalid before 1960 despite possible short term correlations. Thus all the reasoning based on THE tree at Yamal etc was based on invalid assumptions. The entire argument using tree rings was baseless. Instead of being honest about this – ‘the team’ hid this validation failure.
So the validation failure exposed and compounded a larger ethics failure in University of East Anglia.

David
December 26, 2009 4:25 pm

Local weather is not global climate. The temperatures at Yamal could have declined after 1960 as they have in many places. We don’t use one thermometer to track global weather and neither should a single stand of trees be used. BAD SCIENCE!

December 26, 2009 8:43 pm

So, Ian, once they consulted with the botanists and determined that they could filter out static from other causes and use the rings as an approximate proxy for thermometers, and the botanists told them that tree rings are accurate indicators of past growth much, much more accurate than your own tea leaf reading, what then? What sort of hubris makes you think there were no such consultations?
You can’t be bothered to read the papers and the science?
David, how many different sites measured does it take to equal good science on the issue? How do you know? What sort of hubris was it that caused you to think these scientists didn’t bother to get more than one location?
You can’t be bothered to read the papers, either?

Ian W
December 27, 2009 1:04 am

Ed
If you read the CRU emails you will see that they _were_ told that there were far too many variables affecting tree growth rate for it just to be temperature. For example a hot dry year will result in less growth than a temperate wet year, a pest or fungus infestation will result in less growth. This is not ‘noise’ it is that many variables affect tree growth and use for just temperature is unsafe. Especially when the claims are for accuracy to better than plus or minus a few degrees
If you must use a proxy for a variable then your research and results should also include totally open validation of that proxy to prove that it does act as a reliable proxy.
This was not done. Indeed when the unreliability of the proxy became apparent – everyone involved – agreed to hide it. Now you may have different ethics to the rest of the scientific community – but most scientists report their results openly even if they are unfavorable to their research. Research is intended to increase knowledge, not prove preconceptions.
so Ed, climate research is based on proxies. Do you have a problem with validating these proxies as correct and reporting the validation carried out their level of accuracy or inaccuracy, and any ‘adjustments’ as part of that research?
I ask as it is notable that the IPCC committee (at the meeting in Tanzania) and University of East Anglia and Penn State University appear to be of the opinion that you should hide proxy validation failures and original data.
Perhaps scientific ethics have changed?

December 27, 2009 7:27 pm

If you read the CRU emails you will see that they _were_ told that there were far too many variables affecting tree growth rate for it just to be temperature. For example a hot dry year will result in less growth than a temperate wet year, a pest or fungus infestation will result in less growth. This is not ‘noise’ it is that many variables affect tree growth and use for just temperature is unsafe. Especially when the claims are for accuracy to better than plus or minus a few degrees

And if you read the papers you’ll see that those problems were taken into consideration. No one argues that tree rings substitute for thermometer readings, but as a tool to corroborate other measures, they work just fine. The original allegation here was that the dendrochronologists don’t know botany. That’s demonstrably wrong, and the papers do consider the stuff you claim they don’t consider. Read the papers.

If you must use a proxy for a variable then your research and results should also include totally open validation of that proxy to prove that it does act as a reliable proxy.

The research is totally open. Read the papers.
You’re complaining about a chart based off the research papers. The chart was designed to show how dendrochronological data correlate and corroborate other data dealing with global warming. As with many other measures in many other areas — carbon dating has to deal with increased carbon after about 1850, too, for one example — there are variations in the correlation of the proxy measure over time. These variations are well known, not secret, and in this case have been subject to more than a decade of rather vigorous public debate about how to deal with them.
Anyone who reads the literature knows that the dendro data depart from the general trend in the 1950s or 1960s. We don’t know why — it may be, as with the carbon dating data, that the increased amount of carbon has something to do with it. Or it may be that acid rain predominates the effects shown in dendro data at that time, which is about the same time acid rains started to become a major factor. There is a wide variety of potential causes. More research may tease out which is the ultimate cause, or which are the causes.
But in any case, we know that the tree-ring data stop being a good proxy after about 1960. So, for the purposes of this one chart, real temperature readings were substituted.
In other words, known-to-be-accurate data were substituted for known-to-be-inaccurate data.
I’d love to hear one of the warming skeptics explain why actual data should be considered “fraud” in this case. In the real world, that’s a good move, to put in more accurate data in a chart. In any case, this was the substitute data.
If the proxy data are not accurate for the years prior to 1960, the inaccuracies should be explained in the discussions in the papers prior to 1998. Can you point me to a paper that says the data don’t work at all? I’ve not found it. In general it’s fair to say that the proxy data are well known to be well correlative to temperature readings in places and times we couldn’t have temperature readings, with the qualifications well identified in the papers.

This was not done.

Seriously? What do you call the debate over the previous decade? Your argument that we should have used inaccurate data is specious from the start, and here you claim that the difficulties with the proxy data were not known — when your only sources of information on the difficulties are the research papers already on the record identifying those problems.
This shouldn’t be so convoluted. Are the data inaccurate? Can you show us where and how?

Indeed when the unreliability of the proxy became apparent – everyone involved – agreed to hide it.

That’s a false statement on your part. The unreliability of the proxy applied only after 1960, and it’s well discussed in the papers. None of those papers was pulled. All of them remain in public, in research journals.
Your failure to study the issue is not “hiding the data” on the part of the people who did the work.

Now you may have different ethics to the rest of the scientific community – but most scientists report their results openly even if they are unfavorable to their research. Research is intended to increase knowledge, not prove preconceptions.

You may not subscribe to science ethics which requires an accurate summary of the data one criticizes. But that doesn’t affect he accuracy of the overall data. It only means that trying to figure out what you’re saying will be difficult for laymen — you’ve hidden the facts. Odd, isn’t it? You’re doing what you falsely accuse the scientists of doing, and you’re crowing that your ethics are superior.
Research can increase knowledge only if people read it.

so Ed, climate research is based on proxies.

That’s false. Among the more famous non-proxy research, climate research is based on measurements of CO2 on the mountains of Hawaii over the past 60 years, on temperature readings from various science agencies kept over the last 400 years, on the coming of spring, on the zones in which plants grow, on the first freezes and last freezes of a year, on the extent of glaciers as measured during the past 300 years, and a variety of other non-proxy data.
If you were laboring under the misconception that climate data is proxy data, or mostly proxy data, or significantly proxy data, we’ve identified a major source of error for your other observations.

Do you have a problem with validating these proxies as correct and reporting the validation carried out their level of accuracy or inaccuracy, and any ‘adjustments’ as part of that research?

I have no difficulty in using proxies for the purposes they are intended, so far as they tend towards accuracy we can use. In this case, dendrochronology data are used to substitute for actual temperature measurements in times and places we don’t have actual measurements. The proxies are calibrated against real measurements. Adjustments for accuracy are par for the course, to be expected, and not to be fought tooth and nail by people who wish for accuracy.
What is your real question, I cannot tell. If you’re asking whether the proxy data corroborate other measures that show super-warming, the answer is yes. If you’re asking whether proxy data drive climate research, the answer is no.

I ask as it is notable that the IPCC committee (at the meeting in Tanzania) and University of East Anglia and Penn State University appear to be of the opinion that you should hide proxy validation failures and original data.

That’s a false claim.

Perhaps scientific ethics have changed?

Nope, still much stiffer than blog ethics.
REPLY: Ethics or not, they still felt the compelling need to “hide the decline” by substituting one set of data for another, and that’s wrong. It’s like telling the US population that the dollar is strong by splicing the recent values of the Yen onto the value of the dollar data after it started declining and showing it all in one graph, saying “see, the dollar is robust!”. That doesn’t fly, and neither should mixing proxy and real data. It’s just plain wrong and was only done to strengthen their point when the data nature provided went the way they didn’t want it to.
All explained in detail with supporting evidence here: http://climateaudit.org/2009/12/10/ipcc-and-the-trick/
-Anthony

Ian W
December 28, 2009 1:25 am

Climate research IS based on proxies.
There are relatively accurate actual measures for temperature and CO2 going back to the 1800s before that there are historical records and such things as ships logs that can give relatively reliable wind records but once you are back to say the 1600’s then you are left with proxies.
The ENTIRE AGW hypothesis is that temperatures are rising at an exceptional rate and CO2 is at an exceptional level. To do this the it is necessary to know the normal so one can claim the current climate behavior is exceptional. The ‘normal’ climate behavior is only available through proxies and some historic records. For the AGW hypothesis to be accepted the proxies needed to show that the known warm periods such as the Medieval Warm Period and the Roman Optimum were not actually as warm as today. This I believe is what drove the proxy research. Even historic ‘global’ CO2 is based on ice core data the problem is (as the satellite imagery shows) that the CO2 concentration at the poles is significantly lower than elsewhere on the Earth and CO2 also diffuses through ice over time this results in an unnaturally low ‘historic global CO2 from ice cores’ it is unsurprising that historic ice core records show low CO2. What IS surprising is the disregard of actual measures of CO2 taken in the 1800s that show far higher CO2 concentrations.
Climate research is not weather research it is based on multi-century timescales. We have the ability to measure things from satellite now that is only 40 years old at most. But climatology studies this entire inter-glacial and other glacial and inter-glacials and have to rely n proxies. It is time that an independent validation exercise/audit was carried out on all these proxies as peer review is insufficient. A proxy that departs from actual measures in current times may well have similarly departed in previous times and is therefore unsafe.
Anthony has answered the issue of concealing proxies that fail validation.

kwik
December 28, 2009 7:51 am

I think this finnish documentary gives a pretty good overview;
Seems station-data is important now;
Climategate på finsk TV gir en bra oversikt
kwik skrev for 0 minutter siden:
Vi ser ikke stort om Climategate på norsk TV.
De prøver nok å holde lokket på så lenge som mulig.
Men her er 3 deler på Youtube fra en finsk film om Climategate;

http://www.youtube.com/watch?v=Clpmt5_8MBg&feature=related

kwik
December 28, 2009 8:01 am

Please excuse the norwegian text in my previous text. A copy-paste error. Maybe moderator can remove it?

December 28, 2009 3:01 pm

Anthony said:

Ethics or not, they still felt the compelling need to “hide the decline” by substituting one set of data for another, and that’s wrong.

What they did stands the 4-Way test of Rotarians. What they did is entirely within the Scout Law.
I’m shocked to see Anthony Watts say it is “wrong” to delete information known to be inaccurate and substitute much more accurate information. That’s only “wrong” if one is cheering for total disaster.
But on an ethical scale, what they did stands near the top: Getting good information out is alwasy more valuable than telling falsehoods. And make no doubt about it, temperatures did not begin an precipitous decline in 1960.

It’s like telling the US population that the dollar is strong by splicing the recent values of the Yen onto the value of the dollar data after it started declining and showing it all in one graph, saying “see, the dollar is robust!”. That doesn’t fly, and neither should mixing proxy and real data.

I’d really love to see you justify using known-to-be-incorrect data in place of actual temperature readings. On one hand you campaign for perfectly accurate placement of the devices that measure temperatures, and here you claim that use of that accurate data is “dishonest.” You’re closer to right the first time.

It’s just plain wrong and was only done to strengthen their point when the data nature provided went the way they didn’t want it to.

As it was, the chart showed less warming than actually occurred in the decade or so that has elapsed since the chart was published. I’m not sure what making a chart even more wrong on your side of the scale would have done, Anthony. Think about it for a moment: What would the headlines have read had people said “warming to hit plateau for a decade or so” in 1995, and then we’d gone through 1998 and the well-above projected temperatures from 2000 to 2009. The headlines would have yelled about how much more warming was occurring than the scientists had predicted.
It would be as if the television weatherman predicted warm, sunny weekends for the entire summer, and each weekend was cold and rainy. How long would that weatherman last at the station?
Even for the skeptical side — especially for the skeptical side — those guys did everyone a favor when they strove for accuracy over a foolish consistency.
Accuracy is to be prized above politically-motivated data interpretations, in science.
REPLY: I’m now convinced that your view of reality is hopelessly distorted. Say whatever you like, as it has no impact. But the fact remains that a portion of the tree ring dataset that showed a decline in temperature post 1961 was combined with data that showed the opposite result. If a stock report did something like this for a major corporation stock trend, people would go to jail for defrauding investors.
To help you understand, please see this:
Hide the decline graph
– Anthony

December 28, 2009 3:57 pm

I’m now convinced that your view of reality is hopelessly distorted. Say whatever you like, as it has no impact. But the fact remains that a portion of the tree ring dataset that showed a decline in temperature post 1961 was combined with data that showed the opposite result. If a stock report did something like this for a major corporation stock trend, people would go to jail for defrauding investors.

Make your case for claiming a decline after 1960, then. Remember, this chart was published in the 1990s. We knew what happened in the 1960s, 1970s, and 1980s. To do as you say, and show a trend known to be false, would indeed get one jailed if one were selling it as stock advice.
But that’s not what was going on here. The questions are more subtle, and demanding more accuracy. The chart was to show the trends of global temperatures over a long period of time, with projections past the current date.
I’m sure I won’t convince you based on what you said above, but every Boy Scout and former Scout will understand: Substituting the figures from what actually happened was the right thing to do. It showed what actually happened, and cutting off the dendro proxy data prevented the presentation of a much more erroneous projection to the future.
As you know, the actual temperature readings after the chart’s current date were higher than projected. Had the erroneous data you ask to be used been used, the chart would have been even father wrong. The panic that might have resulted from such an error is incalculable.

Vargs
December 29, 2009 12:04 am

Ed Darrell said:

Substituting the figures from what actually happened was the right thing to do. It showed what actually happened, and cutting off the dendro proxy data prevented the presentation of a much more erroneous projection to the future.

What seems impossible to get across in this discussion is that the only real science produced in these papers is a categorical demonstration that dendro data have no predictive value as a proxy for near surface temperature. That’s an interesting and valuable finding. Why not leave it at that?
What was actually published was scriptoral. It is a conflation of whatever straws were to hand to illustrate a thesis. This is not science and would get short shrift in any other hard science discipline. The distrust that many of us who read Anthony’s blog have is that the primary evidence (viz the surface station data) is in the hands of people who believe that the message is more important than the science and who have manifestly manipulated data to reinforce it.
Ed asserts that we know that the actual temperature readings were higher than projected. That’s exactly the point, Ed. We don’t.

Ian W
December 29, 2009 2:08 am

Vargs, you wrote:

“What seems impossible to get across in this discussion is that the only real science produced in these papers is a categorical demonstration that dendro data have no predictive value as a proxy for near surface temperature.”

I fully agree – but this also means that there is no way of knowing whether dendro data have re constructive value as a proxy either. This is the reason that ‘the decline’ in proxy temperatures had to be hidden – not just to make a pretty graph. There must also be doubt over the CO2 ‘perfect match’ between ice core data taken at the pole where CO2 atmospheric concentrations are always lowest and the current measurements supposedly a world average. One wonders whether any ‘decline’ has been hidden there.
“The distrust that many of us who read Anthony’s blog have is that the primary evidence (viz the surface station data) is in the hands of people who believe that the message is more important than the sciencle and who have manifestly manipulated data to reinforce it.”
Again this is an extremely important point. Surely it is time that metrics and their validation were separated from the research into what these metrics may mean. A UN organization cannot be expected to respect academic neutrality but scientists should.

“Ed asserts that we know that the actual temperature readings were higher than projected. That’s exactly the point, Ed. We don’t.”

Not only that but we don’t know what past temperatures were either from this conflation of unvalidated proxies. So we enter a Humpty Dumpty world where the proxies mean “whatever the researchers want them to mean neither more nor less”.
This is the reason why there is a requirement for reproducibility. It would seem that a research project is required to validate the proxies and other temperature and CO2 data running them up into the modern period of reliable metrics. It could identify where proxies were usable and where they were not and also the limits on precision of the metrics from each proxy. It should also run independent quality checks on the modern metrics. The results of this open book project would then be used to validate other research results based on proxies. Such as research which used a single tree in the Yamal peninsula as a global temperature proxy.
As it is now – no scientist now knows who or what to trust; decades of cited peer reviewed research and learning may be based on unsound premises.