CRU Emails "may" be open to interpretation, but commented code by the programmer tells the real story

When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:

From a yahoo.com news story:

In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.

Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.

Dr. Jones responded.

However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.

Ok fine, but how Dr. Jones, do you explain this?

There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:

People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:

Here’s the code with the comments left by the programmer:

function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$

datathresh=datathresh

;

; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES

; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate

; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE

; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE

; USUAL correlate FUNCTION ON THE RESIDUALS.

;

pro maps12,yrstart,doinfill=doinfill

;

; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

;

and later the same programming comment again in another routine:

;

; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD

; reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

 

You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”,  but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.

Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.

Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.

For more details, see Mike’s Nature Trick

UPDATE: By way of verification….

The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file

/documents/osborn-tree6/mann/oldprog

in the files

maps12.pro

maps15.pro

maps24.pro

These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.

UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:

================

From documents\harris-tree\recon_esper.pro:

; Computes regressions on full, high and low pass Esper et al. (2002) series,

; anomalies against full NH temperatures and other series.

; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

;

; Specify period over which to compute the regressions (stop in 1960 to avoid

; the decline

;

Note the wording here “avoid the decline” versus “hide the decline” in the famous email.

===============

I’ll give Dr. Jones and CRU  the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.


Sponsored IT training links:

Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.


0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

480 Comments
Inline Feedbacks
View all comments
Basil
Editor
November 23, 2009 5:35 am

As a bit of clarification in my response to Nick, I’m now actually reading the Harry_Read_Me.txt file for myself. It is an attempt to update CRU TS, but it also references tree ring data and the problem of “the decline.” So, it would seem, on this first quick look, that prior to 1960, the world temperature data in this gridded set are being forced to look like the tree ring data, but after 1960, “real” temperatures are used. Is that a fair reading?

Midwest Mark
November 23, 2009 5:35 am

Our local “newspaper” this morning ran this headline as the top story: “Since ’97, global warming has just gotten worse”. It runs through all the usual AGW talking points; i.e., polar bears are threatened, Arctic ice is at an all-time low, huge chunks of ice are breaking off of Antarctica, glaciers are disappearing, etc. It’s an AP story from Seth Borenstein, and it’s supported by quotes from, among other sources, Janos Pasztor, a “UN climate advisor.” If you follow the jump to page A4, there’s an accompanying story (finally!) about these emails, but the tone of the story is accusatory. That is, some dastardly hackers illegally obtained information and are bent on spreading lies!
I’d say this is a good sign. The AGW coalition has to be very desperate and alarmed (no pun) to find themselves in such a tenuous position! They’re even beginning to run stale global warming stories as banner headlines! “Pay no attention to that man behind the curtain!”

MangoChutney
November 23, 2009 5:38 am

it seems my post at RC is being held up in the queue at the moment with other posts in favour of RC sailing through.
wasn’t one of the alleged emails about holding up and censoring posts on RC?

Editor
November 23, 2009 5:41 am

Ed (23:36:26) :

The comments in the HARRY_README file are pretty wild, however. So wild that I haven’t really figured quite what to think about that just yet. There are other comments in the source files that mention data that was lost (cloud data) and which they recreate or try to re-create based on other data or sensor inputs. The HARRY_README though is rather wild.

You can say that again. 🙂

Robinson
November 23, 2009 5:42 am

It’s also not surprising, as you will know, that those not trained as Software Developers, or in Computer Science in general, would have great faith in computer models, even going as far as to suggest that there’s something wrong with reality if it doesn’t match the model! There’s something magical about a computer, if you don’t program them for a living.

Gary
November 23, 2009 5:42 am

The next question is: what do they mean by the “real temperatures” that the programs are adjusting to? Surely not the ones affected by the demonstrated warming biases, faulty station siting, dropouts, and questionable recording standards.

stephen richards
November 23, 2009 5:52 am

E.M.Smith (23:56:46) :
Yet another very good post from you, an entirely accurate if not a bit cynical but then programmers like these make you very cynical and thoroughly peed off.
Well done EM

November 23, 2009 5:53 am

Tonight here in Australia for the first time Tony Jones from “Lateline” began to ask the hard questions with an expose of the CRU scandal as well as an interesting interview with a very nervous Tim Flannery who attempts to represent a viable AGW platform here in Australia.
It’s a huge week here.
ABC links here:
http://www.abc.net.au/lateline/content/2008/s2751375.htm
http://www.abc.net.au/lateline/content/2008/s2751390.htm

MattN
November 23, 2009 5:53 am

I’ve got my popcorn ready.
Keep at it fellas….

Curiousgeorge
November 23, 2009 5:54 am

Danny V (05:11:32) :
And the Copenhagen prpoganda machine continues at full speed.
http://malaysia.news.yahoo.com/ap/20091123/tbs-sci-climate-09-post-kyoto-f8250da.html
No doubt that story was already written and in the printing queue before the Hadley story broke. I expect we’ll see more of this kind of thing for the next few days.
When a dam breaks, it always starts with a trickle.

Frank Lansner
November 23, 2009 5:59 am

Jay (05:01:28) :
“E.M.Smith (23:56:46) :
Wonderful Explanation. Everyone should read what you wrote. Perhaps Anthony will make a blog post about this. The details are amazingly enlightening. Keep it up guys.

E.M.Smith is allways a good read 🙂

Frank Lansner
November 23, 2009 6:01 am

“michael (04:54:48) :
reality vs. modell
http://i50.tinypic.com/301j8kh.jpg
have fun…!

Michael: where does these data come from??? Interensting.

fred
November 23, 2009 6:02 am

P. Gosselin 01:33:38
Can explain to me why the sunspot count of the widget shows 13 when there are just little specs?
This is not sarcasm. I have noticed several times lately that the “count” has been in that range when spots are barely visible.
Are counts today the same as they would have been a century ago? We have been told that, but I’m beginning to wonder.

fred
November 23, 2009 6:05 am

Correction to : 06:02:03
Not on the widget, but on the “Solar-Terrestrial Data”

Jimbo
November 23, 2009 6:11 am

Jesse (21:24:11) :
“Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
Well Jesse I believe that apart from emails there was also a bit of data. When its all scoured over by scientists who previously were denied access we might see the making of a mountain. 🙂

stephen richards
November 23, 2009 6:14 am

E.M.Smith (01:47:51) :
I used to teach Software Engineering many moons ago and one of the tricks we pulled on the students was designed to mimic the process you have defined.
What we did was send each group of students into a seperate room and asked them to write a program sequence in english. The sequence was to be used to instruct someone to make a cup of tea. Youknow … get the kettle, fill it with water from a water tap, plug it in etc etc. It was hilarious. BUT very enlightening

stephen richards
November 23, 2009 6:15 am

sorry about the upside down M

Spartacus
November 23, 2009 6:17 am

Pay attention to the code into briffa_sep98_d.pro:
************************************************
;
; Now prepare for plotting
;
loadct,39
multi_plot,nrow=3,layout=’caption’
if !d.name eq ‘X’ then begin
window,ysize=800
!p.font=-1
endif else begin
!p.font=0
device,/helvetica,/bold,font_size=18
endelse
def_1color,20,color=’red’
def_1color,21,color=’blue’
def_1color,22,color=’black’
;
restore,’compbest_fixed1950.idlsave’
;
plot,timey,comptemp(*,3),/nodata,$
/xstyle,xrange=[1881,1994],xtitle=’Year’,$
/ystyle,yrange=[-3,3],ytitle=’Normalised anomalies’,$
; title=’Northern Hemisphere temperatures, MXD and corrected MXD’
title=’Northern Hemisphere temperatures and MXD reconstruction’
;
yyy=reform(comptemp(*,2))
;mknormal,yyy,timey,refperiod=[1881,1940]
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=22
yyy=reform(compmxd(*,2,1))
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
;
oplot,!x.crange,[0.,0.],linestyle=1
;
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,[‘Northern Hemisphere April-September instrumental temperature’,$
; ‘Northern Hemisphere MXD’,$
; ‘Northern Hemisphere MXD corrected for decline’],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,[‘Northern Hemisphere April-September instrumental temperature’,$
‘Northern Hemisphere MXD’],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
;
end
****************************************
Attention to the code after “; Apply a VERY ARTIFICAL correction for decline!!”
Cheers

JimB
November 23, 2009 6:18 am

“Methow Ken
…If it looks, walks, flys, swims, and quacks like a duck, theoretically it still COULD be something else. But lacking overwhelming evidence to the contrary, odds are REAL good that it’s a duck. The duck quacked in this case, and their goose is cooked.”
I think it could only be a duck up until 1960. After that, the data had to be modified and it began to smell more like a giraffe, or something… :>)
JimB

Mark
November 23, 2009 6:18 am

Re: Patrik (05:08:57) :
“http://www.cru.uea.ac.uk/ is entirely down right now if you all hadn’t noticed.”
Maybe they are busy shredding documents, deleting files and zeroing out the cleared hard drive space?
I gotta believe that certain people at GISS (and probably other similar organizations) are considering deleting old emails, data, and code.

November 23, 2009 6:20 am

A few more press links:
A better (?) link to Lawson-Watson on BBC:
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373677.stm
The Lawson Times article now sindicated to The Australian for Tues (Oz time)
http://www.theaustralian.com.au/news/opinion/copenhagen-deserves-to-fail/story-e6frg6zo-1225802514603
And The Australian now making the link with the Oz legislation (ETS) debate with a response from the Opposition Senate leader (Minchin)
http://www.theaustralian.com.au/news/features/hot-and-bothered/story-e6frg6z6-1225802504484
While Fairfax helping out with the defence:
http://www.theage.com.au/national/email-scandal-rallies-web-climate-sceptics-20091123-iysr.html

rxc
November 23, 2009 6:29 am

I can think of at least 3 nuclear power plants that were shutdown and are now completely inpoerable because of problems with their documentation that was similar to the issues that have been identified in these documents. The nuclear industry is one industry where there are quite rigorous standards for code validation and verification, with well-established, open international standards for evaluating models against data.
I hope that this episode leads to more investigation of environmental mischaracterization of data and cherry picking, starting with the DDT travesty.

KevinUK
November 23, 2009 6:32 am

Gregg E. (02:58:54) :
“The file HARRY_READ_ME.txt is *very revealing* about just how disorganized CRU’s data and software are. “Harry” is apparently Ian Harris. If he’s the author of that file, it appears from the notes that he’s trying to straighten things out but finding that the data and previous software is a complete charlie foxtrot.
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13
I’ll call it +1 probability that the author of HARRY_READ_ME.txt is the insider who took this archive out of CRU.”
I think you are almost right in your deduction Gregg. If you look a bit further into some of the other files I think its likely the person providing the comments in the HARRY_READ_ME.txt is a ‘contract programmer’ brought in to assist Ian Harris in sorting out this mess. This ‘code monkey’ has not shared the goals of the Team and at some point has had enough (perhaps they didn’t renew his/her contract at some point befo rethe release) and has decided (because he/she still has remote access to the CRU departmental server) to assemble all the documents he/she could and take copies of the .eml files of certain staff and following the final FOIA brush off email, decided to release the emails and files to the internet.
If I’m honest and had been in the same position (i.e. to know that UEA were deliberating commiting a crime in not complying with the FOIA requests and wer ein the process of attempting cover up their mess), I’d probably have done the same.

Spartacus
November 23, 2009 6:33 am

debreuil already had noted the “trick” into the code from my last Post. Kudos to him!!

John Galt
November 23, 2009 6:35 am

Is past musings, some have speculated about scientific/academic fraud and what actually constitutes scientific fraud. Have we reached that point yet?

1 7 8 9 10 11 20