When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:
From a yahoo.com news story:
In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.
Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.
Dr. Jones responded.
However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.
Ok fine, but how Dr. Jones, do you explain this?
There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:
People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:
Here’s the code with the comments left by the programmer:
function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$
datathresh=datathresh
;
; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES
; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate
; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE
; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE
; USUAL correlate FUNCTION ON THE RESIDUALS.
;
pro maps12,yrstart,doinfill=doinfill
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
and later the same programming comment again in another routine:
; ; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD ; reconstructions ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually ; plot past 1960 because these will be artificially adjusted to look closer to ; the real temperatures.
You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”, but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.
Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.
Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.
For more details, see Mike’s Nature Trick
UPDATE: By way of verification….
The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file
/documents/osborn-tree6/mann/oldprog
in the files
maps12.pro
maps15.pro
maps24.pro
These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.
UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:
================
From documents\harris-tree\recon_esper.pro:
; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
;
Note the wording here “avoid the decline” versus “hide the decline” in the famous email.
===============
I’ll give Dr. Jones and CRU the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.
Sponsored IT training links:
Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
@Philip Bratby: The BBC coverage has reopened…….
It is beyond question that the climate is changing; that man is completely responsible is very definitely not! That is why I am delighted at the revelations of the CRU at the University of East Anglia.
Complicit in this misrepresentation of the science is the BBC in its TV and radio output. For over 3 years I have been trying to elicit answers from both Mark Thompson (Director General) and Sir Michael Lyons (Trust Chairman). All I had received was sophistry and obfuscation, until I engaged the help of my MP.
Recently it came to light that a report had been commissioned in June 2007 jointly by the Trust and BBC Board of Management entitled “From Seesaw to Wagon Wheel-Safeguarding Impartiality in the 21st Century”. It concluded: ‘There may be now a broad scientific consensus that climate change is definitely happening and that it is at least predominantly man-made… the weight of evidence no longer justifies equal space being given to the opponents of the consensus’.
Despite this damning evidence from their own report, they steadfastly cling to the belief that their impartiality is intact as required by the BBC Charter. Such is their state of denial that Sir Michael Lyons has even tried to deliberately mislead my MP despite evidence I have to the contrary.
In light of this I have posed the question, through my MP: “On whose authority did the BBC cease to be an impartial Public Service Broadcaster, as required by its Charter, and become the judge, jury and sponsor of such dangerously specious political dogma so eloquently described as ‘…the consensus…’?
Answer comes there none! I believe it is time for the BBC to be subjected to an enquiry on this matter.
Also significant…complete lack of response from the Guardian……still peddling the same rubbish…….http://www.guardian.co.uk/environment/2009/nov/22/climate-change-emissions-scientist-watson
If anyone missed this part from debreul:
It says important note, but I guess I missed the memo.
\FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way , and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.‘
HAVE BEEN MODIFIED TO LOOK MORE LIKE THE OBSERVED TEMPERATURES
Game over. They wil HAVE to call this a fake to keep their jobs.
This is spreading like a bushfire this lunchtime (UK time) the Telegraph newspaper’s website stories on this have crashed their servers.
Either that, or foul play is afoot. I dunno, perhaps I have watched too many episodes of BBC’s “spooks” and can imagine the MI5 geek trying to stop this story spreading round the mainstream media.
All I got from the Telegraph site was:
“Gateway Timeout
The proxy server did not receive a timely response from the upstream server.
Reference #1.cae3554.1258980286.0 “
What language are these *.pro files? I am guessing Fortran.
BTW, This was today’s leading story a couple of hours ago, now it is not in their top5 anymore…
Strange? Not really. I suppose if the weight of traffic to that page crashed that page (the rest of the site is OK), then the fact that the page is not being accessed due to that crash would mean that the code counting page views inside that page would not be incrementing its count.
reality vs. modell
http://i50.tinypic.com/301j8kh.jpg
have fun…!
I keep reading posts by team supporters along the line of “is that all you’ve got, that’s nothing.”
I think some of the supporters of the team need to be reminded of MM’s denial when John Finn brought up the issue at “Real”Climate.
“No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstrution. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum.”
As McIntyre observes, the temperature was not “fully grafted”. However you want to describe it the temperature record was included to hide the decline. If you go to the “Real”Climate archives you will see that when Finn persisted in his questions, Mann suddenly became too busy to bother with the issue. Maybe he was, but he certainly avoided having to go into the messy details of what was done.
http://www.realclimate.org/index.php/archives/2004/12/myths-vs-fact-regarding-the-hockey-stick/#comment-345
If what was done is not a big deal, why didn’t Finn get a clear answer then and there.
See “When scientists assume the missionary position”:
http://vulgarmorality.wordpress.com/2009/11/21/when-scientists-assume-the-missionary-position/
” “Overall we find that the observed Northern Hemisphere circulation trend is inconsistent with simulated internal variability, and that it is also inconsistent with the simulated response to anthropogenic and natural forcing in eight coupled climate models.”
Hmmm. The observed reality and the simulated ‘reality’ are out of synch. But note that the writer’s instinct is to call the observed reality, not the simulation, ‘inconsistent’.
Nothing necessarily sinister or duplicitous here, but it is language revelatory of certain habits of thinking
E.M.Smith (23:56:46) :
Wonderful Explanation. Everyone should read what you wrote. Perhaps Anthony will make a blog post about this. The details are amazingly enlightening. Keep it up guys.
For me, as a layperson who has been in financial services marketing and who reads history, the e-mails are proof that these guys knew they were up to no good, and are doing so deliberately.
When you see how aggressive they have been in attacking skeptics, attributing to skeptics vile motives, how they depend on argument by authority, etc. etc. etc., It is clear to me that they have been doing this for a long time.
The code is where they have committed their fraud, and that, fortunately for us who are their victims, cannot be hidden so easily.
Would it not be great if someone in GISS were to have the strength of conscience that this brave person in the UK has demonstrated?
Well here in Aus on DateLine tonight (ABC), we had some guy, don’t recall the name as I caught the tail end of the broadcast, being very “jittery” in answering questions about the content of some of the e-mails, but then, when talking politically and enjoying the fact he’d just got back from Singapore and will soon go to Copenhagen, I thought of 400kg polar bears, he, just grinned.
Nice one if you can get it.
Asked realclimate in what context the topic quote should be taken
http://www.realclimate.org/index.php/archives/2009/11/the-cru-hack-context/
http://www.cru.uea.ac.uk/ is entirely down right now if you all hadn’t noticed.
And the Copenhagen prpoganda machine continues at full speed.
http://malaysia.news.yahoo.com/ap/20091123/tbs-sci-climate-09-post-kyoto-f8250da.html
Nick Stokes (20:34:52) :
As another poster mentioned, you didn’t give the full quote. It says: ” Uses “corrected” MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
In other words, the data will be ‘artificially’ adjusted so that it is more consistent with HADCRUT3. What do you suppose he meant by ‘artificial’? To me it suggests using something that is not measured data. This seems to be perilously close to fraud.
Chris
Cassandra King (04:12:03 …..see my earlier post @ur momisugly (04:41:24) :
“These scientists at East Anglia are both honourable and world class. Their data is not being manipulated in any bad way”.
In line with AGWThink, it is being manipulated in a *good* way.
After reading your post, I read through some of the code directories and wrote a post on my blog at http://matthewjweaver.com/index.php/about-the-commented-code-released-by-the-alleged-hacker/. In this, I do not reach the same conclusion:
First, these are in a subdirectory called “oldprog” and risks being taken completely out of context. While I do not personally program in Progress, the code is rather easy to read but these comments do not tell the whole story. I’ve now looked through more than a dozen code files and most look rather innocuous. What I wonder about is the data input and the weighting assigned to the data as it is processed by the code. There are some data files but I’m guessing most is missing, of course, and the latter, while in the code, is not as easy to discern and judge.
Among the data in the code directories are limited tree-ring data, temperatures, and more. Maybe summaries, sample files, or results? I do not have the time to compare this data with external datasets to see what is real and what is made up. Nor do I have time to read the code to not only spot the weighting and calculations but to interpret their impact. What I do wonder about as I read through the files is how and why specific ranges were chosen for normalization of data and averaging. Consider, too, how missing data was filled in or ignored, and, as well, the impact of solar cycles and other influences.
The bottom line is that all of this is ripe for manipulation to achieve whatever results are desired. Which gets us back to the email. These researchers were Kool-Aid drinkers for the religion of global warming. They have vested interest in producing results to support their cause. The email makes this abundantly clear and shows their willingness to modify data, ignore inconvenient data, destroy data, and actively prevent independent analysis of their data.
” Yes, yes….science SHOULD rule (am in total agreement there).”
Science should rule what?
E.M. Smith’s long post on the plight of Harry is spot on. As a long-time software engineer, I’ve had to wade into poorly written/poorly documented code myself before that was written by people no longer accessible anymore. Every ‘Harry’ in this kind of situation has a job to do: figure out what the code does well enough to get it to run, probably with some additional inputs or new requirements that necessitate changing the code some – if it can be figured out. ‘Harry’ usually is given a very tight deadline, so he doesn’t have time or the approval just to start over from scratch. And ‘Harry’ certainly was not brought in because he knows climate – he has to pick up tidbits along the way to help him guess if what he is writing makes any sense – and when it doesn’t he will just try his best to make it do what the big wigs say it should do. They are defining the requirements – and changing them every few days or weeks, and define what correct means for Harry’s programs. (Harry didn;t write them, and never would have done done it that way, but they are his now, for better or worse. Just gotta try to meet those deadlines…)
But like many programmers in this situation, Harry prefers black and white – correct versus incorrect, so when Harry sees something that is particularly messed, he will also sometimes add colorful commentary to his notes because then when he reads it again next year he’ll remember that he already figured out it was messed up and won’t agonize in another futile effort to make it make sense. He’s not even expecting anyone else other than another ‘Harry’ to read those comments, and the next ‘Harry’ will appreciate the comments and share the chuckle at the crap that they are made to debug and run.
If it sounds like Dilbert – it’s because what Scott Adams pokes fun at is how stuff REALLY happens. Fortunately for us, Harry did what he did, and it will probably help some of us who are looking at the programs and hoping to get some of them running ourselves. Fortunately, GDL is free and is supposed to run IDL programs as-is, though I haven’t finished setting it up myself (had to get coLinux running first), but it will be very interesting. Then there’s the Fortran stuff – that may be trickier to try to run as is – not likely to find a compatible fortran compiler. Might have to port the code to C++ or C#. If anyone else decides to try having a go at porting some of this to modern languages – perhaps we can collaborate.
Thank you, Harry, for all the clues! (whoever you are)
I work in programming so I had a look through Harry_readme I was pretty appalled but when I figured out they were talking about how to extract the data from Hadcrut2.1 to get to version 3.0 I was shocked. If it is this then GCMs have major problems.
As the developer trying to sort out the mess said
“So, we can have a proper result, but only by including a load of garbage!”
The data has clearly been manipulated to get the results they wanted because;
1. Poor data management meant they hadn’t got the original data.
2. Poor programming techniques meant they had no idea how significant amounts of data in 2.1 was generated.
3. Lack of documentation in the original code meant they did not know why data had been manipulated.
The data was changed to match the expected outputs. None of the papers dependant on Hadcrut3 are reliable as Hadcrut 3 itself is unreliable.
Normally in software development we do a thing called “code review” were developers review each others code – it can be fun but it is a blood sport. The stuff described here (even the first of the many, many manipulations) would have you laughed out of the room and demoted to business software tester.
The code for all models needs to be released and reviewed, as well as source data, why it is being transformed, what the calc for the transformation is.
We cannot agree to implement Copenhagen without this. If everything is on the up and up and makes perfect sense then we go ahead but otherwise more proof of the forcings (+ & -) and CO2 impacts are going to be required.
“This website is currently being served from the CRU Emergency Webserver.
Some pages may be out of date.
Normal service will be resumed as soon as possible. ”
LOL
http://www.cru.uea.ac.uk/
Mr Delingpole writes again
http://blogs.telegraph.co.uk/news/jamesdelingpole/100017546/climategate-why-it-matters/
Alec J (23:59:14)
Thanks for the heads up. I just listened to the short intervie won the Radio 4 Today programme.
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
Bob Watson stayed right on ‘message’ as you’d expect him to given that his attempt to become Chairman of the IPCC failed. He clearly hasn’t spent any time whatsoever looking into the details of what has been released in the emails and data files.
Nigel Lawson has quite rightly called for the NERC (who fund CRU, the Tyndall Centre etc) and the Vice Chancellor of UEA to set up an independent enquiry into the contents of the released emails/documents. I fully agree with this but don’t think this is ever likely to happen. There is evidence in at least one of the emails released that the VP of UEA has even been in support of CRU blocking the FOIA requests. CRU have even received advice from the ICO on ho wto block the requests. Its very clear that neither the NERC or the universities administration are hardly going to be independent in investigating this matter. Are they?