When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:
From a yahoo.com news story:
In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.
Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.
Dr. Jones responded.
However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.
Ok fine, but how Dr. Jones, do you explain this?
There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:
People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:
Here’s the code with the comments left by the programmer:
function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$
datathresh=datathresh
;
; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES
; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate
; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE
; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE
; USUAL correlate FUNCTION ON THE RESIDUALS.
;
pro maps12,yrstart,doinfill=doinfill
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
and later the same programming comment again in another routine:
; ; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD ; reconstructions ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually ; plot past 1960 because these will be artificially adjusted to look closer to ; the real temperatures.
You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”, but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.
Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.
Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.
For more details, see Mike’s Nature Trick
UPDATE: By way of verification….
The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file
/documents/osborn-tree6/mann/oldprog
in the files
maps12.pro
maps15.pro
maps24.pro
These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.
UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:
================
From documents\harris-tree\recon_esper.pro:
; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
;
Note the wording here “avoid the decline” versus “hide the decline” in the famous email.
===============
I’ll give Dr. Jones and CRU the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.
Sponsored IT training links:
Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
The guys in this link are financial modelers and have been running through the CRU code and other data for the last couple of days.
Put your dark glasses on for some of the rather [ extremely ] descriptive language describing the standards of coding and modelers capabilities at CRU.
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625
I am afraid to say that I have not seen, read or heard one skerrick about the CRU Email hacking by our tax funded national broadcaster.
To-day in lieu you can learn that the Antarctic Ice Sheet is losing mass:
http://www.abc.net.au/news/stories/2009/11/23/2750931.htm?site=news
This selected reporting of stories that only support staff driven agendas and neglecting ones that do not is in itself a side story as scandalous as the disclosure of the CRU’s modus operandi.
Nick Stokes (20:34:52) : 22/11
Frankly, since you asked, you are being dense when you write “I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable. ”
The full quote in the programmer’s code is –
“Uses “corrected” MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
There’s no mention of “unreliable”.
There is mention of “artificially adjusted to look closer….”
Why not leave off inventing excuses for a while and post what you think is the most harmful part of the hacked information, from the point of view of proper scientific conduct?
I found this in one of the emails – does this mean anything for the post-1960 data?
“One other thing – MOHC are also revising the 1961-90 normals.”
(MOHC is Met Office Hadley Centre.) Does “revising the normals” mean “inventing the data?” I’m just beginning to wonder. There seems to be a discussion of fudging numbers in the email.
http://www.anelegantchaos.org/emails.php?eid=1017&filename=1254147614.txt
>>Gene Nemetz (22:41:33) :
>>>They feel they can’t go down. The Titanic couldn’t sink too.
No, no, no.
What they can really feel is rapidly shrinking and withdrawn government grants. The cold winds of economic reality will soon be blowing though the sparse remnants of Yamal tree studies. It will be back to climate tokenism, just for the political ‘green-sheen’.
.
>>Another strange happening at Hadley … all the hadcrut3 data for this year, except Jan/Feb, has been deleted.
>>>>But of course they’d delete it, wouldn’t want the very inconveniently record cold Northern Hemisphere October 2009 muddying their “warming”.
Gregg,
Recent months actually showed an increase in temperature in hadcrut3. Perhaps it was realised by Jones et al that the data did not reflect reality ?
TIM
You are obviously one of those programmers I used to sack.
Comment were NEVER designed for the purpose you describe, you ;;;
I can´t wait when HadCRUT itself will be debunked. We know that many stations are bad, but there is no doubt some “added value” algorithm which adds 0.15 deg C here or there.
What will be left, if historical and present temperature records will be blown off? Just pre-programmed playstation climate models.
Cheers
Phil
You delusionals are quite funny.
To build your case with this data you have to perform a couple of ‘tricks’ of your own.
1. Make the unstated assumption that the paleoclimate data is the whole of the co2 forced global warming story.
2. Hone in on increasingly small parts of the story and ignore the big picture. You have to do this because the big picture doesn’t support your position at all.
3. Fail to respond to counter arguments from the originators of the research.
Vested interests and delusionals leading the ignorant here, now that you are starting to lose your grip on policy makers internationally it’s becoming amusing to watch you, if we’ve got the time that is.
There is no much difference between CRU and UAH/RSS dataset:
CRU/UAH diff:
http://www.gfspl.rootnode.net/BLOG/wp-content/uploads/2009/11/cruuah_trend.png
So it’s not warming at all. The theory of the gnomes having reduced the freezing point of water is true after all 🙂
Policyguy (21:21:18) : The 15% of the earth’s surface where trees do grow are in those locations where they may also impacted by lack of light (other trees), lack of water (draught), to the extent – that temperature can not be isolated as a cause for growth during any period. It is a false measure.
And don’t forget my favorite “confounder”: Bear Poo. A recent (Peer Reviewed) article found that bear eating salmon and, well, doing what a bear with a full tummy eventually does in the forrest, despite all the times folks ask: “Does a bear, um, poo, in the woods?”, the answer is they do poo..
Well, said bear account for as much deposited fertilizer as would be applied in a commercial tree farm. So depending on the amount of salmon run, the bear population, and the distance to the streams with said salmon in any given years (i.e. rainfall and stream flow matter) trees will get more or less growth depending on the “Poo Profile” …
To calibrate your trees, you have to have calibrated your poo deposition over time…
Jesse (21:24:11) : Once again, you guys are making mountains out of ant hills.
Ever found yourself standing on a Fire Ant hill? Visit Texas… bare foot… Do not underestimate the impact of an “ant hill”.
This is just normal data processing
No Way. Not even close. Nothing about this code matches ANYTHING I would class as “normal data processing”. (And yes, I’m a “professional” at it).
per the 1960 divergence problem as shown by NUMEROUS sources.
If you have 1960 divergence, then you could have 1860 divergence, and 1760, and 1260 and … Toy Broke.
This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists.
From what I’ve seen, the climate “uniformed amateurs” know more about this than the “real scientists”. I, for one, can write code better than anything these “real scientists” have done. And I have a much better “grasp” of software QA. Or would you agree that these uninformed amateur programmers ought to leave the job to “real programmers”. Hmmm?
Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
Well, it would seem your level of expertise is that of “wrassling coach”… Or would that be “writer for Jerry Springer”?
REPLY: You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
OK, I’ll follow that lead:
Highest ranks attained: Corporate President. Board Member (one public, one private).
Favorite rank: Director of Information Services.
Paper Trail: IPCC CDP, State of California Community College Lifetime Teaching Credential Data Processing & related. Bachelors and more.
Education: Through about 20 Units of masters level (needed for credential, plus MBA work). More “commercial classes” than I can track. 12 Units of doctoral level (it’s a long story involving money and time…)
Experience includes several years teaching I.T. at a local silicon valley college.
Over 30 years professional data processing experience in the private sector including managing a Cray supercomputer site (and building out same) and including being a professional programmer for 2 decades+ including being a DBA and being a professional consultant on mainframe databases. FBI check passed to work in securities field and employed in same at stock brokerages and Federal Reserve Bank. Taught computer forensics. Conducted security audits.
Managed Software QA for a compiler tool chain company.
Managed Software build for a network router appliance company (Build Master) with a commercial product.
Produced and shipped production software and documentation through many product cycles.
and a whole lot more…
So I think my “software professional” status stacks up pretty well against your “real scientists” who are amateur programmers. Perhaps they ought to leave programming to the “real computer scientists” and go back to their day jobs as custodians… no, wait, they lost their data, so they are not qualified to be custodians of things…
Its just a matter of time that Mr Jones and Mann and perhaps more will be moving on.
For those wondering about The Troubles With Harry, here is his picture:
http://www.cru.uea.ac.uk/cru/people/photo/harry.jpg
Looks like a nice enough guy. I’d like to buy him a beer for his efforts with The Code.
I wonder does the Harry Readme file coincide with a new release of the HADCRUT algorithm. As I understand it, every few years a new version of the software / algorithm / data is released and it typically shows more warming than the last one. Could it therefore be plausibly claimed that then Harry Readme invaldiates a particular HACDCRUT release?
Of course that is not to say that the previous one worked okay either.
“Kevin Trenberth, of the US National Center for Atmospheric Research (NCA) in Colorado, whose e-mails were among those accessed, said the timing of the hacking was “not a coincidence”.
He told the Associated Press News agency 102 of his emails had been posted on the internet and he felt “violated”.
Critics say the e-mails show that scientists have distorted the facts of climate change, but Mr Trenberth said the e-mails had been “taken out of context”.
The above is the last paragraph from http://news.bbc.co.uk/1/hi/world/europe/8373551.stm
So, he feels violated eh? Well sport, we feel we have been conned by you & your deceitful cronies & you all richly deserve your collective fates. What are the future employment prospects for people like you. Nil, if there is any justice.
E.M.Smith (00:47:58) :
Please be advised that bears poo in the woods for only about 5 months. During their hibernation, the bears neither defecate or urinate. This would normally mean that nitrogenous wastes during that time would cause poisoning to the urinal system. However this it does not do. The bear solves its nitrogenous waste problem by a form of recycling.” The hibernating bears body diverts nitrogen from pathways that synthesise urea into pathways that generate amino acids and new proteins. And it does this by glycerol (produced when fats are metabolized) and recycled nitrogen as the building blocks” according to the New Scientist Magazine of February 1985.
Another MSM article. The Mail dropped their previous article over the weekend but I found this one on their front page again today.
@ur momisugly Jesse (21:24:11) :
Although my title is actually “Janitor”, I do play a computer scientist on TV (almost 30 years). On TV, I have worked for the DOD, Navy, DHS, software companies, law firms, accounting firms, manufacturers….
But, back to reality, gotta go wash some windows…
[/sarc]
“Lord Lawson calls for public inquiry into UEA global warming data ‘manipulation'”
on BBC radio reported by the telegraph:
http://www.telegraph.co.uk/earth/environment/globalwarming/6634282/Lord-Lawson-calls-for-public-inquiry-into-UEA-global-warming-data-manipulation.html
TRICK or TREAT
Just in case no one has posted above its hit the Daily Telegraph mainstream also.
http://www.telegraph.co.uk/earth/environment/globalwarming/6634282/Lord-Lawson-calls-for-public-inquiry-into-UEA-global-warming-data-manipulation.html
Roger Harrabin’s Notes: E-mail arguments on BBC
http://news.bbc.co.uk/1/hi/sci/tech/8371597.stm
Note that he has plenty of CRU contacts to assist him in putting this revelation in “perspective” but to be fair he is also quoting Myron Ebell, a climate sceptic from the Competitive Enterprise Institute.
No sunspots today.
But if you want, I’m sure I could produce some.
To calibrate your trees, you have to have calibrated your poo deposition over time…
[snip, oh come on ~ ctm]
🙂
A good article in The Times this morning from Lord Lawson about Coppenhagen. Mentions the CRU leaks as well.
I am absoultely loving this! He he!!! The foul stench of bovine faecal contaminated science from the CRU is stomach turning. Of course they’ll point to the recent bad storms in Cumbria as hard evidence of Climate Change, despite better examples in the past over on An Englishmans Castle website.
I think the late great Sir Walter Scott summed it all up rather well, “Oh what a tangled web we weave, when first we practice to deceive”!
Never, never, never give up! (Sir Winston Churchill).