When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:
From a yahoo.com news story:
In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.
Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.
Dr. Jones responded.
However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.
Ok fine, but how Dr. Jones, do you explain this?
There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:
People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:
Here’s the code with the comments left by the programmer:
function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$
datathresh=datathresh
;
; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES
; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate
; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE
; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE
; USUAL correlate FUNCTION ON THE RESIDUALS.
;
pro maps12,yrstart,doinfill=doinfill
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
and later the same programming comment again in another routine:
; ; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD ; reconstructions ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually ; plot past 1960 because these will be artificially adjusted to look closer to ; the real temperatures.
You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”, but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.
Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.
Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.
For more details, see Mike’s Nature Trick
UPDATE: By way of verification….
The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file
/documents/osborn-tree6/mann/oldprog
in the files
maps12.pro
maps15.pro
maps24.pro
These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.
UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:
================
From documents\harris-tree\recon_esper.pro:
; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
;
Note the wording here “avoid the decline” versus “hide the decline” in the famous email.
===============
I’ll give Dr. Jones and CRU the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.
Sponsored IT training links:
Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.
Joanie, take a look at this post. Bristlecone pine are favorite trees of a lot of botanists and other biologists — just finding them in their native habitat is a spiritual experience. The post carries a lot of information about how dendrochronology works, and it may shed some light for you on how and why the data can be trusted, generally.
REPLY: Sorry Ed, you are 100% wrong. Go pander this “spiritual experience” crap in your own bathtub. Trees do no make thermometers in all instances. Liebigs law illustrates that plant and tree growth is limited by the least available limiting growth factor. In some years it may be temperature, with a warm summer spurring growth, but in others it can be lack of water or lack of nutrients, and for long lived species like Bristlecone pine, yes even years of low CO2 can be a limiting growth factor. Bristlecones tend to frequent high and dry climates, and water can be just as influential as temperature.
Without knowing which of the factors was in shortest supply in a year, or stretch of years, you can’t make a blanket claim that tree ring widths reflect only the temperature record. Untangling it all is not possible without the metadata, and we don’t have it in most cases. – Anthony
Oops. I meant, look at this post:
http://www.realclimate.org/index.php/archives/2009/11/a-treeline-story/
rbateman (16:14:26) :
I see evidence that they have gotten to a lot of the data at the sources.
Comparing data sets with provenance to ascertain whether it is original or altered is now not an option, but a necessity. They are banking on nobody knowing how far they went.
I am hoping that others will dig far enough into the climate data to see what I see.
DO NOT under any circumstances assume that the climate data you are looking at is 100% genuine.
Check it against other sources.
Totally agree. I was looking more at the reconstruction of a better temperature record rather than the ‘evidence against CRU’ angle.
The main reason CRU has resisted releasing the data and code is that we would then have been able to see what a mess it is. This is just as damning as evidence of falsification in my view.
philincalifornia (19:47:48) : It’s perhaps instructive to rehash the essay(s) by co-conspirator Grant “Tamino” Foster on how the HadCru data correlated well with GISS. … then When the cluster**** that is HadCru can be fully determined, what’s the statistical chance of GISS being such a closely correlated cluster**** by coincidence ??
No coincidence at all… The “magic sauce” is GHCN. As is admitted in the emails, the CRUt series depends heavily on GHCN. GIStemp depends heavily on GHCN. NOAA (with a NASA data set “manager”) produces GHCN.
All the thermometer location “cooking” that was done to GHCN (moving from the mountains to the sea, moving from the poles to the equator) is reflected in both Hadley CRUt and GIStemp. Same Garbage In, Same Garbage Out.
From:
http://chiefio.wordpress.com/2009/11/21/hadley-hack-and-cru-crud/
Comment by Prof. Phil Jones
http://www.cru.uea.ac.uk/cru/people/pjones/ , Director, Climatic
Research Unit (CRU), and Professor, School of Environmental Sciences,
University of East Anglia, Norwich, UK:
No one, it seems, cares to read what we put up
http://www.cru.uea.ac.uk/cru/data/temperature/ on the CRU web
page. These people just make up motives for what we might or might
not have done.
Almost all the data we have in the CRU archive is exactly the same
as in the Global Historical Climatology Network (GHCN) archive used
by the NOAA National Climatic Data Center
And just who owns that NOAA dataset? Who is “The Data Set Manager”? What I could find looks like a guy at NASA. From:
http://chiefio.wordpress.com/2009/10/24/ghcn-california-on-the-beach-who-needs-snow/
down in the comments:
e.m.smith
It took a while to find, but I think I found “who owns GHCN” and “who manages it”.
From: http://gcmd.nasa.gov/records/GCMD_GA_CLIM_GHCN.html
We find that:
GHCN data is produced jointly by the National Climatic
Data Center, Arizona State University, and the Carbon Dioxide
Information Analysis Center at Oak Ridge National Laboratory.
The NCDC is a part of NOAA. So I’m not seeing NASA on this list. But…
It goes on to say:
Personnel
SCOTT A. RITZ
Role: DIF AUTHOR
Phone: 301-614-5126
Fax: 301-614-5268
Email: Scott.A.Ritz at nasa.gov
Contact Address:
NASA Goddard Space Flight Center
Global Change Master Directory
City: Greenbelt
Province or State: Maryland
Postal Code: 20771
Country: USA
So it looks to me like it has NASA staff assigned, part of Goddard (though it isn’t clear to me if G. Space Flight Center and G.I.S.S. are siblings or if one is a parent of the other. I suspect GSFC is an underling to GISS. That would have Scott Ritz reporting to Hansen IFF I have this figure out… (And all that personal data is at the other end of the link anyway so I’m not publishing any private data NASA has not already published.)
…
It’s looking to me like GISS has their fingerprints all over the GHCN deletions, with NOAA ether as patsy or passive cooperator.
And as you so aptly put it:
Over to you Senator Inhofe ……..
I wonder how many of these bad code offsets they’ll need…And whether their income from Carbon offsets will be enough to cover the costs involved?
http://thedailywtf.com/Articles/Introducing-Bad-Code-Offsets.aspx
Fred Lightfoot (03:29:03) :
E.M.Smith (00.47.58)
Still got access to that Cray ? wishful thinking.
No, but about 5 years ago saw it up for sale for something like $1000 (and one of the guys on our project was thinking of buying it…)
But you can get the same processing power now in an older Macintosh for under $1000 and it doesn’t need the 750 kVA power feed nor the 16 x 16 foot water tower for cooling 8-0
My first Cray (gosh I like the sound of that… first Cray ..) was an XMP-48 and that means a 4 processor box with 8 megawords (of 64 bit words) of memory. It was about a 400 Mega FLOP box. (You could vary the speed of the clock raising it until you were going as fast as possible, or dialing it back if the error rate started to rise… so performance is “about”).
Modern PC class machines can do a fair number of Floating Point Operations and have Ghz clock rates, so even if it was 10 clocks per FLOP you would still have “100+ FLOPS” per CPU, and you can pack multiple CPUs in a box. Add in GB of memory (compared with 64 MB… even if very fast…) and you see where this is going…
Moore’s Law. Every 18 months, a double. And it’s been a lot of 18 months…
So I now have more “compute power” in the laptop I’m typing this on… and in the cluster of older PCs that I can turn into a Beowulf should I ever want to, that are scattered around the house; then was in that old Cray. (Or even in the Y-MP that replaced it, or the EL, or… )
FWIW, one “node” of my Beowulf is presently doing the GIStemp builds / testing. Another is my main server / home machine (but spends most of it’s time turned off). 2 more nodes sit in the garage, waiting for me to want them. 2 were turned into “parts” and another one went to the recycle bin. I just never could find anything that needed the compute power… ( Google “Stone Soupercomputer” for an interesting story of how you, too, can have a super computer for free. Just wait for the next Microsoft update, and as folks need new hardware… )
I am confused ..
the trees show no change over the last forty years, and this is evidence that something has changed?
So their recent evidence can be ignored?
Oscar Bajner (03:38:58) : Thing is, I can’t give up my day job right now – I just got a promotion and from today I am in charge of the rotary buffer! W00t! Wal-Mart rocks.
You left out a couple of other SCCSs like “rcs”, but I’m sure you know that already… Maybe, since they are on Unix / Linux boxes we could just tell them to do a “man ci” and “man co”… At least one of them will like the command name 😉
AND
You got the rotary buffer? They said since I was old enough to know FORTRAN I had to be a door greeter (!) … I’m not sure how you greet a door, but I don’t think the door will notice if I don’t get it quite right…
Maybe we can get together at closing time and you can show me how to “ride the buffer” and I’ll show you how to swing on the automatic door!?
Ed Darrell (22:12:07) :
“…just finding them in their native habitat is a spiritual experience.”
Ed you just rang my fraud detector. Just like the alleged scientist Tim Flannery, you are a mystic not a scientist. I have seen Flannery say, on TV, that he believes in “Gaia” and also “that our(meaning human) intelligence is here for a purpose.” These are positions that can not be held by any scientist. They are both utter nonsense with not the least hint of any evidence in their favour. Teleology is trash.
Joseph in Florida (04:46:57) :
What language are these *.pro files? I am guessing Fortran.
I believe it’s a graphics package: idl
Ed Darrell (18:06:23) :
‘As you recall from your vast experience in this issue, tree ring data correlates very well with other temperature measures until about 1960, and then it tails off as if temperatures declined. However, thermometer readings from the same places don’t show a drop.’
I didn’t know that a grove of trees had a thermommeter close by.
Or maybe the thermometer was sitting at an airport.
@Aligner (07:48:32) :
Very well put. I’ve saved a copy… You covered much of the management part that I’d thought, but didn’t bring myself to speak, and ought to have done so…
To all those who like what I wrote: Thank you! Now go back and reread Aligner’s posting. The points he makes are, in fact, the bigger ones…
“‘As you recall from your vast experience in this issue, tree ring data correlates very well with other temperature measures until about 1960,”
so why was Briffa using frig-factors on data from the 1930s?
Reading the harry read_me.txt… it’s like an episode of monty python tripped on acid. And trillions of OUR taxpayer dollars are going to pay for this bullshit? Makes you want to hit someone with a hockey stick.
tallbloke (14:40:06) :
Both CRU and GISStemp are based on GHCN data. Which is available. So reconstruction from that is as good as will get.
Of course this also means that when the warmists talk of independent data sets, they are lying through their teeth.
And the GHCN data set is horridly broken and biased by deletions. About 90% of the thermometers have been “removed from the set” since about 1990 and the mountains and cold places with them…
http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/
http://chiefio.wordpress.com/2009/11/13/ghcn-pacific-islands-sinking-from-the-top-down/
http://chiefio.wordpress.com/2009/11/16/ghcn-south-america-andes-what-andes/
One has to go “upstream” from GHCN to get ‘unbiased’ data…
I had always been taught that one should use scientific method to collect, collate data. Strange that these over funded people seem to ignore even the most basic of scientific methods!
For those at CRU, go to this site to lean how to implement said method – http://www.sciencebuddies.org/mentoring/project_scientific_method.shtml (it has a pretty, colourful bubble step plan for them and everything!!! >;))
And for those of you who work at Wal-mart and its ilk, I suggest this site – http://teacher.pas.rochester.edu/PHY_LABS/AppendixE/AppendixE.html (and yes, I know this mentions physics, but, it’ll still give you poor Wal-mart janitors an idea of what the egg-head at CRU should have been doing all along…)
Ladies, gentlemen, janitors. My eyeballs have melted, and it *wasn’t* because the planet is heating up! I’ve just finished reading every one of those comments and having a good giggle. I had no idea that janitors and fast-food servers were so smart! 😉 I bid you wonderful people good night from one of the “forgotten” countries.
Kathleen,
Sydney, Australia.
I don’t know if this has been suggested before, but I think it would be a good idea to set up a repository containing the CRU source code.
I’m sure there will be no shortage of volunteers (myself included) to translate the source files into, for example, C, to properly comment them so that they’re easily understandable, and to review them.
The mass of source files would be a daunting task for a handful of people, but for hundreds?
Perhaps yet other volunteers could similarly sort out the data files?
“Maybe we can get together at closing time and you can show me how to “ride the buffer” and I’ll show you how to swing on the automatic door!?”
Ah, good old fashioned cross training. Delighted to see it still exists and that no degree in climatology or any other ‘ology required.
Can I watch, I’ll help out by providing training in chicken keeping and composting.
Ed Darrell (20:59:08)
This, from 1989 I remember:
Yale’s forestry expert Tom Siccama
told us, “All we know is that suddenly in 1962, the trees got
very unhappy, and it was probably the very severe drought
followed by and especially killing winter.
“The one thing it was not was acid rain. Look, you
don’t get that sudden a change from something like acid
rain. And why didn’t it happen in adjacent areas or
states? The people in Vermont who blame this on acid rain
must think they live on an island or something,” Siccama
said.
Apparently, it was drought during the latter part of th e 20th century, though at the time I couldn’t accept that sulphuric acid was raining down across the northern hemisphere, as anthropogenic sulphides haven’t changed that much since the 80’s.
nature has its own secrets as to when trees grow and don’t, so the limiting may well be the ones outlined – precipitation, drought, c02, temperature, humidity, and natural infestations etc.
i remember the hoaxes from the 70’s and 80’s – acid rain, ozone holes, thermonuclear winters, the coming ice age..
Its impossible to ascertain what acid rain actually did and whether there were any real researches with data at the time. – especially the proxy trees used by the hockey team
perhaps you could point to some. I’m happy to have my mind changed
Also, the 1940’s and 1950’s would surely have produced as much sulphuric acid raining down on forests as the 1960’s and 70’s. We had pea soup fogs here in London.
E.M.Smith (23:56:46) :
I really appreciate your explanation; it was an entertainment in itself.
After reading some of his comments, I quite liked poor Harry — at least he seems to have a well-developed (and much-required) sense of the absurd!
This was found in the “HARRY_READ_ME.txt” of the “Whistleblowerl” FOIA2009 File
John
“You can’t imagine what this has cost me – to actually allow the operator to assign false
WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’
database of dubious provenance (which, er, they all are and always will be).
False codes will be obtained by multiplying the legitimate code (5 digits) by 100, then adding
1 at a time until a number is found with no matches in the database. THIS IS NOT PERFECT but as
there is no central repository for WMO codes – especially made-up ones – we’ll have to chance
duplicating one that’s present in one of the other databases. In any case, anyone comparing WMO
codes between databases – something I’ve studiously avoided doing except for tmin/tmax where I
had to – will be treating the false codes with suspicion anyway. Hopefully.
Of course, option 3 cannot be offered for CLIMAT bulletins, there being no metadata with which
to form a new station.
This still meant an awful lot of encounters with naughty Master stations, when really I suspect
nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option –
to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other
words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to
become bad, but I really don’t think people care enough to fix ’em, and it’s the main reason the
project is nearly a year late.”
“The issue here is, should we discount the previous thousand years of data because the data go off the rails in 1960, or should we just dismiss the data after 1960?”
How would you know that the “previous thousand years” are not “off the rails”. There are no measured values to compare it to. The point of proxy data is to reconstruct temperature when there is no actual temperature data. So when there is no temperature data to compare it to, tree ring data is good, but when there is data to compare it to, and it does not compare well, tree data should be “tricked” to “hide the decline” that it shows.
Say that good temperature data has been available since 1880. If that data matches trees until 1960 you have a 80 year match. If it does not match after 1960 you have 50 years of no match. You could also “trick” the data to match the last 50 years to show an inverse relationship with temperature. In that case the proxy data would show it was MUCH hotter in the past.
The point being is that in science, yes I am a scientist, you can not selectively discard data unless and until you can scientifically define why it is appropriate. Certainly, acid rain may account for the discrepancy and would be an appropriate research topic. But to rush to judgement and “trick” the data in order to “balance the needs of the science and the IPCC , which were not always the same.” as Mr. Briffa (whose research it was) admitted, is unconscionable.
http://www.eastangliaemails.com/emails.php?eid=794
E.M.Smith (02:24:14) :
Joseph in Florida (04:46:57) :
What language are these *.pro files? I am guessing Fortran.
I believe it’s a graphics package: idl
———————————
Yep, thats true, it’s
IDL data analysis s/w.
Peter (05:10:14) :
Peter go ahead, there is a GDL in most Linux distributions. With
that you can try to run the Had-Stuff.
David in Florida (11:32:11) :
from that link
“mann to Briffa: “Keith, just a quick note to let you know I’ve had a chance to read over the key bits on last millennium in the final version of the chapter, and I think you did a great job. obviously, this was one of the most (if not the most) contentious areas in the entire report, and you found a way to (in my view) convey the the science accurately, but in a way that I believe will be immune to criticisms of bias or neglect–you dealt w/ all of the controversies, but in a very even-handed and fair way. bravo!”
Briffa to Mann reply: ” I tried hard to balance the needs of the science and the IPCC , which were not always the same.”
Nothing looks more like innocence than an indiscretion!