When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:
From a yahoo.com news story:
In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.
Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.
Dr. Jones responded.
However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.
Ok fine, but how Dr. Jones, do you explain this?
There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:
People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:
Here’s the code with the comments left by the programmer:
function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$
datathresh=datathresh
;
; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES
; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate
; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE
; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE
; USUAL correlate FUNCTION ON THE RESIDUALS.
;
pro maps12,yrstart,doinfill=doinfill
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
and later the same programming comment again in another routine:
; ; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD ; reconstructions ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually ; plot past 1960 because these will be artificially adjusted to look closer to ; the real temperatures.
You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”, but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.
Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.
Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.
For more details, see Mike’s Nature Trick
UPDATE: By way of verification….
The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file
/documents/osborn-tree6/mann/oldprog
in the files
maps12.pro
maps15.pro
maps24.pro
These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.
UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:
================
From documents\harris-tree\recon_esper.pro:
; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
;
Note the wording here “avoid the decline” versus “hide the decline” in the famous email.
===============
I’ll give Dr. Jones and CRU the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.
Sponsored IT training links:
Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.
The quote and credit is due to reader “Neal”.
Keep it up,I am watching with fascination.You(and the others who love science) must be so angry,they have besmirched the whole field of climate science.Dirty rotten scoundrels.Politicians are carrying on as if this whole issue will go away soon,I hope it doesn’t.
The trillions and gazillions of dollars that are now at risk is the pathetic consequence of a programmer gone berserk. What a travesty. Can it be corrected? Who knows. A call to Science is in order so that reliable individuals can correct this unballance between political/social myth and reality. The challenge will be, how to encourage informed individuals to speak up.
I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960”, because the results are unreliable. So is there any indication that anyone has plotted beyond 1960? This came up on the Bishop Hill thread, where he drew attention to an email by Tim Osborn where he said that they never plot some treering set beyond 1960 because of a divergence issue. Turns out that that is what Briffa/Osborn say also in Briffa et al 2001. This Briffa/Osborn context may be unrelated, but it seems to me that it may simply just mean what it says. Don’t plot beyond 1960 using this code. And people don’t.
Its highly unlikely mere facts will be able to stop the AGW religion for advancing from one victory to the next. After all, AGW was never about science, it was about raisng taxes and controlling other people’s lives.
[snip]
Add that to violating FOIA.
When we’re done, we’ll be able to throw the book at them.
But remember – if you question Real Climate Scientists ® then you are a DENIER.
Oooga Boooga!
/sarcasm
I’ve been unable to find the definition of “artificial adjustment” in the climatologist’s handbook. It must be called something else. Sure sounds like something else to me. More! More!
Spin that to the moon.
Looks like a duck, quacks like a duck….smells like a duck’s butt.
If that’s the best you can do then you’re whole case is in trouble. Data from independent sources often has to be massaged (the programming terminology is “munged”) in order to make independent series compatible with each other.
The most interesting thing about this is the practice of “copy and paste” coding indicated by the duplicated comments, which ‘real’ programmers don’t like but which are an unfortunate necessity when programming for scientific research.
Very telling. I just authored a letter to Senator Graham asking if he is keeping up with this new finding. I also insisted that he provide a public appology to the people of South Carolina and to Senator James Inhofe.
[sarcasm on]Don’t you understand? There is reality and then there is TRUTH. If reality doesn’t fit the TRUTH then reality must be adjusted. [sarcasm off]
No wonder they refused to release it even under FOI.
When people use cut-n-paste coding, they sometime even copy the comments and forget to change them.
WHOA!
I can’t believe it!
One of Germany’s biggest highly respected dailies has it on the front website page.
http://www.welt.de/wissenschaft/article5294872/Die-Tricks-der-Forscher-beim-Klimawandel.html#xmsg_comment
Uh, wow. Some fence sitters wanted hard evidence, more than what they perceived to be mere conjecture within the email spool. Well, there you go.
This story has reached another level. Hard evidence of such blatant data manipulation mustn’t be allowed to simply vanish into the news cycle. I hope the few MSM outlets accurately reporting this story pick up on this, because the notes in the code indeed appear to be a smoking gun.
It may have been posted before so, if so, apologies but I do like this one – to be found in the HARRY_READ_ME.txt file:
“OH F*** THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done I’m
hitting yet another problem that’s based on the hopeless state of our databases. There is no uniform
data integrity, it’s just a catalogue of issues that continues to grow as they’re found.”
(I confess to manipulation of the f-word myself, but only to comply with WUWT policy).
So Copenhagen Comrades, what’s a trillion dollars or so here and there based on “no uniform data integrity” ??
No resignations or sackings yet then?
I think it is noteworthy that Steve McIntyre comments on Mann and Briffa truncating their MXD data at 1960.
http://www.climateaudit.org/?p=4221
A poster called Asimov has quoted extracts from the “HARRY_READ_ME.txt” file – deeply shocking stuff:
Asimov’s very plausible suggestion is that Harry is a programmer trying, and often failing, to make sense of the garbage data which he’s been lumbered with.
Several posts over several pages:
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13
Can someone point me what file is this code from?
I have downloaded the package, and found only one file (FOIA/documents/osborn-tree6/mkp2correlation.pro) that includes the above mentioned function. Lines 1 to 7 are identical, but the rest has nothing to do with the screenshot above.
Emails may be just chit-chat between various parties, but the coding is the receipe for cooking the books. No wonder AGW leaves such a bad taste — it’s way too overcooked leftovers.
@Nick Stokes: the real issue is that temperatures derived from tree rings are known to not match measured temperatures after 1960.
If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?
Nick,
Exactly who is responsible for “artificially adjusted to look closer to the real temperatures” that the code is advising against plotting, and why would they have done such a thing?
I agree that it’s no problem with the code saying not to plot past 1960, but it is certainly a problem that the code says someone has taken liberty with post-1960 data (or the methodology used to process it) for the purpose of making it “look” more like the instrumental record.
Wow, the coming week will be most interesting.
Good job Anthony, Steve, and all. Thank you.
Time to look at the sea ice satellite AGC, pointing,
and receiver gain. I think we may find some missing ice.
The really sad thing is that the dendros still have no clue why trees make good thermometers in some years but not others. This is a bigger issue, imho.
RE: Nick Stokes (20:34:52) :
**I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable**
Maybe it means “don’t plot beyond 1960 because the results do not show what we want”?????
The CRU team and maybe all of these “AGW researchers” are clueless… they are way over their heads and inclined to cheat – wait…its defraud as adults – especially with literally trillions of dollars involved.
How many billions has the US and other countries wasted on this BS…
I want my money back!
One may surmise they did not want any Real Programming Talent Aboard to maybe OUT them? so they tried to fake it themselves…
Anyway… thank God for Anthony… attempting to get some quality to the data sets involved!
philincalifornia (20:51:09) :
Yep, saw that, and that’s not the only place he used it. Leaves little doubt that the data was pre-slaughtered, does it not?
Now, who exactly is HARRY?
Unbelievable!!!!!!!!!!!!!!
Or, perhaps, not so unbelievable.
There are ways to game any system. All it takes is a person or persons clever enough to formulate effective methods of cheating.
In the end, what counts more than anything else is the ability to rely on the word of others. To the extent that people are more or less untrustworthy does the potential for dishonesty rise or fall.
Is it just me, or is the societal willingness to indulge in unethical behaviour presently on a significant and massive upswing – Perhaps a little like our friend, the hockey stick curve.
A Revkin has stooped to a new low. This man has no shame.
http://dotearth.blogs.nytimes.com/2009/11/22/your-dot-on-science-and-cyber-terrorism/#bozoanchor
P Gosselin (20:47:32) :
WHOA!
I can’t believe it!
One of Germany’s biggest highly respected dailies has it on the front website page.
And in English here – http://translate.google.com/translate?u=http%3A%2F%2Fwww.welt.de%2Fwissenschaft%2Farticle5294872%2FDie-Tricks-der-Forscher-beim-Klimawandel.html%23xmsg_comment&sl=de&tl=en&hl=en&ie=UTF-8
The alarmists have gotten so good at misdirection that they get me every time. Their comments are never on the real issue. When I read Dr. Jones explanation of the word ‘trick’ I thought that it was perfectly reasonable so decided to lay that comment to rest. I re-read the e-mail later though and thought ‘hang on!’. Is the word ‘hide’ also commonly used by scientists to mean something other than its common usage? I can’t believe they got me again!
Roger Sowell (20:58:33) :
@Nick Stokes: the real issue is that temperatures derived from tree rings are known to not match measured temperatures after 1960.
If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?
That’s not a coding issue. What they say in Briffa 2001 for the Siberian trees is:
““The period after 1960 was not used to avoid bias in the regression coefficients that could be generated by an anomalous decline in tree density measurements over recent decades that is not forced by temperature””
The claim seems to be that they have specific information about a post-1960 divergence, and presumably enough pre-1960 instrumental overlap to satisfactorily calibrate. Now I can’t judge the strength of that, but it’s been discussed in the literature for nearly a decade. The answer won’t be found in a comment in the code. The comment merely reflects the limit stated in the published theory.
REPLY: In other work, Briffa allowed 10 Yamal trees, an unacceptably low sample, to stay in. When he says “anomalous decline in tree density measurements over recent decades” why then would he think 10 trees as a sample upon which to base that the datapoints are OK? As Steve has said many times (and other dendros agree – they say 50 is the minimum sample), Briffa should have truncated that Yamal sample and data. He didn’t. So why is it OK to not truncate a very low sample in one case and not another? Doesn’t make any sense. – A
Looking in the file HARRY_READ_ME.txt there seems to be some suspicious coder comments as well:
“This still meant an awful lot of encounters with naughty Master stations, when really I suspect nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option – to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad, but I really don’t think people care enough to fix ’em, and it’s the main reason the project is nearly a year late.”
“You can’t imagine what this has cost me – to actually allow the operator to assign false WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’ database of dubious provenance (which, er, they all are and always will be).”
Everyone… This is Not just about Money … it is about Tyranny… the politicians want to control us till we have no soul…
And we have the Nick Stokes who want to jump on a mention of 1960… temp mismatches… the entire picture is what you need to look at, Nick… study some more and you will begin to see a pattern.
Roger Sowell (20:58:33) :
@Nick Stokes: the real issue is that temperatures derived from tree rings are known to not match measured temperatures after 1960.
If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?
Well, perhaps that is a place to start. According to John Dally we know that trees don’t grow on 70% of the earth’s surface (oceans), they don’t grow in deserts or at high elevations. The 15% of the earth’s surface where trees do grow are in those locations where they may also impacted by lack of light (other trees), lack of water (draught), to the extent – that temperature can not be isolated as a cause for growth during any period. It is a false measure.
So lets challengeh tree data as a temperature surrogate altogether. That will take care of Mr. Mann and other tree persons and throw out the hockey stick. Apparently, according to his now published email, even Mr. Revkin now agrees with that.
I agree (as a software developer, especially) comments in code are very rarely any kind of ‘mistake’ as such, although many are out of date, it is true.
Overall, this, the email, and what I suspect to be gleaned from the data, the whole story is a shambles. There is only one possible thing that can be done to salvage the credibility of anyone in the field who supports AGW:
1. All existing data and conclusions should either be examined for veracity or dismissed.
2. There must be a new, non IPCC and non-UN controlled, globally funded task force to re-examine the whole AGW issue. Existing AGW believers and sceptics should be included, especially those who have shown extraordinary effort and dedication in the field to date.
3. The entire investigation should be transparent to the participants, funders (one assumes governments) and also the public. The Internet is a good medium for such a task, as has been proved.
4. All political involvement must be prevented. That cannot be stressed enough.
5. All commercial involvement should be prevented. Oddly enough, I would support ‘big oil’ etc as it seems they are gearing up very swiftly to get ahead of the game in renewable, as is sensible. I suspect that many would cry foul, however.
6. No taxes or political changes should be introduced that rely on the AGW theory being accurate until it is proved that CO2 increases will cause dangerous changes to the climate.
Just my 2c worth…..
Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
REPLY: So what do you do down there in Norman? NSSL? U of OK? You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
What I interpret this to mean is that the results of their work pre-1960 are as they think they should be. Post-1960, not so much. Post-1960 results have to be adjusted to reflect …. what?
a. a previously non-operating variable, now operating;
b. a previously operating variable that no longer operates;
c. some combination of a & b;
d. a hunch based on years of experience;
e. a revelation from God;
f. other
The code used to ‘artificially adjust’ the results so they will ‘look closer to the real temperatures’ ought to explain which of the above scientific procedures was used. If this code is not in the posted material, I am sure the researchers will happily provide it.
~~~~~~~~~~~~~~~~~~~~
A few months back WUWT had a discussion on programming so, not to go over all that again, I’ll just say many of us (in years gone by) wrote our own code and/or subroutines that were used over and over. I always put a few comments at the top stating a few common things, such as, my name, and the purpose of the code, and the language (FORTRAN 2D, IV, ??). Is there a complete routine in this dump that would identify the programmer?
Wait and see, the True Believers will soon come out with a “Fake but accurate” defense. Remember, the fate of the whole world hangs in the balance, so what’s a few lies, a bit a fraudulent science and bullying matter as long as the Green agenda of zero-growth, centrally mandated economies and increasing restriction on individual liberties is moved forward.
The end justifies the means in AGW ethics.
Richard Sharpe (20:44:52) :
“When people use cut-n-paste coding, they sometime even copy the comments and forget to change them.”
Oh that’s encouraging, the fate of the world hanging on shade tree coders.
Maybe one of the programmers was the mole.
Another strange happening at Hadley … all the hadcrut3 data for this year, except Jan/Feb, has been deleted.
http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt
P Gosselin (20:47:32) : You likely know this but I had to have it translated, except for the two opening words
Die Tricks der Forscher beim Klimawandel
via Google translate
The tricks of the researchers on climate change
Also, in the Word file jones-foiathoughts.doc it states:
“Options appear to be:
1. Send them the data
2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.”
Number 2 seems to indicate the particular station data that they were trying to hide…
rechauffementmediatique (20:57:34) :
I found these comments in osborn-tree6/mann/oldprog in files like maps12.pro and
maps15.pro. These were dated Jan 2000, and the directory name is not encouraging. Seems unlikely that it is currently used code.
The Times here in the UK now has the CRU story on one of it’s major comment pages
http://www.timesonline.co.uk/tol/comment/columnists/guest_contributors/article6927598.ece
rechauffementmediatique (20:57:34) :
Can someone point me what file is this code from?
I have downloaded the package, and found only one file (FOIA/documents/osborn-tree6/mkp2correlation.pro) that includes the above mentioned function. Lines 1 to 7 are identical, but the rest has nothing to do with the screenshot above
documents\osborn-tree6\summer_modes\maps12.pro
artwest (20:54:26) :
A poster called Asimov has quoted extracts from the “HARRY_READ_ME.txt” file – deeply shocking stuff:
Asimov’s very plausible suggestion is that Harry is a programmer trying, and often failing, to make sense of the garbage data which he’s been lumbered with.
Several posts over several pages:
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13
____________________________
I also lifted this from a comment over there:
“Tim Mitchell works at the Climactic Research Unit, UEA, Norwich, and is a member of South Park Evangelical Church.”
South Park ?? You’ve got to be kiddin’ me ??
PS I also tried to check out if he really did spell it “Climactic” but, strangely, the web site appears to be down. Heh heh heh.
New article from Christopher Brooker in the Daily Mail this morning …
The devastating book which debunks climate change
Front page link too.
Tongue in cheek:
Dr. Mann should have known when he sat out picking cherries that cherry wood is far too brittle to make a hockey stick. Perhaps he should have realized that if he were working with the proper material he might not have made an ash of himself.
Another Daily Mail article by Daniel Martin …
How climate-change scientists ‘dodged the sceptics’
Nice image of Jones (TIP: Place your mouse cursor over it!)
Re Nick Stokes (20:34)
I copied this from “The Air Vent” http://noconsensus.wordpress.com/
Kenneth Fritsch said
November 22, 2009 at 10:30 pm
The scientist is going to concentrate on what is unsual about the proxy, i.e. the proxy does not respond correctly to temperature after 1960. She would not think of plastering something into that time period to deflect attention. She would want to talk about and let you know what she has found – and attempt to explain it.
The advocate and all his defenders, on the other hand, will want to sell their message and when they are questioned will in bewilderment say ” well of course we would not use “bad” proxy data when we have “good” instrumental data and how dare you think we intended to deceive.
Drudge gets +20 million hits a day, has this in right column :
Hostility among foes…
…Britain’s Climate Research Unit of the University of East Anglia reveal an intellectual circle that appears to feel very much under attack, and eager to punish its enemies….”Kevin and I will keep them out somehow — even if we have to redefine what the peer-review literature is!”…”I will be emailing the journal to tell them I’m having nothing more to do with it until they rid themselves of this troublesome editor,”…
http://www.drudgereport.com/
http://www.washingtonpost.com/wp-dyn/content/article/2009/11/21/AR2009112102186_pf.html
I started looking at some of the non-Email and quickly found myself in the midst of documents/HARRY_READ_ME.txt . It’s a dairy of a three year saga to bash a lot of code to run on a new platform and his dealings with data files in as bad shape as the code.
This post didn’t exist at the time, plus the file would be more interesting to the denizens of Chiefio, so I posted http://chiefio.wordpress.com/2009/11/21/hadley-hack-and-cru-crud/#comment-1664 over there.
I’ve written similar diaries, but not for a project this long or with code this fouled up. Programmers will appreciate it the most (especially those with long weekend nights under their belt).
Anthony,
REPLY: In other work, Briffa allowed 10 Yamal trees…
I didn’t want to get into dendro issues here. The thread is about the meaning of a comment in the code, and all I’ve been saying is that the comment is consistent with what is done in the 2001 Briffa et al paper – namely the use of pre-1960 data. The code also dates from the time the 2001 paper would have been being written.
However, I can’t see the relevance of the Yamal issue. Apart from anything else, the 10 tree period there was 1990 and beyond, while here data is not taken past 1960. And it’s a NH paper – Yamal would be a very small part.
REPLY: My point is about exclusions based on sample size, the strategy of exclusion is inconsistent. – A
And how about this admission from Ed Cook, a dendro: http://whatcatastrophe.com/drupal/node/47
No single piece of this jigsaw is determinative and it will take some time for all the relevant material from the leaked/hacked files to be collated.
We have all read some of the emails containing suspicious wording but it should not be overlooked that many of them are capable of innocent interpretation. That is why the “big picture” is so important. Notes in coding also only give a snapshot of the thinking of the coder (is that the correct term?) at the time. Isolated soundbites do not make a case, a coherent body of evidence is needed and that can result only from detailed examination of everything in its proper context.
Of particular importance are the following questions: (a) Does a pattern of behaviour appear? (b) Do private thoughts recorded in emails and notes correspond with the writer’s public pronouncements? (c) When Team Member A suggested something ostensibly underhand or improper to Team Member B, did Team Member B expressly agree, expressly dissent or remain silent? (d) Were the financial interests of the writer or his employer taken into account when deciding how to present scientific conclusions?
If we concentrate on those questions (although many more are also relevant) we will be able to see clearly the extent of any jiggery-pokery.
My first impression is that the documents disclosed fall a long way short of being a complete package. Perhaps the leaker/hacker is planning a second instalment to rebut detailed explanations and excuses from members of the Team. Time will tell.
So far, so good, but don’t get overexcited by individual items.
Yes, Asimov also had quotes from the code up about how they had lost all the cloud data before 199X (sorry on the X, cant remember). That code is a very interesting piece of work.
The HARRY_READ_ME.txt is a MUST READ to understand the utter chaos of the CRU TS. Here are a few more snippets:
BEGIN FILE =========
So.. should I really go to town (again) and allow the Master database to be ‘fixed’ by this program? Quite honestly I don’t have time – but it just shows the state our data holdings have drifted into. Who added those two series together? When? Why? Untraceable, except anecdotally.
It’s the same story for many other Russian stations, unfortunately – meaning that (probably) there was a full Russian update that did no data integrity checking at all. I just hope it’s restricted to Russia!!
There are, of course, metadata issues too. Take:
……….
..knowing how long it takes to debug this suite – the experiment
endeth here. The option (like all the anomdtb options) is totally
undocumented so we’ll never know what we lost.
22. Right, time to stop pussyfooting around the niceties of Tim’s labyrinthine software suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the definitive failure of the entire project..
……….
Tried running anomdtb.f90.. failed because it couldn’t find the .dts file! No matter that it doesn’t need it – argh!
Examined existing .dts files.. not sure what they’re for. Headers are identical to the .dtb file, all missing values are retained, all other values are replaced with one of several code numbers, no idea what they mean.
END FILE ==========
It goes on and on and on like this. I’ve never seen so much confusion in any coding project (and I’ve worked on more than a few). From what I’ve seen, I wouldn’t trust them to code a toy app for an iPhone.
Jeff Id (20:40:17) :
….smells like a duck’s butt.
I’ll just take your word for that.
😉
As someone who spent 25 years for the most part working on one complex custom software system for a certain agency of the Federal Government (that shall remain nameless), all I can say is I fully agree with above comment by Jeff:
Like most programmers on complex software systems with a couple 100 thousand lines of code, we tried hard to accurately and fully describe the functions of each module in the code comments. Failure to do so leads to chaos; and any programmer worth his or her salt will do the same.
If it looks, walks, flys, swims, and quacks like a duck, theoretically it still COULD be something else. But lacking overwhelming evidence to the contrary, odds are REAL good that it’s a duck. The duck quacked in this case, and their goose is cooked.
I’m a statistical programmer for “BIG PHARMA” . For every new drug application, the FDA requires that we give them: raw data, analysis datasets (which are a merging of raw datasets and algorithms applied to raw data), a description of the algorithms and statistical methods, AND all our code. Then, the FDA reviewers try to come up with our results. This is done for every drug or medical device before approval.
The societal impact of global warming er.. climate change is greater than any one drug. If we go through all this independent review for a drug, we should demand a similar review process be applied to AGW claims.
The FDA, acting as a public protector, has to assume we are trying to “cheat” (and that is a reasonable approach). We never throw out data. Granted, our clinical trials are more controlled , but this ‘give the reviewers everything’ approach should be applied as much as possible in climate research.
But then again, we are the evil, capitalist, profit-seeking, “BIG PHARMA” and the people need to be protected from us.
Jesse (21:24:11) :
You’re right Jesse, there’s probably nothing here for you. Maybe you can spend your time at RealClimate. They’re more your type. We’re too far below you.
Say! Did you happen to see the post by Roy Spencer about elitism? Just wondering.
Keep looking folks; at the end of the day this another yawner.
There has to be something beyond more egg on the face for the usual suspects to for this to go anywhere. Either problems in the HadCRU data (due to errors or malfeasance), or implication of someone else.
You can only dump on Mann, Jones, Briffa et al. for so long. It’s lots of fun, but it’s a distraction from the main event: scientific evidence that the climate just isn’t that sensitive to CO2 levels. That’s what Spencer is after, and that’s what matters. Not what happened in some tree ring in 1962.
Off Topic…
Mainstream press overnight (US time) is now more confident and pushing harder…
UK Daily Mail pushing the FOI avoidance:
http://www.dailymail.co.uk/news/article-1230122/How-climate-change-scientists-dodged-sceptics.html
Canada’s Edmonton Journal also highlights the directions to delete stuff:
http://www.edmontonjournal.com/technology/Good+climate+news+alarmists/2252439/story.html
Wall Street Jounal – Also more strident today is – even highlights a link for readers to get the original files!
http://online.wsj.com/article/SB125883405294859215.html?mod=WSJ_hpp_sections_news
Meanwhile….
BBC says:
‘Hopes for the Copenhagen climate summit in December have been boosted after it emerged that more than 60 presidents and prime ministers plan to attend.’
http://news.bbc.co.uk/2/hi/europe/8373551.stm
Will they…NOW??
and poor old Mother Jones (I used to love that mag) is left explaining the meaning of ‘trick’
http://www.motherjones.com/kevin-drum/2009/11/tricked-out
Jesse (21:24:11) :
. . . bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
REPLY: So what do you do down there in Norman? NSSL? U of OK? You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
Hi Jesse
Let me take a break from from floor waxing to say that I did grad school down in Norman. The professors all said they were trying to build a grad school their football team could be proud of. Well, gotta get back to buffing Wal-mart’s wide aisles.
Hi everyone else
I think the real meat here will be coming from the data and code, not in the email fluff. Here’s hoping M&M find the foi data they were looking for.
wes george (21:24:44) :
Wait and see, the True Believers will soon come out with a “Fake but accurate” defense.
They feel they can’t go down. The Titanic couldn’t sink too.
Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
Nothing as convincing as some ad hom attacks.
Lots of folks posting here have actual academic qualifications. (I have an MSc in Clinical Research and 20 years of experience working with clinical data sets FWIW.)
The work from The Team is a joke compared to the standards that the FDA and EMEA require. That’s the problem with “appeal to authority” as a debating technique. When your “authority” turns out to be a tiny cabal of shoddy scientists, things kind of fall apart)
….meanwhile the earth is cooling…. and the general population is slowly finding out that there was collusion among the top circle of global warming scientists….
Glenn (20:38:14) :
I’ve been unable to find the definition of “artificial adjustment” in the climatologist’s handbook. It must be called something else.
Look under ‘cha-ching’
Doug (21:56:11) :
Dr. Mann should have known when he sat out picking cherries that cherry wood is far too brittle to make a hockey stick.
Here’s a case of being able to tell a lie about a cherry tree.
In SOAP-D15-intro-gkss.doc, it states:
“Osborn and Briffa, together with other co-authors (Rutherford et al., 2005), examined the sensitivity of temperature reconstructions to the use of different target variable (annual or seasonal temperature), target domain (hemispheric or extratropical) and reconstruction method. They found that when the differences in target variable and domain are accounted for, the differences in several reconstructions are small and that all reconstructions robustly indicate anomalous warm conditions in the 20th century, with respect to the past millennium.”
Since they are subbing real temp data after 1960, isn’t this fraud?
I mean, I think this is a pretty bald face lie to me.
Jesse (21:24:11) :
Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
IF THERE EVER WAS AN APOLOGY FOR THE CURRENT ELITISM (and it isn’t even a good one) THIS QUOTE…IS IT.
Contrary to Ayn Rand, who was an “elitist” of sorts, but saw the inherent value in anyone with a heart and soul, no matter what their “blue-collar” “job”, this individual pulls the curtain back and exposes himself for who he really is:
The “we scientists” part is extremely pungent given the fact that, empirically speaking, some of the first individuals to cave in the Third Reich….were the scientists.
Explain them apples.
Not sure, but it might have to do that scientific intelligence is not the end-all, be-all.
Yes, yes….science SHOULD rule (am in total agreement there).
Its just when you turn it to these nasty, nasty, ad hominem comments about custodians, and Wal-mart greeters [ugh i hate Wal-mart but don’t hate the people who need a job…so they work there], and others that YOU, JESSE, so quickly deride.
But beyond that, Jesse, this is not “making mountains of of ant hills.”
It is standing up for the truth, whatever it may be.
And…in passing…those FIRE ANT mounds that have taken over in the southeast US thanks to the positive AMO, are, to us, mountains….and if it is a big deal to you, their opportunistic, swarming, life-choking habits will hopefully come to a halt when the AMO turns negative.
So this whole CRU-gate saga is…it IS….a big deal.
It most certainly is a big deal. Precisely why you are your opportunistic mates are so up at arms.
So I say hear hear HAIL HAIL let’s end this cowardistic hi-jack of more than a few opportunistic ideologues who are trying to control the world and lets get on with….
SCIENCE BUSINESS AS USUAL!
AGW is dead. (Thank bl**dy g*d!)
Chris
Norfolk, VA, USA
UKIP (20:52:30) :
No resignations or sackings yet then?
None yet. But maybe some sweaty underarms.
BCC (22:38:30) :
You can only dump on Mann, Jones, Briffa et al. for so long. It’s lots of fun
Nah. Watching them dump on themselves is the fun.
Mike McMillan (22:39:30) :
Well, gotta get back to buffing Wal-mart’s wide aisles.
I got elbow grease. They hiring?
Just wondering, when exactly were the climate models first determined to be “robust” and by whom?
Because that’s really ate at me ever sine I frist read that claim.
Looking now at all things in totality that claim of robustness is just about as big a whopper that’s possible.
So from who did it orginate?
CORRECTION: The “we scientists” part is extremely pungent given the fact that, empirically speaking, some of the first individuals to cave INTO the Third Reich….were the scientists.
So “into” not “in”.
Sorry about that.
Cheers.
Chris
Norfolk, VA, USA
It seems few are looking at the CRU documents… all interest seems to be on the emails. This is a shame – many of the documents offer just as damning evidence as the emails.
Consider “circ_inconsistency.doc”, attributed to Nathan P. Gillet of CRU, dated 3 May 2005, and titled, “Inconsistency between simulated and observed Northern Hemisphere circulation changes.” It clearly states that the eight (8) “state-of-the-art” coupled climate models relied upon for the IPCC 4th Assessment Report fail to match observed data:
(emphasis added by this blogger)
Eric Barnes (21:13:17) :
Revkin’s headline: ‘Cyber-Terrorism’
He knows how it was done? I wasn’t aware that that was found out yet. What if it was people that work at CRU and all they did was violate office policy?
Let’s look at the facts as they are revealed Mr. Revkin before we start playing the terrorism card.
Or is your headline different than your opinion??
Speaking as a programmer I cannot see anything wrong with that code. It is clearly documented to stop people making changes that will give incorrect results.
I think people should think these posts through a little more carefully because this post makes this blog look embarrassing.
“Just wondering, when exactly were the climate models first determined to be “robust” and by whom?”
The same people who programmed them, probaby.
I am a software engineer and have also done quite a bit of looking at the code. I would be cautious about jumping to conclusions as to what the comments mean. My cautious interpretation is the comments mean what they say – they stop at 1940 (in some files) or 1960 in others – because there is a divergence problem as indicated in the file named declineseries.pdf – from memory I believe that the tree ring density diverges about 1960 and the ring widths about 1940. There are some indications that the analysis code was used to try different approaches – e.g. 20 year or 40 year or 50 year smoothing, which suggests looking for ways to “spot the trend”. But I would not yet draw that conclusion.
From a s/w engineering perspective, the code is very ugly. Few descriptive comments and some really bad coding practices, like extensively duplicating blocks of code in numerous files with only small changes. How the heck will they accurately incorporate changes across multiple files in the future? Just a real mess.
Other questions, not answered in the release files, are “How was this code verified and validated?” Does the code due what ever it was intended to do? Does it also do that accurately?
The comments in the HARRY_README file are pretty wild, however. So wild that I haven’t really figured quite what to think about that just yet. There are other comments in the source files that mention data that was lost (cloud data) and which they recreate or try to re-create based on other data or sensor inputs. The HARRY_README though is rather wild.
That warmists’ claim of 20th century as the “warmest ever” continues to amaze me.
1. Viking settlements in Greenland – proof positive it was warmer back then – even though CO2 was fairly low without Exxon and Shell pumping out CO2 from those evil refineries. If CO2 causes warming, then absence of CO2 must cause cooling. Cannot have a valid control system otherwise.
2. Prehistoric man’s body found in melting glacier in the Alps in 1991 – one must wonder how he had the strength (after being shot with an arrow, causing a mortal wound) to dig a hole down through that glacier so he could die underneath it. Or perhaps he died, was covered by snow, and that snow gradually became a glacier? It was warm enough 5,300 years ago in that pass that no glacier existed. Nah, couldn’t be. The warmists told us that it was WAY colder back then…
Maybe it is just me…I have lots of time to think about these things as I run my Wal-Mart floor waxer…actually it’s called a rotary buffer…
REPLY: LOL! – Anthony
Hands up everyone that wants to chip in to buy the CRU crew (that does sound weird, no?) some T-shirts?
http://www.zazzle.com.au/i_reject_your_reality_substitute_my_own_t_shirt-235174364570624210
Shame Mythbusters beat them to that trademark slogan.
“Well, perhaps that is a place to start. According to John Dally we know that trees don’t grow on 70% of the earth’s surface (oceans), they don’t grow in deserts or at high elevations. The 15% of the earth’s surface where trees do grow are in those locations where they may also impacted by lack of light (other trees), lack of water (draught), to the extent – that temperature can not be isolated as a cause for growth during any period. It is a false measure.”
And Yamal is an arctic wasteland where trees grow for about 15% of the year.
15% of 15% is just over 2%!
How much of a representative sample of the earth’s climate is that?
See these program headers and code from the hacked material:
http://www.klimadebat.dk/forum/vedhaeftninger/manndeclinecode.jpg
http://www.klimadebat.dk/forum/vedhaeftninger/cutat1960.jpg
Their programs you can set ON/OFF to cut the decline off temperature series for 1960…! Yes HARD to explain.
The worry for the Team, shown up in the emails, is that the late 20th century warming has two differences from the early 20th century warming. Firstly tree rings responded to early warming but not to the second (at least in theNorthern hemisphere – they did in the Southern) .
http://www.climatedata.info/Proxy/Proxy/treerings_introduction.html
…secondly, the late century warming is only over land.
http://www.climatedata.info/Temperature/reconstructions.html
This suggests that the recent warming may not be genuine.
As if code comments and colluding to avoid FOI requests and data sharing isn’t enough to call into question their political agenda, it appears someone at CRU is on the Earth Government mailing list.
The Earth Government main site is here.
But of course they are just scientists with no agenda.
Found via the Anelegantchaos.org search engine.
Further on the CRU documents – has anyone checked out the tellingly titled “Extreme2100.pdf”? All looks suspiciously like cherry-picked “Yamal ‘extreme’ tree rings”, to a mere ignoramus like myself.
Meanwhile, Seth Borenstein keeps croaking in his swamp, oblivious to all:
Mountain glaciers in Europe, South America, Asia and Africa are shrinking faster than before…. In Greenland and Antarctica, ice sheets have lost trillions of tons of ice…. The world’s oceans have risen by about an inch and a half… Temperatures over the past 12 years are 0.4 of a degree warmer than the dozen years leading up to 1997…. Even the gloomiest climate models back in the 1990s didn’t forecast results quite this bad so fast.
In conclusion, Seth quotes several people with the same environmental disorder:
“The message on the science is that we know a lot more than we did in 1997 and it’s all negative,” said Eileen Claussen, president of the Pew Center on Global Climate Change. “Things are much worse than the models predicted.”
Wow! It’s worse than we thought! The sky didn’t fall according to models. Therefore, we need plug our ears and lie as often and loud as we can! Maybe, finally, the Heaven would hear us and fall?
Editors of the Associated Press! The whole world is laughing at you! People are e-mailing Seth’s articles to each other as funny stories! Boot Borenstein if you want to restore some modicum of credibility.
Second program i showed was.
\FOIA\documents\osborn-tree6\pl_decline_nerc
artwest (20:54:26) : A poster called Asimov has quoted extracts from the “HARRY_READ_ME.txt” file – deeply shocking stuff:
Asimov’s very plausible suggestion is that Harry is a programmer trying, and often failing, to make sense of the garbage data which he’s been lumbered with.
A note or two for non-programmers:
A “READ_ME” file is something programmers leave for each other (and sometimes for their future self when they return in a year or two to something they once shoved their brains through, but have now thankfully purged… then got assigned it again…)
The idea behind a README file is to tell you all the things that someone (or maybe a prior you) wasted a few hours or days learning. All the silly, stupid, vapid things; and sometimes the really neat but hard to figure out tricks. And sometimes just the nuts and bolts of “how do I make this go”.
So when a README file says things like “Why is the error message showing my squared number has gone negative?” they may be leaving a note for themselves on a future date, or for a fellow programmer a cube or two over to give them a clue. (“Whack them with the clue stick.”)
When it says things like “Don’t run this past 1960 because someone needs to artificially fit a wooden leg” it means “This is a stupid thing, that we cut it off here, but that bozo over there couldn’t make it work right so he’s just going to fudge the last bit and glue it on.” It’s a note to say: Your part is done, but you must go whack that bozo with the clue stick… and don’t worry that it’s broken after 1960, that’s his monkey to spank.
I really do feel for Harry Readme. I, too, found myself swimming in a ‘bucket of warm spit’ program with GIStemp. My guess is that a Ph.D. somebody with poor programming skills got Harry brought into the group to try and figure out why Mr. Clueless could not make code go. And Harry had to try and give Mr. Clueless clue, but the Clue Stick started to get worn out having been whacked so much…
Messy minds write messy code. Minds subject to deception write deceptive code. Sloppy imprecise minds write sloppy imprecise code. And Harry and I have had to try and mop up the slop.
So you find some arcane, messy, deceptive, sloppy, imprecise, and broken bit of code and FINALLY figure out what it was supposed to do and maybe what it REALLY does, and you put a note in the README file. So no other poor soul will ever need to stick their mind into THAT bit of code where the sun don’t shine ever again… and move on to the next bit of dreck…
So if the HARRY README says something is “corrected” in quotes, he is saying “I don’t buy it, but it’s the word they used. Don’t worry if the code looks like it’s just corrupting the data, Mr. Clueless says it is ‘corrected’ so let it go… I know, and you know, it isn’t, but he’s the guy writing the spec.”
And if he says [ artificially adjusted ] without quotes he is saying “Mr. Clueless is just going to make it up and stick it in here artificially” and you do not need to worry about how, or why, or validity.
And when it says [ look closer to real temperatures ] with no quotes it means exactly that: All that matters is how it looks, so don’t expect any code to try and create this from actual data, it’s a hand job making things look nice AND it has to look closer to real data but does not actually have to BE real data; so don’t waste time looking for data or code that deals with it.
My take on all this is that the Ph.Ds at GISS and UEA / CRU took a FORTRAN class once in the 1980s but are lousy programmers writing crappy code. It’s not their day job, it’s just a “neat trick” they learned to exploit. And their code shows it.
They are like the guy who can build a patio deck, but all the nails are over driven, and a few are bent over. There are hammer dings in the wood where they missed some times. The ends of the wood are not sawn well, the deck was made with green wood with knots in it (that has shrunk as it dried…) and they did not bother to varnish the thing, so it doesn’t weather well. Oh, and they used iron nails instead of stainless, so there are lots of ‘rust and iron stains’. Finally, they just used 4×4 posts for the foundation. Who needs concrete piers anyway…
Then Harry got brought in a few years later to tell them why the floor creaks and a couple of boards have come loose. And Harry has discovered he need to explain dry rot, varnish, redwood vs pine, stainless nails, …
IMHO, they need to have real carpenters build their deck for them… and they certainly ought not to invite the world economy over for a party on top of their present deck full of dry rot … I don’t think it would hold up, and when it collapses, the “guests” are going to be hurt, and very angry…
And Harry seems to know this too.
The newspaper translation from English/German/English gave ‘Team’ as ‘Guild’. The http://www.public.iastate.edu site defines Medieval Guilds as:
* exclusive, regimented organizations;
* created in part to preserve the rights and privileges of their members; and
* separate and distinct from the civic governments, but since the functions and purposes of guild and civic government overlapped, it was not always easy to tell them apart, especially since many well-to-do guildsmen were prominent in civic government.
I suggest we refer to “The Guild” rather than the “The Team” from now on.
This is starting to be noticed at the BBC.
On this morning’s Radio 4 Today program at about 0735am there was a five minute slot with Former Chancellor Nigel Lawson and a Prof Watson. It was reasonably well balanced – surprising given that the presenter James Naughtie usually tries to work Global Warming into everything he can.
Lawson is calling for an enquiry into the University and its handling.
The Today program is the BBC’s prime radio current affairs slot and is “required listening” for most commentators and politicians – it can set the agenda for what is happening in British Politics.
An earlier piece win the program from Roger Harrabin failed to mention it.
It will be very interesting to see what happens from here.
No wonder Mann never wanted his source code nor his science Open Sourced so that many eyes can gaze upon his obvious attempts at fabrication (allegedly – assuming the files are genuine which they seem to be).
It’s a new programming language: “Smoking Gun!” and he apparently used it to shot himself in his own foot.
Had quick look at The TIme website – major comment piece on their home page from Nigel Lawson – http://www.timesonline.co.uk/tol/comment/columnists/guest_contributors/article6927598.ece
.
Regards the Copenhagen Agreement, Einstein said of its findings that the participants were: “playing a risky game with reality”.
Einstein was actually talking about the Uncertainty Principle of Quantum Mechanics, which was debated in Copenhagen back in about 1927 – but his remark neatly sums up the forthcoming conference too.
http://en.wikipedia.org/wiki/Copenhagen_interpretation
.
.
The guys in this link are financial modelers and have been running through the CRU code and other data for the last couple of days.
Put your dark glasses on for some of the rather [ extremely ] descriptive language describing the standards of coding and modelers capabilities at CRU.
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625
I am afraid to say that I have not seen, read or heard one skerrick about the CRU Email hacking by our tax funded national broadcaster.
To-day in lieu you can learn that the Antarctic Ice Sheet is losing mass:
http://www.abc.net.au/news/stories/2009/11/23/2750931.htm?site=news
This selected reporting of stories that only support staff driven agendas and neglecting ones that do not is in itself a side story as scandalous as the disclosure of the CRU’s modus operandi.
Nick Stokes (20:34:52) : 22/11
Frankly, since you asked, you are being dense when you write “I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable. ”
The full quote in the programmer’s code is –
“Uses “corrected” MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
There’s no mention of “unreliable”.
There is mention of “artificially adjusted to look closer….”
Why not leave off inventing excuses for a while and post what you think is the most harmful part of the hacked information, from the point of view of proper scientific conduct?
I found this in one of the emails – does this mean anything for the post-1960 data?
“One other thing – MOHC are also revising the 1961-90 normals.”
(MOHC is Met Office Hadley Centre.) Does “revising the normals” mean “inventing the data?” I’m just beginning to wonder. There seems to be a discussion of fudging numbers in the email.
http://www.anelegantchaos.org/emails.php?eid=1017&filename=1254147614.txt
>>Gene Nemetz (22:41:33) :
>>>They feel they can’t go down. The Titanic couldn’t sink too.
No, no, no.
What they can really feel is rapidly shrinking and withdrawn government grants. The cold winds of economic reality will soon be blowing though the sparse remnants of Yamal tree studies. It will be back to climate tokenism, just for the political ‘green-sheen’.
.
>>Another strange happening at Hadley … all the hadcrut3 data for this year, except Jan/Feb, has been deleted.
>>>>But of course they’d delete it, wouldn’t want the very inconveniently record cold Northern Hemisphere October 2009 muddying their “warming”.
Gregg,
Recent months actually showed an increase in temperature in hadcrut3. Perhaps it was realised by Jones et al that the data did not reflect reality ?
TIM
You are obviously one of those programmers I used to sack.
Comment were NEVER designed for the purpose you describe, you ;;;
I can´t wait when HadCRUT itself will be debunked. We know that many stations are bad, but there is no doubt some “added value” algorithm which adds 0.15 deg C here or there.
What will be left, if historical and present temperature records will be blown off? Just pre-programmed playstation climate models.
Cheers
Phil
You delusionals are quite funny.
To build your case with this data you have to perform a couple of ‘tricks’ of your own.
1. Make the unstated assumption that the paleoclimate data is the whole of the co2 forced global warming story.
2. Hone in on increasingly small parts of the story and ignore the big picture. You have to do this because the big picture doesn’t support your position at all.
3. Fail to respond to counter arguments from the originators of the research.
Vested interests and delusionals leading the ignorant here, now that you are starting to lose your grip on policy makers internationally it’s becoming amusing to watch you, if we’ve got the time that is.
There is no much difference between CRU and UAH/RSS dataset:
CRU/UAH diff:
http://www.gfspl.rootnode.net/BLOG/wp-content/uploads/2009/11/cruuah_trend.png
So it’s not warming at all. The theory of the gnomes having reduced the freezing point of water is true after all 🙂
Policyguy (21:21:18) : The 15% of the earth’s surface where trees do grow are in those locations where they may also impacted by lack of light (other trees), lack of water (draught), to the extent – that temperature can not be isolated as a cause for growth during any period. It is a false measure.
And don’t forget my favorite “confounder”: Bear Poo. A recent (Peer Reviewed) article found that bear eating salmon and, well, doing what a bear with a full tummy eventually does in the forrest, despite all the times folks ask: “Does a bear, um, poo, in the woods?”, the answer is they do poo..
Well, said bear account for as much deposited fertilizer as would be applied in a commercial tree farm. So depending on the amount of salmon run, the bear population, and the distance to the streams with said salmon in any given years (i.e. rainfall and stream flow matter) trees will get more or less growth depending on the “Poo Profile” …
To calibrate your trees, you have to have calibrated your poo deposition over time…
Jesse (21:24:11) : Once again, you guys are making mountains out of ant hills.
Ever found yourself standing on a Fire Ant hill? Visit Texas… bare foot… Do not underestimate the impact of an “ant hill”.
This is just normal data processing
No Way. Not even close. Nothing about this code matches ANYTHING I would class as “normal data processing”. (And yes, I’m a “professional” at it).
per the 1960 divergence problem as shown by NUMEROUS sources.
If you have 1960 divergence, then you could have 1860 divergence, and 1760, and 1260 and … Toy Broke.
This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists.
From what I’ve seen, the climate “uniformed amateurs” know more about this than the “real scientists”. I, for one, can write code better than anything these “real scientists” have done. And I have a much better “grasp” of software QA. Or would you agree that these uninformed amateur programmers ought to leave the job to “real programmers”. Hmmm?
Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
Well, it would seem your level of expertise is that of “wrassling coach”… Or would that be “writer for Jerry Springer”?
REPLY: You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
OK, I’ll follow that lead:
Highest ranks attained: Corporate President. Board Member (one public, one private).
Favorite rank: Director of Information Services.
Paper Trail: IPCC CDP, State of California Community College Lifetime Teaching Credential Data Processing & related. Bachelors and more.
Education: Through about 20 Units of masters level (needed for credential, plus MBA work). More “commercial classes” than I can track. 12 Units of doctoral level (it’s a long story involving money and time…)
Experience includes several years teaching I.T. at a local silicon valley college.
Over 30 years professional data processing experience in the private sector including managing a Cray supercomputer site (and building out same) and including being a professional programmer for 2 decades+ including being a DBA and being a professional consultant on mainframe databases. FBI check passed to work in securities field and employed in same at stock brokerages and Federal Reserve Bank. Taught computer forensics. Conducted security audits.
Managed Software QA for a compiler tool chain company.
Managed Software build for a network router appliance company (Build Master) with a commercial product.
Produced and shipped production software and documentation through many product cycles.
and a whole lot more…
So I think my “software professional” status stacks up pretty well against your “real scientists” who are amateur programmers. Perhaps they ought to leave programming to the “real computer scientists” and go back to their day jobs as custodians… no, wait, they lost their data, so they are not qualified to be custodians of things…
Its just a matter of time that Mr Jones and Mann and perhaps more will be moving on.
For those wondering about The Troubles With Harry, here is his picture:
http://www.cru.uea.ac.uk/cru/people/photo/harry.jpg
Looks like a nice enough guy. I’d like to buy him a beer for his efforts with The Code.
I wonder does the Harry Readme file coincide with a new release of the HADCRUT algorithm. As I understand it, every few years a new version of the software / algorithm / data is released and it typically shows more warming than the last one. Could it therefore be plausibly claimed that then Harry Readme invaldiates a particular HACDCRUT release?
Of course that is not to say that the previous one worked okay either.
“Kevin Trenberth, of the US National Center for Atmospheric Research (NCA) in Colorado, whose e-mails were among those accessed, said the timing of the hacking was “not a coincidence”.
He told the Associated Press News agency 102 of his emails had been posted on the internet and he felt “violated”.
Critics say the e-mails show that scientists have distorted the facts of climate change, but Mr Trenberth said the e-mails had been “taken out of context”.
The above is the last paragraph from http://news.bbc.co.uk/1/hi/world/europe/8373551.stm
So, he feels violated eh? Well sport, we feel we have been conned by you & your deceitful cronies & you all richly deserve your collective fates. What are the future employment prospects for people like you. Nil, if there is any justice.
E.M.Smith (00:47:58) :
Please be advised that bears poo in the woods for only about 5 months. During their hibernation, the bears neither defecate or urinate. This would normally mean that nitrogenous wastes during that time would cause poisoning to the urinal system. However this it does not do. The bear solves its nitrogenous waste problem by a form of recycling.” The hibernating bears body diverts nitrogen from pathways that synthesise urea into pathways that generate amino acids and new proteins. And it does this by glycerol (produced when fats are metabolized) and recycled nitrogen as the building blocks” according to the New Scientist Magazine of February 1985.
Another MSM article. The Mail dropped their previous article over the weekend but I found this one on their front page again today.
@ Jesse (21:24:11) :
Although my title is actually “Janitor”, I do play a computer scientist on TV (almost 30 years). On TV, I have worked for the DOD, Navy, DHS, software companies, law firms, accounting firms, manufacturers….
But, back to reality, gotta go wash some windows…
[/sarc]
“Lord Lawson calls for public inquiry into UEA global warming data ‘manipulation'”
on BBC radio reported by the telegraph:
http://www.telegraph.co.uk/earth/environment/globalwarming/6634282/Lord-Lawson-calls-for-public-inquiry-into-UEA-global-warming-data-manipulation.html
TRICK or TREAT
Just in case no one has posted above its hit the Daily Telegraph mainstream also.
http://www.telegraph.co.uk/earth/environment/globalwarming/6634282/Lord-Lawson-calls-for-public-inquiry-into-UEA-global-warming-data-manipulation.html
Roger Harrabin’s Notes: E-mail arguments on BBC
http://news.bbc.co.uk/1/hi/sci/tech/8371597.stm
Note that he has plenty of CRU contacts to assist him in putting this revelation in “perspective” but to be fair he is also quoting Myron Ebell, a climate sceptic from the Competitive Enterprise Institute.
No sunspots today.
But if you want, I’m sure I could produce some.
To calibrate your trees, you have to have calibrated your poo deposition over time…
[snip, oh come on ~ ctm]
🙂
A good article in The Times this morning from Lord Lawson about Coppenhagen. Mentions the CRU leaks as well.
I am absoultely loving this! He he!!! The foul stench of bovine faecal contaminated science from the CRU is stomach turning. Of course they’ll point to the recent bad storms in Cumbria as hard evidence of Climate Change, despite better examples in the past over on An Englishmans Castle website.
I think the late great Sir Walter Scott summed it all up rather well, “Oh what a tangled web we weave, when first we practice to deceive”!
Never, never, never give up! (Sir Winston Churchill).
Lord Lawson announced this morning:
About time!
kdkd wrote:
Vested interests and delusionals leading the ignorant here, now that you are starting to lose your grip on policy makers internationally it’s becoming amusing to watch you, if we’ve got the time that is.
Interesting. I would have written the same, word for word, speaking to the green crowd.
Who has a vested interest in green scaremongering? Skeptics? No, the environmentalist totalitarians. That’s why they always suppress any talk about the nuclear solution of energy and pollution problems.
Who is losing their grip on policy makers internationally? Al Gore & Co. Skeptics never had such a grip.
Who is becoming the laughing stock of the world? Briffa, Jones, Overpeck, Mann, Al Gore et al. Yes, it is very amusing to watch them now, running around like cockroaches on the hot skillet, their shenanigans exposed.
Who is delusional? Those who respect the facts and have guts to admit that they don’t know enough about climate to be able to predict it, or those who believe in their omniscience, and ascribe to the humanity a disproportional influence over the processes of astronomical scale?
Who judges people not by what thy do but by who they know, not by the correctness of their predictions but by pre-orchestrated fraudulent “peer reviews”?
Finally, who is always afraid to post under their real names, and has neither respect nor tolerance, not even an elementary human decency, toward their opponents?
Mann smells like a ducks but at the very behind…
http://www.timesonline.co.uk/tol/comment/columnists/guest_contributors/article6927598.ece
I think this is an important article.
JNL (22:29:08) : I’m a statistical programmer for “BIG PHARMA” . For every new drug application, the FDA requires that we give them:…
Nice List.
FWIW, I’ve done “qualified installs” for Pharma companies.
What the non-Pharma folks might not know: For every single bit of hardware and software used for all the stuff JNL listed, it must be installed “just so”. Every Single Step of Every Single Procedure must be defined in advance. Even if it is just “Open box. Plug in cord. Turn power switch on.”.
A “Qualified Install” has a predefined process and it has an implementor. It also has a Manager (that was me) and a QA officer (that may have been specific to the particular company, but the rest is FDA mandated).
The Manager watches the Implementor (i.e. systems admin) do each step.
Each step must be done EXACTLY AS WRITTEN. Then both the sysadmin and the manager sign off the step. At the end of the entire process a PREDEFINED QA process is performed and the output must match EXACTLY.
Then the manager gets to sign off the whole package and hand it over to the Company QA guy (who was watching over the shoulder of the manager watching over the shoulder of the SysAdmin…).
The whole package is copied, filed with the company, and sent to the FDA.
This is so that they can exactly duplicate everything the drug company did, including “Open the box, plug it in, turn on power”… (Though that second list would not pass a “Qualified Install” since I used commas instead of periods and said “turn on power” instead of “Turn power switch on”; it did not match my first list… Yes, it IS that picky…)
So not only must all the stuff JNL listed be sent to the FDA, but also every single bit of hardware assembly, software installation, software configuration, (the works) must be a “Qualified Install” and documented. And lord help you if NetApp changes the power-on button from Red to Orange and your Qualified Install says “Turn On Red Power Button Lower Left”…
IF the FDA decides to test something you sent, and the Qualified Install docs don’t match what they experience when, oh, booting up a Sun Server, guess what: Your drug gets rejected until you get it right… So when you say “install Solaris” you’d better have the exact release number noted and it better behave exactly the same each time…
So that is what you must do if you want to sell a an aspirin with a new type of inert binder in it or even just wanted to make a “kosher aspirin” with an enteric coating blessed by a Rabbi …
But enslave the world with carbon taxes? Destroy world economies? Claim thermageddon happening now? That can be done with completely undefined and substantially broken software with no comments, no procedures, irreproducible runs (as the comments in HARRY_README show), and with no clue if the product works.
Grab an aspirin tin / bottle and think about it, for just a few moments…
“If tree ring-based temperatures are known to be false compared to actual measurements, then how can they be true in earlier decades or centuries?”
Because no one can prove otherwise, silly.
E.M.Smith (00:47:58) :
Ever found yourself standing on a Fire Ant hill? Visit Texas… .
I did that once. ONCE.
Mike in Houston
.
savethesharks (23:13:36) :
CORRECTION: The “we scientists” part is extremely pungent given the fact that, empirically speaking, some of the first individuals to cave INTO the Third Reich….were the scientists.
So “into” not “in”.
“in to” is correcter.
Mike, grammar n*zi
.
Anyone heard any comments from M&M ?
I always look forward to Nick Stokes’ contributions; he must be nearly the most hard-working supporter of the increasing dishelveled AGW edifice; certainly he is the politest, admittedly against poor opposition, and he must have the constitution of a Mallee bull, having to digest the tripe, offal and dreck that passes as AGW evidence these days.
I’ve also been fascinated with the divergence ‘problem’ which Nick has applied his cudgels to on this thread; here we have AGW, proxified through history with all sorts of weird and wonderful samples and series correlated with each other according to strange incantations, with all the power of quantum exotics; in this way the wizards of these bits of yore and fairy stuff can construct a mighty hafted stick, as strong as any Bradman bat, to slay the doubters. But when we hit 1960, the start of the dreaded AGW, the magic disappears and the proxy magic path, so firm in the past, wilts like a viagra-less, old man’s dreams. It just isn’t fair that in the modern era when we have access to all the tree-rings we can shake a stick at that none work and we have to instrumentalise the modern duds up to speed; typical younger generation!
Found a quotlet on another blog…*
“…we are having trouble to express the real message of the reconstructions – being
scientifically sound in representing uncertainty…”
It’s like they’re channeling the Hitchhiker’s Guide to the Galaxy. “We demand rigidly defined areas of doubt and uncertainty!” 😉
*I have too slow of a net connection to download the whole lump of stuff.
“Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
This is the sort of comment that those who know that their argument can not stand up to logic will resort to using. It fits in with so-called scientists who refuse to make public the raw data, code, and methods used to obtain the hockey-stick.
Without public transparency, there can be no science. The idea of putting your results out there for all to try to discredit is the very heart of the scientific method. (or at least, that is what they said in the science courses I took)
Does the truth not matter to these alarmists?
LORD LAWSON CALLS FOR PUBLIC INQUIRY INTO UEA GLOBAL WARMING DATA ‘MANIPULATION’
Lord Lawson, the former chancellor, has called for an independent inquiry into claims that leading climate change scientists manipulated data to strengthen the case for man-made global warming.
http://www.telegraph.co.uk/earth/environment/globalwarming/6634282/Lord-Lawson-calls-for-public-inquiry-into-UEA-global-warming-data-manipulation.html
Fry (00:46:31) :
“There is no much difference between CRU and UAH/RSS dataset”
Of course not, because the difference would be too difficult to explain.
Pre-satellite era data fiddling is free territory for CRU/GISS so they could argue that climate has been warming during last decades.
The BBC coverage has reopened, with the proviso that: “We have now re-opened comments on this post. However, legal considerations mean that we will not publish comments quoting from e-mails purporting to be those stolen from the University of East Anglia, nor comments linking to other sites quoting from that material.”
See http://www.bbc.co.uk/blogs/thereporters/richardblack/2009/11/copenhagen_countdown_17_days.html#comments
It would appear that the BBC is not going to give this any publicity. Any surprises there?
I have an idea of where these files may have come from. Many e-mail programs store each “folder” of messages and attachments in a single file with a simple indexing scheme.
When an e-mail is deleted, it’s not actually removed from the “folder” file, only the e-mail’s pointer in the index is removed. To actually DELETE the e-mails, the “folder” file has to be compacted or purged or whatever term the mail client software uses. There are many mail recovery programs that can quickly and easily create an index to all deleted messages in a mail client’s trash/deleted “folder” file, though they may not be able to fully recover the complete headers.
This “feature” is one thing computer forensics often uses to find evidence on computers.
In spite of this being fairly common knowledge amongst people with mid to high level computer skills, it’s surprising how few actually bother to ensure their mail clients are configured to automatically purge/compact deleted messages.
This could be the e-mail analog to a famous case where a murder suspect sneaked a pair of pinking shears into an interrogation room where the police had brought the actual floppy disks they’d obtained from his house. (Had they been a bit more on the ball they’d have stuck some blank disks into the sleeves or made backup copies first.) The suspect managed to chop the disks into pieces, but other people were able to develop a process to put them back together well enough to recover large enough fragments to prove the suspect had written a lot about the murder. A powerful magnet would’ve been a better way to destroy the incriminating evidence.
Pinking shears = “deleting” e-mail. Big magnet = purging the deleted e-mail.
JNL wrote:
“The FDA, acting as a public protector, has to assume we are trying to “cheat” (and that is a reasonable approach)… But then again, we are the evil, capitalist, profit-seeking, “BIG PHARMA” and the people need to be protected from us.”
I agree – the FDA is only “acting as” etc.
But – isn’t it possible that the FDA and “BIG PHARMA” have BOTH lost “some” credibility “lately”? And by lately I mean the past few decades…
Does the “revolving-door” policy mean anything to people working for the “BIG PHARMA – and I mean other than “career opportunities”?
Well, just wondering – and yes, I realize this is OT etc. – and I promise not to continue… 😉
E.M.Smith (00:47:58) :
“To calibrate your trees ……….”
You are absolutely right. One factor regarding forest growth that can’t be easily accounted for is effects of the excessively strong winds and hurricanes. I do regular walks in woodland which was affected by 1987 hurricane in the South East England. In some areas where trees were more exposed, most of large mature oak and beech trees were uprooted, while in the sheltered parts they survived.
Young saplings from 1987 are now established trees in all areas, but in the exposed parts where mature trees were uprooted, young ones are now nearly twice the size of those of the same age in the sheltered parts. This could be attributed to extra sunlight and nutrients available in the areas where large old established trees were uprooted.
kdkd (00:44:40)
Can you fill us in on the big picture? For example can you tell us how high sea levels were in the MWP? Can you give us the arctic ice extent for the 40’s? Do you know when the forward speed of Greenland’s glaciers peaked? We have all heard the litany of how things are worse than ever and accelerating, but what is lacking in those litanies is numerous details and a long term perspective.
Some more files with the same comment….
./FOIA/documents/osborn-tree6/summer_modes/hovmueller_lon.pro
./FOIA/documents/osborn-tree6/summer_modes/maps24.pro
./FOIA/documents/osborn-tree6/summer_modes/maps_general.pro
./FOIA/documents/osborn-tree6/summer_modes/maps1.pro
./FOIA/documents/osborn-tree6/summer_modes/maps15.pro
./FOIA/documents/osborn-tree6/summer_modes/maps1_movie.pro
./FOIA/documents/osborn-tree6/summer_modes/maps1_poster.pro
./FOIA/documents/osborn-tree6/summer_modes/maps12.pro
./FOIA/documents/osborn-tree6/mann/oldprog/hovmueller_lon.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps24.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps1.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps15.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps1_movie.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps1_poster.pro
./FOIA/documents/osborn-tree6/mann/oldprog/maps12.pro
INSIDERS?
Interesting that entitled “FOIA.zip” came at Jeff Id’s blog (nOV.13) with a posting that asked 18 leading US scientific associations about their letter to the US Senate on Oct.21,2009, at http://www.whatisclimate.com/
Would Russian hackers have done that?
Link to interview [0735] with Nigel Lawson and Robert Watson on R4 Today programme
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
I’ve never heard Robert interviewed before but he did seem a little defensively assertive to my ears.
Alec J (23:59:14) :”This is starting to be noticed at the BBC.
On this morning’s Radio 4 Today program at about 0735am there was a five minute slot with Former Chancellor Nigel Lawson and a Prof Watson. It was reasonably well balanced – surprising given that the presenter James Naughtie usually tries to work Global Warming into everything he can.”
It’s now online here for today
http://news.bbc.co.uk/today/hi/listen_again/default.stm
at 0735
Prof Watson starts by saying “These scientists at East Anglia are both honourable and world class. Their data is not being manipulated in any bad way”.
Was Yamal not bad? How does honourable square with ‘Not allowing data release because people will try to find fault with it’?
Discuss.
tim (23:24:49) :
Well, tim, there is a big problem with what is indicated in HARRY_READ_ME.txt and the present records for historical temp data.
The skullduggery does not start and end at CRU.
Great damage has been inflicted to irreplacable?? data.
Somebody gave the warmists the means to destroy data, and they did just that.
HARRY_READ_ME understood the gravity of what happened to the data he was given to work with.
Nope, nothing wrong with the code, it finished the job of data destruction as designed.
The Librarian at Alexandria weeps yet again, for the same type of people have once more robbed history.
The news claims that “hackers” broke into and stole emails/data.
The real “hackers” destroyed science data long before that, under the guise of science.
I’m just imagining some of the behind the scenes conversations (probably by phone 😉 ) between the interested parties that must be going on now. I’d think the mother of all ‘damage limitation’ plans is being drawn up.
Along the same vein, it’s amusing to see the usual ‘nothing to see here, move along’ articles appearing in the various pro AGW newspapers / blogs (either that or the ostrich ‘it never happened’ behaviour as typified by the BBC). However, try as they might , I can’t see this one going away…
P Gosselin (01:33:38) :
No sunspots today, and for most of the past 2 weeks those that have been ‘officially listed’ as sunspots were SOHO only visible.
We really should be talking about the Sun, but a very sad day has dawned with the realization that Science Barbarians have sacked and burned irreplacable data worldwide in an effort to support a political bent.
When you look at the unsmoothed proxy data it looks like noise. It’s only when the data has been smoothed and only those proxy series that match twentieth century temperatures, (as measured by thermometers), are included do you begin to get something that looks like a hockey stick. Everything else is discarded.
Even when they get something that looks like a hockey stick, it has a divergence problem that needs a ‘trick’ to hide it. This trick is to truncate the data and substitute the thermometer data that was used to ‘calibrate’ the original data.
Lots of reasons have been put forward as to why this is necessary and, from some, why it doesn’t matter, but for me the most convincing reason, the one that old Occam would adopt, is that the original, unsmoothed, unadjusted, uncalibrated data is correct.
It isn’t temperature the proxies are showing, it’s noise.
When you’ve thrown away 90% of the data in the smoothing and calibrating process and what you’ve got left still needs padding with data from another source, then you don’t have anything at all.
No wonder they needed to rig the peer review process and bully the journals.
bbc loses all credibility
an appaling and disturbing piece of nepotism:
http://news.bbc.co.uk/2/hi/science/nature/8371597.stm
E.M.Smith (00.47.58)
Still got access to that Cray ? wishful thinking.
Posted this on CA, but I see I’m not the only one wondering here. I think it is a dataset merge and then attempt at normalizing, but some of it is by year, so pretty weird.
Weirdest? This is printed on running. Sometimes when programming you fudge like this to get clues to where you might be off, it is more legit if it warns you on output. Still the final part sounds like they aren’t about to change it. Not sure what to think.
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is.
….original msg….
What is the ‘decline’ thing anyway? It is in a lot of code, seems to involve splicing two data sets, or adjusting later data to get a better fit. Mostly (as a programmer), it seems like a ‘magic number’ thing, where your results aren’t quite right, so you add/multiply by some constant rather than deal with the real problem. Aka “a real bad thing to do” : ).
\FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
\FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
\FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
\FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;
; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
\FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
\FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
\FOIA\documents\osborn-tree6\mann\oldprog\pl_decline.pro
;
; Now apply I completely artificial adjustment for the decline
; (only where coefficient is positive!)
;
tfac=declinets-cval
fdcorrect=fdcalib
for iyr = 0 , mxdnyr-1 do begin
fdcorrect(*,*,iyr)=fdcorrect(*,*,iyr)-tfac(iyr)*(zcoeff(*,*) > 0.)
endfor
;
; Now save the data for later analysis
;
save,filename=’calibmxd3.idlsave’,$
g,mxdyear,mxdnyr,fdcalib,mxdfd2,fdcorrect
;
end
\FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;
; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
Brazen!
“We also have a data protection act, which I will hide behind.”
http://directorblue.blogspot.com/2009/11/milli-vanilli-of-science-hacked-emails.html
Sample from Harry_readme.txt
“In other words, the *anom.pro scripts are much more recent than the *tdm
scripts. There is no way of knowing which Tim used to produce the current
public files. The scripts differ internally but – you guessed it! – the
descriptions at the start are identical. WHAT IS GOING ON? Given that the
‘README_GRIDDING.txt’ file is dated ‘Mar 30 2004’ we will have to assume
that the originally-stated scripts must be used. ”
I could help the Hadley Harries with this. There is this software thing called “revision control” and I could set it up in an afternoon for them. (CVS, SVN, GIT, BZR … whatever they like) In two afternoons I could set up a really neat, scriptable control system. They could spend more time debugging and less time on [snip]
Thing is, I can’t give up my day job right now – I just got a promotion and from today I am in charge of the rotary buffer! W00t! Wal-Mart rocks.
“Alan the Brit (01:37:00) :
I am absoultely loving this! He he!!! The foul stench of bovine faecal contaminated science from the CRU is stomach turning. Of course they’ll point to the recent bad storms in Cumbria as hard evidence of Climate Change,”
They already have. 10 o’clock news and again on Today prog this morning.
It’s interesting the BBCs confused policy at the moment. They have reopened their blogsite this morning with the specific message No links to or extracts from the emails will be allowed. Yet, Nigel Lawson was allowed to talk about it on BBC R4 this morning! (already linked above) And in The Mail this morning 3 pages of stuff about it. A new story on Phil Jones “Pioneer or Junk Peddler” and then a Chistopher Booker piece of 2 pages.
Nigel Lawson also has a piece in The Times this morning. So, it seems it’s only Aunty Beeb fighting with its internal conflict on this issue, in which its editorial policy has been compromised due to it’s Editor-In-Chief the DJ Mark Thompson being hood-winked by Al Gore back in 2007 when he attended the personal presentation of his flawed Powerpoint presentation to BBC Staff. Ever since then, he has dictated their current toothless policy from the top – in my opinion.
Time for him to go. Time for The BBC Trust to get involved.
It says important note, but I guess I missed the memo.
\FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way, and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.’
As a software engineer with 30 years experience, some of it working with government scientists, that code is horribly horribly familiar…
The problem is that research scientists have done a programming course at some point in time. 99% consider themselves good coders as a result. 98% of them are wrong….
The initial flaw seems to be in the way they intend to use the software – its only for them (often not even for their colleagues), and as such is completely uncontrolled. Often changes are made without any reference or not, changes on changes…and after a while, they arent sure any more why things happen the way they do…
Documentation? We dont need that, its my programme, I know what it does. Maybe. Will you in 5 years? Experience shows you dont…
This code is a classic example of this way of programming. Now fortunately, much of this type of coding is only used by one person, not designed for input to anything critical, as an aide for a researcher, for whom results trump everything. So while its bad practice, it doesnt have too many disastrous effects. This time, however, its being used for predictions costing 100’s of billions of dollars….
Monkton is absolutely correct, we need to take the raw data, the calculations, and build new, verified models and data sets to see what is hapenning BEFORE we spend all this money. If these DO show AGW, fair enough. My money is on any AGW being so small its lost in the noise.
Climate Change? Hah.
THIS is what I call real Climate Change:
Date Id Name State Latitude Longitude Maximum Temperature
in ºF Minimum Temperature
in ºF Observation Temperature
in ºF Precipitation
in inches
1892-01-03 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-04 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-05 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.02
1892-01-06 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-07 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.54
1892-01-08 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.32
1892-01-09 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.05
1892-01-10 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-11 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-12 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-13 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-14 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.005
1892-01-15 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.005
nothing but precip data until…..
1934-09-27 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-28 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-29 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-30 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-10-01 47290 RED BLUFF CA 40.1833 -122.233 85.0 51.0 79.0 0
1934-10-02 47290 RED BLUFF CA 40.1833 -122.233 82.0 51.0 79.0 0
1934-10-03 47290 RED BLUFF CA 40.1833 -122.233 84.0 54.0 76.0 0
1934-10-04 47290 RED BLUFF CA 40.1833 -122.233 91.0 50.0 76.0 0
1934-10-05 47290 RED BLUFF CA 40.1833 -122.233 100.0 47.0 79.0 0
1934-10-06 47290 RED BLUFF CA 40.1833 -122.233 82.0 62.0 66.0 0
1934-10-07 47290 RED BLUFF CA 40.1833 -122.233 74.0 54.0 64.0 0.11
crua6[/cru/cruts/version_3_0/db/testmergedb] grep -n ‘RED BLUFF’ tmp.0*.*
tmp.0612081519.dat:28595: 725910 401 1223 103 RED BLUFF USA 1991 2006 101991 -999.00
tmp.0702091122.dtb:171674: 725910 401 1223 103 RED BLUFF USA 1878 1980 101878 -999.00
tmp.0704251819.dtb:200331: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
tmp.0704271015.dtb:254272: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
tmp.0704292158.dtb:254272: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
crua6[/cru/cruts/version_3_0/db/testmergedb]
The first file is the 1991-2006 update file. The second is the original
temperature database – note that the station ends in 1980.
It has *inherited* data from the previous station, where it had -9999
before! I thought I’d fixed that?!!!
Yeah, baby, you fixed it all right.
1 station data set smoked on the CRU barbie.
As a Software Developer, I know that programmers often impart their stream of consciousness into the code in the form of comments. But from reading the above (particularly debreuil’s quotes), it seems clear to me there’s quite a substantial confirmation bias in their method.
So come on, folks – time to nominate your favourite email. I realise we’re totally spoilt for choice but which ones stand out.
The ‘nature trick’ is definitely a contender and is far more damaging than Phil Jones and the press are trying to make out, but this only confirmed what I knew anyway.
The surprise for me was the Trenberth effort which includes the immortal lines “but the data are surely wrong. Our observing system is inadequate. ”
Comments on sceptic blogs often suggest that the warmers think that if the data doesn’t agree with the models then the data must be wrong. I always thought this was an unfair exaggeration. But it’s true. These people are beyond satire.
The BBC are hiding behind the very flimsy excuse of “legal reasons” why they cannot reveal details of the emails, the excuse is so transparently dishonest I wonder if they are so desperate to deny people a chance to see the evidence that they would risk using such an obviously false reason for withholding the data?
I suspect that the BBC science and environment departments and reporters are very deeply involved with the scientists at the heart of the scandal, it must be clear that the BBC are covering up for the fraudsters for as long as it takes to either create a backup story or the story fades away.
Whatever the motives of the BBC and their reporters, the longer the delay the more suspicious that delay becomes, perhaps the BBC are willing to take the risk of stonewalling and delaying the actual release of the data considering the damage that releasing that data will have on the BBC.
Interesting quote from the baffled programmer trying to make sense of it all, and finally guessing.. (Harry’s txt)
“…The results are depressing. For Paris, with 237 years, +/- 20% of the real value was possible with even 40 values. Winter months were more variable than Summer ones of course. What we really need, and I don’t think it’ll happen of course, is a set of metrics (by latitude band perhaps) so that we have a broad measure of the acceptable minimum value count for a given month and location. Even better, a confidence figure that allowed the actual standard deviation comparison to be made with a looseness proportional to the sample size.
All that’s beyond me – statistically and in terms of time. I’m going to have to say ’30’.. it’s pretty good apart from DJF. For the one station I’ve looked at….”
previous beeb link didn’t work for me perhaps this might 0.735am spot
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
Lawson suggests funding body NERC and VC should enquire into issues raised
– pigs might fly
I have to say this tickled me from HARRY_READ_ME.txt
“So, once again I don’t understand statistics. Quel surprise, given that I haven’t had any training in stats in my entire life, unless you count A-level maths.”
Given the volume of material and the number of serious issues to be considered how can any of the participants show up in public without a barrage of embarrassing questions ?
Has the entire AGW community just been neutralised for all practical purposes ?
Unless we see an entirely new set of personnel, climate science will be frozen in the spotlight and cannot progress.
“…“…we are having trouble to express the real message of the reconstructions – being
scientifically sound in representing uncertainty…” ”
Now ‘trouble to express’ *does* sound like an ESL grammar error. ‘Trouble expressing’ would be standard grammar.
Nick Stokes (20:34:52) :
I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable. So is there any indication that anyone has plotted beyond 1960? This came up on the Bishop Hill thread, where he drew attention to an email by Tim Osborn where he said that they never plot some treering set beyond 1960 because of a divergence issue. Turns out that that is what Briffa/Osborn say also in Briffa et al 2001. This Briffa/Osborn context may be unrelated, but it seems to me that it may simply just mean what it says. Don’t plot beyond 1960 using this code. And people don’t.
Nick,
I think you are hanging your hat on the paleo/divergence issue. But it looks to me like HARRY_READ_ME is about the code used in CRU TS. I’m not certain about that, but I think we need to know. If so, then the Briffa et al literature acknowledging the divergence in paleo time series really doesn’t apply here. I.e., the “adjust for the decline” in the “Harry” code, and “Mike’s Nature Trick” are two different things.
Ok, haven’t done fortran in 20 years, but if I read this right, it is creating a weighting hash for each 5 year period starting in 1904 (two arrays, 1st is year, second is weighting). The forties area are multiplied by as much as -.3, then in 1960 the ‘fudge’ creeps positive, up to 2.6 in 1980 onwards. It then interpolates this over the data. Please correct if this is wrong…
1904 0.
1909 0.
1914 0.
1919 0.
1924 0.
1929 -0.1
1934 -0.25
1939 -0.3
1944 0.
1949 -0.1
1954 0.3
1959 0.8
1964 1.2
1969 1.7
1974 2.5
1979 2.6
1984 2.6
1989 2.6
1994 2.6
1999 2.6
original code (\FOIA\documents\osborn-tree6\briffa_sep98_d.pro)
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
A better case can’t be made for open science, open software … And honesty. Never trust programs done behind closed doors.
When the secrecy of temperature data, and the software that manipulates it, is higher that that for nuclear weapons design, there is something very wrong with the whole of climate science. The hoax is designed to scam people out of money and make others very rich off the hoax.
Science may never recover from this … Where is the virtue in science fraud.
rbateman (04:00:20) :
Sounds very interesting – could you perhaps explain a little more?
Thanks 🙂
Frank
Ashtoreth (03:52:14) :
Monkton is absolutely correct, we need to take the raw data, the calculations, and build new, verified models and data sets to see what is hapenning BEFORE we spend all this money.
While I agree, we need to realize that there is no longer any raw data, at least at CRU. So it — the raw data — will have to be acquired all over again. Given the that this is now international politics, and not just academics cooperating in the interest of disinterested science, that may no longer be possible.
Can anyone tell me what the relationship between CRU TS and HadCRUT is? While there has been a bit of a kerfuffle over the fact that CRU is not Hadley, do the latter use data from the former in their product?
This is Lawson’s think tank
http://www.thegwpf.org/
Membership a minimum £100
Lots of names you know on the advisory board
Just watched the Politics Show on BBC1 (UK).
Fred Singer and Bob Watson (Chief Environmental Scientist) were interviewed by Andrew Neil. Not exactly a trouncing, but Singer got the easier ride and probably edged it. Watson was surprisingly agreeable and suggested an enquiry should be set up which looks into a) the ‘hacking’ of the emails and b) the contents of the emails.
@Philip Bratby: The BBC coverage has reopened…….
It is beyond question that the climate is changing; that man is completely responsible is very definitely not! That is why I am delighted at the revelations of the CRU at the University of East Anglia.
Complicit in this misrepresentation of the science is the BBC in its TV and radio output. For over 3 years I have been trying to elicit answers from both Mark Thompson (Director General) and Sir Michael Lyons (Trust Chairman). All I had received was sophistry and obfuscation, until I engaged the help of my MP.
Recently it came to light that a report had been commissioned in June 2007 jointly by the Trust and BBC Board of Management entitled “From Seesaw to Wagon Wheel-Safeguarding Impartiality in the 21st Century”. It concluded: ‘There may be now a broad scientific consensus that climate change is definitely happening and that it is at least predominantly man-made… the weight of evidence no longer justifies equal space being given to the opponents of the consensus’.
Despite this damning evidence from their own report, they steadfastly cling to the belief that their impartiality is intact as required by the BBC Charter. Such is their state of denial that Sir Michael Lyons has even tried to deliberately mislead my MP despite evidence I have to the contrary.
In light of this I have posed the question, through my MP: “On whose authority did the BBC cease to be an impartial Public Service Broadcaster, as required by its Charter, and become the judge, jury and sponsor of such dangerously specious political dogma so eloquently described as ‘…the consensus…’?
Answer comes there none! I believe it is time for the BBC to be subjected to an enquiry on this matter.
Also significant…complete lack of response from the Guardian……still peddling the same rubbish…….http://www.guardian.co.uk/environment/2009/nov/22/climate-change-emissions-scientist-watson
If anyone missed this part from debreul:
It says important note, but I guess I missed the memo.
\FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way , and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.‘
HAVE BEEN MODIFIED TO LOOK MORE LIKE THE OBSERVED TEMPERATURES
Game over. They wil HAVE to call this a fake to keep their jobs.
This is spreading like a bushfire this lunchtime (UK time) the Telegraph newspaper’s website stories on this have crashed their servers.
Either that, or foul play is afoot. I dunno, perhaps I have watched too many episodes of BBC’s “spooks” and can imagine the MI5 geek trying to stop this story spreading round the mainstream media.
All I got from the Telegraph site was:
“Gateway Timeout
The proxy server did not receive a timely response from the upstream server.
Reference #1.cae3554.1258980286.0 “
What language are these *.pro files? I am guessing Fortran.
BTW, This was today’s leading story a couple of hours ago, now it is not in their top5 anymore…
Strange? Not really. I suppose if the weight of traffic to that page crashed that page (the rest of the site is OK), then the fact that the page is not being accessed due to that crash would mean that the code counting page views inside that page would not be incrementing its count.
reality vs. modell
http://i50.tinypic.com/301j8kh.jpg
have fun…!
I keep reading posts by team supporters along the line of “is that all you’ve got, that’s nothing.”
I think some of the supporters of the team need to be reminded of MM’s denial when John Finn brought up the issue at “Real”Climate.
“No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstrution. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum.”
As McIntyre observes, the temperature was not “fully grafted”. However you want to describe it the temperature record was included to hide the decline. If you go to the “Real”Climate archives you will see that when Finn persisted in his questions, Mann suddenly became too busy to bother with the issue. Maybe he was, but he certainly avoided having to go into the messy details of what was done.
http://www.realclimate.org/index.php/archives/2004/12/myths-vs-fact-regarding-the-hockey-stick/#comment-345
If what was done is not a big deal, why didn’t Finn get a clear answer then and there.
See “When scientists assume the missionary position”:
http://vulgarmorality.wordpress.com/2009/11/21/when-scientists-assume-the-missionary-position/
” “Overall we find that the observed Northern Hemisphere circulation trend is inconsistent with simulated internal variability, and that it is also inconsistent with the simulated response to anthropogenic and natural forcing in eight coupled climate models.”
Hmmm. The observed reality and the simulated ‘reality’ are out of synch. But note that the writer’s instinct is to call the observed reality, not the simulation, ‘inconsistent’.
Nothing necessarily sinister or duplicitous here, but it is language revelatory of certain habits of thinking
E.M.Smith (23:56:46) :
Wonderful Explanation. Everyone should read what you wrote. Perhaps Anthony will make a blog post about this. The details are amazingly enlightening. Keep it up guys.
For me, as a layperson who has been in financial services marketing and who reads history, the e-mails are proof that these guys knew they were up to no good, and are doing so deliberately.
When you see how aggressive they have been in attacking skeptics, attributing to skeptics vile motives, how they depend on argument by authority, etc. etc. etc., It is clear to me that they have been doing this for a long time.
The code is where they have committed their fraud, and that, fortunately for us who are their victims, cannot be hidden so easily.
Would it not be great if someone in GISS were to have the strength of conscience that this brave person in the UK has demonstrated?
Well here in Aus on DateLine tonight (ABC), we had some guy, don’t recall the name as I caught the tail end of the broadcast, being very “jittery” in answering questions about the content of some of the e-mails, but then, when talking politically and enjoying the fact he’d just got back from Singapore and will soon go to Copenhagen, I thought of 400kg polar bears, he, just grinned.
Nice one if you can get it.
Asked realclimate in what context the topic quote should be taken
http://www.realclimate.org/index.php/archives/2009/11/the-cru-hack-context/
http://www.cru.uea.ac.uk/ is entirely down right now if you all hadn’t noticed.
And the Copenhagen prpoganda machine continues at full speed.
http://malaysia.news.yahoo.com/ap/20091123/tbs-sci-climate-09-post-kyoto-f8250da.html
Nick Stokes (20:34:52) :
As another poster mentioned, you didn’t give the full quote. It says: ” Uses “corrected” MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
In other words, the data will be ‘artificially’ adjusted so that it is more consistent with HADCRUT3. What do you suppose he meant by ‘artificial’? To me it suggests using something that is not measured data. This seems to be perilously close to fraud.
Chris
Cassandra King (04:12:03 …..see my earlier post @ (04:41:24) :
“These scientists at East Anglia are both honourable and world class. Their data is not being manipulated in any bad way”.
In line with AGWThink, it is being manipulated in a *good* way.
After reading your post, I read through some of the code directories and wrote a post on my blog at http://matthewjweaver.com/index.php/about-the-commented-code-released-by-the-alleged-hacker/. In this, I do not reach the same conclusion:
First, these are in a subdirectory called “oldprog” and risks being taken completely out of context. While I do not personally program in Progress, the code is rather easy to read but these comments do not tell the whole story. I’ve now looked through more than a dozen code files and most look rather innocuous. What I wonder about is the data input and the weighting assigned to the data as it is processed by the code. There are some data files but I’m guessing most is missing, of course, and the latter, while in the code, is not as easy to discern and judge.
Among the data in the code directories are limited tree-ring data, temperatures, and more. Maybe summaries, sample files, or results? I do not have the time to compare this data with external datasets to see what is real and what is made up. Nor do I have time to read the code to not only spot the weighting and calculations but to interpret their impact. What I do wonder about as I read through the files is how and why specific ranges were chosen for normalization of data and averaging. Consider, too, how missing data was filled in or ignored, and, as well, the impact of solar cycles and other influences.
The bottom line is that all of this is ripe for manipulation to achieve whatever results are desired. Which gets us back to the email. These researchers were Kool-Aid drinkers for the religion of global warming. They have vested interest in producing results to support their cause. The email makes this abundantly clear and shows their willingness to modify data, ignore inconvenient data, destroy data, and actively prevent independent analysis of their data.
” Yes, yes….science SHOULD rule (am in total agreement there).”
Science should rule what?
E.M. Smith’s long post on the plight of Harry is spot on. As a long-time software engineer, I’ve had to wade into poorly written/poorly documented code myself before that was written by people no longer accessible anymore. Every ‘Harry’ in this kind of situation has a job to do: figure out what the code does well enough to get it to run, probably with some additional inputs or new requirements that necessitate changing the code some – if it can be figured out. ‘Harry’ usually is given a very tight deadline, so he doesn’t have time or the approval just to start over from scratch. And ‘Harry’ certainly was not brought in because he knows climate – he has to pick up tidbits along the way to help him guess if what he is writing makes any sense – and when it doesn’t he will just try his best to make it do what the big wigs say it should do. They are defining the requirements – and changing them every few days or weeks, and define what correct means for Harry’s programs. (Harry didn;t write them, and never would have done done it that way, but they are his now, for better or worse. Just gotta try to meet those deadlines…)
But like many programmers in this situation, Harry prefers black and white – correct versus incorrect, so when Harry sees something that is particularly messed, he will also sometimes add colorful commentary to his notes because then when he reads it again next year he’ll remember that he already figured out it was messed up and won’t agonize in another futile effort to make it make sense. He’s not even expecting anyone else other than another ‘Harry’ to read those comments, and the next ‘Harry’ will appreciate the comments and share the chuckle at the crap that they are made to debug and run.
If it sounds like Dilbert – it’s because what Scott Adams pokes fun at is how stuff REALLY happens. Fortunately for us, Harry did what he did, and it will probably help some of us who are looking at the programs and hoping to get some of them running ourselves. Fortunately, GDL is free and is supposed to run IDL programs as-is, though I haven’t finished setting it up myself (had to get coLinux running first), but it will be very interesting. Then there’s the Fortran stuff – that may be trickier to try to run as is – not likely to find a compatible fortran compiler. Might have to port the code to C++ or C#. If anyone else decides to try having a go at porting some of this to modern languages – perhaps we can collaborate.
Thank you, Harry, for all the clues! (whoever you are)
I work in programming so I had a look through Harry_readme I was pretty appalled but when I figured out they were talking about how to extract the data from Hadcrut2.1 to get to version 3.0 I was shocked. If it is this then GCMs have major problems.
As the developer trying to sort out the mess said
“So, we can have a proper result, but only by including a load of garbage!”
The data has clearly been manipulated to get the results they wanted because;
1. Poor data management meant they hadn’t got the original data.
2. Poor programming techniques meant they had no idea how significant amounts of data in 2.1 was generated.
3. Lack of documentation in the original code meant they did not know why data had been manipulated.
The data was changed to match the expected outputs. None of the papers dependant on Hadcrut3 are reliable as Hadcrut 3 itself is unreliable.
Normally in software development we do a thing called “code review” were developers review each others code – it can be fun but it is a blood sport. The stuff described here (even the first of the many, many manipulations) would have you laughed out of the room and demoted to business software tester.
The code for all models needs to be released and reviewed, as well as source data, why it is being transformed, what the calc for the transformation is.
We cannot agree to implement Copenhagen without this. If everything is on the up and up and makes perfect sense then we go ahead but otherwise more proof of the forcings (+ & -) and CO2 impacts are going to be required.
“This website is currently being served from the CRU Emergency Webserver.
Some pages may be out of date.
Normal service will be resumed as soon as possible. ”
LOL
http://www.cru.uea.ac.uk/
Mr Delingpole writes again
http://blogs.telegraph.co.uk/news/jamesdelingpole/100017546/climategate-why-it-matters/
Alec J (23:59:14)
Thanks for the heads up. I just listened to the short intervie won the Radio 4 Today programme.
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
Bob Watson stayed right on ‘message’ as you’d expect him to given that his attempt to become Chairman of the IPCC failed. He clearly hasn’t spent any time whatsoever looking into the details of what has been released in the emails and data files.
Nigel Lawson has quite rightly called for the NERC (who fund CRU, the Tyndall Centre etc) and the Vice Chancellor of UEA to set up an independent enquiry into the contents of the released emails/documents. I fully agree with this but don’t think this is ever likely to happen. There is evidence in at least one of the emails released that the VP of UEA has even been in support of CRU blocking the FOIA requests. CRU have even received advice from the ICO on ho wto block the requests. Its very clear that neither the NERC or the universities administration are hardly going to be independent in investigating this matter. Are they?
As a bit of clarification in my response to Nick, I’m now actually reading the Harry_Read_Me.txt file for myself. It is an attempt to update CRU TS, but it also references tree ring data and the problem of “the decline.” So, it would seem, on this first quick look, that prior to 1960, the world temperature data in this gridded set are being forced to look like the tree ring data, but after 1960, “real” temperatures are used. Is that a fair reading?
Our local “newspaper” this morning ran this headline as the top story: “Since ’97, global warming has just gotten worse”. It runs through all the usual AGW talking points; i.e., polar bears are threatened, Arctic ice is at an all-time low, huge chunks of ice are breaking off of Antarctica, glaciers are disappearing, etc. It’s an AP story from Seth Borenstein, and it’s supported by quotes from, among other sources, Janos Pasztor, a “UN climate advisor.” If you follow the jump to page A4, there’s an accompanying story (finally!) about these emails, but the tone of the story is accusatory. That is, some dastardly hackers illegally obtained information and are bent on spreading lies!
I’d say this is a good sign. The AGW coalition has to be very desperate and alarmed (no pun) to find themselves in such a tenuous position! They’re even beginning to run stale global warming stories as banner headlines! “Pay no attention to that man behind the curtain!”
it seems my post at RC is being held up in the queue at the moment with other posts in favour of RC sailing through.
wasn’t one of the alleged emails about holding up and censoring posts on RC?
Ed (23:36:26) :
You can say that again. 🙂
It’s also not surprising, as you will know, that those not trained as Software Developers, or in Computer Science in general, would have great faith in computer models, even going as far as to suggest that there’s something wrong with reality if it doesn’t match the model! There’s something magical about a computer, if you don’t program them for a living.
The next question is: what do they mean by the “real temperatures” that the programs are adjusting to? Surely not the ones affected by the demonstrated warming biases, faulty station siting, dropouts, and questionable recording standards.
E.M.Smith (23:56:46) :
Yet another very good post from you, an entirely accurate if not a bit cynical but then programmers like these make you very cynical and thoroughly peed off.
Well done EM
Tonight here in Australia for the first time Tony Jones from “Lateline” began to ask the hard questions with an expose of the CRU scandal as well as an interesting interview with a very nervous Tim Flannery who attempts to represent a viable AGW platform here in Australia.
It’s a huge week here.
ABC links here:
http://www.abc.net.au/lateline/content/2008/s2751375.htm
http://www.abc.net.au/lateline/content/2008/s2751390.htm
I’ve got my popcorn ready.
Keep at it fellas….
Danny V (05:11:32) :
And the Copenhagen prpoganda machine continues at full speed.
http://malaysia.news.yahoo.com/ap/20091123/tbs-sci-climate-09-post-kyoto-f8250da.html
No doubt that story was already written and in the printing queue before the Hadley story broke. I expect we’ll see more of this kind of thing for the next few days.
When a dam breaks, it always starts with a trickle.
Jay (05:01:28) :
“E.M.Smith (23:56:46) :
Wonderful Explanation. Everyone should read what you wrote. Perhaps Anthony will make a blog post about this. The details are amazingly enlightening. Keep it up guys.
”
E.M.Smith is allways a good read 🙂
“michael (04:54:48) :
reality vs. modell
http://i50.tinypic.com/301j8kh.jpg
have fun…!
”
Michael: where does these data come from??? Interensting.
P. Gosselin 01:33:38
Can explain to me why the sunspot count of the widget shows 13 when there are just little specs?
This is not sarcasm. I have noticed several times lately that the “count” has been in that range when spots are barely visible.
Are counts today the same as they would have been a century ago? We have been told that, but I’m beginning to wonder.
Correction to : 06:02:03
Not on the widget, but on the “Solar-Terrestrial Data”
Jesse (21:24:11) :
“Once again, you guys are making mountains out of ant hills. This is just normal data processing per the 1960 divergence problem as shown by NUMEROUS sources. This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.”
Well Jesse I believe that apart from emails there was also a bit of data. When its all scoured over by scientists who previously were denied access we might see the making of a mountain. 🙂
E.M.Smith (01:47:51) :
I used to teach Software Engineering many moons ago and one of the tricks we pulled on the students was designed to mimic the process you have defined.
What we did was send each group of students into a seperate room and asked them to write a program sequence in english. The sequence was to be used to instruct someone to make a cup of tea. Youknow … get the kettle, fill it with water from a water tap, plug it in etc etc. It was hilarious. BUT very enlightening
sorry about the upside down M
Pay attention to the code into briffa_sep98_d.pro:
************************************************
;
; Now prepare for plotting
;
loadct,39
multi_plot,nrow=3,layout=’caption’
if !d.name eq ‘X’ then begin
window,ysize=800
!p.font=-1
endif else begin
!p.font=0
device,/helvetica,/bold,font_size=18
endelse
def_1color,20,color=’red’
def_1color,21,color=’blue’
def_1color,22,color=’black’
;
restore,’compbest_fixed1950.idlsave’
;
plot,timey,comptemp(*,3),/nodata,$
/xstyle,xrange=[1881,1994],xtitle=’Year’,$
/ystyle,yrange=[-3,3],ytitle=’Normalised anomalies’,$
; title=’Northern Hemisphere temperatures, MXD and corrected MXD’
title=’Northern Hemisphere temperatures and MXD reconstruction’
;
yyy=reform(comptemp(*,2))
;mknormal,yyy,timey,refperiod=[1881,1940]
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=22
yyy=reform(compmxd(*,2,1))
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
;
oplot,!x.crange,[0.,0.],linestyle=1
;
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,[‘Northern Hemisphere April-September instrumental temperature’,$
; ‘Northern Hemisphere MXD’,$
; ‘Northern Hemisphere MXD corrected for decline’],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,[‘Northern Hemisphere April-September instrumental temperature’,$
‘Northern Hemisphere MXD’],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
;
end
****************************************
Attention to the code after “; Apply a VERY ARTIFICAL correction for decline!!”
Cheers
“Methow Ken
…If it looks, walks, flys, swims, and quacks like a duck, theoretically it still COULD be something else. But lacking overwhelming evidence to the contrary, odds are REAL good that it’s a duck. The duck quacked in this case, and their goose is cooked.”
I think it could only be a duck up until 1960. After that, the data had to be modified and it began to smell more like a giraffe, or something… :>)
JimB
Re: Patrik (05:08:57) :
“http://www.cru.uea.ac.uk/ is entirely down right now if you all hadn’t noticed.”
Maybe they are busy shredding documents, deleting files and zeroing out the cleared hard drive space?
I gotta believe that certain people at GISS (and probably other similar organizations) are considering deleting old emails, data, and code.
A few more press links:
A better (?) link to Lawson-Watson on BBC:
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373677.stm
The Lawson Times article now sindicated to The Australian for Tues (Oz time)
http://www.theaustralian.com.au/news/opinion/copenhagen-deserves-to-fail/story-e6frg6zo-1225802514603
And The Australian now making the link with the Oz legislation (ETS) debate with a response from the Opposition Senate leader (Minchin)
http://www.theaustralian.com.au/news/features/hot-and-bothered/story-e6frg6z6-1225802504484
While Fairfax helping out with the defence:
http://www.theage.com.au/national/email-scandal-rallies-web-climate-sceptics-20091123-iysr.html
I can think of at least 3 nuclear power plants that were shutdown and are now completely inpoerable because of problems with their documentation that was similar to the issues that have been identified in these documents. The nuclear industry is one industry where there are quite rigorous standards for code validation and verification, with well-established, open international standards for evaluating models against data.
I hope that this episode leads to more investigation of environmental mischaracterization of data and cherry picking, starting with the DDT travesty.
Gregg E. (02:58:54) :
“The file HARRY_READ_ME.txt is *very revealing* about just how disorganized CRU’s data and software are. “Harry” is apparently Ian Harris. If he’s the author of that file, it appears from the notes that he’s trying to straighten things out but finding that the data and previous software is a complete charlie foxtrot.
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13
I’ll call it +1 probability that the author of HARRY_READ_ME.txt is the insider who took this archive out of CRU.”
I think you are almost right in your deduction Gregg. If you look a bit further into some of the other files I think its likely the person providing the comments in the HARRY_READ_ME.txt is a ‘contract programmer’ brought in to assist Ian Harris in sorting out this mess. This ‘code monkey’ has not shared the goals of the Team and at some point has had enough (perhaps they didn’t renew his/her contract at some point befo rethe release) and has decided (because he/she still has remote access to the CRU departmental server) to assemble all the documents he/she could and take copies of the .eml files of certain staff and following the final FOIA brush off email, decided to release the emails and files to the internet.
If I’m honest and had been in the same position (i.e. to know that UEA were deliberating commiting a crime in not complying with the FOIA requests and wer ein the process of attempting cover up their mess), I’d probably have done the same.
debreuil already had noted the “trick” into the code from my last Post. Kudos to him!!
Is past musings, some have speculated about scientific/academic fraud and what actually constitutes scientific fraud. Have we reached that point yet?
All,
Fox News Channel (US) has picked up the story. As of ~0830 US Central Time they were doing a segment on air.
as expected, my post has been deleted from RC
Great post on NYT weaseling
http://www.weeklystandard.com/weblogs/TWSFP/2009/11/nytimes_we_wont_publish_statem.asp
Some of the revelations in the emails have been quite shocking – even worse than I thought. There’s some interesting samples on Andrew Bolt’s blog. However, what I find even more interesting is the response of the warmists. I am waiting to see if any of them – even one – says, hmm, maybe I was wrong; maybe these guys have been subverting science.
But no. The reaction is one of absolute denial (in the non perjorative sense). I am reminded of our old friend the cognisant dissonance again. Psychology teaches us that this is exactly the behaviour to be expected – mentally trying to rearrange the facts to somehow remove the dissonance. Hence we have “mountains out of molehills”, “taken out of context” etc. All very interesting and revealing behaviour.
Yet, in a way, these individuals have a metaphorical noose around their necks. The floor on which they stand is very slowly moving downwards, but they still have time to remove the noose before the rope tightens. The price they must pay is the repudiation of their cherished beliefs. Yet they do nothing except argue, in the hope that their arguments will somehow be heard and the floor will stop descending. The longer they leave it the worse it gets. Too bad.
Latest gloss-over from the Guardian.
http://www.guardian.co.uk/environment/cif-green/2009/nov/23/leaked-email-climate-change
Nothing to see here, folks, move along, all is well..
If the instrument temperatures do not agree with tree ring data over significant recent time periods, then why should we think the tree ring data is accurate for pre-instrumental times?
on the optinons how to handel data:
Jones:
Options appear to be:
1. Send them the data
2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.
This lot is just stunning!
– nd SteveMc must be feeling pretty vindicated at the moment!
– I bet he had a good weekend!
– I love the harry_readme file – it’s just amazing
– the code/data quality of the HadTemp product is so great (sarcasm)
– I think they’ll have to withdraw that from publication after this!
– and the code comments about adjusting the data post 1960 ‘to hide the decline’
– great stuff.
The other thing the emails give is a great insight into how the HockeyTeam operates
– suppressing disent, controlling publications & reviews, hiding data they don’t like, – and diverting attention from the real issue (e.g. ‘hiding the decline’) onto something they can toss back & forth ad infinitum (e.g. ‘trick’)
By the way, the ‘decline’ is the way the northern hemisphere tree-ring data doesn’t track temp after about 1960
– it does the opposite (i.e. declines)
– it does seem amazing to me that they seems to have been no attempt to find the scientific explantion for this, and just loads of software tweaks to hide it….
– for if our great proxies don’t track temp reliably in the current time & recent past, why should we suppose that they do 1000 years ago??
Well, we live in interesting times!
kdkd wrote:
1. Make the unstated assumption that the paleoclimate data is the whole of the co2 forced global warming story,
The skeptics didn’t build up the importance of paleoclimate/proxies. That was done by the Hockey Team and the IPCC (to their detriment, I believe). The warmest it’s been in XXXX (insert your favorite number) has been trumpeted in countless press releases. That’s the doing of the warmies, not the skeptics. McIntyre is on record at CA saying the proxies aren’t really that import to the question of whether or not Co2 causes warming–if that’s also your point, why is Phil Jones/Michael Mann et al in such a tizzy?
2. Hone in on increasingly small parts of the story and ignore the big picture. You have to do this because the big picture doesn’t support your position at all.
Besides the climatology/proxy stuff I guess there’s also the all the modeling/sensitivity issues. Is that the “big picture”? Most of us here are less than convinced.
3. Fail to respond to counter arguments from the originators of the research.
Do you mean the lack of peer-reviewed “skeptic” articles? If so then you might be interested in the emails talking about manipulating the peer review process and efforts to remove the editors of journals that dare to publish anything the Hockey Team doesn’t support. You strike me as a recent refugee from RealClimate so if you mean instead the lack of followup “skeptic” posts at RC, then you you might want to hang around here long enough to see the references to censorship over there. It’s easy to win debates when you have the power to silence your critics.
If you are in fact a frequentor of RealClimate you might like to try a little experiment. Why don’t you try posting something along the lines of “The folks at ClimateAudit and Watts Up with That claim that you practice very selective censorship of the skeptical arguments and response posts at RC. Is this true?” You might also ask if McIntyre or Watts could have a thread on RC to make their case. Report back.
“Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.
Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result”
,,,,,,,,,,,,,<<<<<<<Yes indeed. I posted that they would claim all comments were taken out of context last Friday. They did as I predicted. Now Real climate is starting a thread to do so. This is SPIN. Covering what they wrote privately to save face. As a Psychologist, this is common as a defense mechanism when people feel threatened.
Over here in Britain this story is starting to build up a head of steam. There are few voices objecting to transparency.
My reply to the Revkin/Pierrehumbert blog post
Oh please.
First of all, there’s no indication there was any vandalism involved here. In fact, it’s not even clear there was hacking involved; this looks very much like an inside job. There are no “honey, pick up a pint of milk on the way home” emails, which is interesting in itself. My own suspicion is this particular collection is one Jones himself made of documents he didn’t want released in an FOIA.
Second, scientists, such as myself, who work at public institutions are or should be aware that our email accounts are subject to FOIA. I have my own laptop with a non-university wireless modem that I use for personal business. Use of public property for personal purposes may be tacitly permitted in many places, but it shouldn’t be protected or excused.
And third, this would not have been such an issue if Jones and his cohorts had not been actively trying to hide their raw data, a practice that modern science increasingly frowns on. This is now the third embarrassment that has come from that practice (the first being the revelation of the loss of a lot of original climate data — though one must now ask if that loss was accidental — and the second the serious issues revealed a couple of months ago with Briffa’s analysis of data.). One would hope scientists would learn from their mistakes.
Jesse (21:24:11):
This is what happens when a bunch of uninformed amateurs try and “debunk” real scientists. Leave the science to the scientists and go back to your day jobs as custodians, wal-mart employees and laborers.
REPLY: So what do you do down there in Norman? NSSL? U of OK? You might be surprised at the sort of professionals that frequent here. I invite them to sound off. – A
Hi Jesse,
Your ad hominem failed because the custodians and day laborers regularly show themselves to be equivalent to climate scientists and occasionally prove to be better than proponents of Anthropogenic Global Warming. That does not say much about those professional folks at CRU.
As for myself, I am a laborer as an Electrical Engineer who works on weapon systems to be used in a time of war. My day job frequently becomes my night job and sometimes my weekend job. The products of my organization incorporate this feature called “Configuration Management” and if the product has software in it, the software is written to these things called “Standards”. Before any product of my organization is transitioned from development to the military, it has to go through several program reviews, design reviews, technical evaluation, and operational evaluation. Software used in modeling a weapon has to go through Verification, Validation, and Accreditation before it is accepted. Funny, I do not see any of that in the products from CRU.
If you would like to find out about building software to a professional standard, get two programmers together who had to write MIL-SPEC code back in the late ‘80s or early ‘90s and use the words ‘ADA’ and ‘twenty one sixty seven A’ in the same sentence. Bring popcorn; the discussion will extend well into the night.
I see they mention the MWP in file 0845217169:
“There were also long warm spells between 900 and 1100, known as the medieval warm period, and 1360 to 1560. ”
I thought the hockey stick got rid of it?
I’m gonna need a whole boat load of popcorn!