Climategate: The Smoking Code

NOTE: Part 2 of this story has been posted: see The Smoking Code, part 2

The Proof Behind the CRU Climategate Debacle: Because Computers Do Lie When Humans Tell Them To

From Cube Antics, by Robert Greiner

I’m coming to you today as a scientist and engineer with an agnostic stand on global warming.

If you don’t know anything about “Climategate” (does anyone else hate that name?) Go ahead and read up on it before you check out this post, I’ll wait.

Back? Let’s get started.

First, let’s get this out of the way: Emails prove nothing. Sure, you can look like an unethical asshole who may have committed a felony using government funded money; but all email is, is talk, and talk is cheap.

Now, here is some actual proof that the CRU was deliberately tampering with their data. Unfortunately, for readability’s sake, this code was written in Interactive Data Language (IDL) and is a pain to go through.

NOTE: This is an actual snippet of code from the CRU contained in the source file: briffa_Sep98_d.pro

[sourcecode language=”text”]

;

; Apply a VERY ARTIFICAL correction for decline!!

;

yrloc=[1400,findgen(19)*5.+1904]

valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75   ; fudge factor

if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’

yearlyadj=interpol(valadj,yrloc,timey)

[/sourcecode]

Mouse over the upper right for source code viewing options – including pop-up window

What does this Mean? A review of the code line-by-line

Starting off Easy

Lines 1-3 are comments

Line 4

yrloc is a 20 element array containing:

1400 and 19 years between 1904 and 1994 in increments of 5 years…

yrloc = [1400, 1904, 1909, 1914, 1919, 1924, 1929, … , 1964, 1969, 1974, 1979, 1984, 1989, 1994]

findgen() creates a floating-point array of the specified dimension. Each element of the array is set to the value of its one-dimensional subscript

F = indgen(6) ;F[0] is 0.0, F[1] is 1.0….. F[6] is 6.0

Pretty straightforward, right?

Line 5

valadj, or, the “fudge factor” array as some arrogant programmer likes to call it is the foundation for the manipulated temperature readings. It contains twenty values of seemingly random numbers. We’ll get back to this later.

Line 6

Just a check to make sure that yrloc and valadj have the same number of attributes in them. This is important for line 8.

Line 8

This is where the magic happens. Remember that array we have of valid temperature readings? And, remember that random array of numbers we have from line two? Well, in line 4, those two arrays are interpolated together.

The interpol() function will take each element in both arrays and “guess” at the points in between them to create a smoothing effect on the data. This technique is often used when dealing with natural data points, just not quite in this manner.

The main thing to realize here, is, that the interpol() function will cause the valid temperature readings (yrloc) to skew towards the valadj values.

What the heck does all of this mean?

Well, I’m glad you asked. First, let’s plot the values in the valadj array.

Artificial Hockeystick Graph

Look familiar? This closely resembles the infamous hockey stick graph that Michael Mann came up with about a decade ago. By the way, did I mention Michael Mann is one of the “scientists” (and I use that word loosely) caught up in this scandal?

Here is Mann’s graph from 1999

mann-hockey-stick-graph

As you can see, (potentially) valid temperature station readings were taken and skewed to fabricate the results the “scientists” at the CRU wanted to believe, not what actually occurred.

Where do we go from here?

It’s not as cut-and-try as one might think. First and foremost, this doesn’t necessarily prove anything about global warming as science. It just shows that all of the data that was the chief result of most of the environmental legislation created over the last decade was a farce.

This means that all of those billions of dollars we spent as a global community to combat global warming may have been for nothing.

If news station anchors and politicians were trained as engineers, they would be able to find real proof and not just speculate about the meaning of emails that only made it appear as if something illegal happened.

Conclusion

I tried to write this post in a manner that transcends politics. I really haven’t taken much of an interest in the whole global warming debate and don’t really have a strong opinion on the matter. However, being part of the Science Community (I have a degree in Physics) and having done scientific research myself makes me very worried when arrogant jerks who call themselves “scientists” work outside of ethics and ignore the truth to fit their pre-conceived notions of the world. That is not science, that is religion with math equations.

What do you think?

Now that you have the facts, you can come to your own conclusion!

Be sure to leave me a comment, it gets lonely in here sometimes.

hat tip to WUWT commenter “Disquisitive”

========================

NOTE: While there are some interesting points raised here, it is important to note a couple of caveats. First, the adjustment shown above is applied to the tree ring proxy data (proxy for temperature) not the actual instrumental temperature data. Second, we don’t know the use context of this code. It may be a test procedure of some sort, it may be something that was tried and then discarded, or it may be part of final production output. We simply don’t know. This is why a complete disclosure and open accounting is needed, so that the process can be fully traced and debugged. Hopefully, one of the official investigations will bring the complete collection of code out so that this can be fully examined in the complete context. – Anthony


Sponsored IT training links:

Join today for 646-985 exam prep and get a free newsletter for next 642-072 and 1z0-050 exams.


0 0 votes
Article Rating
276 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
INGSOC
December 4, 2009 5:38 am

Thanks for the great article! Lots to digest. Be prepared for the media blitz that is building however. AGW aint dead yet! I hope we see many more “agnostics” stand up for truth in science. We can beat back the drumbeat of lies by MSM.
Cheers!

pwl
December 4, 2009 5:40 am

Line 8 appears to be missing from the listing.
Excellent article.

pwl
December 4, 2009 5:41 am

Oh, you mean the 8th line since they start with line “00”. Using the files line numbers might make the article more clear.

MangoChutney
December 4, 2009 5:43 am

i guess Phil Jones leave of absence could be a long one

pwl
December 4, 2009 5:43 am

I can’t comprehend their justification for this obvious blatant fraud.
What was Mann thinking when he manNipulated this data in this manner?

Medic1532
December 4, 2009 5:45 am

Thanks for the explanation for those of us who aren’t fluent in the various computer languages (I think I still remember how to write a BASIC program lets see 10 Print “Hello World”;20 goto 10 ) I figured the EMail uproar was just the first shot.
Medic1532

Anton
December 4, 2009 5:45 am

The clear explication provided here highlights a comment I made (as did many others) yesterday. Exclusive focus by the UEA investigation (or the MSM) on the emails would be a perfect way to miss the main point of the released material. Even if the the emails had been entirely innocuous, the computer codes, etc would be more than enough to prove fraud.

Varco
December 4, 2009 5:46 am

I believe the emails are acting as a distration for the media from a key topic of what what actually done with (to?) the data by way of programming. I found this explanation extremely lucid and hope it inspires passing journalists to review the Harry readme file. I’m not sure you need to be a programmer to understand that ‘Harry’ is not describing best scientific practice let alone programming!
Hopefully work such as this will lead to more detailed investigation of data analysis (manipulation?) techniques being employed by all the major climate data providers?

December 4, 2009 5:47 am

So the Mann-made hockey stick proxy end was artificially elevated to match the “instrumental record”. The same story as Briffa´s Yamal, where the hockey blade was achieved by cherry-picked series.
The red “instrumental record” is GISS 1890-1999: http://www.woodfortrees.org/plot/gistemp/from:1890/to:1999
Can we use f- word yet?

Editor
December 4, 2009 5:52 am

pwl (05:40:23) :
> Line 8 appears to be missing from the listing.
Anthony needs to put a blank line at the top of the <pre> block to get the line numbers in sync. Currently everything is off by one.
REPLY: the WordPress formatting did that, I’ve switched to a different formatting tool that has a little popup toolbar in the upper right to allow window viewing of long lines of code – Anthony

MangoChutney
December 4, 2009 5:54 am

just a thought, if the emails and code were obtained by a hacker and not released by a whistle blower, would the evidence be inadmissible in a court of law?

December 4, 2009 5:55 am

“It just shows that all of the data that was the chief result of most of the environmental legislation created over the last decade was a farce.”
The code is awful. But do we know that the output of the code discussed here was actually published somewhere? I didn’t yet find an explanation of what the significance of this program is, and how it has been used.
(should “cut-and-try” not be “cut-and-dry”?)

SHIRAKAWA Akira
December 4, 2009 5:56 am

The source code should start from line 01, not 00.

December 4, 2009 5:58 am

The end!!!
This following short analysis might be easy to understand the reasons why the ’so-called scientists’ would not release the code for scrutiny. If you can’t independently run the code and get the same results as the output the Fraudsters display, then something is wrong. If you can run the code and get the output displayed, then you only need analyze the software code to see what is wrong that produces the FAKED hockey stick. Easy for software engineers, not so easy for lay people — Which is what this is based on, fooling lay people into paying massive new taxes.

bill
December 4, 2009 5:58 am

Hang on a moment! This code is from 1998.
If this fudge were included in the Briffa etc. documents then there would be no decline.
So all those “hide the decline emails” would be irrelevant.
The only conclusion is that this bodge was done to see what adjustments would have to be made to fit the temp record.
Having done that it was not used for any submitted work

Varco
December 4, 2009 6:00 am

Off topic, but can we expect an update to the surfacestations work anytime soon? Papers were mentioned at the time of the mid-term census report?
I think more than anything it was the early results from the Surfacestations project that convinced me descriptions of ‘settled science’ were horribly compromized. While Climategate is rightly grabbing attention at the moment perhaps passing journalists would care to check out surfacestations.org for examples of how data collection and reporting should be undertaken. The hard work of the volunteers in collating the damning (IMHO) evidence about the temperature measuring network, and the transparency in which the data has been recorded and presented is commendable and should be given public recognition – we owe these people a debt of thanks. Most importantly it shows there is a way forward for climate science that can be believed, a breath of fresh air in an otherwise fetid atmosphere?

durox
December 4, 2009 6:02 am

ONE international is sending emails all over the web. once you click the link in the invitation, you sign their petition. you can sign as many times as you want by just clicking, which i find to be in bad taste.
for more info visit http://one.org/international/actnow/copenhagen/index.html?rc=copenhagenconfemail
and pls write about this ongoing unfair effort. thanks

December 4, 2009 6:02 am

Well we all know that the code was realy the smoking gun. Hopefully the MSM will stop trying to gloss over the issue here and start truly attacking the guilty.

Leon Brozyna
December 4, 2009 6:08 am

And there we have, in simple form, a recipe for cooking the books.
Give a person weak in science and ethics some information on programming and statistics and they’ll be able to “prove” anything they want.

rbateman
December 4, 2009 6:08 am

Now there is a thought:
Re-write the IDL into BASIC.
Lots of people understand Visual Basic, and could get it.

Jeremy
December 4, 2009 6:09 am

Yes I saw this code about a week ago when it first appeared. Your article above is correct. Whether it was used and for what published journals may be difficult to prove, however, the similarity to the trend in the hockey stick plot is a “smoking gun” if I ever saw one.

slow to follow
December 4, 2009 6:10 am

bitbutter – yes, this should be said loud and clear. And investigated before too much is built on it. Has anyone seen anything establishing the provenance of this code?

Alberto
December 4, 2009 6:10 am

I’m missing some context here. I would like to know for which series/reconstructions this program was used for, what were the reasons for the adjustments etc.
Without the proper context, it’s difficult to determine the implications.
So a follow up would be nice.

Burch
December 4, 2009 6:10 am

Sorry, I don’t know IDL. In the initialization of the array valadj, there is a ‘*0.75’ at the end. What does that accomplish? Multiply each initializer by 0.75? If so, then why not just put the proper values into the initializer string? Is it possible that the ‘fudge factor’ comment refers only to the 0.75 constant and not the entire array? Is there anything in the code that explains where the constants in the initializer string came from?
There are two ways to view this. One is they were sincerely trying to correct for something, the other is they were cooking the books to show a trend that did not exist.

Chilled Out
December 4, 2009 6:12 am

Line 5 – “valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor”
I’d love to see any of the IPCC climate scientist try to explain these numbers – take an apparently arbitrary set of numbers, multiply by 0.75 and bingo we have a “hockey stick” curve.
Note their explantion needs to be backed up by released data, and detailed analysis to explain how each of the factors has been derived and the independent quality assurance that has been applied to the calculations/methodology.
Will we get it – not f**king chance from the CRU, coz they have lost/mislaid/deleted the data 😉

Thomas J. Arnold.
December 4, 2009 6:13 am

It is precisely what data set scrutiny requires.
A type of public, objectivity and as you put it “uninterested” stance that leads to in depth analysis and results in conclusions which are accurate and that can be believed, Mr. Greiner.
People like yourself, a Physics grad’, familiar in Pure science, with no axe to grind and with other tech’ skills, are needed, fancy a job?…………There will be some going in East Anglia soon…………… I predict.
Trouble is……can you do it for altruistic purposes? After the scandal there will be no money in it, gov’ research funding has taken a big hit.

December 4, 2009 6:15 am

1. This code section was mentioned in a previous WUWT thread.
2. Note that the valadj array is multiplied by 0.75, so you graph should show 1.95 as the top value, not that it makes much difference in the primary argument.
3. One question I have is whether or not this code segment was actually used to create a published figure. As someone who has written a lot of code, I have often created phony data just to test certain routines. As far as I know, Briffa has not made this claim, so to make it now would be suspicious. If someone can show that this code was used to (say) create a graph for an IPCC report, then that would definitely be a smoking gun.

mikey
December 4, 2009 6:16 am

Wow everyday more and more is coming out.
slightly off-topic but this new oped on CIF is just shocking:
http://www.guardian.co.uk/commentisfree/libertycentral/2009/dec/04/climate-change-scepticism

wolfwalker
December 4, 2009 6:16 am

I saw what I think is the same chunk of code analyzed on another blog entry a few days ago. If I could remember which one I’d provide a link, but I don’t so I can’t. Sorry. In any case, one of the commenters there raised a very pertinent question. We can see that the variable ‘yearlyadj’ incorporates the fudged data. Is that variable used anywhere? If it isn’t, then there doesn’t appear to be any way for the fudge to actually infect the program output.

PhilW
December 4, 2009 6:16 am

It’s just gone mainstream here in UK!

slow to follow
December 4, 2009 6:17 am

Anthony – do you believe this code is
“actual proof that the CRU was deliberately tampering with their data.”?
If not I think you should caveat this article accordingly.

Douglas DC
December 4, 2009 6:20 am

“However, being part of the Science Community (I have a degree in Physics) and having done scientific research myself makes me very worried when arrogant jerks who call themselves “scientists” work outside of ethics and ignore the truth to fit their pre-conceived notions of the world. That is not science, that is religion with math equations.”
Don’t sugar coat it Pard, tell us what you think.I agree whole heartedly…

John E.
December 4, 2009 6:20 am

I don’t see where this code actually obtains any temperature data. yearloc is just an array of years, right? Does interpol() access a database of temp records for the particular year and then apply the skewing fudge factor array to those temps?
I defintely see how the fudge factor array will create a hockey stick. But to reach the conclusion in the post we need to see how this connects to the temperature record that they published.

December 4, 2009 6:21 am

Hey, thanks for linking to my article 🙂
This website actually was what turned me on to the whole issue.
@Burch, the .75 and “fudge factor” array serve as kind of a guess-and-check hardcoded curve that forces legitimate data to look like whatever the hell the researchers wanted to. Likely the .75 and each individual data value in the fudge factor array were all tweaked little by little until the original data looked like a “hockey stick”, which is what Michael Mann hypothesized a decade ago.
It’s a simple case of falsifying data to fit a pre-conceived feeling.
excellent question.

EdB
December 4, 2009 6:22 am

It is simple. Put the programmer under oath, get a line by line explaination of what was done and who asked for the code, and why. Then put under oath the people that used the results of the code, and ask them what papers, reports, advice, was based on the coded program.
The truth can come out, if the authorities want it. I am a skeptic though.

hunter
December 4, 2009 6:27 am

Edb,
Aren’t we all?

John Galt
December 4, 2009 6:28 am

I think the whole AGW climate doomsday is a travesty of science and has always been a convenient tool for forcing a particular world-view upon us non-believers.
That said, can we verify the authenticity of this code? Perhaps the one thing that might come from this is EUA, GISS, etc., will finally be compelled to release their data and source code.
As some one said, if you can’t replicate the results, it ain’t science.

bill
December 4, 2009 6:29 am

Robert Greiner you state:
Line 8
This is where the magic happens. Remember that array we have of valid temperature readings? And, remember that random array of numbers we have from line two? Well, in line 4, those two arrays are interpolated together.
The interpol() function will take each element in both arrays and “guess” at the points in between them to create a smoothing effect on the data. This technique is often used when dealing with natural data points, just not quite in this manner.
The main thing to realize here, is, that the interpol() function will cause the valid temperature readings (yrloc) to skew towards the valadj values.

Lets look at a bit more of that code:
; Apply a VERY ARTIFICAL correction for decline!!
;yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
yearlyadj=interpol(valadj,yrloc,timey)
Does not this line give a yearly adjustment value interpolated from the 20 year points?
filter_cru,5.,/nan,tsin=yyy,tslow=tslow oplot,timey,tslow,thick=5,color=21
Does not this line plot data derived from yyy
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20

The smoking gun line!!!!
Does not his line plot data derived from yyy+yearlyadj The FUDGED FIGURE
BUT…………
IT’S COMMENTED OUT!!
This is further backed up by the end of file:
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,[‘Northern Hemisphere April-September instrumental temperature’,$
; ‘Northern Hemisphere MXD’,$
; ‘Northern Hemisphere MXD corrected for decline’],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,[‘Northern Hemisphere April-September instrumental temperature’,$
‘Northern Hemisphere MXD’],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5

To me this looks as if ‘Northern Hemisphere MXD corrected for decline’ would have been printed in colour 20 – just the same as the smoking gun line. HOWEVER you will note that this section is commented out also.
So in my view this is code left in after a quick look-see.
Remember engineers ans scientist are human and play if bored and do not always tidy up.
have a look at:
http://micro.magnet.fsu.edu/creatures/index.html

Genaddicted
December 4, 2009 6:29 am

Okay, I hate nitpickers, so I was going to let it go, but someone else asked the question, so here’s the answer:
“cut and try” should not be “cut and dry” either – it’s “cut and dried” – past tense.
Very good post, by the way.

Jack Green
December 4, 2009 6:30 am

You could digitize the output graph that they have published and get the data if they published one with the raw values. If not you could write some code using this code and back calculate the raw values.
Somebody smarter then me could do it. My training as and expert witness and being around a lot of smart lawyers tells me we could get the raw data from what we have.
I would bet the raw data is somewhere on a HCRU computer just sitting there. They whistle blower as I remember when this all started has more data to release or am I remembering incorrectly.

rw
December 4, 2009 6:30 am

From the comments about line nos, I gather that most of you folks are not programmers.
Is this code performing Mike’s Nature trick ?

David Schnare
December 4, 2009 6:30 am

I have queried one of the folks who is familiar with this code and who is a major name in this mess, and he defended the code as follows:
The opening comment: “Apply a VERY ARTIFICAL correction for decline!!” indicates this is not a computer run that would be used in a paper, but is merely one of many runs used to understand how the model operated under different scenarios. He argued that no one could identify a single peer-reviewed paper that relied on this code.
I can believe that one might want to play around with a model to understand how sensitive it is to various scenarios, including presumptions about temperature curves. One would not be looking at the graph, but would be looking at what the forcings would have to be to generate such a graph. This would allow a better understanding of the power of various forcings. It would help tease out which forcings have the greatest significance to model outputs.
There would surely be other ways to reach the same understanding, but one can’t account for how any particular scientist goes about his or her business.
I am forced to reserve judgment until I see how this code was actually used and whether it was used to support a specific academic contribution.

Tim Clark
December 4, 2009 6:31 am

bill (05:58:43) :
Hang on a moment! This code is from 1998.
If this fudge were included in the Briffa etc. documents then there would be no decline.
So all those “hide the decline emails” would be irrelevant.
The only conclusion is that this bodge was done to see what adjustments would have to be made to fit the temp record.
Having done that it was not used for any submitted work

I don’t think this applies to the Briffa et al work per se:
This is an actual snippet of code from the CRU contained in the source file
IMHO it applies to the CRU temperature reconstruction which was on the same graph as the tree series as a comparison indicating agreement between the two. Since we know that Briffa’s work was bogus, and now that the CRU was also rigged, I guess they are in complete agreement. But nice try Bill, where’s Mary Hinge, Joel S. and RRKampen these days?

imapopulist
December 4, 2009 6:32 am

So the data output from this program will be the result of an interpolation between (essentially a combination of) the actual temperature data for each period of time and a list of increasing values that have been artificially incorporated into the program. The list of values was intentionally skewed to reflect higher temperatures in the later periods and to conform with periods of increasing CO2 levels.
In other words, this computer program is producing bald faced lies.
This is without any doubt a criminal fraud.
How else can anyone justify such a manipulation of data?

December 4, 2009 6:34 am

I’d seen this earlier and added it to my page for newcomers explaining “The Decline” and the hiding thereof and theclose-up details of UEA’s latest graph which only digs their hole deeper. At the bottom of that page is more code, with comments, putting the above snippet further in context.

imapopulist
December 4, 2009 6:34 am

“NOTE: This is an actual snippet of code from the CRU contained in the source file: briffa_Sep98_d.pro”
[snip]

Phil M
December 4, 2009 6:36 am

There’s a problem with your analysis:
– the ‘yearlyadj’ fudge factor isn’t used in this code!
– if you look a couple of lines further down you can see that the lines which would adjust the data, filter it & then plot it out are actually commented out using ‘;’s

;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20

and code that does the actual plotting doesn’t use the ‘yearlyadj’ fudge factor

filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21

– so, in this example at least, the fudge-factor isn’t used, even if it looks a little suspicious.
– perhaps it was just some test code, or Briffa changed his mind about using it….
Actual code:
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21

Chris
December 4, 2009 6:38 am

The data array in line 5 is the “hockey stick” and 0.75 multiplier is the “fudge factor”. In other words, the former gives the data a curve effect, and the latter is adjusted to provide the proper steepness (or distortion to the data).
You guys realize don’t you that 2 parameter models (i.e., nothing complicated, very simple) can essentially model most known phenomena to about 10% accuracy.

JonesII
December 4, 2009 6:38 am

Great post. Small and simple enough to be used by media. Something like this:
“Look how they waterboarded computer programs these pseudo scientists to cheat YOU with the tale of global warming”

TJA
December 4, 2009 6:38 am

Remember when Steve reanalyzed the NASA temp records and found that the thirties were the warmer decade? The warmists answer was the CRU didn’t reflect the warming. Well, in light of all this, that wasn’t much of an answer was it?

Gary
December 4, 2009 6:41 am

Something seems to be missing in this explanation at the end. Showing an example of how the interpol() function actually adjusts a table of number would help.
What becomes apparent is that the subtle allure of modeling has crept into the simple display of basic data. Models let you tweak the data to get an expected output. It isn’t a big leap to apply the same rationale to adjusting the data because it’s not up to your expectations. When the practice becomes standard operating procedure, you no longer see its flaws.

Burch
December 4, 2009 6:43 am

Gotta love this comment from cloudcorr.pro
——
; program to construct cloud correlation coefficients (with DTR)
; method approximately follows New et al 2000
; this program is required because Mark New has lost both
; the correlation data file, and construction files

Steve Geiger
December 4, 2009 6:45 am

is this a joke?
“I’m coming to you today as a scientist and engineer with an agnostic stand on global warming.”
then,
“It just shows that all of the data that was the chief result of most of the environmental legislation created over the last decade was a farce.”
Then,
“I tried to write this post in a manner that transcends politics. ”
unbelievable. As a long time ‘skeptic’ and avid supporter of S. McIntyre, Lucia L., and the likes, all I can say is this dialog has reached a new ‘low’.

Nigel S
December 4, 2009 6:47 am

Time for Interpol (ICPO) perhaps?

SABR Matt
December 4, 2009 6:47 am

What’s really hilarious is how blatant the programmer was…
Apply a VERY artificial adjustment for decline!!
Really??? This is just completely bizarre

imapopulist
December 4, 2009 6:47 am

Anthony, Please dig into this one example and flesh it out even more. It is absolutely the most damning piece of evidence of climate fraud that I have seen to date.
It is clearly a smoking gun.

Andrew Francis
December 4, 2009 6:47 am

This code fragment is meaningless unless without the context in which it was used. You need to have evidence that it was used to unjustifyably distort temperature data which was subsequently published. PS. I’m a sceptic, but without context this code is definately NOT a smoking gun.

December 4, 2009 6:48 am

TV TV TV. Anthony, we need you and other experts on TV TV TV….

syphax
December 4, 2009 6:49 am

It just shows that all of the data that was the chief result of most of the environmental legislation created over the last decade was a farce.
I’m having trouble parsing this. “All of the data”? What? All climate data is keystoned off tree-ring data? I had no idea. “that was the chief result of most of the environmental legislation created over the last decade”? What? I thought the chief result of environmental legislation was stuff like cleaner air and cleaner water.
By the way, the following lines of this file are:
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
“;” indicates a comment line. So in this file, yearlyadj isn’t used in the version of this file. Huh.
(It does appear to be used in briffa_sep98_e.pro.)

Will Hudson
December 4, 2009 6:51 am

Newsflash: Bob Ward (policy and communications director at the Grantham Research Institute on Climate Change and the Environment at the London School of Economics and Political Science ) just announced on Sky News that “the climate science was setted 200 years ago”. His interview, with Frasier Nelson (Editor of Spectator) was interesting in that Ward did everything he could do to shout down Frasier. Hopefully, the piece will be on skynews.com later.

Denbo
December 4, 2009 6:52 am

This code is too well documented. It must be fake. Line 2 is such a nice description

NickB.
December 4, 2009 6:52 am

Real scientists, citizen scientists UNITE!!!
We need a made in the light of day, open source temperature reconstruction. Since the “professionals” (I use the term loosely) can’t even recreate their own products there is no choice but to throw them out and start over

Burch
December 4, 2009 6:55 am

Here’s a link to the released documents in case you haven’t seen it elsewhere.
http://junkscience.com/FOIA/

Pamela Gray
December 4, 2009 6:55 am

Replication is not the key here. If you have the original code and raw data set, and can replicate it, it will follow the GIGO principle. Falsifiability is the key here. Running the raw data with a different filter, using a different set of raw data, looking through the code for errors that bias the outcome, etc, are ways to falsify the theory. Replication is what got us to this point in the first place and is the preferred biased approach of “teams”. But falsifiability is the necessary (but missing?) step prior to replication. I don’t believe this step was done by the “team”. And is probably rarely done by any scientific team with any kind of publicity attached to it. No one publishes research that came to a dead end and turned out to be false. Labs that want to be known for something often fall into the trap of wanting to “prove”, not “falsify” their experiments. The former garners money, the later does not. Therefore the incentive to be right is far greater than the incentive to be wrong. It is a disease of science that has plagued the ages and we seem to be in the middle of a pandemic.

December 4, 2009 6:59 am

TJA (06:38:59) :
Remember when Steve reanalyzed the NASA temp records and found that the thirties were the warmer decade

At least get it right: ‘Steve reanalyzed the NASA temp records and confirmed Hansen’s previous finding that the thirties were the warmer decade’!

David L. Hagen
December 4, 2009 7:01 am

See Bishop Hill’s blog for summary comments on The Code

December 4, 2009 7:03 am

Phil. (06:59:09),
I seem to recall something about a connection between Hansen and NASA…

Doug
December 4, 2009 7:03 am

You see? After all the flak in the air, we have evidence that it was Mann made GW. (or at least he helped)

Pamela Gray
December 4, 2009 7:03 am

The author of the post has done what the lab should have done. Made every attempt to falsify their conclusion. Once they have made every effort to do that, only then should they publish the raw data and code for others to replicate. This blog and the above post, and other blogs, have done the work for them. Now let them return to their labs and try again (maybe with one or two pink slips in the mail boxes?).

December 4, 2009 7:04 am

Excellent article. Thanks for explaing the code for the layman. This is earth shattering.
Also, I think you meant to use the word cause, not result in this sentence:
“all of the data that was the chief result of most of the environmental legislation”

Richard M
December 4, 2009 7:04 am

What would be interesting is to compare the “value added data” and the raw data. If the difference between the two came very close to “valadj” then you might have a smoking gun.

JonesII
December 4, 2009 7:05 am

Wait, wait, wait!, That “interpol() function” we can call itThe cherry picking function
Call the Interpol!

MattN
December 4, 2009 7:06 am

OK, there’s no way they left this on an open directory for someone to find. This had to be a hack or a whistleblower. No way in hell I’d let anything this incriminating out in an FOI request.
When can we start using the f-word?

imapopulist
December 4, 2009 7:07 am

So one might say global warming is “Mann made”?

Bob H.
December 4, 2009 7:09 am

The emails have gotten most of the attention…for now. They are a lot smaller than the data sets, and most people can go through the emails and put together a time line. We still haven’t heard from the techies yes who undoubtedly have started breaking down the data sets and code. It will be interesting to see if they can replicate CRU’s outputs and then be able to explain what CRU did. But even if they can, that is only half of the story; the other half is what the output should have looked like if it hadn’t been tampered with. That will take some time.

Greg
December 4, 2009 7:09 am

I am not jumping on board as vehemently as some on this post. IDL is not a broadly used language. On the original site for this post someone claiming to be an IDL programmer raised some very interesting points that may lessen the accuracy of the charges made regarding this piece of code. I have been coming to this site for a year and am a solid skeptic of AGW but the rush to judgment on smoking guns can diminish the credibility of the those questioning the CRU.

December 4, 2009 7:11 am

In the past 20 years, how many BILLIONS of dollars wasted for this fraud and how many MILLIONS of lives could have been saved if the money would have been better spent?????

Gene Nemetz
December 4, 2009 7:11 am

Just what do you think you’re doing HARRY_READ_ME?

bill
December 4, 2009 7:12 am

Please check some of these comments above
THE CODE WAS COMMENTED OUT
IT WAS NOT USED at the point this image was taken
Check out this comment from the original post location
http://cubeantics.com/2009/12/the-proof-behind-the-cru-climategate-debacle-because-computers-do-lie-when-humans-tell-them-to/comment-page-1/#comment-664

photon without a Higgs
December 4, 2009 7:12 am

TV TV TV. Anthony, we need you and other experts on TV TV TV….
nice idea

bill
December 4, 2009 7:14 am

If you think that this commented section just means that they could have used it to fudge the data then you would also have to take into account all the code that briffa deleted and even the code that doesn’t exist.
This is beyond ludicous!

RR Kampen
December 4, 2009 7:14 am

Has this code been used?
Or has it been preceded by a semicolon in the lines that follow the code cited here?

harpo
December 4, 2009 7:15 am

May be that this is off topic but check out Real Climate… Here is what Gavin had to say in his latest piece…
*************************************
Unusually, I’m in complete agreement with a recent headline on the Wall Street Journal op-ed page:
“The Climate Science Isn’t Settled”
*************************************
Gee.. I could have sworn that Gavin said it was…..

helvio
December 4, 2009 7:16 am

«That is not science, that is religion with math equations.»
This is, for me, the quote of the day! So true! 😀

December 4, 2009 7:17 am

While I agree with most of this analysis, it is NOT the temp data being adjusted it is the proxy data being merged with the temp data.
title=’Northern Hemisphere temperatures, MXD and corrected MXD’
title=’Northern Hemisphere temperatures and MXD reconstruction’
This code is from a briffa osborn reconstruction of temperature in a process I call hockeystickization. It’s a common practice in the black art of proxy temperatures. While it is absolutely disingenuous and IMO fraudulent it is not evidence of HadCRU temp data being manipulated. Michael Mann claimed no knowledge that any scientist had ever done such a thing as merge proxies directly with temp.
Kieth briffa referred to a different method of hockeysticization as Mikes trick to hide the decline.

SOM
December 4, 2009 7:17 am

Hey PWL Just follow the money and you’ll comprehend just fine…it’s called Natural Law…
PWL said: “I can’t comprehend their justification for this obvious blatant fraud. What was Mann thinking when he manNipulated this data in this manner?”

VJay912
December 4, 2009 7:18 am

But what about the code or data that was supposedly deleted? If it is in fact destroyed that would be just as much a smoking gun in my eyes.

Steve M. from TN
December 4, 2009 7:18 am

I’ll have to agree with Bill…the plot apparently does not use the “fudge factor.” But, for the IPCC reports stop using the Briffa plot around 1960 (somewhere around point 12 in the above graph). See:
http://wattsupwiththat.com/2009/11/30/playing-hide-and-seek-behind-the-trees/
This code does highlight the fact that the Briffa tree-ring data don’t match measured temperatures. Either the Actual temperatures are garbage or the tree-rings don’t make for a good temperature reconstruction.

Ed Fix
December 4, 2009 7:23 am

All the discussion about whether this is “actual proof that the CRU was deliberately tampering with their data” is a distraction. Yes, the “fudge-factored” data may not have been used to produce any graphs. Yes, that array may not have even been used within this program. That’s all beside the point.
The Hockey Team could put this away for good by merely releasing their data and analysis, as they should have done in the first place. All they have to do is explain themselves. The fact that they won’t, and that we have to guess, is incriminating in its own right, and sufficient reason to reject their result. Mann, et.al. and his successors and colleagues have attempted to turn paleo-climate research on its head with their research, purporting to disprove the Roman optimum, Medieval warm period, etc. THE BURDEN OF PROOF IS ON THEM, NOT US. This isn’t a criminal prosecution; there isn’t any presumption the defendant is correct.
The fact that the Hockey Team can’t or won’t disprove the allegations is all we need to reject their results. Whether they committed any actual crimes is a question for another venue, with a different standard of evidence.

Morgan
December 4, 2009 7:26 am

Isn’t this the “correction for decline” described in the Osborn, Briffa, Schweingruber, Jones (2004) paper cited in the file (Annually resolved patterns of summer temperature over the Northern Hemisphere since AD 1400) as one step in calibration of the proxy/temperature relationship?
“To overcome these problems, the decline is artificially removed from the calibrated tree-ring density series, for the purpose of making a final calibration. The removal is only temporary, because the final calibration is then applied to the unadjusted data set (i.e., without the decline artificially removed). Though this is rather an ad hoc approach, it does allow us to test the sensitivity of the calibration to time scale, and it also yields a reconstruction whose mean level is much less sensitive to the choice of calibration period.”
If so, it may be evidence of bad science, but not of outright fraud.

Robinson
December 4, 2009 7:26 am

The reason you have a fudge factor like 0.75 is to allow you to scale the curve arbitrarily. You only have to change the fudge factor, rather than the entire array you see.

Robinson
December 4, 2009 7:27 am

By the way, use of hard-coded “magic numbers” is considered very bad practice in software development. One of the first code-reviews I ever had made this very criticism of one of the modules I’d written :/.

John Adlington
December 4, 2009 7:28 am

If the CRU gave us the raw data and the algorithms they used to plot their graphs we could have a proper argument about this situation. They won’t because they can’t – they’ve “lost” the raw data.
They are as convincing as Joseph Smith saying he lost the original of the book of Mormon.

DG
December 4, 2009 7:30 am

It has been said that this line was commented out and speculation drawn that the code could be explained as a legitimate attempt to debug/test other parts of their program.
As a coder (this claim is not conclusive – additionally, I am not familiar with the language, though the majority of them are largely similar IDL is by no means cryptic), that does NOT look like a debugging comment. The number fudging requires several other lines of code, as can be plainly seen.
The “Apply a VERY ARTIFICIAL…” comment reveals that the programmer’s aim is not to draw conclusions out of data, but to insert preconceptions into it.
This is NOT Mike’s Nature Trick – which made use of data being added to the 1960s and afterwards. From Jones’ email regarding that trick, it seems that program code itself was not altered to hide the decline. Instead, improper data was used. There may in fact be little to no evidence of fraud in the functional code itself (nor can the commented line be used to exonerate anyone in this scandal), but the comments further corroborate our growing common-sense suspicions – that a corrupt process was used to hide a decline in temperatures. There are at least two implements to hide the decline documented between Jones’ Nature-trick email and the code posted above.

Tim Clark
December 4, 2009 7:31 am

Bill, you’re fighting a losing battle. Make the decision now to reanalyze your position and you will feel a lot better.
The following is probably how they eliminated the medieval warm period:
see: documents/osborn-tree6/summer_modes/pl_decline.pro
;
; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
;

;
; Now apply a completely artificial adjustment for the decline
; (only where coefficient is positive!)
;
which is accompanied by a similar array, etc.
as well as:
;
; Now fit a 2nd degree polynomial to the decline series, and then extend
; it at a constant level back to 1400. In fact we compute its mean over
; 1856-1930 and use this as the constant level from 1400 to 1930. The
; polynomial is fitted over 1930-1994, forced to have the constant value
; in 1930.

AnonyMoose
December 4, 2009 7:31 am

It would be nice to add a graph which shows the valadj values along the Y axis and the years along the X axis. That image would make more obvious the scale and period affected. I comprehend what is being done, but most climate graphs show the years thus showing the years for this correction would help people compare the adjustment with the other graphs that they’ve seen. Yes, I know what starting at 1400 will do to the graph, and the programmer also knew what that would do.

chainpin
December 4, 2009 7:35 am

How can anyone claim that this is a smoking gun without knowing definitively whether this code was used to produce the charts found in the literature and/or the IPCC reports?
I believe Gavin has already claimed that it was simply test code.
More sleuthing needs to be done.

Billy
December 4, 2009 7:36 am

I tend toward the skeptic side myself but as a programmer, I Just want to reiterate what some other people have already said here, namely that this is definitely NOT a smoking gun.
1) I can’t tell you how many snippets of code I have lying around in various directories on various computers where I work. Programmers write all kinds of things, sometimes just for their own curiosity or self-edification. There is no way of saying if this is ‘production’ code or just some little one-off that somebody was working on for some unknown reason. To me, it has the feel of the latter, but I couldn’t say for sure. The comments seem kind of suspicious but again, without knowing how it was used it by itself is pretty meaningless.
2) The author seems confused. He writes: “Remember that array we have of valid temperature readings?” Actually no. All I’ve seen is an array of years. Later he writes “…the valid temperature readings (yrloc)…”. Uh no, yrloc does not hold temperature readings, it holds year numbers as he just got done explaining to us a few paragraphs earlier.
3) It’s not clear to me what the interpol() functions does. What is the ‘timey’ parameter passed? Is THIS the temperature data in a yearly time series maybe? So maybe interpol() fills in those year gaps in (1400-1904, 1904-1909, 1909-1914, etc.) using the fudge factor numbers as some sort of weight? I don’t know, but the author’s explanation isn’t very clear at all.

B B
December 4, 2009 7:37 am

Sorry, but to me this analysis looks out of context and ridiculous. I don’t think it’s possible to conclude anything based on that code. (I’m a computer programmer myself and I have some training in physics and math, and have done modeling in the past.)
I am a “climate skeptic” and a long time reader of WUWT and CA. Unfortunately, the quality of “climategate” discussions is starting to deteriorate and approach that of AGW propaganda at an alarming pace. If this trend contiunues then I’m afraid I’ll have to switch camps.
Still I’d like to thank Anthony for his hard work and for publishing quality materials.

Al Ward
December 4, 2009 7:40 am

MangoChutney (05:54:06) :
“just a thought, if the emails and code were obtained by a hacker and not released by a whistle blower, would the evidence be inadmissible in a court of law?”
=>The leak of the Pentagon Papers by Daniel Ellsberg set the standard, & the subsequent Supreme Court decision, a 6-3 ruling in favor of The New York Times, which had published the purloined material, also firmly established the First Amendment principle of no prior restraint on publication.
“A stranger’s illegal conduct does not suffice to remove the First Amendment shield from speech about a matter of public concern,” wrote Justice John Paul Stevens.<=
I ran across that piece of information in this article, “Sen. Boxer and ClimateGate: The Terror of Tiny Town,” by Michael Walsh on biggovernment.com

Burch
December 4, 2009 7:44 am

>THE CODE WAS COMMENTED OUT
>IT WAS NOT USED at the point this image was taken
In that particular source file, yes it was commented, but as noted elsewhere, in the file ‘briffa_sep98_e.pro, it IS used, twice. I cannot comment on whether the results of any particular version were used to create published graphs.
As a reminder, the array ‘valadj’ is the “artificially corrected” element.
And remember, it’s really easy to comment or uncomment a line and rerun the code, so we would need to correlate published data to this line of code to know if it was, or was not, used to influence the results.
That said, it is suspicious that it exists at all.
—–
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
————- And later
;
; Restore the Hugershoff NHD1 (see Nature paper 2)
;
xband=x
restore,filename=’../tree5/densadj_MEAN.idlsave’
; gets: x,densadj,n,neff
;
; Extract the post 1600 part
;
kl=where(x ge 1400)
x=x(kl)
densadj=densadj(kl)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densadj=densadj+yearlyadj

Denis
December 4, 2009 7:46 am

I think you are full of BS and what you have set out is crap.
Please ‘hide the decline’ as peer reviewed science demands.

TJA
December 4, 2009 7:50 am

Phil,
So NASA corrected the published data before Steve M brought it up? Do you have a link for that, because that is not how I remember it. I am calling BS here.
Not to mention that no such analysis occurred or can occur on the CRU data set. So the only *reliable* data we have shows that the thirties were warmer than the ’90s, as any history book which deals with the period will corroborate. The claim that the global numbers don’t show this argument carries a *ton* less weight in light of the facts that have come out recently.
The word “unprecedented” is gone, and it ain’t coming back without a lot of extremely careful and transparent work. My guess is that if the reanalysis is done carefully, the thirties will be found to have been warmer, and the only answer you have right now is that I should trust Phil Jones.

James
December 4, 2009 7:56 am

I agree with Bill. Greiner’s analysis of this code is fatally flawed if yearlyadj is not actually used in the code. Anthony should write an update.

James
December 4, 2009 8:01 am

To elaborate a little further…for this code to mean anything, it needs to be shown how it effects the output/final result. Mr. Greiner has not done that. I would not bother posting this stuff unless you can determine a little more precisely what effect the suspect code has on the output. If the answer is “none”, then there really isn’t necessarily a problem.

Claude Harvey
December 4, 2009 8:01 am

I find it interesting how many of the respondents who can’t read and interpret computer code and who know nothing of the author’s veracity, background or motives are willing, without the slightest hesitation, to swallow what the author of this piece says, hook-line-and-sinker. I believe they do so BECAUSE IT TELLS THEM WHAT THEY WISH TO HEAR.
Isn’t that what got “true believing” AGW crowd into the mess they’re now in?
Rein it in a bit, boys and girls!
CH

P Gosselin
December 4, 2009 8:02 am

Interesting AlGorithm.

David Harrington
December 4, 2009 8:12 am

Been looking at this. I assume that the values in yearlyadj are then applied to the temperature data?
Also this fudge series flattens out in the late 70’s (2.5, 2.6, 2.6, 2.6, 2.6 and 2.6.) what is the effect of that?
REPLY: Not the temperature data, but the tree ring proxy data – Anthony

NK
December 4, 2009 8:13 am

Love the commenters to this site — even the alarmists like Phil, because most everyone tries to make sense. Looking at this, it’s code for some unknowable purpose, and hopefully the EAU investigation is not a whitewash and looks at the code written over the years at CRU and more importantly why code was written. Was this code a benign ‘test run’ to see if data would react as predicted, so actual model results would be reliable? or was this code part of the 1990s scam to manipulate data? Only a thorough real investigation will reveal that. Based on the Mann code manipulations, the Briffa data nonsense, CRU evading legitmate FOI demands et al, strong suspicion of adjusting data to match model results is clearly warranted. Time will tell, but investigations by government and private grant givers is essential, whistleblowers would be a helpful resource.

DeVries
December 4, 2009 8:14 am

The most positive result of these disclosures should be more access to the data and algorithms that are used. It seems to me that looking at source code is a pretty poor way to really determine bias in the algorithms. A much better way would be to simply obtain the model, and run it with constant temperature, thereby graphing the bias directly. Mr. Watts, you seem to be in an excellent position to demand this information. I cannot imagine how this demand could be denied in the present situation. What possible excuse could be invented to oppose such a study?

david
December 4, 2009 8:14 am

The most damning thing is the very comment “; fudge factor”. What the heck kind of comment is that to put in code? The point of comments is to explain what the coder is doing and how; it makes the code useful to anyone (including the programmer himself) who comes after to understand the logic and to debug and correct if necessary.
“fudge factor” is, whatever the rationalizations offered by the Mann contingent, NOT a technical term in climatology, the meaning of which would be obvious to anyone using or reading this code; and it’s not a useful comment to anyone reading the code for scientific purposes.
What it sounds like is “This is the adjustment we need to make it come out the way we want; I told you I’d need to apply a ‘fudge factor’ when we discussed the data in person.” So SOMEONE, or some people, in addition to the coder, probably knew what it meant, but no one else in on conversations before the coding would know–and no one else would be meant to know for sure.

thomas
December 4, 2009 8:15 am

I really hope that some time is taken to graphically illustrate these “value added” graphs opposed to the raw data.
The genius of the “hockey stick” is that it is clear and a perfect bookcover illustration. Someone can just point to it and say’ “See, look for yourself.”
Those who question it, need a compelling and equally clear illustration. I am not saying that anyone should, like the hockey team, construct such graph; but this is as much a PR fight as a scientific one.
My hope is that the reason that Climate Audit has not had any recent postings is because this is what is being done. Wouldn’t it be great if it were revealed at Copenhagen.

JonesII
December 4, 2009 8:23 am

This post shows a “smoking gun” and it does not matter if the guy with the gun in his hand says he just used it to kill a passing by bird…That will be a matter of investigation. It’s too late now.

Mambo Banana Patch
December 4, 2009 8:24 am

Thanks for that information … with it, I have concluded that a lot of phony scientists and Al Gore need to be charged with fraud and prosecuted to full extent of the law.
Governments who have been duped into funding this crap science should be the ones laying the charges. Also, the motion picture academy that gave Gore an Oscar based on his documentary which proved to be a pack of lies and Styrofoam should rescind his Oscar.
The Nobel prize committee has no credibility, so he can keep his phony peace prize. It has the same value as the one given to the man/child Obama and the old terrorist Yassir Arafat. Gore is in good company there.

EdB
December 4, 2009 8:35 am

James, Billy.. we need to go to court and put these programmers in fear of 5 year sentences..
then the truth will come out!

JJ
December 4, 2009 8:39 am

“This code fragment is meaningless unless without the context in which it was used. You need to have evidence that it was used to unjustifyably distort temperature data which was subsequently published. PS. I’m a sceptic, but without context this code is definately NOT a smoking gun.”
Second that.
Look folks, do not squander the opportunity that has been presented to us, by flying off the handle with hastily drawn ‘gotchas’. As currently analysed, this code snippet proves nothing untoward.
Somebody risked a lot to get this data out of East Anglia. With the massive lock down and circling of wagons that is going on, we may not have another chance like this. It would be a travesty if the real smoking guns end up hidden beneath an avalance of water pistols turned up by half assed, Team-like, amateur hour, conjecture driven ‘analyses’.
What do the values in valadj represent? What is the source of the 0.75 coefficient? To which data were the interplations applied? What was the result? Was it ever used? What would that mean?
All of those questions need solid answers before making a big deal out of this. There is plenty of damning material in the whistleblowers gift. No need to exaggerate or make stuff up. Leave that to the Team.

helvio
December 4, 2009 8:41 am

My contribution, with two (de)motivational posters 😀
http://www.flickr.com/photos/45273160@N06/sets/72157622809522485/

JohnSpace
December 4, 2009 8:42 am

For those saying that this isn’t a “smoking gun” because we don’t know how it has been used, you are right as far as that goes, however you are wrong for a more serious reason. No one knows how this model was used. That is the whole point. These scientists would not turn over how the made the model and what their assumptions were based upon. They lost their data, their models were spaghetti, and they increasingly lost control over the process. I don’t care whether this was fraud or incompetency, it sure as heck isn’t science, and it most assuredly should not have been used to justify billions of expenses.
This is a “smoking gun” because even the scientists who MADE this code can explain it and whether or not it was used. As such all their work is now suspect.

Tim Clark
December 4, 2009 8:42 am

Ed Fix (07:23:34) : and others
I agree Ed.
But that has always been the problem; lack of replication to verify results because the original code and data is not published. The release of these documents should in the very least force the release of original data and code (if available) so that people can determine if fudge factors were included. This is what the skeptic community should argue for vehemently. If code is not released, it should be readily apparent to even the most ardent alarmists that there is genuine f***d.

John Galt
December 4, 2009 8:43 am

Would this code be accepted for an assignment in an Engineering or Computer Science course? What grade would it get?

TKl
December 4, 2009 8:44 am

I found this comment in calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
Could it be possible, they used ‘value-added-data’ later on, so there was no need anymore to apply the artificial correction?

December 4, 2009 8:46 am

The UN IPCC needs to investigate the CRU lies in August about losing their data to avoid FIO:
http://strata-sphere.com/blog/index.php/archives/11715

Billy
December 4, 2009 8:54 am

I agree w/ BB above:
“Unfortunately, the quality of ‘climategate’ discussions is starting to deteriorate and approach that of AGW propaganda at an alarming pace.”
The skeptic side needs to get a grip or run the risk of coming across as tinfoil-hat types.
Here’s some food for thought. Think about this logically. If their intent was to just fudge some numbers and never let anybody see the raw data, why would they need a computer program to do it? Why not just make up a bunch of numbers? Obviously, there is some method to the apparent (to some) madness of the program in question, otherwise why bother? I agree that the AGW side should be completely open with their raw data and with any algorithms for massaging it. Certainly, their secrecy contributes to suspicions. But that doesn’t mean that everything they do is sinister.

J. Bob
December 4, 2009 8:56 am

The fact that the “fudge factor” is in the code, has to make one wonder.
Yes programmers & analysts can be sloppy at first, but as the project progresses, code is either cleaned up or re-written for traceability or maintenance.
In a number of companies, anyone presenting this code in a design or report review, would be reprimanded or shown the door in quick order.

g hall
December 4, 2009 8:57 am

As someone who has programmed for a living for the past 15 years may I just point out that you write code to a spec – find who specced the application – there is the clue as to who is guilty of fudging the figures.

carilou
December 4, 2009 8:58 am

anyone have a link to the page that has all the emails available to read? thanks so much!

Doug in Seattle
December 4, 2009 8:59 am

So far the code reviewers have demonstrated that proxy values have been blended with instrumental temperatures contrary to what Michael Mann has publicly stated is NEVER done by real scientists.
Beyond the the frustration of exhibited in the Harry Read Me file, we don’t as yet have any smoking guns with respect to similar manipulation of the temperature record.
It is quite clear from comparison of previous versions of the global temperature record that CRU, NASA, and others have been reducing past temperature and raising modern ones to accentuate the warming, but so far we have not seen this code or the methods used. This, if it exists, will be the goods that can shut down these clowns.
My advice to the code reviewers is to keep digging.

Ed Fix
December 4, 2009 9:07 am

Jeff Id (07:17:23) :
“Kieth briffa referred…”
I had a college roommate named Keith who got exasperated with me everytime I couldn’t remember whether his name had an “ie” or “ei”. I still need to stop and think about it.

John Diffenthal
December 4, 2009 9:10 am

This code fragment was mentioned on page 8 of Monckton’s fire and brimstone paper that you linked to earlier in the week.

December 4, 2009 9:15 am

>> Billy (07:36:27) :
3) It’s not clear to me what the interpol() functions does. What is the ‘timey’ parameter passed? Is THIS the temperature data in a yearly time series maybe? So maybe interpol() fills in those year gaps in (1400-1904, 1904-1909, 1909-1914, etc.) using the fudge factor numbers as some sort of weight? I don’t know, but the author’s explanation isn’t very clear at all. <<
It seems obvious to me that interpol is an interpolation subroutine, probably just a linear interpolation. It takes the 'yrloc' array and the 'valadj' array and creates a value adjustment for a given year (timey) that fits between the 'valadj' values for the 'yrloc' boundary years.

slow to follow
December 4, 2009 9:16 am

Anthony – please pay attention to the comments above. Leaving this post up with no caveat damages the credibility of your blog. Many will be passing by here without the time/interest to read the comments in detail. Unless this can be shown to produce production output to call it “smoking code” is disingenuous.

Brent
December 4, 2009 9:17 am

But, but, but . . . commenting out lines of code is just for simplicities sake. On when you need/want it, off when you don’t. Commented out or not, what is it doing in there at all?
The gun is smoking alright.

Clark
December 4, 2009 9:18 am

This is one of the most important finds.

J.Hansford
December 4, 2009 9:18 am

In some ways isn’t this is all moot?…. Jones has already admitted that the original data is gone. Therefore this whole database is meaningless, which makes that HadCRUT temperature history corrupt and meaningless…. Am I right in saying that? Is Jones’s goose pretty much cooked already, no matter what?

December 4, 2009 9:29 am

, I responded to you on my site.
As I said before, this proves that the CRU data can’t be trusted. Let’s get the results re-run and go from there.

December 4, 2009 9:32 am

@Anthony,
Thanks for adding the caveat. I didn’t intend to take the point of disproving anything about global warming. I just wanted to show that there is enough proof in the CRU source code to warrant an investigation. Not some off-the-wall email that was sent 5 years ago.

bill
December 4, 2009 9:36 am

g hall (08:57:10) :
J. Bob (08:56:09) :
John Galt (08:43:01)
JJ (08:39:45) :
david (08:14:07) etc. etc.
The smoking gun is commented out (see the “;” = comment). Why cannot you see this?? It is obvious.
The code is written by scientists for a 1off use. Why would you clean it up and make it presentable. They are not going to sell it to others.

NorseRaider
December 4, 2009 9:37 am

I’m a Cambridge compsci, and I think there are a couple of bits missing from this explanation. The thrust of the analysis is roughly right, but I would advise being cautious unless we can show whether and where this code was used.
So, here is the complete analysis, from me 🙂
Firstly yrloc does not contain “temperature readings”… it is literally just a list of years. You are quite right though, the years are:
[1400, 1904, 1909, …. 1984, 1989, 1994]
These are the “x-values” of a function. Note that they form an irregular grid, with the first value being in the late medieval era, and all others in the C20th.
Then, they create the “y-values” of this function .. one for each “x-value”:
[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75
Note, the *0.75 on the end just multiplies every value in the list (it’s like this so they can all be adjusted in a single step). So -0.1 above becomes -0.075 etc. Then, the final mapping of inputs to outputs looks like this:
year y-value
1400 0
1904 0
1909 0
1914 0
1919 0
1924 -0.075
1929 -0.1875
1934 -0.225
1939 0
1944 -0.075
1949 0.225
1954 0.6
1959 0.9
1964 1.275
1969 1.875
1974 1.95
1979 1.95
1984 1.95
1989 1.95
1994 1.95
Now then. This is fine, but it’s irregular. There is no value for 1452 for example, and no value for 1970… The IDL interpolate function is documented here (thanks Nasa 😉 :
http://idlastro.gsfc.nasa.gov/idl_html_help/INTERPOL.html
It takes another list of years, called “timey”, which we don’t have here, and linearly interpolates the equivalent y-value for each of them. So if the year in “timey” was 1960, the resulting y-value would be interpolated between the given values for 1959 and 1964.
In all likelihood, “timey” is a list of every year from 1400 to present, and so this is just a way to expand the “valadj” array to cover all the years in the range.
It definitely looks suspicious, in as much as that someone at somepoint has played with arbitrary adjustments to the C20th. However, it is *not* a smoking gun unless we can show (a) that they used it, (b) where they used it, and (c) how they used it.
Hope that helps! Send me more yummy code to digest 😉

Brent
December 4, 2009 9:38 am

“Am I right in saying that? Is Jones’s goose pretty much cooked already, no matter what?”
No, unfortunately. I fail to see an easy way out of this very tangled mess for anyone involved, and they’ll try to salvage every last bit of their pride they can. This is, I believe, perhaps the biggest conspiracy in history, and everyone’s going to do whatever they can to detach their name from it. Their is simply too much pride involved for people and institutions and governments to admit they were wrong openly and frankly.

NorseRaider
December 4, 2009 9:39 am

Array should have been:
[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75

Burch
December 4, 2009 9:40 am

Here’s an interesting file – jones-foiathoughts.doc – entire text below. Take note of the last line. How professional!
———————–
Options appear to be:
1. Send them the data
2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.

NorseRaider
December 4, 2009 9:40 am

Getting trimmed for some reason.
[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,
0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75

Optimizer
December 4, 2009 9:43 am

As a person very experienced in IDL programming, I would like to clarify a little, and summarize.
1) Think of the data in the variables yrloc and valadj as a series of (x,y) points on a graph. In this case, the 19 points are (1400, 0.0), (1904, 0.0), … (1994, 2.6*0.75). What the “interpol” function does is to draw straight lines between these points on a graph, and then pick off values for a different series of x values (the years in the variable “timey”). This is done because the dataset you want to add this function to has a different set of x values than the “fudge” function has. The whole point is to get the “fudge” values for the same set of years as some other data set, so you can add the “fudge” to that other data (called “yyy” here). Saying that this processing is making a “guess” might mislead some readers – better to call it an “estimate”, an “interpolation”, or a “resampling”.
The author of this article really should have included an “x vs y” plot, instead of what he has, to better show what’s really going on. It actually is even more of a “hockey stick”. Anybody with Excel could make such a chart pretty easily by typing the yrloc values in one column, the valadj in another (remember to include the “*0.75”), and then making an x vs y plot from those two columns.
BTW, the “fudge” is more appropriately called a “term” than a “factor” because it is added in, not multiplied.
2) As far as the ethical considerations, this can only be called “suspicious”. From the comments, and the variable name, it’s clear that the author of the code does not like being ordered to add this “fudge” in there AT ALL. Also, the “*0.75” (which multiplies all the numbers within the square brackets by 0.75) suggests some trial-and-error tinkering even with the “fudge” – as do the several leading zeros at the beginning of valadj (which have no effect). Without knowing where the “fudge” numbers come from, one can’t say for sure how legit it is or not, but IF the author of the code really understood what was going on, their commentary makes it likely that something unsavory was going on.
We also don’t know how this code was used. If the results never saw the light of day, then it’s no big deal – just somebody being unsuccessful in trying to pull something (maybe for internal use).
3) The insider explanation offer by David Schure (06:30:57) is such an obvious exercise in blowing smoke You-Know-Where that it is really very insulting. It would make me even more suspicious, but I know that guys like that spew BS as a reflex, whether there’s something to hide or not.
4) As to the point about the code that uses the result being commented out, that’s a total red herring. The semi-colons can be added or deleted in a few seconds (I do this all the time). The fact that the code was written in the first place tells you that somebody used it at one time. The fact that it was commented out only tells you that the extra “fudged” curve on the plot wasn’t included the LAST time the code was used. The fact that the commented-out lines were not simply deleted tells you that the author thinks they might need to include them back in again later, or that the author wanted to be able to refer back to see how a prior result had been obtained. People who say “it means nothing” because it’s commented out are “conveniently” ignoring the obvious question as to why it would have been written in the first place.
4) In summary, we know that fudged data was included on a plot over the strong objection of the programmer, but that it was subsequently (if only temporarily) removed. How bad this is depends on where the “fudge” numbers came from, and how the output might have been used. It’s very suspicious, but we need to know more to really prove something.

nigel jones
December 4, 2009 9:45 am

I don’t see that this code snippet or the rest of the code from CRU proves malpractice.
It reinforces the view that they have been up to no good for years.
It suggests that their code development and data management practices are a mess and well below what is required for such important work.
It gives reason to doubt the validity of their output.
If it proves anything, it’s that there’s an absolute need for their data and methods to be brought into the open for review and for them to explain themselves. If they can’t or won’t, they can’t complain if the world judges them on the basis of the leaked material. The clock is ticking.
I go along with the view that the title, “The Smoking Code” and the article presented with no qualifying statement, are somewhat precipitate; not as measured as WUWT usually is.

bill
December 4, 2009 9:47 am

Brent (09:17:09) :
But, but, but . . . commenting out lines of code is just for simplicities sake. On when you need/want it, off when you don’t. Commented out or not, what is it doing in there at all?
The gun is smoking alright.

I hope you’re kidding!!!!!!!!!!!
Why not castigate briffa for all the fudge factors that he or someone else MAY have written but didn’t!!!!!!!!!!!!!!!!!!!!!!
As someone else said – why bother with the code why not just draw the line you want?

Billy
December 4, 2009 9:48 am

>> Tom_R (09:15:52) :
“It seems obvious to me that interpol is an interpolation subroutine, probably just a linear interpolation. It takes the ‘yrloc’ array and the ‘valadj’ array and creates a value adjustment for a given year (timey) that fits between the ‘valadj’ values for the ‘yrloc’ boundary years.”
Agreed. Over at the author’s blog he posted a link to the interpol() function documentation. After reading that and thinking about it some more I posted over there with essentially the same explanation you give above. I’m guessing timey is just an array containing the values [1400, 1401, 1402, … 1992, 1993, 1994].

Mark
December 4, 2009 9:53 am

Stay calm people. In my opinion this code doesn’t quite rise to the level of a “smoking gun”. I would liken it to gun smoke in the air. It certainly raises a lot of interesting questions.
Why was this code written?
Where and how was it used?
Please, people, hang on to your skepticism. Don’t rush to judgment and declare this code “evidence of fraud.” That would be too much like declaring “the science is settled.” Let’s not make that mistake.

steven mosher
December 4, 2009 9:54 am

It really would help if folks would read the Hide the decline posts that McIntyre has written ( or Jean S or UC ).
As usual the warmists are trying to underplay the “chartsmanship” they engaged in and the sceptics are overplaying it.

latitude
December 4, 2009 9:54 am

The smoking gun:
Everyone is looking at all the smoking guns,
and totally missing what I think is the most obvious one.
If you believe CRU, “most of the raw data was destroyed when we moved in 1980”
Then no matter what the current head of it says:
I destroyed it.
I will destroy it.
etc etc
He (CRU director Phil Jones) was not there.
Jones did not work for CRU in 1980.
If you believe CRU, what this really says is that all of Jones’ work, every computer climate program/model….
….everything was built on their “adjusted” data.

WAG
December 4, 2009 10:00 am

Anyone see this?
http://www.nationalpost.com/news/canada/story.html?id=2300282
Apparently Climategate was only one of several organized efforts to break into universities and steal data.

Jack Okie
December 4, 2009 10:00 am

JohnSpace nails it:
The “smoking gun” is that CRU, et al did not make their raw data, metadata, code, etc freely available for others to examine, and supposedly “prestigious” “scientific” journals accepted their papers without requiring the supporting materials to be archived. That is not real science, it’s cargo-cult science.
James: To elaborate a little further…for this code to mean anything, it needs to be shown how it effects the output/final result. Mr. Greiner has not done that. I would not bother posting this stuff unless you can determine a little more precisely what effect the suspect code has on the output. If the answer is “none”, then there really isn’t necessarily a problem.
Well, James, if you’ll provide us the actual data and code used for CRU and Mann’s various papers, I’ll bet we can determine a lot of things “a little more precisely”. Anyone who believes Mr Greiner has got it wrong, feel free to provide him with the actual materials. Since CRU at the very least suppressed their data, we are free to draw whatever conclusions we can from what IS available. If CRU (or you, James) objects, hand over the data and code which will refute Mr. Greiner’s and other commenters’ conclusions.

Roy Everett
December 4, 2009 10:00 am

I’m playing Devil’s Advocate a bit here. This code fragment in isolation isn’t a smoking gun, though suspicious. As other blog comments say or imply, it needs to be set clearly in context. Was this code actually run in producing the deceptive output? When? Is it just test code? What is the meaning of the fudge factors (e.g. definite intent to mis-lead; or reasonable calibration). Was the code repeatedly re-run with different fudge factors with until it converged on a desired result? Is it just somebody playing around with an idea? Is it something left over from a “sand-pit” (i.e. experimental non-production code not intended for release) which, perhaps, got “left in” accidentally in later versions? This is the sort of thing for investigators to ferret out by tracing builds and deliveries: you can’t work this out just by looking at snippets of code or even whole programs, and it would be easy, given the current hysteria, to leap to unwarranted conclusions.

George
December 4, 2009 10:04 am

Just a summary, the code was commented in the fragment posted in the middle of this thread. It has been noted by others that it was used in other runs. Burch also noted that it was used in another area. As a note, when I spend a lot of time on a piece of code, I may save it in comments in the file. The really dumb part of this is commenting the use of the adjustment, but not commenting the code to generate it. If you don’t use the calculation, normally you comment it out too as it waste cpu cycles and storage. Only if you plan to use it elsewhere would you leave it running. hmm.
To address – “translate to VB… it would be just as obscure.” Arrays in VB can be even funkier, so it would not do any good, especially without the code for Interpol().
The Fudge Factor… this was done with the adjustment hard coded . Each element in the array is statically defined. It is 0 for the first 5, dips for the next 4, then jumps up steeply.
=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75
The multiplier lets the coder have one number to change to change the slope of the adjustment. *1.0, steeper, *.5 shallower. You can see the results by changing it. A real developer would have used a variable and that way would not have to scroll into the code to change it (just change the decl in the header or definition file.)

Optimizer
December 4, 2009 10:08 am

Nice post, Burch (07:44:25)!
The fact that adding the “fudge” is done in additional places makes it even more suspicious.

Eric
December 4, 2009 10:14 am

While the curve of the adjustments does correspond to the shape of Mann Hockey Stick, that is an incorrect comparison. The adjustment code shown appears to adjust smoothed temperatures from the 1930s onwards, not from hundreds of years ago.
Thus, this article is misleading.
No one yet knows the reason this code fragment exists (or similar code fragments in other files). The fragment raises good questions – but does not yet give a definitive answer.
Please be cautious in interpretation.
(I am also a s/w engineer.)

George
December 4, 2009 10:16 am

Just a general comment on the code and comments from various links above.
I saw the numbers in the Excel file for grant dollars. If this crap data was in such a state, why the heck did they not use some of the grant money to hire a grad student, even in Computer Sciences, to get them a real database? They could have created a SQL database from all these crap flat files and made life so much easier. Guess I have been in the corporate enterprise world too long and just do not understand acedemia at all. It does not address the raw data issue, but it could have preserved it and made it easier. sigh.

Steve in SC
December 4, 2009 10:18 am

My great fear is that the investigation at UEA will focus on who got the files out and not on the conduct of the “team”.
Whether or not this particular snippet of code is a smoking gun or not, I say
there is definitely the stench of burned black powder in the air.

Allen
December 4, 2009 10:18 am

My friends, these are but the opening battles in a war against media fiction and political rhetoric. As much as you or I understand the falsehoods that the fiction CRU had fed to the alarmists, the moderates in this world, who don’t understand science and scientific peer-review process, tend to understand the stench of corruption.
All we can do as scientists is hammer away in any public forum you can find to bring the corruption into the uncomfortable public glare. The politicians are starting to notice that Climategate can provide an opportunity to garner more votes, and it is our job to persuade the moderate or disinterested voter with facts that get at the truth (or expose the falsehoods). As more voters become persuaded, they will register their views on the opinion polls that the politicians watch like hawks.
For me, the war will be won when all major political parties in democratic states begin to discard climate change policy planks in their election campaigns. As you can see, with elections as much as years away in some countries, the war will be long.

December 4, 2009 10:26 am

Eric Raymond, a guru blogger of programmers and of the open source movement exposed this code on Nov 24. http://esr.ibiblio.org/?p=1447.
I’m told he had not commented on climate change in his years of blogging before this point. But it’s clear now, he had been watching and knew exactly what to look for.
Anthony, I’m sure Eric has been reading your blog and CA.
Eric (ESR) responds to questions on his site.
———-
“>The “blatant data cooking” is to use the actual thermometer data where it’s available, which, of course, shows no decline over those decades …
esr: Oh? “Apply a VERY ARTIFICAL correction for decline!!”
That’s a misspelling of “artificial”, for those of you slow on the uptake. As in, “unconnected to any f…. data at all”. As in “pulled out of someone’s ass”. You’re arguing against the programmer’s own description, fool!
In fact, I’m quite familiar with the “divergence problem”. If AGW were science rather than a chiliastic religion, it would be treated as evidence that the theory is broken.”
——
“>The program you are puzzling over was used to produce a nice smooth curve for a nice clean piece of COVER ART.
esr: Supposing we accept your premise, it is not even remotely clear how that makes it OK to cook the data. They lied to everyone who saw that graphic.”
——
“>I’m sure we’ll see a correction or retraction here any minute.
esr: As other have repeatedly pointed out, that code was written to be used for some kind of presentation that was false. The fact that the deceptive parts are commented out now does not change that at all.
It might get them off the hook if we knew — for certain — that it had never been shown to anyone who didn’t know beforehand how the data was cooked and why. But since these peiple have conveniently lost or destroyed primary datasets and evaded FOIA requests, they don’t deserve the benefit of that doubt. We already know there’s a pattern of evasion and probable cause for criminal conspiracy charges from their own words.”
———
This is damning code.
I toyed with it here.
http://joannenova.com.au/2009/11/cru-data-cooking-recipe-exposed/
Imagine what kind of reasons you might come up with for that VERY ARTIFICIAL adjustment.

Back2Bat
December 4, 2009 10:30 am

“Why not just make up a bunch of numbers? Obviously, there is some method to the apparent (to some) madness of the program in question, otherwise why bother?” Billy
Do you think evil is that cut and dried? First a computer program is an apparent attempt at proper procedure and probably started that way. At first glance, it appears that due process is followed. And to those unfamiliar with GIGO, a computer adds legitimacy to the result. I would assume that the code was initially created honestly and then “improved” as was thought necessary.
Computers are great tools for ANY purpose.

Russ R.
December 4, 2009 10:32 am

This is not smoking code.
This is a fresh pile of steaming code.
This is the pile they were trying rub our noses in, to break our awful habit, of using the most cost efficient means of energy to heat our homes, produce goods and services, and transport us to work.
I look at this, and I know they know all about “peer-review” but don’t know the first thing about “code-review”.

DAV
December 4, 2009 10:33 am

A nit perhaps: “cut-and-try” means experimental; “cut-and-dry” OTOH means ordinary and routine almost meaning straight-forward. Which was meant? I see I’m not the only one asking.

TJA
December 4, 2009 10:40 am

I think that TKI’s comment points to an area where further investigation is warrented.
I agree with others that it is certainly possible that a programmer could leave uncommented code, but that never gets called, in a module. It just doesn’t seem very likely that it wouldn’t have all kinds of “DO NOT USE THIS FUNCTION IN PRODUCTION!” type warnings around it.

December 4, 2009 10:55 am

WAG (10:00:06),
Notice that the second word in your posted article is “alleged”. Everything in the article is alleged by the IPCC guy. The comments following the article easily debunk the spin.
The guy claims that a hacker “attempted” to break into their computers. FYI, attempts to hack into every computer connected to the internet happen constantly. That’s why people use anti-virus programs.
If the alleged basis to this story was true, it would have been reported by the mainstream media as soon as the emails were leaked. But to make this a talking point more than two weeks later is just another attempt at damage control: “Hey, look over there! I think I saw a hacker!”
This is simply deliberate misdirection. The issue isn’t even the CRU insider who in all probability leaked the info — it’s the obvious fact that at the very least, now no one can rely on any conclusions based on the climate pseudo-science code or the temperature data endlessly massaged and beaten into submission by the CRU.
The CRU’s siege mentality, with the ongoing strategy sessions about how to thwart legal FOI requests are the real story, not this speculation. Conspiring to evade the law is way more serious than this IPCC guy inventing his speculation about a presumed hacking attempt.

December 4, 2009 10:56 am

If it it’s in there but commented out or whatever, that’s standard programmer stuff. Just because it’s been commented out doesn’t mean it hasn’t been run and since THE ORIGINAL DATA HAS BEEN LOST then I am not willing to give them the benefit of the doubt.

Mike Abbott
December 4, 2009 11:00 am

Anyone who thinks this piece of code is The Smoking Gun had better read this blog post, which is linked above under “Possibly Related Posts.”:
http://amanwithaphd.wordpress.com/2009/12/01/i-do-not-think-that-means-what-you-think-it-means/
Yep, there is such a thing as a fudge factor in the code. But the only part of the code that appears to use it is commented out by the semicolons. This means that the computer ignores any line that begins with a semicolon.
So, it looks like this fudge factor might only have been used to check out some code and make sure the arrays worked right. It was used in a command (yearlyadj ) but not in the actual print out. It was then commented out and not used.

I agree with “Bill” and a few other lonely voices in this thread who believe this piece of code is not a smoking gun. A more likely explanation (as others have suggested) is that it is merely a test of one of the computer models.

NickB.
December 4, 2009 11:00 am

To all of those saying Anthony should take down this post:
Can we definitively prove that this code snippet was used to produce anything, no we can’t BECAUSE in all likelihood, CRU can’t even reproduce their own products and their refusal to release their “real” code, and that’s assuming this isn’t “real” code which is just a specious an argument as saying it’s not…
This has created a vacuum in which the leaked release is the only thing we have to go on, despite repeated requests and, IMO, improper denials through FOIA to see what is actually under-the-hood (a.k.a. access to allow the most basic aspects of the scientific method to be observed)
Self-censoring this post is not the answer, and I think Anthony’s note is more than adequate to explain the context of this code and analysis

Roger Knights
December 4, 2009 11:02 am

I think it would be a good idea to append a “?” to the title of this thread.

December 4, 2009 11:16 am

RE: Optimizer (09:43:39)
You gave an excellent clarification.

Agjag06
December 4, 2009 11:18 am

This interpretation seems to be the most reasonable.
http://www.jgc.org/blog/2009/11/very-artificial-correction-flap-looks.html
He even identifies the published paper where this section of code would have applied.

Burch
December 4, 2009 11:28 am

The following text is copied from the JunkScience.com web site. I have not tried to independently verify the claims, but it makes sense.
——————————————
Update: It has become fairly obvious this archive was not “hacked” or “stolen” but rather is a file assembled by CRU staff in preparation for complying with a freedom of information request. Whether it was carelessly left in a publicly accessible portion of the CRU computer system or was “leaked” by staff believing the FOIA request was improperly rejected may never be known but is not really that important. What is important is that:
There was no “security breach” at CRU that “stole” these files
The files appear genuine and to have been prepared by CRU staff, not edited by malicious hackers
The information was accidentally or deliberately released by CRU staff
Selection criteria appears to be compliance with an or several FOIA request(s)
With some reluctance we have decided to host compressed archives of the hacked files and uncompressed directories you can browse online. Both are linked from the menu or you can simply point your browser to http://junkscience.com/FOIA/

debreuil
December 4, 2009 11:36 am

Note, in briffa_Sep98_d.pro this line is never eventually used (commented out). It is used in another directory, briffa_Sep98_e.pro though. Here it is used to plot graphs titled:
‘Age-banded MXD from all sites’
and
‘Hugershoff-standardised MXD from all sites’
I’m not sure what these graphs are for. It is important to follow the code all the way through and be accurate tough. ‘Code doesn’t lie’, so nothing is exposed until the analysis is correct.
Imo (accounting for comments from Gavin I can’t verify so take at face value), I think this may have been an abandoned effort.
I suspect they were looking at what it would take to get a hockey stick from the data and saw it wasn’t going to happen that way. At that point they either did the biased selections or clipping and grafting of temp data. (this is speculation on my part, waiting for someone more familiar with the graphs to figure that out).

Peter
December 4, 2009 11:46 am

Bill:

The smoking gun is commented out (see the “;” = comment). Why cannot you see this?? It is obvious.

It may be commented out in briffa_sep98_d.pro, but, as has been pointed out more than once here, it is definitely not commented out in briffa_sep98_e.pro (in the harris-tree directory) In this file it it used more than once.

boondox
December 4, 2009 12:00 pm

MangoChutney (05:54:06) :
“just a thought, if the emails and code were obtained by a hacker and not released by a whistle blower, would the evidence be inadmissible in a court of law?
Fourth Amendment protections apply to actions by police or their agents (cops tell the hotel maid to look for the dope) not to actions by private citizens. Assuming the source isn’t acting at behest of law enforcement, suppression is unlikely. Even if law enforcement was somehow involved, “inevitable discovery” exception might apply. Courts are also much less apt to apply exclusionary rule in a civil action, the most likely venue.
Besides, most important court here is public opinion – excluding “smoking gun” evidence of fraud would have same net effect as suppressing evidence of infidelity in Tiger’s car wreck case.

John Galt
December 4, 2009 12:18 pm

bill (09:36:37) :
g hall (08:57:10) :
J. Bob (08:56:09) :
John Galt (08:43:01)
JJ (08:39:45) :
david (08:14:07) etc. etc.
The smoking gun is commented out (see the “;” = comment). Why cannot you see this?? It is obvious.
The code is written by scientists for a 1off use. Why would you clean it up and make it presentable. They are not going to sell it to others.

It’s called professional standards.
Do you take any pride in your work? Are you sloppy when you think nobody is looking? Cut corners when the boss isn’t around?

DocMartyn
December 4, 2009 12:21 pm

Any dataset on which this was used would have a output that was divisible by 0.75. You could pick it up using Benfords law.
Audit commander has a free upload and will do a Benfords analysis. All you need are some outputs.

J. Peden
December 4, 2009 12:26 pm

bill (05:58:43) :
Hang on a moment! This code is from 1998.
If this fudge were included in the Briffa etc. documents then there would be no decline.
So all those “hide the decline emails” would be irrelevant.

No, there was a decline and it was hidden in ipcc and other official documents – as described by Steve McIntyre and Jean S..
NOAA also even deleted the post 1960 Briffa data from its archive, as though there was no data to begin with!

wronwright
December 4, 2009 12:27 pm

Yeah, but …
Can’t we still restructure the entire economy of the US and the West, effectively transferring all manufacturing to China, India, and the Asian Tigers? Along with millions of jobs and trillions in GNP? Just in case?

MDR
December 4, 2009 12:41 pm

Technical nit about the description of line 4: The IDL findgen(n) function generates a one-dimensional array of n elements, stepping from 0 to n-1. (So findgen(6) goes from 0.0 to 5.0, and not to 6.0 as listed above.)
Thus, the findgen(19) generates a 19-element array going from 0 to 18 in integer steps, which, after being concatenated to the initial value, makes yrloc a 20-element array as stated (and the ending year for yrloc that you list is correct).

Gary Pearse
December 4, 2009 12:42 pm

Now that we know that the raw data was discarded by UEA (although personally I believe it is spirited away somewhere – they should confiscate Jones et al’s home computer, etc.) is it possible to refind it by reviewing the “Codes” and removing the fudge factors? A bit of forensic computation is needed here. These things must be considered by the independent commissioner appointed by UAE to ferret out the truth.

Gail Combs
December 4, 2009 12:44 pm

WAG (10:00:06) :
Anyone see this?
http://www.nationalpost.com/news/canada/story.html?id=2300282
Apparently Climategate was only one of several organized efforts to break into universities and steal data.
Reply:
Sounds like media spin to make skeptics look like criminal kooks. There is no one to refute his story so it is a very safe spin.
I will believe that AFTER I see a police report dated several weeks ago. At this point I do not believe him and I doubt the media will bother to check if his story is true. Chances are there is a University police department so even checking with the police does not necessarily prove he is telling the truth.
The MSM showed faked film footage of a major riot at Purdue University just after the Kent state riot to get the governor off the hook for the murder of four students. The film was shown nationwide but the riot never happened. I have not trusted the MSM ever since. The Kent State riot was about the town not allowing married adult Vietnam Vets the right to vote if they were students and NOT about protesting the war. However denying voting rights would have drawn a lot more criticism so the story was changed and the riot at Purdue fabricated. (based on first hand evidence by me and my boyfriend we were at the “riots”)

HAL 9000
December 4, 2009 12:47 pm

Just a moment, just a moment. I have just picked up a fault in the CRU 35 unit. It will go 100% failure within the next 72 hours.

December 4, 2009 12:58 pm

you guys realized that the variable you are talking about is never even used in the code in that file?
here is the code: http://www.di2.nu/foia/osborn-tree6/briffa_sep98_d.pro
they use this data in the interpolation and get this yearlyadj
and that variable is never used.
i think the CRU data is very incriminating, this is just a very bad example.

John Galt
December 4, 2009 12:58 pm

WAG (10:00:06) :
Anyone see this?
http://www.nationalpost.com/news/canada/story.html?id=2300282
Apparently Climategate was only one of several organized efforts to break into universities and steal data.

So releasing emails and documents created by the climate researchers themselves would discredit climate science?
I can understand how fake emails and documents could be used to attempt to discredit AGW, but real documents and real emails are even more damning?
I’ll admit, a lot of software is like sausage — you don’t want to know how it’s made or what’s in it — but given the ramifications of AGW, isn’t it necessary for these scientists to actually show their work?
BTW: Aren’t there whistleblower laws that would offer protection to somebody to step forward and shine the light on what is really happening?

John Galt
December 4, 2009 1:02 pm

George (10:16:34) :
Just a general comment on the code and comments from various links above.
I saw the numbers in the Excel file for grant dollars. If this crap data was in such a state, why the heck did they not use some of the grant money to hire a grad student, even in Computer Sciences, to get them a real database? They could have created a SQL database from all these crap flat files and made life so much easier. Guess I have been in the corporate enterprise world too long and just do not understand acedemia at all. It does not address the raw data issue, but it could have preserved it and made it easier. sigh.

Any database would do, even one of the various open-source databases available. Imagine the analysis that could be done if the data was properly stored in a well-designed database?

Jakers
December 4, 2009 1:02 pm

“I tried to write this post in a manner that transcends politics. ” – Hahaha!
“I’m coming to you today as a scientist and engineer with an agnostic stand on global warming.” – Double hahaha!!
“It just shows that all of the data that was the chief result of most of the environmental legislation created over the last decade was a farce.” – _ALL_ of the data? This was applied to _ALL_ of the AGW data in 1998???

Dr A Burns
December 4, 2009 1:09 pm

Bill, Tim,
I doesn’t seem a coincidence that the “correction” corresponds almost exactly to the 2.3 degree temperature fall over 35 years in Briffa’s 1998 paper. This paper was based on 400 trees across the whole N hemisphere; much more meaningful that the later narrow selections. It would have been a great embarrassment to the CRU at the time.

TJA
December 4, 2009 1:36 pm

“you guys realized that the variable you are talking about is never even used in the code in that file?”
So, can you say with authority that the scope of the variable is limited to the file? Just curious. I don’t know the answer, but if the variable is never used anywhere in the code… anywhere, than this is much ado about nothing.
Just don’t understand what “hide the decline” means then. Occam’s razor suggests that the answer is probably one of the simpler interpretations of the evidence at hand. I can’t see how “hide the decline” can be use in so many different places to mean the exact thing it needs to mean to avoid being incriminating, even if that thing is different in every “context”
I see though that the toothpaste cleanup crew are out in force to try and force it back into the tube. The truth will out on this code. Objections will be upheld or answered. This is not going to be settled by rhetorical questions.

chris
December 4, 2009 1:56 pm

To all the nerds dissecting the code like some cadaver down to the cellular level get friggin’ real. Tell this to the Inuit’s who for centuries have lived and prospered as an indigenous population as they watch their homes sink into the methane excreting bog that was frozen for milleniums…no snow on Kilamanjaro, glaciers receding around the planet while the desertification belt expands, bird migrations changing and most improtant, something quite easily measureable…duh…the acidification of the friggin’ oceans. Seems the biggest carbon sink of all may sooner than later not be drinking any carbon at all. Rock on number freaks you’ve really proved it now.
Suggestion get out into the real world once in a while…smell the roses tatse the bitters ;(

Greg
December 4, 2009 2:05 pm

Wow, Chris that was awesome. You managed a nice round up of debunked AGW theories in a just a few short sentences. Keep drinkin the Kool-Aid! AL Gore is counting on you to keep him well funded. I just wish he would show up and debate someone oneday….

Aaron Edwards
December 4, 2009 2:10 pm

It’s the code stupid.
Thanks Anthony for reminding us all not to get to polarized before we see the entire code warts and all. Context is everything here. I wonder how many commentors actually read your note. All we need are the raw data and the naked code. Surely the FOIA will eventually allow qualified individuals to examine the data in question and all will be made clear. That is my hope. The truth has a way of muscling in and establishing itself. Be patient grasshoppers, strength is gained first through knowledge.

dave-boy
December 4, 2009 2:12 pm

BBC Newsnight programme last night did an item on this code, a expert programmer said that the code was flawed,as it missed date it was supposed to be analysing.
And the person who programme the code was not very good at his job
something the actual code programmer put in to the programes comments
comments like “oops theres my bad programing again”
You might find Newsnight on the BBCiplayer 3/12/2009 starts 22.30hrs
great site Anthony,keep up the good work.

Dan R.
December 4, 2009 2:12 pm

“I can’t comprehend their justification for this obvious blatant fraud. What was Mann thinking when he manipulated this data in this manner?”
He was thinking about all the millions in grant money that he stood to lose as well as the long-term damage to his career, that’s what.

Iskanderbey
December 4, 2009 2:13 pm

Yup…. It’s true….
I have turned to the dark-side……..
The University site has a form where you can submit your funding research program…..
You know……
Is it just me… or…. do l have fun in a strange way?
Anyhoo………
In the comments section where you can outline what you are after l have proposed the following:
I am interested in funding your Climate Research Unit as long as they come up with the results i’m looking for. Please pass this on to Professor P. Jones as he may be looking for a new job and that means he may be cheaper to afford.
I am also looking for someone to destroy the original raw temperature data after they skew the numbers the way l like them. If you can suggest anyone that would be great.
Also, l can give a job to any and all of the Climate Researchers that lied, manipulated and fudged their way into the “Discredited Scientist” (Seancetist) category. That is, l will pay them to apologise to everyone they have deceived.
Wow, i’m on fire here today! Can you please invite me to Phils’ early “Retirement” party? I will bring the drinks and the credibility (which is sadly lacking there at the CRU)
Brrr… It’s getting cold here….
MUST *giggle BE *giggle GLOBAL *giggle COOLING
Hey, please tell Phil that he may not have proved Global Warming but he really has warmed my heart. The best was when he pretended to be “Weird Al Yankovick” and drank from the toilet bowl. We are still recovering. It was just the best ! *snorting giggle
But we really are greatly interested in his decline. We, here in the Globe that’s Warming, have been looking for his decline everywhere but can not find it. Please ask him if he knows where he last saw it, and we will keep looking here. I hope he has not forgotten where he put his decline.
If he, by mistake, accidentally hid his decline please ask him to keep looking as l would like his decline seen by the whole world.
Thank you… and l hear England is warm and dry at the moment.
Seriously…. It’s called “Global Raining”
Throw it up…. see if it has wings……
Back to me now
If you have any research ideas please contact me as l think this may be a one off thing…..

JJ
December 4, 2009 2:14 pm

NickB. (11:00:41) :
You post amounts to nothing more than – “In the absence of data, we are free to make up whatever fits our preconcieved conclusion”
That is Teamspeak, and should not be practiced here.
No one is asking that Anthony take down this post, only that he tone it down so that the conclusions drawn are supported by the data presented.
We do not (yet) have sufficient information to declare this code snippet a ‘smoking gun’ or anything of the sort. We need to keep digging, instead of going off half cocked.
Anthony should tone down the conclusory tone of this post, until the above questions are relaibly answered.
Incidently, for those of you who think ‘It was commented out!’ is some sort of definitive resolution to this issue – It Aint. When we programmers ‘comment out’ code like that, we do it so that we can comment it right back in if we want. If we have no intention of ever running that code again, we dont comment it out. We delete it. That the code was commented out is no proof whatsoever that it was never used.
We need to know what this code does, what the data in it represent, to which data it was applied, what the results were, and how the results were used. Absent that, this is just an interesting lead that needs to be followed. Pretending otherwise is hypocritical, and can backfire in ways that could be used to smokescreen the objects of legitimate criticism that we are seeing in the whistleblowers data.

SunSword
December 4, 2009 2:14 pm

Why didn’t they use some of those grant dollars to hire one or more people trained in IT? Hire a real programmer, a real DBA, and buy a real database? Well…
(1) They are self-centered narcissists who think they are so smart they don’t need to hire anyone because they are so great they can do anything at all, or
(2) Because it is very CONVENIENT that the original data has been “lost” and that the model code is a rats nest of unintelligible spaghetti.
Or perhaps both.

NickB.
December 4, 2009 2:41 pm

JJ (14:14:08) :
Admittedly, maybe I misread the following comment:
____________________________________
slow to follow (09:16:30) :
Anthony – please pay attention to the comments above. Leaving this post up with no caveat damages the credibility of your blog. Many will be passing by here without the time/interest to read the comments in detail. Unless this can be shown to produce production output to call it “smoking code” is disingenuous.
____________________________________
That said, this is a repost of the original here: http://cubeantics.com/2009/12/the-proof-behind-the-cru-climategate-debacle-because-computers-do-lie-when-humans-tell-them-to/
As a re-post/guest-post, I’m not sure how exactly Anthony could “tone it down”. It’s my understanding that in this situation the only options are caveat/disclaimer (which he had already done) or removing it entirely
Other than your summary of me that “In the absence of data, we are free to make up whatever fits our preconcieved conclusion”… that’s not what I meant to say and I don’t think that’s what I said.
What I was trying to get to is that these out of context snippets are the only pieces of the puzzle available, so the feeding frenzy should not come as a surprise. This is red meat but without context cannot definitively prove anything. No disagreement on that

Peter
December 4, 2009 2:44 pm

Chris,
Yes, we’ve all noticed how all the cold places are getting warmer, the warm places are getting colder, dry places getting wetter, wet places getting drier, calm places windier, windy places calmer, cloudy places sunnier and sunny places cloudier.
Anything you wish to add?

Burch
December 4, 2009 2:45 pm

Since there are nearly 200 responses here, and it’s a pain to read through ALL of them, let me repeat this point for those who missed it above. Yes, the use of the array is commented in the exemplar file. However, it is NOT commented, and used twice, in a different source file. ‘briffa_sep98_e.pro’
That said, this does not prove that the offending code was used in any published examples. Only that it could have been.
This is just one more bit of the puzzle to ponder.

John Barrett
December 4, 2009 2:48 pm

For Will Hudson
http://playpolitical.typepad.com/issue_ads/2009/12/the-spectators-fraser-nelson-debates-climate-change-with-an-hysterical-bob-ward-from-the-lse.html
h/t conservativehome.com
For our non-UK viewers : The Spectator is a rather right-wing weekly magazine of immense venerability, but to its credit it does open its doors to a broad range of opinions. It used to be very radical in some of its output ( Mark Steyn was a regular contributor ) but has gone off the boil in the last couple of years after it lost its editor Boris Johnson, who is now Mayor of London.
It has dedicated this week’s issue to the AGW conundrum. http://www.spectator.co.uk
This is a fine example of the level of debate; sceptics trying to be concilitory and just asking for some honesty against shouty, close-eared shrill rudeness and hubris from the warmists. Bob Ward is an intensely tedious presence in blogs’ comments wherever this matter arises.

Peter
December 4, 2009 3:03 pm

As to the ‘adjusted’ data not being used anywhere, the following code comment snippets appear to suggest otherwise.
From abdlowfreq2grid.pro:
; HUGREG=Hugershoff regions, ABDREG=age-banded regions, HUGGRID=Hugershoff grid
; The calibrated (uncorrected) versions of all these data sets are used.
; However, the same adjustment is then applied to the corrected version of
; the grid Hugershoff data, so that both uncorrected and corrected versions
; are available with the appropriate low frequency variability. There is some
; ambiguity during the modern period here, however, because the corrected
; version has already been artificially adjusted to reproduce the largest
; scales of observed temperature over recent decades – so a new adjustment
; would be unwelcome. Therefore, the adjustment term is scaled back towards
; zero when being applied to the corrected data set, so that it is linearly
; interpolated from its 1950 value to zero at 1970 and kept at zero thereafter.
From calibrate_correctmxd.pro:
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
and further on in the same file:
; Now verify on a grid-box basis
; No need to verify the correct and uncorrected versions, since these
; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.

slow to follow
December 4, 2009 3:21 pm

The BBC take a look – again the question of where and how the code is used doesn’t get asked:
http://news.bbc.co.uk/1/hi/programmes/newsnight/8395514.stm

nigel jones
December 4, 2009 3:51 pm

SunSword (14:14:57) :
Why didn’t they use some of those grant dollars to hire one or more people trained in IT? Hire a real programmer, a real DBA, and buy a real database? Well…
(1) They are self-centered narcissists who think they are so smart they don’t need to hire anyone because they are so great they can do anything at all, or
(2) Because it is very CONVENIENT that the original data has been “lost” and that the model code is a rats nest of unintelligible spaghetti.
Or perhaps both.
———————————————–
There’s a third option. Their approach to data management and code isn’t much different to that of a lot of other research establishments. Mainly writing code as one-offs and they don’t have IT professionals, or won’t use them for the reasons you suggest. In most cases, no one cares whether the analysis in some routine paper is backed by proper IT procedures. The paper just doesn’t matter outside the field and if it’s found to be rubbish, it will simply reflect on the workers and the establishment.
The importance of this work grew, partly through their own political efforts, and they got away with far too much because dubious peer review took the place of verification. It changed from doing one-offs into performing a process and the expectations of QA, and development disciplines are very different. Also, it’s quite reasonable to insist that the quality of work used to inform huge policy decisions is impeccable.
They ended up holding a tiger by the tail. I can’t say I’ve got a lot of sympathy for them because it’s too serious and there’s too much evidence that they’ve behaved badly, but I can see how these things can come about bit by bit until someone finds themselves in the most unbelievable mess.

greg elliott
December 4, 2009 3:55 pm

The dip at point 8 corresponds to 1934, the hottest year on record in the US. By using a negative fudge factor at that point this fact was hidden in the graph, making the temperature rise seem straight line- to match CO2 increases.
The fudge factor inflated the results and the .75 multipler was used to correct this, to bring the inflated tree ring data in line with the instrument data.

Tenuc
December 4, 2009 4:05 pm

I find it is the context of the whole package of information in the leaked .zip file that makes a stench so strong it makes me want to hold my nose.
We know the CRU where under pressure to produce the ‘proof’ of CAGW for the IPCC on short lead-times.
We know from the Harry File that the original raw data was in a mess even before it got to him, and can speculate that much of the data it contained may have been interpolated.
We know Jones et al didn’t want anyone outside the cabal to have access to this data.
We know that a program used to produce global average temperature charts had the built in facility to produce ‘fudged’ data for trend charts.
We know the trick to solve the problem of the Briffa tree data work was a done by a data-join to the thermometer set, which was not alluded to when the data was published.
We know many peer reviewed pro-CAGW papers used the trends from the CRU temperature data in their work.
We know the satellite data was calibrated to the thermometer record, so perhaps has also been debased.
The conclusion is that the majority of climate science from the IPCC has been contaminated by poor the poor quality of the original data and subsequent adjustments, and this is the reason the whole group could never let anyone outside of it see the mess.
Oh what a tangled web we weave. When first we practise to deceive!
When first we practise to deceive!

Upon Further Review
December 4, 2009 4:05 pm

As a “global warming/climate change/whatever they’re calling it this week” skeptic, I’ve been going round and round with Warmers who keep trying to play the classic Wizard of Oz “Pay no attention to that man behind the curtain” strategy.
But my bottom line argument is always this: If the East Anglia data is on the up-and-up, why did the so-called scientists refuse to share it (and threaten to delete it), even in response to an FOIA request? Why did they feel it necessary to ignore, silence and even press for the firing of ideological opponents? And if the original data really wasn’t destroyed, why hasn’t it been publicly released in order to shut up the skeptics once and for all?
Make no mistake: This is bigger than one university, and the Warmers know it. The entire global warming movement hangs in the balance.
As Lord Christopher Monckton pointed out, there are only four major data sets for global temperatures, and this was one of them. But the other three are interrelated, either technically or socially (i.e., the e-mails reveal a close kinship between the scientists). So it really isn’t a case of this being only one of countless databases.
And speaking of interrelated, the East Anglia data underpins the IPCC report, which is what convinces politicians to allocate billions of taxpayer dollars to “saving the planet” and sign treaties such as the Copenhagen Agreement.
That, I believe, was the endgame of the global warming hoax: As the Copenhagen Agreement reveals, the Warmers wanted to extort billions of dollars from industrialized nations and create a transnational agency with power over the countries that signed the treaty.
In other words, like so many scams through history, it all boils down to money and power.
Funny, I don’t recall ever hearing about Albert Einstein refusing to share the data behind his theories. That’s because real scientists have no reason to destroy their data, bully opponents, or — in the case of pseudo-scientist Al Gore — cancel speeches and hide from the public.
P.S. In answer to a previous question … actually, the correct term is “cut-and-DRIED,” not “cut-and-dry.” But perhaps “cut-and-try” was what the author intended.

Upon Further Review
December 4, 2009 4:16 pm

As for the standard Warmer comeback, “Oh, yeah? So why are the polar ice caps melting?,” I have a simple response: How do you know they are?
Most of us haven’t been there, and the mainstream media are adept at selective reporting. How many of you saw this April 2009 story in the New York Times or any other MSM outlet?
http://www.news.com.au/antarctic-ice-is-growing-not-melting-away/story-0-1225700043191
Ice is expanding in much of Antarctica, contrary to the widespread public belief that global warming is melting the continental ice cap.
The results of ice-core drilling and sea ice monitoring indicate there is no large-scale melting of ice over most of Antarctica, although experts are concerned at ice losses on the continent’s western coast …
[I]ce is melting in parts of west Antarctica. The destabilisation of the Wilkins ice shelf generated international headlines this month.
However, the picture is very different in east Antarctica, which includes the territory claimed by Australia.
East Antarctica is four times the size of west Antarctica and parts of it are cooling. The Scientific Committee on Antarctic Research report prepared for last week’s meeting of Antarctic Treaty nations in Washington noted the South Pole had shown “significant cooling in recent decades”.
___________________________________________
— In the words of the late Paul Harvey, “And now you know … the rest of the story.”

JJ
December 4, 2009 4:17 pm

NickB. (14:41:45) :
“As a re-post/guest-post, I’m not sure how exactly Anthony could “tone it down”. It’s my understanding that in this situation the only options are caveat/disclaimer (which he had already done) or removing it entirely”
He can first change the title. He can also emphasize the caveat with reformatting. When you said that he had posted a caveat I did a double take, as I had not seen it. I went up and looked for it, and still missed it. Finally found it … it essentially amounts to fine print buried at the bottom, when the large font head and subhead are sensational. The caveat needs to be BEFORE the poorly reasoned, inflammatory ‘guest post’. He can also make better choices as to what to feature on this blog, frankly. This ‘guest post’ is beneath the quality I expect form this site.
Note the automatically generated ‘Possibly related post’ immediately below the caveat. It links to a blog where someone rips this ‘guest post’ apart and uses it as a broad brush to paint all of us as ‘accusers’ who quote mine and are taking everything found in the leak out of context. This is exactly what I was warning about.
Somebody put their neck on the line to get this stuff out of East Anglia. We owe it to them to not waste it with bullshit ‘red meat feeding frenzies’.

Greg
December 4, 2009 4:24 pm

For the code snip to be interesting it has to be demonstrated that the code is actually used in published data or is the basis for such. I’m pretty certain that this is not the whole code, is not the whole data set, and may or may not be something that anything was based on.
That’s why the note at the end of the post is so important. There is no clear context to this code.
As for the hysterics claiming dangerous warming, well, there’s nothing about the current warming conditions (if any, and I believe that we’ve had some) that is unusual. It’s been both warmer and cooler, Inuit’s have sunk into the methane bogs before, and somehow survived. So have Polar Bears. Glaciers have come and gone. Droughts have come and gone. It’s all happened before and will happen again. According to historical records people, plants, and critters do better (overall) when it’s warmer.
Now, if you want dangerous climate change look at the end of the last ice age. I know of at least one flood (Spokane, Wa) which would have wiped about any US city off the map. It happened as the glaciers were receeding and a giant ice dam broke. I’m sure there were others.

Mooloo
December 4, 2009 4:44 pm

“We also don’t know how this code was used. If the results never saw the light of day, then it’s no big deal – just somebody being unsuccessful in trying to pull something (maybe for internal use).”
Why, if this is someone tinkering, was it sitting in a FOIA file?
Sorry, but when I did FOIA requests I trashed anything that was just me jotting notes to myself. Normally they were destroyed as I went along, of course, not at the time of the FOIA because they were never meant to be anything but ephemera. When lawyers do discovery, they present finished documents or at least drafts intended for consideration by others, not every half-arsed revision along the way.
I just don’t credit that this can be anything but an important step along the way for CRU. It might not be the finished article, but I find it difficult to believe it is some random musings.

Arno Nyhm
December 4, 2009 5:36 pm

Indeed, the usage of the manipulated data array is commented out. But why should someone a) even write such a manipulation of data into his code, and b): why would he then just comment it out instead of deleting it?
I think that this does definitively _not_ look like just a programmer playing with code or data. If the manipulation was obvious (if the manipulation would be like [-1.,2.,-3., …] or [-2.,-1.,0., …]) i’d say: ok, someone did play a bit. But to reckognize the influence of the elements of complex data to the results of your algoritm? Even a programming beginner would select data that is obvious in the result, rather than arbitrary values between -0.3 and +2.6. Which — now wonder — perfectly fit the Mann hockeythingy.

SOYLENT GREEN
December 4, 2009 6:34 pm

Well said, Pamela Gray.

Jerky
December 4, 2009 6:56 pm

Why no publish the entire piece of code? This is entirely meaningless without seeing how it was intergrated or used. Anyone who’s an experienced programmer knows this is an entirely disingenous post!

Roger Knights
December 4, 2009 7:01 pm

“I’m not sure how exactly Anthony could “tone it down”.”
Append a question mark to the current title.
“Oh what a tangled web we weave. When first we practise to deceive!”
“Oh what a tangled web we weave.
When first we practice to believe.”
–Laurence J. Peter (of Peter’s Principle)

Norman
December 4, 2009 7:02 pm

I asked about this on Real Climate a couple days ago and Gavin Schmidt was kind enough to reply.
Norman says:
2 December 2009 at 8:41 PM
This is the one that disturbs me. It seems an intentional deception in the program code. Can anyone explain why they did this?
;
; PLOTS ‘ALL’ REGION MXD timeseries from age banded and from hugershoff
; standardised datasets.
; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
; with missing values set appropriately. Uses mxd, and just the
; “all band” timeseries
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
[edit for space]
[Response: This was discussed earlier. It was an artificial correction to check some calibration statistics to see whether they would vary if the divergence was an artifact of some extra anthropogenic impact. It has never been used in a published paper (though something similar was explained in detail in this draft paper by Osborn). It has nothing to do with any reconstruction used in the IPCC reports. – gavin]

Roger Knights
December 4, 2009 7:11 pm

JJ:
“The caveat needs to be BEFORE the poorly reasoned, inflammatory ‘guest post’. He can also make better choices as to what to feature on this blog, frankly. This ‘guest post’ is beneath the quality I expect from this site.
“Note the automatically generated ‘Possibly related post’ immediately below the caveat. It links to a blog where someone rips this ‘guest post’ apart and uses it as a broad brush to paint all of us as ‘accusers’ who quote mine and are taking everything found in the leak out of context. This is exactly what I was warning about.”

Correct. I’ve been repeatedly warning that we must not give the other side an opportunity to counterpunch, we must not at this point try for a knockout blow but merely ask for a re-examination, and that we must not focus on the effect of the code and e-mails on the temperature record so much as on their effect on the credibility and trustworthiness of the warmers, of the importance of peer review and consensus, etc.
Incidentally, here’s an neat riposte to “denialist”: insister.

D. D. Freund
December 4, 2009 7:12 pm

“This is where the magic happens. Remember that array we have of valid temperature readings? ”
There is no reason to believe from what you have published that any array of termperatures is involved here.
“The main thing to realize here, is, that the interpol() function will cause the valid temperature readings (yrloc) to skew towards the valadj values.”
As far as I can tell, yrloc contains year numbers (eg., 1940), not temperatures. It is unclear what interpolating such numbers can have, but it sure has nothing to do with temperature, as far as you have shown.

anon
December 4, 2009 9:31 pm

BTW haven’t seen this mentioned anywhere — e-mails have dates so we know when they took place.
Well FWIW the code, data, and other files also have create and last edit dates (see the directory listing) and most of the code (*.pro) files are from 1998 to 2004. Not sure if this will help with the forensic work but seems like it should help create some kind of time sequence to the work.
And the raw data files look like they’re mainly from the early 90s
All the e-mail files have the same date 1.1.2009 but that just seems like a dummy date assigned to all the files to hide the actual dates they were extracted as even e-mails from 2009 have the same date.

PeterD of Yarrawonga
December 4, 2009 9:33 pm

Comment on, Comment off.
Why was the code commented out?
There are two ways to misrepresent data:
1. Use the unchanged raw data with a complex formula;
2. Change the data and use a simple formula.
Process 2 has the advantage that if data must eventually be supplied the changes will consistently produce the required result.
So, run the raw data to see what it looks like.
Make modifications by applying successive approximations to variables and constants until the desired effect is achieved.
Toggle the comment and make a final run to check that simple formula produces the expected outcome; if so, leave it as is, no need to go back and change the comment status.
The only potential problem is that the original unchanged data should be compared. How strange that it has been lost.
Maybe it’s a smoking code, maybe not. The point is that the output from this group of researchers is being used right now to invoke a new era of “world governance” in which economies of nations are to be controlled by an organisation that has total contempt for accountability and democratic process, with no avenue of appeal or to opt out.
In the absence of totally transparent data and analysis, is that ok with you?

Jay
December 4, 2009 11:15 pm

Thank you for your clean and honest work sir.. Im sure you would be glad to have a unkind eye peer review your work, because it speaks for itself.. Maybe if enough honest scientists come forward the damage can be repaired.. Thanks again for your time and work..

jorgekafkazar
December 4, 2009 11:17 pm

Prosecutor: And where did you find the defendant, Constable Platypus?
Bobby: Be’ind the ware’ouse, standin’ in the shadders by the back gate.
Prosecutor: Was the gate locked when you arrived?
Bobby: Hit was. There was a chain wif a padlock through two o’ the links.
Prosecutor: And what did the defendant have in his hands?
Bobby: ‘E ‘ad a pair o’ bolt cutters, hex’ibit B. (points at evidence table)
Prosecutor: And how did the defendant explain his presence at the back gate of the warehouse, in the dark, with these bolt cutters in his possession?
Bobby: [looking at his notebook] ‘E said, “I were just practisin’, gov’nor. These ‘ere bolt cutters ‘ave been commented out.”
(laughter)
Prosecutor: And had they been commented out?
Bobby: Someone ‘ad put a semicologne on ’em wif a piece o’ chalk, yerse.
Prosecutor: How long would it take for the defendant to remove the semicolon and cut the chain?
Barrister: Objection, M’lud! PC Platypus is not an expert on chalk.
Judge: Over-ruled. You may answer the question.
Bobby: Habout ‘arf a second.
(laughter)
Prosecutor: Thank you, Constable Platypus.

jorgekafkazar
December 4, 2009 11:21 pm

My comments are disappearing.

JamesinCanada
December 4, 2009 11:47 pm

chris (13:56:53)
If over the past 150 years, or any time frame, the weather or longer range climate did not change, scientists everywhere would be stumped. Climate changes, and to much bigger degrees than we’ve seen the past century. The arctic is not melting in actuality, pay the big bucks and fly up there and find out for yourself. Northerners don’t build on bogs, they drill housing stilts into permafrost. Foundations aren’t usually possible because every intense summer the permafrost shifts, as per normal.
The ocean issue of released methane, and deep ocean issue at that, is one you’d have to ask the Earth about. Does she have unexplored underwater volcanic activity she hasn’t told us about yet? If there’s one thing the past century of what, a +0.6degC, or a more recent – 0.4degC change of AIR temperature cannot do – is warm the massive volumes of oceans on this planet. Only the Sun or the Inner Earth can affect that, ask a physicist or oceanographer. It would be nice if they could step up to the plate and dispel this myth, but I guess it wouldn’t sell too many papers.

mkurbo
December 5, 2009 12:05 am

HAL 9000 did not malfunction. It was an error in the human programming and human direction that created the unfortunate problems.
Mr. Jones knew the raw data was lost for whatever reason. Further, it happened before he headed-up CRU, which would be all the more reason for him to have presented this fact early on and provided his “adjusted” data for review as such. He had many years to say “we lost data and here is how we approached the problem”. But he and CRU did not take that path.
The emails and code show us that humans were involved and there was programming adjusted and bad direction (decisions) given – no different than HAL 9000.
The smoking gun is the agenda that these actions underwrite.

Roy Everett
December 5, 2009 12:14 am

I’m not convinced that this microscopic tearing apart of lines of code trawling for comments, semicolons and arrays is helping to clarify matters, so I may move to a different thread which looks at the whole cadaver, not just the odd suspicious skin mark. What I want to know is:
(1) what is the basic structure of the model? i.e.:
– How are the real physical processes simplified mathematically;
– How are a priori unknown parameters in the model given values to make the model match reality in the form of past data?
– Does the resulting model behave (a) deterministically (b) chaotically (c) periodically (d) some mixture of these (e) chaotically but with bounds (f) “exponential” runaway (g) something new?
– Is the model sensitive to small errors in data (linked to b and e above)?
(2) How was this model encoded in the form of a computer algorithm?
– Has it been tested?
– How?
(3) Where did the actual underlying raw data come from and how was it sanitized (possibly quite legitimately) for use in this model?
(4) Is the model intended for forward extrapolation?
– Is the extrapolation valid?
– Does the extrapolation inherently contain show-stopping flaws (see below)?
(5) Did the output of the preceding steps actually get supplied to IPCC?
At the moment all I see in this thread is commentary on fragments of code which appear to be merely attempting to do a complex best-fit of disparate data and trying different techniques.: I see no physical modelling at all! Yet I can hardly believe that this fiasco all boils down to mathematical artefacts and inappropriate curve fitting and extrapolation, such as (i) extending a series of spline curves beyond the actual data range, or (ii) using second- or greater-degree polynomials to smooth data points. Such techniques would be fine for presentation to show smoothed and interpolated data in order to emphasise trend instead of random errors, but the techniques would obviously generate hockey-sticks just outside the range of data (at both ends). Such an error would be so elementary it should not get past the first stage of a peer review. My limited understanding is that in any case some error like this had already been flushed out by demonstrating that random data caused hockey sticks. Is this model really so opaque and spaghetti-like that one cannot demonstrate conclusively that either it still is or is not inherently a hockey-stick generator regardless of data?
I’m not a climatologist, warmist, or coolist. I’m only a software engineer who used in the past to model simple physical systems; I have only come into this issue, as a spectator, in the last week. I cannot believe that the questions above have not been asked before, and so I plan to look for them and their answers in the existing material.
Aside from this: have there been any individual or mass suicides, apocalyptic Heaven’s Gate style, resulting from the interaction of the global warming hypothesis on panicky over-suggestible people?

TKl
December 5, 2009 1:05 am

The following code is from
FOIA\documents\osborn-tree6\summer_modes\data4alps.pro
This file is dated 11.08.2008. (dd.mm.yyyy)
doinfill=0 ; use PCR-infilled data or not?
doabd=1 ; use ABD-adjusted data or not?
docorr=1 ; use corrected version or not? (uncorrected only available
; for doinfill=doabd=0)

printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way, and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.’
It seems, they had a pre-corrected version of data, aka ‘value added data’,
because there is no code actually implementing a correction.
Same thing in
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
dated 28.11.2006 and
FOIA\documents\osborn-tree6\summer_modes\hovmueller_lon.pro
dated 28.02.2007

December 5, 2009 4:01 am

Anthony: Thanks for the caveat; I’m not sure all readers distinguish between your views and your guest posters’ and I know that you wouldn’t be as extreme as to say “that all of the data that was the chief result of most of the environmental legislation created over the last decade was a farce” as the result of one bit of out of context code. Perhaps the caveat could be moved to the top?
I do have a theory about the context here, echoing “Morgan” and “TKI” above: I think this was done as input to the tree ring calibration. We know from other code (like that posted by TKI) above and the Harry file that the recent “decline” in the tree-ring data was a problem when trying to calibrate the earlier data against the instrumental series – such a known (but unexplained) divergence would throw off the whole thing; at least by reducing the correlation but also most likely by offsetting the whole reconstructed tree data upwards.
Assuming for a moment one believes that the tree ring decline happened for some valid measurement reason, as opposed to trees “peak clipping” the temperature signal, then the logical thing to do here would be to truncate both series at the point where the decline starts, and only correlate before that. Indeed several code snippets indicating doing just that. You then can at least validly say that the trees correlate with R^2=xx up to 1960, or something like that.
Another, much more dodgy way, would be to fudge the more recent values so they track recent temperatures more closely before trying the calibration. This obviously produces a completely bogus correlation overall, but it might help you get a rough idea of what the calibration offset needs to be for earlier years.
From the comments Morgan posted above, they obviously believed that they could then regenerate a valid correlation/calibration using the real data afterwards. Maybe that’s why the code was commented out – the first run was with the fudge, to seed it, the second without. Obviously one would wish for some kind of parameter to the code to switch it on or off, but I guess if this is only going to be done once, that’s pretty typical programmer behaviour.
(Actually this idea reminds me of bootstrapping a compiler, if that makes sense to any other (former) compiler-heads. You don’t expect the first output to be any good, it just has to be enough to compile the real thing)
But whether it’s valid or not, without any evidence of this actually being used in anger, any idea that any published reconstruction, let alone current data, was “fudged” or “falsified” using it is pure speculation. You simply cannot generalise this to invalidity of all climate data, not least because the CRU data tracks other independent series extremely closely for recent years:
http://www.woodfortrees.org/plot/hadcrut3vgl/from:1979/offset:-0.15/mean:12/plot/gistemp/from:1979/offset:-0.24/mean:12/plot/rss/mean:12/plot/hadcrut3vgl/from:1979/offset:-0.15/trend/plot/gistemp/from:1979/offset:-0.24/trend/plot/rss/trend
(UAH left out because of known divergence between it and RSS)
But a caveat of my own: Many eyes make all bugs (and dodgy algorithms) shallow. Clearly the whole process needs to be thrown out in the open – and maybe even make use of the power of the Open Source movement to actually assist with the coding? I’m already doing this to a small degree with WFT in the technologies I know, but which aren’t really ideal for statistical work; but I’m sure there are lots of R, IDL etc. experts out there who would love to help!

slow to follow
December 5, 2009 5:52 am

@woodfortrees (Paul Clark) (04:01:43) :
“Perhaps the caveat could be moved to the top?”
Seconded

bill
December 5, 2009 6:26 am

woodfortrees (Paul Clark) (04:01:43) :
Thanks for that level headed comment!

bill
December 5, 2009 6:30 am

TKl (01:05:58) :
It seems, they had a pre-corrected version of data, aka ‘value added data’,
because there is no code actually implementing a correction.

If you care to read what you copied:
“printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. ”
THE DATA SHOULD NOT BE USED
ie TRUNCATE the data at 1960
It does not say use the fudged data after 1960

December 5, 2009 6:42 am

I couldn’t resist a satirical poke at Monbiot (Dec 4) in the UK’s Guardian.
( see http://www.guardian.co.uk/environment/blog/2009/dec/04/debate-climate-sceptics?showallcomments=true#end-of-comments )
>
>
Mr Monbiot, bravo to you, sir!
You made a mug of that old fart, Lawson by ridiculing him with HadCRUT3 temperature series wheeze! I literally wet myself when I read your ‘Guardian’ piece where you say, “What it actually shows is that eight out of the 10 warmest years since records began have occurred since 2001.” Corker! Mum’s the word now on that ‘reconstructed’ 1000 year record set ; )
No one came back at you with the 12 Oct 2009 email, either. You know the part – where that dullard, Trenberth says to Mann, “The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t…. Our observing system is inadequate.” Idiot!
Trenberth and Jones are too much of a liability now. I’m starting to like that ‘apology’ you made more and more. I think I see where you’re taking this one ( thinking: sacrifices for the cause). The team talk in the locker room is Jones and Trenberth are plum scapegoats – throw them out and keep the integrity of the team intact, right? We may do something about this on RC. MM was wondering if you’d be up for more flim-flam in case someone does another ‘Trenberth’?
Btw, SM and his team of holocaust deniers over on CA and WUWT haven’t yet chewed over the lost 800+ ground- based climate-measuring stations from the official GIStemp. We might want to cull another set of ‘cold’ ground-based stations and augment the HadCRUT3 with a slew from China near some power stations (UHI?). Any thoughts?
Regards,
GB

bill
December 5, 2009 7:13 am

mkurbo (00:05:32) :
The raw data is not lost – it just does not reside in uncorrected for at CRU
The majority of this early and latter data will be recorded on paper reports. To retreive this is not a job for a computer but manual eyeball and brain.
An example is ships log books
http://www.corral.org.uk/Home
Costs:
http://www.corral.org.uk/Home/project-meetings/team-meeting-28th-september-2009-1
These would be much more ordered and readable than a pile of photocopied papers from around the word.
I believe these logbooks from the 40s have been used by cru to provide a more accurate sst from that date.
Having got your paper copies you then would have to asses each site manually or by comparison to near neighbours.
Any duplicates will have to be removed. Any out of tolerance readings killed or perhaps the whole site or period for a site removed. Other errors due to manual data transfer and entry will also have to be corrected (misreading a 2 instead of a 7 etc)
Having done this other adjustments for site relocation (manual?) and recording times and uhi (partially manually- just because say “oswalds creek” has a population of 20 does not mean that the measurements are not affected by building work).
Having done this you have value added results, which will eventually be made available (when National met offices give permission)
If you want raw data, what you will get would be the paoper copies. How on earth will you retrieve valid data from these unless you spend the next 30 years invesigating and adjusting just as CRU has. Why will yours be more acceptable? Can we afford to wait 40 years for the results?
Even the Surface Station project (last updated 2008/04/18) with all its helpers has not manged to produce an output!

Greg Elliott
December 5, 2009 7:31 am

This graph showing solar activity is reversed left to right otherwise most people would recognize the “hockey stick”.
http://en.wikipedia.org/wiki/File:Carbon14_with_activity_labels.svg
Based on this graph, it could be “scientifically” argued that C14, not C02 is what is driving climate change.

JJ
December 5, 2009 8:05 am

bill (07:13:28) :
“Having done this other adjustments for site relocation (manual?) and recording times and uhi (partially manually- just because say “oswalds creek” has a population of 20 does not mean that the measurements are not affected by building work). ”
You claim that adjusting for the UHI of a site with a metropolitan population of 20 is necessary, and that such adjustment will have to be site specific and capable of capturing such effects as ‘building work’, and thus need to be performed manually.
Please document that this necessary procedure has been performed in the CRU dataset for all sites with a nearby population of 20 or more, and that it has been perfromed properly. Ditto any other alleged temperature datasets.
“If you want raw data, what you will get would be the paoper copies. How on earth will you retrieve valid data from these unless you spend the next 30 years invesigating and adjusting just as CRU has.”
Please explicitly define what is meant by ‘just as CRU has’? Really, a lot of people are very interested to know what that is, and have gone so far as to file FOIA requests to learn. If you know, please out with it.
“Why will yours be more acceptable?”
Because it will include proper UHI adjustment, be open source, and not have been produced by secretive tribalists intent on hiding the decline.
“Can we afford to wait 40 years for the results?”
Absolutely. If that is what it takes, we can’t afford not to.
That said, the UK Met seems to think they can get it done in three. They can probably do it in less than ten. As the current ‘no global warming’ period is already at least that long, we can wait.
JJ

TKl
December 5, 2009 8:31 am

bill (06:30:21) :
> If you care to read what you copied:
> “printf,1,’IMPORTANT NOTE:’
> printf,1,’The data after 1960 should not be used. ”
>
> THE DATA SHOULD NOT BE USED
>
> ie TRUNCATE the data at 1960
>
> It does not say use the fudged data after 1960
I did read what I copied. Where did I advise to use the data?
But if one doesn’t want to use the data why “adjust” it?

Matt Y.
December 5, 2009 8:31 am

This is a very sloppy, over-reaching post.
“””Remember that array we have of valid temperature readings?”””
No… I remember an array of years, not temperatures
“””valadj, or, the “fudge factor” array as some arrogant programmer likes to call it is the foundation for the manipulated temperature readings. It contains twenty values of seemingly random numbers. We’ll get back to this later.”””
Is the whole array a fudge factor, or the .75 it is being multiplied by (which you left out of your graph by the way)?
It is not at all clear what the interpol() function is doing what you claim. It seems more likely it is “interpolating” values into the valadj array for the missing years. Do you have more evidence that it is doing something else?
This is embarassingly bad. Throwing up softballs like this just gives the warmists ammunition to blow us off.

mkurbo
December 5, 2009 8:47 am

Bill – I guess I’m not being clear enough, sorry…
It’s the conduct that is the smoking gun here – there was ample opportunity for CRU to NOT have ended up in this position.
For this to truly unravel, the two dots that need firm connection are conduct and agenda.

K
December 5, 2009 10:37 am

The code means nothing by itself. We have to know where it changed conclusions and publications.
And we don’t know that. And cannot know it yet. Probably no one does, the matter is very complex and the proper records may never have been kept. Or if made, may no longer exist.
Believing Jones or anyone else can explain what was done is fantasy. Some of those involved know more, some less, but no one will have the entire picture, purpose, data, conversion, adjustment, code, storage, and every other step before publication.
Clearly reassessment is the proper path. The alternative, attempting step-by-step correction will prove a bottomless swamp.
Governments and stations all over the world will have to provide raw data again. And they may no longer have it. If they thought CRU had safely preserved that data for all time then bureaus around the world may have discarded their originals.
And reassessment must be done honestly and independently. Jones and others may or may not have been honest but they are certainly not independent and disinterested. So the original gang must be excluded from influencing any review.
The law will ponder matters for a long time. Violating Freedom of Information Acts is probably the only violation that can be proved. University discipline, if any, is rare and totally unpredictable.

John A. Jauregui
December 5, 2009 12:54 pm

There’s a lot more to this ClimateGate story. This small (2 to 3 dozen) cabal of climate scientists could not have possibly gotten to this point without extraordinary funding, political support at virtually all levels of government, especially at the national level and unparalleled cooperation from the national and world media. This wide-spread networked support continues even as we the people puzzle over what this is all about. I ask you, “What are you seeing and hearing from our national media on the subject?” Anything? What are you seeing and hearing from all levels of our government, local and regional newspapers and media outlets? Anything of substance? At all of these levels the chatter has remained remarkably quite on the subject, wouldn’t you say? Why? What points and positions are you beginning to hear on the radio and see on the television? This cabal of scientists has an unprecedented level of support given the revelations contained in the emails, documented in the computer software code and elaborated in the associated programmer remarks (REM) within the code. And —- this has gone on for years, AND continues even in the presence of the most damning evidence one could imagine, or even hope for. Watergate pales in comparison, given the trillions of dollars in carbon offset taxes, cap & trade fees hanging in the balance and the unimaginable political control over people’s lives this all implies. The mainstream media’s conspiracy of silence proves the point. Their continued cover-up is as much a part of this crime as the actual scientific fraud. ABC, CBS and NBC are simply co-conspirators exercising their 5th Amendment rights.

December 5, 2009 2:53 pm

Anthony, I made another post regarding the CRU’s source code analysis that I did on December 3rd.
http://cubeantics.com/2009/12/climategate-code-analysis-part-2/
I thought you might be interested since you linked to the previous article.
-Robert

December 5, 2009 3:37 pm

I am just an everyday sort of guy,but I am appauled at this lack of respect for ther scientific method and the arrogant skewing of data to suit a political agenda.Full disclosure…You bet your ass full disclosure.I shall continue to follow this as I am totaly PISSED OFF about it.

RickD
December 5, 2009 7:03 pm

I’m wondering if you have any idea what this code was used for.
I’m also wondering if you’ve ever heard of “simulations”.
Do I need to say more?

RickD
December 5, 2009 7:06 pm

Do you know what simulations are?
Do you understand that scientists often generate random data?
Do you have any evidence that this code was used to falsify data, as opposed to simply used to simulate random data according to a certain model?
This blog post is irresponsible. If you want to know what the code was used for, why not ask the coder? Presuming that it was used for nefarious ends is entirely unjustified.

pat
December 5, 2009 8:21 pm

someone mentioned monbiot still claiming this decade has continued the ‘warming’.
well, bbc world service radio is endlessly playing a promo for a series of programs next year looking at the first decade of the 21st century:
it goes:
10 years of cyber technology
10 years of blah blah (forget what the second one is)
10 years of warming of the planet
bbc had pachauri for half an hour last nite with the most inane host, inserting stuff like ‘what EVERONE’S worried about is the tipping point’ and the like. the audience got to ask reverential questions and appeared to all make money from the AGW industry. climeategate or anything associated with it was not mentioned. shame on the media.

December 5, 2009 10:11 pm

where again do the codes come in?
the data in the images appears to be wrong just with a glance. Where art thou Gyres created from the Earth’s spin. As we go zippin round the son in orbit, the Earth (sort of like an addict) is spun. There should be circular patterns counter clock-wise in the Northern hemisphere and clock-wise in the Southern.
a quick disclaimer though, I am not a whether person

Roger Knights
December 6, 2009 12:17 am

mkurbo (08:47:16) :
“It’s the conduct that is the smoking gun here – there was ample opportunity for CRU to NOT have ended up in this position.

Correct. Focus on the unscientific, unprofessional, secretive, manipulative behavior and machinations and corruptions of the scientific process by the team and their allies.
Don’t focus on the code, which can’t be properly analyzed from the outside. The most we can do is find possibly suspicious segments, not yet a smoking gun. And don’t focus on the temperature record. The fudgings that have been done on it will not affect it much. The globe is clearly warming, and we know the general shape of the major up-and-down trends since 1850.
Focus rather on the partisanship and dishonesty that is implied by ANY amount of fudging, data-concealment, and manipulativeness, and on the wider web of bias and corruption that must exist in the field for these spiders to have operated so successfully for so long.
The key point of Climategate is that it raises a strong suspicion that “climate science” amounts to “advocacy research,” not that this or that of its finding is incorrect or exaggerated. Its findings and logic MAY be correct, but now we can’t trust them enough to pass ruinously expensive legislation based on them. We would be nuts to “buy” anything from the team—or its allies. They’ve forfeited our trust. There must be a reality check by neutral panels of non-climatologists—a reexamination of everything, and a complete exposure of everything that’s been going under under the rock.

Bill Hunter
December 6, 2009 8:40 am

bill (09:47:02) :
“As someone else said – why bother with the code why not just draw the line you want?”
Thats a silly comment. Obviously if your intent is to deceive you would want as much legitimate data behind the result as possible. You would want to absolutely minimize divergence from the actual data as much as possible while producing the desired effect.
Here we have seen a willingness to deceive, “hide the decline”. Its all about the designing the correct visual impact to the masses.
Its quite simply propaganda not science. If that doesn’t disturb you it should.

Xbnkrbrkr
December 6, 2009 10:18 am

Your right e mails are just talk but when coupled with their refusal to release research under the FOIA you have to wonder what are they hiding. It doesn’t help their credibility when they write about destroying data rather than give it up. Actions speak so much louder than words.
The two MMs have been after the CRU station data for years. If they ever hear there is a Freedom of Information Act now in the UK, I think I’ll delete the file rather than send to anyone.. From. Phil Jones 2-2-05

December 6, 2009 10:29 am

Obviously the majority of all the comments were made by very knowledgeable persons. I tell you the truth, I don’t understand all those codes and I could not tell from this blog, if those scientist did really make up all the story …
I know only one thing, it is a shame that some scientist, people we put our trust in, are lying and making up “stuff”,. We really don’t need the government to spend money on phony claims, BUT we really need to be “environment conscious”, we have to respect our wonderful Earth.
A balance has to be found between extreme spending on ghost claims and the obvious attitude we should all have by protecting our Earth.
It begins in our backyard, empowering our kids, teaching them respect for the environment, picking paper on the ground, learn not to waste electricity or water. This is all fundamentals that begin right her at our door, and the snowball will grow….
I know, I know, you are going to say we already know this, good! So it doesn’t hurt repeating it, as I still see people throw stuff out of their car’s window…there parent’s must have forget to “repeat them” the fundamentals…LOL..by the way, I can’t help biping my horn to them when I see that!!
Now concerning the research and the governmental part in all this, yes we need it, we need to find other sources of energy, we need to fight again pollution and we will still need to trust scientist, but as we can’t do the job without them they cannot do it without us.
It is a world wild consciousness! It is Everybody’s job.
Save the planet and SHE will save us!!!
NN

karl golledge
December 6, 2009 11:26 am

stop the WARm-mongers

John A. Jauregui
December 6, 2009 11:46 am

Tiger Woods: The Media’s Allegorical Substitute for Climategate Reporting
Sound familiar, but in another context????
http://www.esquire.com/the-side/opinion/tiger-woods-accident-updates-legacy-120109

December 6, 2009 12:29 pm

Whistleblowers–Make History, Make a Fortune
The AGW Scammers’ Achilles heel is their dependence on government funding.
There is a very robust anti-fraud mechanism in place in the Federal government.
We need to help whistleblowers provide the truth.
Can you imagine working for that superscilious “scientist” at Penn State? You know that there are numerous grad students who helped the scam. They just need to speak up.
The beautiful thing is that grant fraud is nothing new. And the federal government has a robust anti-fraud Task Force. To combat the rampant fraud in federal grants, they have established a program in which whistleblowers are rewarded with a share of funds recovered from scammers.
Get the word out to Penn State, NASA, NOAA, and any other recipients of federal funding.
Whistleblowers can make a lot of money.
Tell the truth.
Here’s a lawyer that specializes in fraud recovery:
http://howtoreportfraud.com/examples-of-federal-fraud/grant-fraud
If you are a potential whistleblower, or know someone who might be, you can also join our group: http://tech.groups.yahoo.com/group/co2isplantfood/
Anonymity is gauranteed. We’ll help you find the right government or private entity to help you take the proper action.
It’s been a long nasty time for those who believe in science and ethics. Being part of the scammer groups must have been painful and difficult. Now’s the time to do the right thing. Make history. End the most massive financial scam of all time.
Please join the group: http://groups.yahoo.com/group/co2isplantfood
And sign the petition: http://www.ipetitions.com/petition/NOC_NOW/
Join Facebook group: http://www.facebook.com/group.php?gid=191580771509
We can stop this scam, together.
Kent Clizbe
NOC-NOW
Stop the Scam—Halt the IPCC
No Consensus—No Warming

mogar
December 6, 2009 4:43 pm

I’ve been an independent computer consultant for almost 19 years. In a past career I worked in the lab as a chemist for 16 years. I have worked in many places that have similar code control as the CRU, meaning none. The problem here is that nobody knows what the code was when it was run. As some have mentioned some of the code is commented out. We don’t know when it was commented out and when it wasn’t commented out. One would expect that if a programmer wrote the commented code he did not do it just for snicks and giggles and at some point the commented code was run.
Having said all this it points out why if you are trying to prove something that involves vast amounts of data and intricate statistical analysis the data is only part of the proof, the code that was run to arrive at your conclusions is the other. The fact that CRU instead just releases results and is so secretive of both data and code makes it impossible for anyone to evaluate their methods or repeat their findings. Ask yourselves this if they had released the code and the data with their findings would we necessarily be calling this a smoking gun? But they didn’t and that makes this appear to be exactly that. Without the data and the code THAT WAS RUN to produce their findings we have no reason to believe anything the CRU says, or for that matter anyone using CRU datasets. Science that cannot be repeated is not science, I don’t know what it is but I do know what it is not.
Reading the HARRY_ReadMe file it describes an environment where raw data was indiscriminately tossed about with no safeguards to prevent its corruption and program suites that do not perform as advertised in what little documentation that was available.
We programmers refer to shops like this as the ‘wild west’ or ‘hairballs’. I’ve worked in a bunch of them so I know one when I see it and from looking at the code the CRU is one.
So aside from all the other scientific reasons to doubt AGW, what we have here is a theory where all the observed data is of questionable providence and the methods used to interpret that data was itself a moving target. We are left with no way to verify anything they have said for the last decade or so. I don’t know what others use to describe ‘science’ like that but as far as I can see it’s nothing but a hodgepodge of rumors.

Jake
December 7, 2009 1:02 am

For the religious and political motivations behind the AGW scam, check out this site from an insider — http://www.green-agenda.com
Much of this comes from the UN’s Agenda 21, which includes much more than the AGW scam. There are a few other scams just like it, all part of the same global plan.

Chris O'Brien
December 7, 2009 9:46 am

Let’s not forget that the author of this “information” is a social conservative that has railed against political solutions to climate change on his blog.
I’m not sure that ANY information would’ve led to to reach a different conclusion.
This is just more propaganda from the right.

Daniel
December 7, 2009 12:58 pm

The integrity of science must be preserved above everything else. If the data is no longer verifiable, then painful as it is, we may have no choice than to scuttle our work and start from scratch. If a more rigorous set of checks and balances are required to preserve transparency and validity of data — even revising the peer review process — then so be it. The integrity of science is worth it, no matter the embarrassment, time or cost. At stake is our ability to verify truth itself — and this is worth any price.

Peter Tashkoff
December 7, 2009 1:34 pm

I understand the point the OP is making but I’m not sure I’d simply write off the emails as cheap talk.
Putting aside all the blatant deception revealed, at least one of the emails calls for recipients to delete material that was already subject to a freedom of information enquiry.
I am given to understand that is a criminal offence, if proven, and I’m very interested to see the outcome of the so-called independent investigation being carried out into the activities of Phil Jones and other members of the CRU.
Though my first expectation is to see an attempt to whitewash the thing, after all, that’s the primary purpose of ‘independent’ investigations.

Mike
December 7, 2009 2:44 pm

what is briffa_Sep98_d.pro and how do you link this information to the CRU

kavustock
December 7, 2009 3:22 pm

It’s 8 degrees out. Time to go running but I don’t want to bundle up. No worries.
rloc=[today]
valadj=[77] ; fudge factor
Now it’s 85! Thanks Dr Jones!
I bet the good doctor could make a killing in the stock market!

Ian
December 9, 2009 11:16 am

Your plot doesn’t seem to match the valadj. The Valadj goes negative, to zero, negative, and then climbs into the positive. Your chart just shows one drop into negative.

Andy Mikula
December 9, 2009 1:40 pm

All you need to ask those who believe in made made global warming is:
Do you think GREENLAND was named as a joke?

secretlivesofscientists
December 9, 2009 1:54 pm

Couldn’t agree with you more. I don’t really have an opinion at this point of AGW, because I’m a good scientist and I believe in substantiating claims with data, and there’s not enough data to make the AGW-CO2 linkage, in my opinion.
The failure to disclose the codes really got me all hot and bothered, however. Not only is it detrimental to the scientific community, it hurts the integrity of the practice of science and relationship between the scientist and society as well.
The appropriate action would be for the journals which published the results-in-question to give the following ultimatum: disclose your codes and data analysis in full, or we’re yanking your papers.

Irony Curtain
December 9, 2009 2:08 pm

Don’t know how many of you have seen this 1 minute clip that makes the point brilliantly…
http://iowntheworld.com/blog/?s=Green+Fakers

John rose
December 9, 2009 4:38 pm

In your conclusion you state: “However, being part of the Science Community (I have a degree in Physics) and having done scientific research myself makes me very worried when arrogant jerks who call themselves “scientists” work outside of ethics and ignore the truth to fit their pre-conceived notions of the world. That is not science, that is religion with math equations.”
I know exactly how you feel. I have been dealing with that for years in another area of pseudo-science. Your statement is also a perfect definition of evolutionists. Science has very little to do with the theory of evolution. It, like man-made global warming, is just another religion. “Science” takes a backseat to the pre-conceived notions of atheists. It’s a shame so many people are taken in by both of these hoaxes.

GoreGetsRichFromMorons
December 9, 2009 5:12 pm

Some of these posts are amazing. People actually putting no smoking gun in this code? One with any common sense would think that if man made global warming was not a hoax the code, the raw data and every body involved would want to have all of this information published. There is only one reason to hide this information so that people cannot come to an honest conclusion.
Until raw data, the entire modeling process and any addtional relevant information is made public intelligent people can assume this is a hoax and the idiots can buy green waisitng their money. None of these so called scientists should ever have credibility again.

Jim
December 9, 2009 10:30 pm

I have a question.
Why don’t they just use a simple spreadsheet like excel, and do the calculations and analysis in there or something like minitab to do all of this. It seems to me that those 2 pre-written programs could “calibrate”, adjust, average, and trend just about any kind of data you wanted. I mean I have HUGE data files from work that I treat this way, with no problems what so ever. Plus that way all of your “raw data” is sitting right beside any kind of HONEST adjustments that are made.
To me I think over complicating all the data manipulations, in a way that makes it harder for someone to double check their work, just makes it more suspicious.

AlgorLiesLiesLies
December 9, 2009 10:31 pm

Exactly GoreGetsRichFromMorons (17:12:52) :…
There is no need for a smoking gun…
These guys should never have been listened to whatsoever!! People have been requesting, then demanding (via FOIA) their data and methods for years, to no avail. That in and of itself tells you they are and were lying.

December 10, 2009 9:05 am

“RU was deliberately tampering with their data.”
Oh no, that’s horrible. You mean they were taking their data and replacing inconvenient parts of it with made up data?
“…(potentially) valid temperature station readings were taken and skewed to fabricate the results the “scientists” at the CRU wanted to believe, not what actually occurred.”
Oh, so they weren’t replacing data outright, but they were interpolating valid temperature readings in a way that skews the result. That still seems bad. Why would they need to skew valid temperature readings?
“This closely resembles the infamous hockey stick graph that Michael Mann came up with about a decade ago.”
Interesting. But… that graph show’s Mann’s 1999 “hockey stick” actually under-projecting the actual temperature readouts. Why would a scientist (even using the word loosely) interested in massaging data to show evidence of global warming massage it to show less global temperature change than actual direct measurement of the temperature shows?
Hey, something else that’s interesting: Mann’s graph shows actual temperature measurements starting sometime after 1800. That makes sense… probably not a lot of reliable direct temperature measurements from before the 19th century. But you said that CRU’s code is manipulating valid temperature readings, and their code also has a skew factor (of 0.0) for 1400. What gives? Where are CRU’s valid temperature measurements from 1400 coming from? And how is Mann’s model going all the way back to 1000?
“the adjustment shown above is applied to the tree ring proxy data (proxy for temperature) not the actual instrumental temperature data.”
Ohh… so these *aren’t* direct, valid temperature measurements after all. They are temperature estimates based on tree ring data.
“we don’t know the use context of this code.”
Perhaps not… but you know, this whole thing sounds a lot like one of those emails that are being bandied about where Phil Jones mentions using Mann’s “trick” regarding tree-ring data.
The explanation I’ve heard about that is that tree-ring data is not not entirely reliable after a certain data for some not completely understood reasons, and Mann found a way to integrate older tree-ring data with other temperature data, but had to employ some method to do so, which was published in Nature.
Could that be what this code is about? If so… is it really the smoking gun you think it is? You clearly have been following all of this enough to have some additional insight, right?
“I really haven’t taken much of an interest in the whole global warming debate and don’t really have a strong opinion on the matter.”
Oh I see. You *haven’t* been following it, but you decided because you have a degree in Physics, you are qualified to pick some sample code apart and level accusations against others based on your interpretation of what the code is doing, but without actually knowing much more than any of the rest of us laymen about tree-ring temperature series.
Do you really have enough here to challenge people’s credentials and question their ethics?
“having done scientific research myself makes me very worried when arrogant jerks who call themselves “scientists” work outside of ethics and ignore the truth to fit their pre-conceived notions of the world.”
If by “work outside of ethics” you mean post excerpts of code whose context one admits not knowing, covers a data in a field in which they don’t specialize, but yet they feel entitled to interpret anyway and cast aspersions upon those who allegedly wrote/used said code… then I agree: that is worrying indeed.

lsulaw
December 10, 2009 9:42 am

For MangoChutney at (05:54:06), who asked “just a thought, if the emails and code were obtained by a hacker and not released by a whistle blower, would the evidence be inadmissible in a court of law”
Yours truly is only a new attorney but I nonetheless believe US law is well setttled here. I can’t speak for the Brits but don’t think the UK has a Mapp v. Ohio (1961)-style exclusionary rule. If so, they wouldn’t have a “fruit of the poison tree” doctrine. Thus I would expect the Brits would admit these files into evidence.
In the US we do have that rule, so evidence illegally obtained is not admissable. Thus the outcome would not turn on any distinction between a “hacker” or “whistle blower” in the US. If a private person acting without the knowledge of or at the direction of the police steals the data then that data would be admissable. It doesn’t matter if that stealing person were a hacker or a whistle blower. As long as the police did not tell them to steal it or know beforehand that it would be stolen it’s admissable. If the police come into into possession of material without they themselves violating a law then the material is admissable.
For example, suppose a technician in a shop stumbles across child porn on a customer’s laptop while attempting to resolve a software conflict between two unrelated programs. Tech calls the police. They examine the laptop without a warrant. The porn is admissable. Same situation except now the police tell the tech to examine every computer brought in and call them whenever he finds kiddie porn. In the US that evidence would be excluded. No probable cause; not in plain sight; search warrant needed but not obtained.
However, in this case, I believe the “hacker/stolen” theory is highly unlikely to be the cause of the release. An army of wireheads have analyzed the directory file structure and content of the materials and of course the near universal conclusion is that the Zip file was NOT the work of hackers.

Audit the fed
December 10, 2009 9:59 am

I will admit my bias as a skeptic of MAN MADE global warming.
However as a scientist myself (not in these fields) I am absolutely horrified at the lack of integrity in the “climate science” field.
Destroying orignal data, rigged statistical modeling.
It’s sad to say, but basically money has corrupted science to the point that ANY claims without ALL data being published OPENLY should be immediately put in the trash.

oracle2world
December 11, 2009 1:40 pm

I’m not familiar with this language, but it appears a continuation of a line of code uses a hyphen “-“. So the IF statement is not commented out.
What is interesting is why anyone would program this in the first place. Clearly something nefarious is going on. And “fudge factor” isn’t jargon you expect to find in any comment.
Destroying the original datasets is kind of the clincher though. Because there is no way of testing the code as it changes over time. With a program of even moderate complexity, there is no way to test all the possible error conditions, and a good way to see if something is really squirrelly is to use the same datasets for checking.
Not to be unkind, but I get the feeling the key programs were being modified to reach some predetermined conclusion.

December 18, 2009 3:03 am

I hate to break it to you climate guys, but exactly the same sort of stuff has been going on AND EXPOSED for years with regard to the antismoking movement. Data deletion from the internet records, chopping off of end points to produce desirable looking graphs, problems with FOI requests. Nothing new at all, just being done in a different area where people are still open minded enough to look at the exposure and see it for what it is.
Michael J. McFadden
Author of “Dissecting Antismokers’ Brains”

Sonja Christiansen
December 18, 2009 6:20 am

I agree with Michael, and this is not the only realm where science has been misuded by ‘policy’ as justification. What does this signify? Science as a substitute for ethics?? Science as a tool of control when religion no longer works?? Science as a new form of authority that stifles debate and dissent?
Scientists and their institutions had better watch out, for that possible role, while attractive and surely useful for raising research funding, would also mean the end of science as ‘speaking truth to power’ and future generations. Perhaps we have reached that stage already.

December 26, 2009 7:21 pm

Has anyone ever heard of the term… (expence padding)
A totally dishonest business person always pads long
after the fact. It must really be tough to come up with a
believable report, secretly defending the amount already
spent!
do-dad