Climategate: hide the decline – codified

WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the  HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.

http://codyssey.files.wordpress.com/2009/02/software_bug.jpg

To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

  • FOIA\documents\harris-tree\recon_esper.pro; Computes regressions on full, high and low pass Esper et al. (2002) series,

    ; anomalies against full NH temperatures and other series.

    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline

  • FOIA\documents\harris-tree\calibrate_nhrecon.pro;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline that affects tree-ring density records)

    ;

  • FOIA\documents\harris-tree\recon1.pro

    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro;

    ; Specify period over which to compute the regressions (stop in 1940 to avoid

    ; the decline

    ;

  • FOIA\documents\HARRY_READ_ME.txt17. Inserted debug statements into anomdtb.f90, discovered that

    a sum-of-squared variable is becoming very, very negative! Key

    output from the debug statements:

    (..)

    forrtl: error (75): floating point exception

    IOT trap (core dumped)

    ..so the data value is unbfeasibly large, but why does the

    sum-of-squares parameter OpTotSq go negative?!!

  • FOIA\documents\HARRY_READ_ME.txt22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software

    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the

    definitive failure of the entire project..

  • FOIA\documents\HARRY_READ_ME.txtgetting seriously fed up with the state of the Australian data. so many new stations have been

    introduced, so many false references.. so many changes that aren't documented. Every time a

    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with

    references, some with WMO codes, and some with both. And if I look up the station metadata with

    one of the local references, chances are the WMO code will be wrong (another station will have

    it) and the lat/lon will be wrong too.

  • FOIA\documents\HARRY_READ_ME.txtI am very sorry to report that the rest of the databases seem to be in nearly as poor a state as

    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO

    and one with, usually overlapping and with the same station name and very similar coordinates. I

    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!

    There truly is no end in sight.

  • FOIA\documents\HARRY_READ_ME.txt28. With huge reluctance, I have dived into 'anomdtb' - and already I have

    that familiar Twilight Zone sensation.

  • FOIA\documents\HARRY_READ_ME.txtWrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not

    being kept in step. Sounds familiar, if worrying. am I the first person to attempt

    to get the CRU databases in working order?!!

  • FOIA\documents\HARRY_READ_ME.txtWell, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I

    immediately found a mistake! Scanning forward to 1951 was done with a loop that, for

    completely unfathomable reasons, didn't include months! So we read 50 grids instead

    of 600!!! That may have had something to do with it. I also noticed, as I was correcting

    THAT, that I reopened the DTR and CLD data files when I should have been opening the

    bloody station files!!

  • FOIA\documents\HARRY_READ_ME.txtBack to the gridding. I am seriously worried that our flagship gridded data product is produced by

    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station

    counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived

    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented

    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?

    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding

    procedure? Of course, it's too late for me to fix it too. Meh.

  • FOIA\documents\HARRY_READ_ME.txtHere, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet

    the WMO codes and station names /locations are identical (or close). What the hell is

    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)

  • FOIA\documents\HARRY_READ_ME.txtWell, it's been a real day of revelations, never mind the week. This morning I

    discovered that proper angular weighted interpolation was coded into the IDL

    routine, but that its use was discouraged because it was slow! Aaarrrgghh.

    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'

    to 720x360 - also deprecated! And now, just before midnight (so it counts!),

    having gone back to the tmin/tmax work, I've found that most if not all of the

    Australian bulletin stations have been unceremoniously dumped into the files

    without the briefest check for existing stations.

  • FOIA\documents\HARRY_READ_ME.txtAs we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
  • FOIA\documents\HARRY_READ_ME.txtOH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm

    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform

    data integrity, it's just a catalogue of issues that continues to grow as they're found.

  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.proprintf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’

    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’

    printf,1,’Reconstruction is based on tree-ring density records.’

    printf,1

    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY

    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’

    printf,1,’will be much closer to observed temperatures then they should be,’

    printf,1,’which will incorrectly imply the reconstruction is more skilful’

    printf,1,’than it actually is. See Osborn et al. (2004).’

  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    FOIA\documents\osborn-tree6\summer_modes\data4sweden.proprintf,1,'IMPORTANT NOTE:'

    printf,1,'The data after 1960 should not be used. The tree-ring density'

    printf,1,'records tend to show a decline after 1960 relative to the summer'

    printf,1,'temperature in many high-latitude locations. In this data set'

    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'

    printf,1,'this means that data after 1960 no longer represent tree-ring

    printf,1,'density variations, but have been modified to look more like the

    printf,1,'observed temperatures.'

  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro;

    ; Remove missing data from start & end (end in 1960 due to decline)

    ;

    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)

    sst=prednh(kl)

  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the

    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors

    ; but not as predictands. This PCR-infilling must be done for a number of

    ; periods, with different EOFs for each period (due to different spatial

    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),

    ; since they won’t be used due to the decline/correction problem.

    ; Certain boxes that appear to reconstruct well are “manually” removed because

    ; they are isolated and away from any trees.

  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]

    ;

    ; Apply a VERY ARTIFICAL correction for decline!!

    ;

    yrloc=[1400,findgen(19)*5.+1904]

    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$

    2.6,2.6,2.6]*0.75 ; fudge factor

    (...)

    ;

    ; APPLY ARTIFICIAL CORRECTION

    ;

    yearlyadj=interpol(valadj,yrloc,x)

    densall=densall+yearlyadj

  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro;

    ; Plots density ‘decline’ as a time series of the difference between

    ; temperature and density averaged over the region north of 50N,

    ; and an associated pattern in the difference field.

    ; The difference data set is computed using only boxes and years with

    ; both temperature and density in them – i.e., the grid changes in time.

    ; The pattern is computed by correlating and regressing the *filtered*

    ; time series against the unfiltered (or filtered) difference data set.

    ;

    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE

    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro;

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

    ;

  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered

    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which

    ; gives a zero mean over 1881-1960) after extending the calibration to boxes

    ; without temperature data (pl_calibmxd1.pro). We have identified and

    ; artificially removed (i.e. corrected) the decline in this calibrated

    ; data set. We now recalibrate this corrected calibrated dataset against

    ; the unfiltered 1911-1990 temperature data, and apply the same calibration

    ; to the corrected and uncorrected calibrated MXD data.

  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro; No need to verify the correct and uncorrected versions, since these

    ; should be identical prior to 1920 or 1930 or whenever the decline

    ; was corrected onwards from.

  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro; we know the file starts at yr 440, but we want nothing till 1400, so we

    ; can skill lines (1400-440)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1980, which is

    ; (1980-1400)/10 + 1 lines

    (...)

    ; we know the file starts at yr 1070, but we want nothing till 1400, so we

    ; can skill lines (1400-1070)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1991, which is

    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro

    FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro

    FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

  • FOIA\documents\harris-tree\recon_esper.pro

    ; Computes regressions on full, high and low pass Esper et al. (2002) series,

    ; anomalies against full NH temperatures and other series.

    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline

  • FOIA\documents\harris-tree\calibrate_nhrecon.pro

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline that affects tree-ring density records)

    ;

  • FOIA\documents\harris-tree\recon1.pro

    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro

    ;

    ; Specify period over which to compute the regressions (stop in 1940 to avoid

    ; the decline

    ;

  • FOIA\documents\HARRY_READ_ME.txt

    17. Inserted debug statements into anomdtb.f90, discovered that

    a sum-of-squared variable is becoming very, very negative! Key

    output from the debug statements:

    (..)

    forrtl: error (75): floating point exception

    IOT trap (core dumped)

    ..so the data value is unbfeasibly large, but why does the

    sum-of-squares parameter OpTotSq go negative?!!

  • FOIA\documents\HARRY_READ_ME.txt

    22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software

    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the

    definitive failure of the entire project..

  • FOIA\documents\HARRY_READ_ME.txt

    getting seriously fed up with the state of the Australian data. so many new stations have been

    introduced, so many false references.. so many changes that aren't documented. Every time a

    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with

    references, some with WMO codes, and some with both. And if I look up the station metadata with

    one of the local references, chances are the WMO code will be wrong (another station will have

    it) and the lat/lon will be wrong too.

  • FOIA\documents\HARRY_READ_ME.txt

    I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as

    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO

    and one with, usually overlapping and with the same station name and very similar coordinates. I

    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!

    There truly is no end in sight.

  • FOIA\documents\HARRY_READ_ME.txt

    28. With huge reluctance, I have dived into 'anomdtb' - and already I have

    that familiar Twilight Zone sensation.

  • FOIA\documents\HARRY_READ_ME.txt

    Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not

    being kept in step. Sounds familiar, if worrying. am I the first person to attempt

    to get the CRU databases in working order?!!

  • FOIA\documents\HARRY_READ_ME.txt

    Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I

    immediately found a mistake! Scanning forward to 1951 was done with a loop that, for

    completely unfathomable reasons, didn't include months! So we read 50 grids instead

    of 600!!! That may have had something to do with it. I also noticed, as I was correcting

    THAT, that I reopened the DTR and CLD data files when I should have been opening the

    bloody station files!!

  • FOIA\documents\HARRY_READ_ME.txt

    Back to the gridding. I am seriously worried that our flagship gridded data product is produced by

    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station

    counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived

    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented

    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?

    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding

    procedure? Of course, it's too late for me to fix it too. Meh.

  • FOIA\documents\HARRY_READ_ME.txt

    Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet

    the WMO codes and station names /locations are identical (or close). What the hell is

    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)

  • FOIA\documents\HARRY_READ_ME.txt

    Well, it's been a real day of revelations, never mind the week. This morning I

    discovered that proper angular weighted interpolation was coded into the IDL

    routine, but that its use was discouraged because it was slow! Aaarrrgghh.

    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'

    to 720x360 - also deprecated! And now, just before midnight (so it counts!),

    having gone back to the tmin/tmax work, I've found that most if not all of the

    Australian bulletin stations have been unceremoniously dumped into the files

    without the briefest check for existing stations.

  • FOIA\documents\HARRY_READ_ME.txt

    As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>

  • FOIA\documents\HARRY_READ_ME.txt

    OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm

    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform

    data integrity, it's just a catalogue of issues that continues to grow as they're found.

  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro

    printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’

    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’

    printf,1,’Reconstruction is based on tree-ring density records.’

    printf,1

    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY

    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’

    printf,1,’will be much closer to observed temperatures then they should be,’

    printf,1,’which will incorrectly imply the reconstruction is more skilful’

    printf,1,’than it actually is. See Osborn et al. (2004).’

  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    printf,1,'IMPORTANT NOTE:'

    printf,1,'The data after 1960 should not be used. The tree-ring density'

    printf,1,'records tend to show a decline after 1960 relative to the summer'

    printf,1,'temperature in many high-latitude locations. In this data set'

    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'

    printf,1,'this means that data after 1960 no longer represent tree-ring

    printf,1,'density variations, but have been modified to look more like the

    printf,1,'observed temperatures.'

  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro

    ;

    ; Remove missing data from start & end (end in 1960 due to decline)

    ;

    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)

    sst=prednh(kl)

  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro

    ; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the

    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors

    ; but not as predictands. This PCR-infilling must be done for a number of

    ; periods, with different EOFs for each period (due to different spatial

    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),

    ; since they won’t be used due to the decline/correction problem.

    ; Certain boxes that appear to reconstruct well are “manually” removed because

    ; they are isolated and away from any trees.

  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]

    ;

    ; Apply a VERY ARTIFICAL correction for decline!!

    ;

    yrloc=[1400,findgen(19)*5.+1904]

    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$

    2.6,2.6,2.6]*0.75 ; fudge factor

    (...)

    ;

    ; APPLY ARTIFICIAL CORRECTION

    ;

    yearlyadj=interpol(valadj,yrloc,x)

    densall=densall+yearlyadj

  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro

    ;

    ; Plots density ‘decline’ as a time series of the difference between

    ; temperature and density averaged over the region north of 50N,

    ; and an associated pattern in the difference field.

    ; The difference data set is computed using only boxes and years with

    ; both temperature and density in them – i.e., the grid changes in time.

    ; The pattern is computed by correlating and regressing the *filtered*

    ; time series against the unfiltered (or filtered) difference data set.

    ;

    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE

    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro

    ;

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

    ;

  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro

    ; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered

    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which

    ; gives a zero mean over 1881-1960) after extending the calibration to boxes

    ; without temperature data (pl_calibmxd1.pro). We have identified and

    ; artificially removed (i.e. corrected) the decline in this calibrated

    ; data set. We now recalibrate this corrected calibrated dataset against

    ; the unfiltered 1911-1990 temperature data, and apply the same calibration

    ; to the corrected and uncorrected calibrated MXD data.

  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro

    ; No need to verify the correct and uncorrected versions, since these

    ; should be identical prior to 1920 or 1930 or whenever the decline

    ; was corrected onwards from.

  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro

    ; we know the file starts at yr 440, but we want nothing till 1400, so we

    ; can skill lines (1400-440)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1980, which is

    ; (1980-1400)/10 + 1 lines

    (...)

    ; we know the file starts at yr 1070, but we want nothing till 1400, so we

    ; can skill lines (1400-1070)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1991, which is

    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)


Sponsored IT training links:

Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.


Get notified when a new post is published.
Subscribe today!
5 2 votes
Article Rating
445 Comments
Inline Feedbacks
View all comments
Shona
December 7, 2009 4:47 am

JPNJ (18:27:48) :
Novice question. Why does everyone assume the code notes are the original and not added later by the hacker/Whisleblower to really grab attention to certain parts for people like you?

Because had that been the case UEA would have been screaming that fact from the rooftops. They have maintained a deafening silence on the Harry file. Something which tells me it is a bona fide piece of code.

John McLachlan
December 7, 2009 6:45 am

The computer program operates upon temperature or proxy temperature data, as I understand this. Suppose one input data, consistent with the temperature being constant for the last 10,000 years. Would this program produce an output which indicated a constant temperature, a rise in temperature or a decline in temperature, over the last 10,000 years? If the code is released, then it may be possible to assess whether it has built-in biases and if these are significant.
Also, suppose that instead of rejecting tree rings after 1960 as a valid proxy for temperature, the correlation between tree growth and measured temperature post 1960 was used to calibrate the previous tree-rings, what effect would this have upon inferred historical temperatures? Surely, if tree rings are no longer valid as proxies for temperature, then their claimed validity as proxies for previous times is not necessarily true. Likewise any other proposed proxy.

KC
December 7, 2009 2:04 pm

Why all the complexity? I’m not a scientist, but I’d invest most of my time in calibrating the proxy data to the instrument data and then I’d just plot the proxy data. Any other thought process mixed into the method just skews the results in the direction of their imagination.

Simeon Higgs
December 7, 2009 7:28 pm

Someone needs to replace the input data with a constant number and see what happens

Steven
December 8, 2009 12:56 am

This is all SO FRAUDULENT.
All of these ‘warmers’ who keep diverting the topic to the ‘heinous act of stealing emails’ need to go outside and jump in front of a truck.
THE REAL ISSUE: Why did all of the Climate-Scientists react like they were guilty if they weren’t? Caught with their hands in the cookie jar.
THE REAL ISSUE: If the data is so accurate then shouldn’t it stand the test of scrutiny in an open investigation? We all know now that these scientists would not provide this data to other scientists who wanted to check the validity of the ‘Consensus’ because they had already made their minds up that such people were ‘denyers’. Maybe they can find the data and we can now get this scrutiny properly started for the first time in 30 odd years.
THE REAL ISSUE: Why did all of the Climate-Scientists firstly admit that the emails were theirs an then do a backflip….I know, how about legal advice was given to them. That’s cool, but once again, WHY DID THEY ADMIT FIRSTLY THAT THESE EMAILS WERE THEIRS? If they were fabricated the the scientists would have categorically denied fraud.
THE REAL ISSUE: How does a climate institute responsible for the largest chunk of climate modelling being used by almost every political body on Earth, suddenly lose all of the modelling data for climate models that are being used to structure the biggest ever re-organization of world economics imagineable? Just this fact alone should mean that all of the Copenhageners should CANCEL THEIR VISITS, and start asking questions about why on Earth they were relying on such an institute in the first place to give professional input if they can’t even store and manage their data.
There are many REAL issus out there, but why do the ‘warmers’ ignore logical proof of manipulation to deceive in order to protect their egos only?
You people should learn to admit when you are wrong, and if you are involved in the coverup then there are people such as me out there that will now not stop until you are all brought to justice and the extent of this world government conspiracy is known to the public.
ENOUGH IS ENOUGH. Copenhagen better come and go, because if ay of these stuffed shirts sign anything there will be hell to pay.

random bytes
December 8, 2009 2:41 am

Folks, I am workingfor a company that has maintained accurate, independant, temperature records for at least the last 80 years across multiple sites worldwide and a single trend is clear: It is downward – the temperatures are dropping, so much so that they are looking at planting nearer the equator in both hemispheres.
One lot of monitoring gear ran nice and quietly for twenty years with no modifications and religiously reporting a decline in temperature, quietly tucked away in a corner of a national park – free from any man made sources of heat.
This data is not unique – any company that derives its income from planting would also have good records. Why can’t NASA and the UK idiots at least release their data for public scruting? Because it has been thoroughly doctored in order to support their beliefs. This is just the most blatant disregard of their own scientific profession – it seems proof is simply not required to support their political and monetary goals but the newspapers just don’t seem to care.
Thanks god for the US senate and the few people around the world who just won’t take the media crap!!

chris
December 9, 2009 3:26 pm

I never write parts of programs or comments when i don’t use them. That’s too much work. Fraud – plain and simple.

joeedh
December 12, 2009 4:09 pm

These comments don’t look too bad to me. I work in software development myself, and this all looks innocent enough. Further more, if it is really possible to make a case that tree ring behavior changed past 1960, I don’t see the harm of swapping out data with ground stations that — after all — are supposed to be even more accurate.
Crap like this shows up all the time in real coding work, and especially in more math-heavy code produced by universities and engineering companies. Honestly I don’t understand all the “developers” who keep saying how evil this is; this sort of sloppyness happens all the time. Sloppy code doesn’t equal inaccurate code; a sloppily-written program may run just as well as one coded better.
It’s not that sloppy programming always produces incorrect results (though it can), it simply produces hard and expensive to maintain programs.

joeedh
December 12, 2009 4:11 pm

I should point out that certain industries do have much stricter standards, mostly related to ensuring the behavior of software is always completely predictable and reliable; e.g., the software that runs trains, medical equipment, etc. IIRC this is mandated by law, and as far as I know hasn’t proliferated past those few industries as yet (mostly due to lack of cheap analysis tools I think).

December 15, 2009 7:13 am

This is horrific.
The scientist(s) have told the programmer results they want. He has worked all weekend to adjust the data to produce the “correct” results and hide the cooling after certain dates.
This is not just the smoking gun: this is the bullet in motion being scored by barrel of the gun.

nehp
December 17, 2009 2:12 pm

——————-
CITE: SidViscous (09:33:44) :
“Somebody should Slashdot this, it is exactly the sort of thing the programming geeks there will understand and appreciate. They may also be able to provide valuable insight.”
I’ve been watching Slashdot for this story to come up, obviously this is just the sort of thing that SHOULD show up there.
It won’t. Slashdot is filled with pro AGW types, and controlled by them. I wouldn’t be surprised if it never show up there at all.
—————————-
hahaha
“it is exactly the sort of thing the programming geeks there will understand and appreciate” and “Slashdot is filled with pro AGW types”
do you get it? oh please this is so ridiculous!

joeedh
December 17, 2009 4:35 pm

If you read the code and have any practical programming expertise at all, and if you read all the emails, these documents themselves have little incriminating evidence. The issue isn’t really whether or not they altered the proxy data, it’s whether or not the historical temperature data they used was accurate. Right now there isn’t any damning (in a legal sense) evidence in that regard.
There are unresolved issues in AGW (e.g. is the water vapor feedback really as huge as the models suggest, what about the data the CRU lost, etc) but I believe the false furor these documents have spawned has really hurt the credibility of global warming skeptics. If you have any sort of expertise at all, it’s obvious the documents themselves aren’t nearly so damning as they appear on the surface.
In the end, I hope investigation is done into the real inconsistencies, but I worry these documents have damaged the chances of that happening (if I were prone to conspiracy theories I’d say that was why the CRU wasn’t more active in denying the validity of the documents, to lure skeptics into shooting themselves in the foot, but I doubt the decision was that simple).
The reality is that global warming is happening; the real question is how fast. I don’t have a stance on this myself (I’m not a scientist, nor do I have accurate to the right data), but I’m hoping we have enough time we don’t have to sacrifice world economic growth to fix this problem.
Interestingly enough, water vapor levels in the atmosphere are the real driving force here; it’s widely believed that the relatively small amount of warming from CO2 causes ocean water to evaporate faster, producing a feedback where the warming temperatures cause more and more water vapor to enter the atmosphere (water vapor is a huge greenhouse gas). The alleged validity of this theory is the defining factor on whether global warming is truly catastrophic or not.

Rich Ahlgrim
December 22, 2009 1:42 pm

If I wrote code like this I would be facing jail time (I am a finiancial programmer).
This is criminal

harpo
December 23, 2009 5:21 pm

Joeedh
Feel free to disagree with any or all of my points.
1. Tree rings were used to reconstruct past temperatures
2. In order to do that one must calibrate against known temperatures
3. Briffa produced a reconstruction that showed that there were serious calibration issues.
4. Jones & Mann used a “trick” to “hide” this fact
To the extent that any money was obtained by Jones and/or Mann based on the “trick” (given that they knowingly hid facts that were at least pertinent, even material, to the issue at hand) then that is *obtaining financial advantage by deception*
I’m pretty sure that’s a crime in most countries around the world. Certainly is in the UK (although it may be called something slightly different).

Doug
December 26, 2009 12:01 pm

joeedh, with all due respect you state that “I don’t have a stance on this myself (I’m not a scientist, nor do I have accurate to the right data)”, then you confidently propound theories regarding natural feedback loops, the existence of a trend in global warming, and whether this file is damning or not. Allow me to assume that you are also not a programmer.
As someone with a PhD in Chemical Engineering I at least have a solid grounding in scientific programming (10+ years) and the scientific method (15+years). I do not presume to propound theories on the science indovled (I only maintain a heathly scepticism regarding *any* scientific theory, mine included) but I can judge that this project is typical of a university grade project and simultaneously damages the claim that these scientists have access to the extrodinary evidence required to support their extraordinary claims.
Again, with all due respect, who on earth are you to judge?

December 26, 2009 1:03 pm

this Tim Mitchell that wrote the code that Harry is dealing with:
“…………….Although I have yet to see any evidence that climate change is a sign of Christ’s imminent return, human pollution is clearly another of the birth pangs of creation, as it eagerly awaits being delivered from the bondage of corruption (Romans. 19-22).
Tim Mitchell works at the Climactic Research Unit, UEA, Norwich, and is a
member of South Park Evangelical Church.”
from this article:
http://www.e-n.org.uk/1129-Climate-change-and-the-christian.htm
google the following sentence for his bio, link
“In 1997 I moved to Norwich to carry out the research for a PhD at the
Climatic Research Unit (CRU) of the University of East Anglia. ..”
Tim Mitchell bio: ( a little bit changed to CAPS by me)
In 1997 I moved to Norwich to carry out the research for a PhD at the
Climatic Research Unit (CRU) of the University of East Anglia. My subject
was the development of climate scenarios for SUBSEQUENT USE BY RESEARCHERS investigating the impacts of climate change. I was supervised by Mike Hulme and by John Mitchell (Hadley Centre of the UK Meteorological Office). The PhD was awarded in April 2001.
http://www.cru.uea.ac.uk/~timm/personal/index.html – Cached
These guys found him FIRST . Weeks ago! (about 2/3 down the page)
http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13
To quote Harry: (Dr Ian ‘Harry’ Harris – Harry_Read_Me.txt)
So what the hell did Tim do?!! As I keep asking.”
Whilst many people of faith are excellent dedicated professional scientists.
I have a few doubts that an evangelical eco christian (my label), that
obviously is passionate and committed to the above, and clearly believes in
human pollution/corruption, and is aparently anticipating Christ’s return to
earth due to impending climate armageddon, may not be perhaps open as they may think they are, perhaps they should be, to both sides of the debate. (ie the theory could be wrong!)
http://www.e-n.org.uk/2625-Day-after-tomorrow.htm
“The librarian chooses to rescue an old Bible, not because he believes in
God, but because its printing was ‘the dawn of the age of reason’. In this
film we see how far we have fallen. Lost, we retreat into a virtual world
where disaster becomes entertainment and the unreal seems more real than reality itself. ‘For whom tolls the bell? It tolls for thee.’
Dr. Tim Mitchell, climate scientist”
Where is Tim, Why can’t ‘Harry’ just ask him?
Dr Tim, obviously left CRU around 2004, as published research papers dry up
2004 Dr. Tim Mitchell,formerly a scientist, now a student at LTS
London Theological Seminary
Evangelical Protestant college for the training of preachers and pastors.
Provides degrees up to Masters level. includes course details and resources.
http://www.ltslondon.org
This guy has an opinion on the code:
http://di2.nu/200912/01.htm
Don’t forget the church he worshiped at(the irony):
South Park (Al Gore – ManBearPig – South park episode)
need to save my $ for those CO2 taxes

joeedh
December 26, 2009 4:37 pm

harpo:
The tree ring data did turn out to be faulty, and modern models are robust without them. It’s not like they were the only ones used; there’s ice cores, core reef samples, etc.
The ring data fit the temperatures before 1960. Correcting them to match the observed temperatures after 1960 sounds a bit off, but I’m not a statistician so I’m not totally sure. You could argue that humans had changed the world so much that the tree ring inconsistency was caused by us (this view would fit the relevant scientist’s worldview quite well, though I disagree with it), but ultimately tree rings were judged to be too unreliable to rely on in the latest models.
The “spike” in the hockey graph comes from temperature records, it’s not affected by the proxy data (which is inherently less accurate and more error-prone). My original point is that the “spike” is the important part, and if you assume the temperature records are correct, they do show a significant global temperature increase, that does seem to correspond to the models.
That’s why I said the real question was the temperature record accuracy, not the proxy data. Since this so many sites and blogs fell to the fallacy of concentrating on the proxy data, I worry that skeptics have shot themselves in the foot, resulting in a significantly weakened position in the long run, and no check on the climate change activists.
My position has morphed to “well, it is happening, the question is how fast; drastic measures will do more harm then good, especially in developing countries that might not be able to bear the economic strain.” But it’s hard to tell, there’s so many shadings, and both sides are biased to hell and occasionally lie their asses off.
I suspect the truth is somewhere in between, but finding it seems impossible without access to the raw data of temperature stations across the world (and a statistician skilled enough to normalize the data and analyze it).

joeedh
December 26, 2009 4:51 pm

Doug:
I’m taking a little more active stance since I’ve done more research (which seems to show that both sides are full of crap).
I’m a software developer (admittedly about half self-taught) in computer graphics. My experience with university CG code hasn’t been that much better then what I saw in the files, thus why I was so confused why people were blowing up at it. I’ve also seen much *worse* code in my work, it happens all the time in the commercial world.
As for the water vapor feedback, I was simply reporting it (I’m not fully convinced of its validity myself, and I thought I implied as much). Just look at the NASA news release on it from last year (I’ve not really deciphered the paper it was based on, it *seemed* weak but I’m not qualified enough to say).
Anyway, I stand on my original point. I don’t want the skeptic movement to die (I think they ask some very interesting questions, and serve as a check on the climate change activists, and since I think both sides are full of crap I don’t want either one to win outright). But I’m afraid this will come back and bite them in the long term.

Kurt
January 7, 2010 10:02 am

The sum of the squares going negative is very revealing.
The sum of the squares is a basic calculation in statistics, and is one of the steps taken in doing a regression or correlation. It is a measure of the amount of error; lack of correlation is another way of saying it.
If the sum of the squares is very large it means there is a lot of error and almost no correlation at all. If the sum of the squares gets really large, and they are using integers in the calculation, then the sum of the squares will go negative. Whatever word size they are using in the program – 16 bit, 32 bit, etc, – when it is maxed out, it will be a negative number.
This means the errors that they are calculating are so large, they max out the integers being used to store the sum of the squares.

January 18, 2010 8:41 am

If it wasn’t a hacker, but a whistleblower, my money is on whoever was writing these notes.
Think about it – the stress, the ‘Twilight Zone Syndrome’ mentioned in trying to deal with it – SWEARING in the rem notes!
If it was me trying to deal with this, and I had Phil Jones and Michael Mann up my rear, I could totally see myself, jacked up on too much coffee, no sleep, and a head full of total bullshit just deciding one night to hit the reset button on the whole charade.
It’s funny how they talk of ONE email taken out of context, when clearly “Hide The Decline” was literally their mantra, spoken and written hundreds of times daily.
Let this trigger the end of the UN, Socialism as a whole, the arrogant, shrill nonsense of the ‘Green’ movement, and Taxation-as-Industry in our time.
Let’s quit whining, and get back to work – there is an entire galaxy to explore.

1 16 17 18