Climategate: hide the decline – codified

WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the  HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.

http://codyssey.files.wordpress.com/2009/02/software_bug.jpg

To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

  • FOIA\documents\harris-tree\recon_esper.pro; Computes regressions on full, high and low pass Esper et al. (2002) series,

    ; anomalies against full NH temperatures and other series.

    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline

  • FOIA\documents\harris-tree\calibrate_nhrecon.pro;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline that affects tree-ring density records)

    ;

  • FOIA\documents\harris-tree\recon1.pro

    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro;

    ; Specify period over which to compute the regressions (stop in 1940 to avoid

    ; the decline

    ;

  • FOIA\documents\HARRY_READ_ME.txt17. Inserted debug statements into anomdtb.f90, discovered that

    a sum-of-squared variable is becoming very, very negative! Key

    output from the debug statements:

    (..)

    forrtl: error (75): floating point exception

    IOT trap (core dumped)

    ..so the data value is unbfeasibly large, but why does the

    sum-of-squares parameter OpTotSq go negative?!!

  • FOIA\documents\HARRY_READ_ME.txt22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software

    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the

    definitive failure of the entire project..

  • FOIA\documents\HARRY_READ_ME.txtgetting seriously fed up with the state of the Australian data. so many new stations have been

    introduced, so many false references.. so many changes that aren't documented. Every time a

    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with

    references, some with WMO codes, and some with both. And if I look up the station metadata with

    one of the local references, chances are the WMO code will be wrong (another station will have

    it) and the lat/lon will be wrong too.

  • FOIA\documents\HARRY_READ_ME.txtI am very sorry to report that the rest of the databases seem to be in nearly as poor a state as

    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO

    and one with, usually overlapping and with the same station name and very similar coordinates. I

    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!

    There truly is no end in sight.

  • FOIA\documents\HARRY_READ_ME.txt28. With huge reluctance, I have dived into 'anomdtb' - and already I have

    that familiar Twilight Zone sensation.

  • FOIA\documents\HARRY_READ_ME.txtWrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not

    being kept in step. Sounds familiar, if worrying. am I the first person to attempt

    to get the CRU databases in working order?!!

  • FOIA\documents\HARRY_READ_ME.txtWell, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I

    immediately found a mistake! Scanning forward to 1951 was done with a loop that, for

    completely unfathomable reasons, didn't include months! So we read 50 grids instead

    of 600!!! That may have had something to do with it. I also noticed, as I was correcting

    THAT, that I reopened the DTR and CLD data files when I should have been opening the

    bloody station files!!

  • FOIA\documents\HARRY_READ_ME.txtBack to the gridding. I am seriously worried that our flagship gridded data product is produced by

    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station

    counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived

    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented

    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?

    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding

    procedure? Of course, it's too late for me to fix it too. Meh.

  • FOIA\documents\HARRY_READ_ME.txtHere, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet

    the WMO codes and station names /locations are identical (or close). What the hell is

    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)

  • FOIA\documents\HARRY_READ_ME.txtWell, it's been a real day of revelations, never mind the week. This morning I

    discovered that proper angular weighted interpolation was coded into the IDL

    routine, but that its use was discouraged because it was slow! Aaarrrgghh.

    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'

    to 720x360 - also deprecated! And now, just before midnight (so it counts!),

    having gone back to the tmin/tmax work, I've found that most if not all of the

    Australian bulletin stations have been unceremoniously dumped into the files

    without the briefest check for existing stations.

  • FOIA\documents\HARRY_READ_ME.txtAs we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
  • FOIA\documents\HARRY_READ_ME.txtOH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm

    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform

    data integrity, it's just a catalogue of issues that continues to grow as they're found.

  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.proprintf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’

    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’

    printf,1,’Reconstruction is based on tree-ring density records.’

    printf,1

    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY

    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’

    printf,1,’will be much closer to observed temperatures then they should be,’

    printf,1,’which will incorrectly imply the reconstruction is more skilful’

    printf,1,’than it actually is. See Osborn et al. (2004).’

  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    FOIA\documents\osborn-tree6\summer_modes\data4sweden.proprintf,1,'IMPORTANT NOTE:'

    printf,1,'The data after 1960 should not be used. The tree-ring density'

    printf,1,'records tend to show a decline after 1960 relative to the summer'

    printf,1,'temperature in many high-latitude locations. In this data set'

    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'

    printf,1,'this means that data after 1960 no longer represent tree-ring

    printf,1,'density variations, but have been modified to look more like the

    printf,1,'observed temperatures.'

  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro;

    ; Remove missing data from start & end (end in 1960 due to decline)

    ;

    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)

    sst=prednh(kl)

  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the

    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors

    ; but not as predictands. This PCR-infilling must be done for a number of

    ; periods, with different EOFs for each period (due to different spatial

    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),

    ; since they won’t be used due to the decline/correction problem.

    ; Certain boxes that appear to reconstruct well are “manually” removed because

    ; they are isolated and away from any trees.

  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]

    ;

    ; Apply a VERY ARTIFICAL correction for decline!!

    ;

    yrloc=[1400,findgen(19)*5.+1904]

    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$

    2.6,2.6,2.6]*0.75 ; fudge factor

    (...)

    ;

    ; APPLY ARTIFICIAL CORRECTION

    ;

    yearlyadj=interpol(valadj,yrloc,x)

    densall=densall+yearlyadj

  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro;

    ; Plots density ‘decline’ as a time series of the difference between

    ; temperature and density averaged over the region north of 50N,

    ; and an associated pattern in the difference field.

    ; The difference data set is computed using only boxes and years with

    ; both temperature and density in them – i.e., the grid changes in time.

    ; The pattern is computed by correlating and regressing the *filtered*

    ; time series against the unfiltered (or filtered) difference data set.

    ;

    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE

    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro;

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

    ;

  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered

    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which

    ; gives a zero mean over 1881-1960) after extending the calibration to boxes

    ; without temperature data (pl_calibmxd1.pro). We have identified and

    ; artificially removed (i.e. corrected) the decline in this calibrated

    ; data set. We now recalibrate this corrected calibrated dataset against

    ; the unfiltered 1911-1990 temperature data, and apply the same calibration

    ; to the corrected and uncorrected calibrated MXD data.

  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro; No need to verify the correct and uncorrected versions, since these

    ; should be identical prior to 1920 or 1930 or whenever the decline

    ; was corrected onwards from.

  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro; we know the file starts at yr 440, but we want nothing till 1400, so we

    ; can skill lines (1400-440)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1980, which is

    ; (1980-1400)/10 + 1 lines

    (...)

    ; we know the file starts at yr 1070, but we want nothing till 1400, so we

    ; can skill lines (1400-1070)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1991, which is

    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro

    FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro

    FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

  • FOIA\documents\harris-tree\recon_esper.pro

    ; Computes regressions on full, high and low pass Esper et al. (2002) series,

    ; anomalies against full NH temperatures and other series.

    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline

  • FOIA\documents\harris-tree\calibrate_nhrecon.pro

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline that affects tree-ring density records)

    ;

  • FOIA\documents\harris-tree\recon1.pro

    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro

    ;

    ; Specify period over which to compute the regressions (stop in 1940 to avoid

    ; the decline

    ;

  • FOIA\documents\HARRY_READ_ME.txt

    17. Inserted debug statements into anomdtb.f90, discovered that

    a sum-of-squared variable is becoming very, very negative! Key

    output from the debug statements:

    (..)

    forrtl: error (75): floating point exception

    IOT trap (core dumped)

    ..so the data value is unbfeasibly large, but why does the

    sum-of-squares parameter OpTotSq go negative?!!

  • FOIA\documents\HARRY_READ_ME.txt

    22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software

    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the

    definitive failure of the entire project..

  • FOIA\documents\HARRY_READ_ME.txt

    getting seriously fed up with the state of the Australian data. so many new stations have been

    introduced, so many false references.. so many changes that aren't documented. Every time a

    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with

    references, some with WMO codes, and some with both. And if I look up the station metadata with

    one of the local references, chances are the WMO code will be wrong (another station will have

    it) and the lat/lon will be wrong too.

  • FOIA\documents\HARRY_READ_ME.txt

    I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as

    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO

    and one with, usually overlapping and with the same station name and very similar coordinates. I

    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!

    There truly is no end in sight.

  • FOIA\documents\HARRY_READ_ME.txt

    28. With huge reluctance, I have dived into 'anomdtb' - and already I have

    that familiar Twilight Zone sensation.

  • FOIA\documents\HARRY_READ_ME.txt

    Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not

    being kept in step. Sounds familiar, if worrying. am I the first person to attempt

    to get the CRU databases in working order?!!

  • FOIA\documents\HARRY_READ_ME.txt

    Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I

    immediately found a mistake! Scanning forward to 1951 was done with a loop that, for

    completely unfathomable reasons, didn't include months! So we read 50 grids instead

    of 600!!! That may have had something to do with it. I also noticed, as I was correcting

    THAT, that I reopened the DTR and CLD data files when I should have been opening the

    bloody station files!!

  • FOIA\documents\HARRY_READ_ME.txt

    Back to the gridding. I am seriously worried that our flagship gridded data product is produced by

    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station

    counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived

    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented

    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?

    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding

    procedure? Of course, it's too late for me to fix it too. Meh.

  • FOIA\documents\HARRY_READ_ME.txt

    Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet

    the WMO codes and station names /locations are identical (or close). What the hell is

    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)

  • FOIA\documents\HARRY_READ_ME.txt

    Well, it's been a real day of revelations, never mind the week. This morning I

    discovered that proper angular weighted interpolation was coded into the IDL

    routine, but that its use was discouraged because it was slow! Aaarrrgghh.

    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'

    to 720x360 - also deprecated! And now, just before midnight (so it counts!),

    having gone back to the tmin/tmax work, I've found that most if not all of the

    Australian bulletin stations have been unceremoniously dumped into the files

    without the briefest check for existing stations.

  • FOIA\documents\HARRY_READ_ME.txt

    As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>

  • FOIA\documents\HARRY_READ_ME.txt

    OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm

    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform

    data integrity, it's just a catalogue of issues that continues to grow as they're found.

  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro

    printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’

    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’

    printf,1,’Reconstruction is based on tree-ring density records.’

    printf,1

    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY

    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’

    printf,1,’will be much closer to observed temperatures then they should be,’

    printf,1,’which will incorrectly imply the reconstruction is more skilful’

    printf,1,’than it actually is. See Osborn et al. (2004).’

  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    printf,1,'IMPORTANT NOTE:'

    printf,1,'The data after 1960 should not be used. The tree-ring density'

    printf,1,'records tend to show a decline after 1960 relative to the summer'

    printf,1,'temperature in many high-latitude locations. In this data set'

    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'

    printf,1,'this means that data after 1960 no longer represent tree-ring

    printf,1,'density variations, but have been modified to look more like the

    printf,1,'observed temperatures.'

  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro

    ;

    ; Remove missing data from start & end (end in 1960 due to decline)

    ;

    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)

    sst=prednh(kl)

  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro

    ; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the

    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors

    ; but not as predictands. This PCR-infilling must be done for a number of

    ; periods, with different EOFs for each period (due to different spatial

    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),

    ; since they won’t be used due to the decline/correction problem.

    ; Certain boxes that appear to reconstruct well are “manually” removed because

    ; they are isolated and away from any trees.

  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]

    ;

    ; Apply a VERY ARTIFICAL correction for decline!!

    ;

    yrloc=[1400,findgen(19)*5.+1904]

    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$

    2.6,2.6,2.6]*0.75 ; fudge factor

    (...)

    ;

    ; APPLY ARTIFICIAL CORRECTION

    ;

    yearlyadj=interpol(valadj,yrloc,x)

    densall=densall+yearlyadj

  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro

    ;

    ; Plots density ‘decline’ as a time series of the difference between

    ; temperature and density averaged over the region north of 50N,

    ; and an associated pattern in the difference field.

    ; The difference data set is computed using only boxes and years with

    ; both temperature and density in them – i.e., the grid changes in time.

    ; The pattern is computed by correlating and regressing the *filtered*

    ; time series against the unfiltered (or filtered) difference data set.

    ;

    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE

    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro

    ;

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

    ;

  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro

    ; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered

    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which

    ; gives a zero mean over 1881-1960) after extending the calibration to boxes

    ; without temperature data (pl_calibmxd1.pro). We have identified and

    ; artificially removed (i.e. corrected) the decline in this calibrated

    ; data set. We now recalibrate this corrected calibrated dataset against

    ; the unfiltered 1911-1990 temperature data, and apply the same calibration

    ; to the corrected and uncorrected calibrated MXD data.

  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro

    ; No need to verify the correct and uncorrected versions, since these

    ; should be identical prior to 1920 or 1930 or whenever the decline

    ; was corrected onwards from.

  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro

    ; we know the file starts at yr 440, but we want nothing till 1400, so we

    ; can skill lines (1400-440)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1980, which is

    ; (1980-1400)/10 + 1 lines

    (...)

    ; we know the file starts at yr 1070, but we want nothing till 1400, so we

    ; can skill lines (1400-1070)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1991, which is

    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)


Sponsored IT training links:

Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.


The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
5 2 votes
Article Rating
445 Comments
Inline Feedbacks
View all comments
PR Guy
November 25, 2009 10:51 am

Over at Connolley’s blog I posted:
“The Emails show that Jones and Mann can’t be trusted. HARRY_READ_ME shows that the code is incompetent and the code itself shows manual adjustments that have no scientific basis. This is sufficient evidence to call for a third party review of the entire CRU methodology. ”
To which I got two replies:
——
PR Guy – what papers was the HARRY_READ_ME code used on? Have you any evidence it was used at all?
Posted by: Chris S. | November 25, 2009 12:07 PM
—–
PR Guy,do you even know which dataset/product the HARRY_READ_ME code is dealing with?
Posted by: Adam | November 25, 2009 12:46 PM
—-
To which I responded:
“Chris S and Adam, these are very reasonable questions. Perhaps you should submit a FOIA to find out. I’m sure we all agree that answers to these sorts of questions are vital and should not be obstructed.”
This last comment was deleted by William (or maybe the moderator, if there is a moderator). The Team never lets points get scored against them on their court.

Frank Lansner
November 25, 2009 10:51 am

CLIMATE GATE IN NEW ZEALAND!
J SAllingers Climate fraction caught in temperature swindle:
http://nzclimatescience.net/index.php?option=com_content&task=view&id=550&Itemid=1
Here NZ temperature graph before and after “adjusments”:
http://www.klimadebat.dk/forum/vedhaeftninger/newzealand.jpg
Sallinger has changes NZ temperature trend for the 20´th century from 0,06K to 0,92 K !
The team behind these findings will now move on to other countries.
WAY TO GO!

hitfan
November 25, 2009 10:52 am

I’ve done some programming myself. I used to put in swear words in the code all the time.
In fact, I would even use the F word and variations thereof for variable and object names, LOL.

John Peter
November 25, 2009 10:55 am

Hi Folks
What do you think about this one? It looks as if the “data adjustment contamination” has infected New Zealand as well. Look at
http://www.climatescience.org.nz/
and click on Link at: CLIMATEGATE IN NEW ZEALAND? – TEMPERATURE RECORDS MANIPULATED Science
It is incredible to read how New Zealand’s National Institute of Water & Atmospheric Research (NIWA) seems to have managed to make a “hockey stick” out of their raw data which apparently shows that there has been no warming of any consequence since 1850. One wonders who else has been involved in this game.

t-bird
November 25, 2009 10:56 am

I guess this proves their point that Global Warming truly is man-made. Totally made up, in fact, by a few.

jcl
November 25, 2009 10:58 am

I mean seriously, [snip]????? “yearlyadj”? Temp proxy declines so just add a ramp to the values??????
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(…)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj

Dishman
November 25, 2009 11:00 am

I had a really scary thought…
What if the problem isn’t with the tree rings, but with the temperature series?
What if the temperature series is actually off by 2.6C (high) since 1940?
Does that mean we’re actually 2C or more below the 1940s temps?
Are we screwed?

RConnelly
November 25, 2009 11:01 am

The major problem with climate science (along with much of acadamia) is that they dont use ‘Software Engineers’ following good software development practices to develop the programms, models, etc. Mostly its just PhDs hacking stuff together. They are very smart… but software engineering is not their expertise.
Ask them where are the Requirement documents, Design Documents, Review documents, Test Plans, Test results, configuration Control plans etc..
Now they are asking for $100 billions to be spent based to some extent upon software that has not passed any formal testing….

November 25, 2009 11:06 am

JCS
“Steve Gavin at RealClimate has refused more than 6 times to post the following message, are you willing to present this very important point for me?”
LOL. JCS, welcome to the club of thousands who Gavin has moderated out because he finds their comments inconvenient. Gavin is a part of the AGW cabal. He is on the distribution list for many of the e-mails from CRU gate. Some of those comments make it clear that Jones and others consider Gavin as the guy that runs interference for them. I have written a small piece about how debates are orchestrated at RC here:
http://reallyrealclimate.blogspot.com/

NikFromNYC
November 25, 2009 11:08 am

HARRY_READ_ME is a great work of stream-of-consciousness literature and perhaps Harry was aware of it so when they asked him to delete it after he got all excited about it being included in a Freedom of Information Act package that then was denied release…he said to himself:
“No, so holp me Petault, it is not a miseffectual whyancinthinous riot of blots and blurs and bars and balls and hoops and wriggles and juxtaposed jottings linked by spurts of speed: it only looks as like is as damn it; and, sure, we ought really to rest thankful that at this deleteful hour of dungflies dawning we have even a written on with dried ink scrap of paper at all to show for ourselves, tare it or leaf it, (and we are lufted to ourselves as the soulfisher when he led the cat out of the bout) after all that we lost and plundered of it even to the hidmost coignings of the earth and all it has gone through and by all means, after a good ground kiss to Terracussa and for wars luck our lefftoff’s flung over our home homeplate, cling to it as with drowning hands, hoping against all hope all the while that, by the light of philosophy, (and may she never folsage us!) things will begain to clear up a bit one way or another within the next quarrel of an hour and be hanged to them as ten to one they will too, please the pigs, as they ought to categorically, as, strickly between ourselves, there is a limit to all things so this will never do.” – James Joyce (“Finnegans Wake” 1939)

PhilW
November 25, 2009 11:11 am

Run these numbers, see if that makes a hockey stick…………
http://spreadsheets.google.com/ccc?key=0Ah4XLQCleuUYdFIxMnhMNnlXb2JQcDZUendjUXpWWUE&hl=en

November 25, 2009 11:13 am

Are there perhaps some people around still denying this fairly decent evidence of poor science? Is there some term we could use for them perhaps?

Glenn
November 25, 2009 11:16 am

;mknormal,yyy,timey,refperiod=[1881,1940]
;
“; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(…)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj”
************************************
Hmm. Are there edited versions out there? My file:
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
;
oplot,!x.crange,[0.,0.],linestyle=1
;
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,[‘Northern Hemisphere April-September instrumental temperature’,$
; ‘Northern Hemisphere MXD’,$
; ‘Northern Hemisphere MXD corrected for decline’],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,[‘Northern Hemisphere April-September instrumental temperature’,$
‘Northern Hemisphere MXD’],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
;
end

Gary Hladik
November 25, 2009 11:16 am

Reed Coray (09:35:23) : “http://i47.tinypic.com/mrxszt.png
Great picture, but I have a question. Which one is Judas?”
Easy. They all are. 🙂

Bill Hunter
November 25, 2009 11:16 am

If I produced work like that I would want to delete it also rather than give it to an auditor.
Short of a detailed explanation and the full production of a model establishing this as nothing more than an interim piece of work; Jones should be fired for this alone.
He probably should also be held to account for the fraud that this is. This is beyond incompetence for somebody in Jones position if he can’t bring something forward to mitigate this!
This would also appear to establish Jones as a liar in suggesting the “to hide the decline” behavior only dealt with modifying a graphic.

t-bird
November 25, 2009 11:16 am

Very disappointing. slashdot.org has only posted the original Nov 20th story about the fact that files had been hacked (or leaked) from the Hadley CRU. No follow-up about the content or the code that is being found.
Global economic nightfall, perhaps aided and abetted by programmers, and not a peep out of slashdot?

Robinson
November 25, 2009 11:17 am

I’ve done some programming myself. I used to put in swear words in the code all the time.
In fact, I would even use the F word and variations thereof for variable and object names, LOL.

That isn’t considered very professional in my business, to be honest.

rbateman
November 25, 2009 11:22 am

Excuse me, Mr. Monbiot. Nobody has given me a copper-coated zinc penny.
I want my weather back. Get it?
Dishman (11:00:55) :
Are we scrooged?
Probably.
Depends upon whether you can trust GHCN with Jones/Karl online now is not the mangled mess HARRY was tasked with while supposing that the MasterDB will somehow miraculouly appear in a pristine state.
Now, just close your eyes, click your heels 3 times, and say “there’s no place like home, there’s no place like home”.

Ron de Haan
November 25, 2009 11:23 am

If we don’t fix the science and clean out the “climate caves” we will have more of this bogus in the future:
http://heliogenic.blogspot.com/2009/11/can-it-get-any-more-hysterical.html

rbateman
November 25, 2009 11:26 am

http://www.solarcycle24.com/stereobehind.htm
Zzzzzz…….snore….zzzzz…..snuck, snore….zzzzzz

Eric
November 25, 2009 11:26 am

documents\cru-code\linux\cruts
This code is used to convert new data into the new CRU 2.0 data format (.cts files).
There is another version of this code in cru-code\alpha which is, per comment in the readme file, intended for running on the “Alphas”
Data can come in from text files or Excel spreadsheet files (or actually Excel spreadsheets written out to text files from Excel). These programs are designed to read multiple climate data file formats include
GHCNv2
CLIMAT (Phil Jones formt)
MCDW
CLIMAT (original)
CLIMAT (AOPC-Offenbach)
Jian’s Chinese data from Excel (appears to be text output from Excel)
CRU time-series file format – with the comment “(but not quite right)”
Data files for running these code files are not available in this archive.
Software engineering comment – this collection of programs – very large source code files – is implementing a crude database management system. Most of the source code is uncommented and undocumented. From a s/w engineering perspective, it would have seemed wise to have used an existing DBMS that had been extensively tested and verified. Instead, the approach chosen results in extremely large amounts of custom code being written. There is no evidence provided of software quality assurance (SQA) procedures applied, such as a test plan, test scenarios, unit testing, test driven development and so forth. It would most likely have been quicker and more reliable to use existing software tools like DBMS.
The goal of the software is to eventually calculate the anomalies of the temperature series from the 1961-1990 mean.
Because station reporting data is often missing, the code works to find nearby stations and then substitute those values directly or through a weighting procedure. In effect, the code is estimating a value for missing data. Station data will be used as long as at least 75% of the reporting periods are present (or stated the other way, up to 25% of the data can be missing and missing data will be estimated).
The linux\_READ_ME.txt file contains an extensive description. Of interest, stations
within 8km of each other are considered “duplicates” and the data between the stations is “merged”. I have a question about this which may not really matter – but there is no attempt to determine if the nearby stations are correlated with one another. It is possible, for example, that one station is near a body of water (and less volatile) and another is on the roof of a fire station (see surfacestations.org). Or the stations could be at different elevations. In my town, the official weather reporting station moved 4 times over the past century – from downtown in a river valley, to eventually up on a plateau next to a windy airport. These locations would today fall within the 8km bounding area. My concern is that this could skew results in an unpredictable way. Then again, it could be that the situations like I describe are rare and would have negligable impact on the calculations.

hitfan
November 25, 2009 11:28 am

Robinson: I may have exaggerated a bit. I only did it for one particular employer and it was because I was working 14+ hours a day and being placed on 24/7 pager duty with no extra compensation (imagine getting called during your grandmother’s funeral or while in Church on Christmas Eve–yes it happened to me), I survived 20 rounds of layoffs at that company during the dotcom crash.
So in order to relieve a bit of frustration, I had a bit of fun with my source code.
And one of these programs was literally a SPAM DIALER, which was a robo caller that would annoy people with telemarketing promotions. It was quite efficient, it was able to call about 20,000 people a day (with 2 T1s).
Yes, I confess that I programmed a spam dialer during the dotcom crash in order to pay the bills. And there were swear words in the source code! 🙂 (and I don’t feel the least bad about it).
The company is now dead and bankrupt and I danced a little jig when I found out about it a few years later!

Editor
November 25, 2009 11:28 am

ManBearPig

November 25, 2009 11:28 am


Hysteria (10:19:26) :

My opinion – for what its worth – I think we are too late.

It is, however, going to make interesting reading in the history books, on a number of different levels, perhaps even on a par with Piltdown Man(n).

The “Piltdown Man” is a famous paleontological hoax concerning …
The Piltdown hoax is perhaps the most famous paleontological hoax in history. It has been prominent for two reasons: … and the length of time (more than 40 years) that elapsed from its discovery to its full exposure as a forgery.

.
.

DennisA
November 25, 2009 11:32 am

In spite of all this, a BBC program announcement for this eveneing, ref Copenhagen, is: “Can President Obama save the Planet?” I despair.

1 4 5 6 7 8 18