Climategate: hide the decline – codified

WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the  HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.

http://codyssey.files.wordpress.com/2009/02/software_bug.jpg?w=720

To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
    ; plot past 1960 because these will be artificially adjusted to look closer to
    ; the real temperatures.
  • FOIA\documents\harris-tree\recon_esper.pro; Computes regressions on full, high and low pass Esper et al. (2002) series,
    ; anomalies against full NH temperatures and other series.
    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
    ;
    ; Specify period over which to compute the regressions (stop in 1960 to avoid
    ; the decline
  • FOIA\documents\harris-tree\calibrate_nhrecon.pro;
    ; Specify period over which to compute the regressions (stop in 1960 to avoid
    ; the decline that affects tree-ring density records)
    ;
  • FOIA\documents\harris-tree\recon1.pro
    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro;
    ; Specify period over which to compute the regressions (stop in 1940 to avoid
    ; the decline
    ;
  • FOIA\documents\HARRY_READ_ME.txt17. Inserted debug statements into anomdtb.f90, discovered that
    a sum-of-squared variable is becoming very, very negative! Key
    output from the debug statements:
    (..)
    forrtl: error (75): floating point exception
    IOT trap (core dumped)
    ..so the data value is unbfeasibly large, but why does the
    sum-of-squares parameter OpTotSq go negative?!!
  • FOIA\documents\HARRY_READ_ME.txt22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software
    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
    definitive failure of the entire project..
  • FOIA\documents\HARRY_READ_ME.txtgetting seriously fed up with the state of the Australian data. so many new stations have been
    introduced, so many false references.. so many changes that aren't documented.
    Every time a
    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
    references, some with WMO codes, and some with both. And if I look up the station metadata with
    one of the local references, chances are the WMO code will be wrong (another station will have
    it) and the lat/lon will be wrong too.
  • FOIA\documents\HARRY_READ_ME.txtI am very sorry to report that the rest of the databases seem to be in nearly as poor a state as
    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
    and one with, usually overlapping and with the same station name and very similar coordinates. I
    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
    There truly is no end in sight.
  • FOIA\documents\HARRY_READ_ME.txt28. With huge reluctance, I have dived into 'anomdtb' - and already I have
    that familiar Twilight Zone sensation.
  • FOIA\documents\HARRY_READ_ME.txtWrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not
    being kept in step. Sounds familiar, if worrying. am I the first person to attempt
    to get the CRU databases in working order?!!
  • FOIA\documents\HARRY_READ_ME.txtWell, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I
    immediately found a mistake!
    Scanning forward to 1951 was done with a loop that, for
    completely unfathomable reasons, didn't include months! So we read 50 grids instead
    of 600!!!
    That may have had something to do with it. I also noticed, as I was correcting
    THAT, that I reopened the DTR and CLD data files when I should have been opening the
    bloody station files!!
  • FOIA\documents\HARRY_READ_ME.txtBack to the gridding. I am seriously worried that our flagship gridded data product is produced by
    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station
    counts totally meaningless
    . It also means that we cannot say exactly how the gridded data is arrived
    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
    procedure? Of course, it's too late for me to fix it too. Meh.
  • FOIA\documents\HARRY_READ_ME.txtHere, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet
    the WMO codes and station names /locations are identical (or close). What the hell is
    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
  • FOIA\documents\HARRY_READ_ME.txtWell, it's been a real day of revelations, never mind the week. This morning I
    discovered that proper angular weighted interpolation was coded into the IDL
    routine, but that its use was discouraged because it was slow! Aaarrrgghh.
    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
    to 720x360 - also deprecated! And now, just before midnight (so it counts!),
    having gone back to the tmin/tmax work, I've found that most if not all of the
    Australian bulletin stations have been unceremoniously dumped into the files
    without the briefest check for existing stations.
  • FOIA\documents\HARRY_READ_ME.txtAs we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
  • FOIA\documents\HARRY_READ_ME.txtOH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm
    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform
    data integrity, it's just a catalogue of issues that continues to grow as they're found.
  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.proprintf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
    printf,1,’Reconstruction is based on tree-ring density records.’
    printf,1
    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY
    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
    printf,1,’will be much closer to observed temperatures then they should be,’
    printf,1,’which will incorrectly imply the reconstruction is more skilful’
    printf,1,’than it actually is. See Osborn et al. (2004).’
  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
    FOIA\documents\osborn-tree6\summer_modes\data4sweden.proprintf,1,'IMPORTANT NOTE:'
    printf,1,'The data after 1960 should not be used. The tree-ring density'
    printf,1,'records tend to show a decline after 1960 relative to the summer'
    printf,1,'temperature in many high-latitude locations. In this data set'
    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
    printf,1,'this means that data after 1960 no longer represent tree-ring
    printf,1,'density variations, but have been modified to look more like the
    printf,1,'observed temperatures.'
  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro;
    ; Remove missing data from start & end (end in 1960 due to decline)
    ;
    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
    sst=prednh(kl)
  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
    ; but not as predictands. This PCR-infilling must be done for a number of
    ; periods, with different EOFs for each period (due to different spatial
    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
    ; since they won’t be used due to the decline/correction problem.
    ; Certain boxes that appear to reconstruct well are “manually” removed because
    ; they are isolated and away from any trees.
  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]
    ;
    ; Apply a VERY ARTIFICAL correction for decline!!
    ;
    yrloc=[1400,findgen(19)*5.+1904]
    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
    2.6,2.6,2.6]*0.75 ; fudge factor
    (...)
    ;
    ; APPLY ARTIFICIAL CORRECTION
    ;
    yearlyadj=interpol(valadj,yrloc,x)
    densall=densall+yearlyadj
  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro;
    ; Plots density ‘decline’ as a time series of the difference between
    ; temperature and density averaged over the region north of 50N,
    ; and an associated pattern in the difference field.
    ; The difference data set is computed using only boxes and years with
    ; both temperature and density in them – i.e., the grid changes in time.
    ; The pattern is computed by correlating and regressing the *filtered*
    ; time series against the unfiltered (or filtered) difference data set.
    ;
    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro;
    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
    ; plot past 1960 because these will be artificially adjusted to look closer to
    ; the real temperatures.
    ;
  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
    ; gives a zero mean over 1881-1960) after extending the calibration to boxes
    ; without temperature data (pl_calibmxd1.pro). We have identified and
    ; artificially removed (i.e. corrected) the decline in this calibrated
    ; data set. We now recalibrate this corrected calibrated dataset against
    ; the unfiltered 1911-1990 temperature data, and apply the same calibration
    ; to the corrected and uncorrected calibrated MXD data.
  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro; No need to verify the correct and uncorrected versions, since these
    ; should be identical prior to 1920 or 1930 or whenever the decline
    ; was corrected onwards from.
  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro; we know the file starts at yr 440, but we want nothing till 1400, so we
    ; can skill lines (1400-440)/10 + 1 header line
    ; we now want all lines (10 yr per line) from 1400 to 1980, which is
    ; (1980-1400)/10 + 1 lines
    (...)
    ; we know the file starts at yr 1070, but we want nothing till 1400, so we
    ; can skill lines (1400-1070)/10 + 1 header line
    ; we now want all lines (10 yr per line) from 1400 to 1991, which is
    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
    FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro
    FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
    ; plot past 1960 because these will be artificially adjusted to look closer to
    ; the real temperatures.
  • FOIA\documents\harris-tree\recon_esper.pro
    ; Computes regressions on full, high and low pass Esper et al. (2002) series,
    ; anomalies against full NH temperatures and other series.
    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
    ;
    ; Specify period over which to compute the regressions (stop in 1960 to avoid
    ; the decline
  • FOIA\documents\harris-tree\calibrate_nhrecon.pro
    ;
    ; Specify period over which to compute the regressions (stop in 1960 to avoid
    ; the decline that affects tree-ring density records)
    ;
  • FOIA\documents\harris-tree\recon1.pro
    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
    ;
    ; Specify period over which to compute the regressions (stop in 1940 to avoid
    ; the decline
    ;
  • FOIA\documents\HARRY_READ_ME.txt
    17. Inserted debug statements into anomdtb.f90, discovered that
    a sum-of-squared variable is becoming very, very negative! Key
    output from the debug statements:
    (..)
    forrtl: error (75): floating point exception
    IOT trap (core dumped)
    ..so the data value is unbfeasibly large, but why does the
    sum-of-squares parameter OpTotSq go negative?!!
  • FOIA\documents\HARRY_READ_ME.txt
    22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software
    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
    definitive failure of the entire project..
  • FOIA\documents\HARRY_READ_ME.txt
    getting seriously fed up with the state of the Australian data. so many new stations have been
    introduced, so many false references.. so many changes that aren't documented.
    Every time a
    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
    references, some with WMO codes, and some with both. And if I look up the station metadata with
    one of the local references, chances are the WMO code will be wrong (another station will have
    it) and the lat/lon will be wrong too.
  • FOIA\documents\HARRY_READ_ME.txt
    I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as
    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
    and one with, usually overlapping and with the same station name and very similar coordinates. I
    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
    There truly is no end in sight.
  • FOIA\documents\HARRY_READ_ME.txt
    28. With huge reluctance, I have dived into 'anomdtb' - and already I have
    that familiar Twilight Zone sensation.
  • FOIA\documents\HARRY_READ_ME.txt
    Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not
    being kept in step. Sounds familiar, if worrying. am I the first person to attempt
    to get the CRU databases in working order?!!
  • FOIA\documents\HARRY_READ_ME.txt
    Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I
    immediately found a mistake!
    Scanning forward to 1951 was done with a loop that, for
    completely unfathomable reasons, didn't include months! So we read 50 grids instead
    of 600!!!
    That may have had something to do with it. I also noticed, as I was correcting
    THAT, that I reopened the DTR and CLD data files when I should have been opening the
    bloody station files!!
  • FOIA\documents\HARRY_READ_ME.txt
    Back to the gridding. I am seriously worried that our flagship gridded data product is produced by
    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station
    counts totally meaningless
    . It also means that we cannot say exactly how the gridded data is arrived
    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
    procedure? Of course, it's too late for me to fix it too. Meh.
  • FOIA\documents\HARRY_READ_ME.txt
    Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet
    the WMO codes and station names /locations are identical (or close). What the hell is
    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
  • FOIA\documents\HARRY_READ_ME.txt
    Well, it's been a real day of revelations, never mind the week. This morning I
    discovered that proper angular weighted interpolation was coded into the IDL
    routine, but that its use was discouraged because it was slow! Aaarrrgghh.
    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
    to 720x360 - also deprecated! And now, just before midnight (so it counts!),
    having gone back to the tmin/tmax work, I've found that most if not all of the
    Australian bulletin stations have been unceremoniously dumped into the files
    without the briefest check for existing stations.
  • FOIA\documents\HARRY_READ_ME.txt
    As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
  • FOIA\documents\HARRY_READ_ME.txt
    OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm
    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform
    data integrity, it's just a catalogue of issues that continues to grow as they're found.
  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
    printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
    printf,1,’Reconstruction is based on tree-ring density records.’
    printf,1
    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY
    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
    printf,1,’will be much closer to observed temperatures then they should be,’
    printf,1,’which will incorrectly imply the reconstruction is more skilful’
    printf,1,’than it actually is. See Osborn et al. (2004).’
  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
    FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
    printf,1,'IMPORTANT NOTE:'
    printf,1,'The data after 1960 should not be used. The tree-ring density'
    printf,1,'records tend to show a decline after 1960 relative to the summer'
    printf,1,'temperature in many high-latitude locations. In this data set'
    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
    printf,1,'this means that data after 1960 no longer represent tree-ring
    printf,1,'density variations, but have been modified to look more like the
    printf,1,'observed temperatures.'
  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro
    ;
    ; Remove missing data from start & end (end in 1960 due to decline)
    ;
    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
    sst=prednh(kl)
  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
    ; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
    ; but not as predictands. This PCR-infilling must be done for a number of
    ; periods, with different EOFs for each period (due to different spatial
    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
    ; since they won’t be used due to the decline/correction problem.
    ; Certain boxes that appear to reconstruct well are “manually” removed because
    ; they are isolated and away from any trees.
  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]
    ;
    ; Apply a VERY ARTIFICAL correction for decline!!
    ;
    yrloc=[1400,findgen(19)*5.+1904]
    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
    2.6,2.6,2.6]*0.75 ; fudge factor
    (...)
    ;
    ; APPLY ARTIFICIAL CORRECTION
    ;
    yearlyadj=interpol(valadj,yrloc,x)
    densall=densall+yearlyadj
  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
    ;
    ; Plots density ‘decline’ as a time series of the difference between
    ; temperature and density averaged over the region north of 50N,
    ; and an associated pattern in the difference field.
    ; The difference data set is computed using only boxes and years with
    ; both temperature and density in them – i.e., the grid changes in time.
    ; The pattern is computed by correlating and regressing the *filtered*
    ; time series against the unfiltered (or filtered) difference data set.
    ;
    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
    ;
    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
    ; plot past 1960 because these will be artificially adjusted to look closer to
    ; the real temperatures.
    ;
  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
    ; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
    ; gives a zero mean over 1881-1960) after extending the calibration to boxes
    ; without temperature data (pl_calibmxd1.pro). We have identified and
    ; artificially removed (i.e. corrected) the decline in this calibrated
    ; data set. We now recalibrate this corrected calibrated dataset against
    ; the unfiltered 1911-1990 temperature data, and apply the same calibration
    ; to the corrected and uncorrected calibrated MXD data.
  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
    ; No need to verify the correct and uncorrected versions, since these
    ; should be identical prior to 1920 or 1930 or whenever the decline
    ; was corrected onwards from.
  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
    ; we know the file starts at yr 440, but we want nothing till 1400, so we
    ; can skill lines (1400-440)/10 + 1 header line
    ; we now want all lines (10 yr per line) from 1400 to 1980, which is
    ; (1980-1400)/10 + 1 lines
    (...)
    ; we know the file starts at yr 1070, but we want nothing till 1400, so we
    ; can skill lines (1400-1070)/10 + 1 header line
    ; we now want all lines (10 yr per line) from 1400 to 1991, which is
    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)

Sponsored IT training links:

Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.


About these ads

446 thoughts on “Climategate: hide the decline – codified

  1. It would be fun to have these guys under oath in a deposition with a lawyer who really understands code and an expert to consult. Bottom line — the code will be shown to be crap and the results were predetermined.

  2. Expose the code and bust the Anti-Trust Climate Team

    I suggest that we offer Phil Jones the opportunity to run this code and produce his temperature “product” while the process is being videotaped and broadcast live publicly. He’ll be allowed a time period of perhaps 24 hours to replicate his “robust” temperature results without “fudging” then.

    I would suggest the results would be “busted” rather than “robust”.
    Shiny
    Edward

  3. It just gets better and better (worse for them, that is). The question remains: is this enough to overcome the momentum acquired by the AGW side since 2005? Senator Inhofe’s commitment to the skeptic’s cause and his ranking minority seat in the Environment and Public Works Committee looms large. But why won’t the mainstream media report on the matter?

  4. Perhaps a simple question. Science is all about repeatability and it is quite clear from these programmers comments that the CRU was quite obviously “cooking the books”. So, it stands to reason, if we want a reasonably true temperature record for the globe, from now back to whenever, what would it take to do that? For the current temperature records pains would have to be made to either use only station data that is free of UHI effect, or (somehow) to figure out what the heat signature of each station is in relation to the rural surroundings, and use that as an offset of some kind. For temperature records of the past, why are trees used when they so obviously have a mixed signal (more things affect tree growth than simply temperature) instead of human writings – humans have been recording and reporting on temperature and climate for thousands of years.

    What would it take to start over, and make a completely independent temperature record?

  5. Somebody should Slashdot this, it is exactly the sort of thing the programming geeks there will understand and appreciate. They may also be able to provide valuable insight.

  6. Simplified code for CRU to use (from an average programmer with no climatology expertise):

    For intYear = 1400 to 2009;
    floatGlobalMeanTemerature = floatGlobalMeanTemperature + WHATEVER_THE_HELL_YOU_WANT_IT_TO_BE;
    intYear++
    next

    Print “Holy Crap! We’re all going to die!”

  7. “a sum-of-squared variable is becoming very, very negative!”

    This is an interesting “trick.” Has this issue been figured out yet? Buggy code is hard to unravel even when you are the one that wrote it, but if you don’t have it all and it is someone else’s code (maybe several others) the reason for this may not be discoverable. Did Harry ever figure it out?

    “Delaunay triangulation” This seems to mean the code located points outside the region they should have been in, and, it was someone else’s poorly documented code (“off-the-shelf”) so Harry couldn’t figure out what it did or how.

  8. hmmm (09:14:27) :

    It’s almost as if whoever wrote these notes wanted us to find them.

    Positively Freudian.I know enough about programming though no expert,
    just from my own limited experience the commands and data input are
    to quote:” Crap crap ‘..

  9. Apart from Harry, who was doing his best, these guys should be in jail. They knew exactly what the code was (or wasn’t) doing.

    Harry spilled the beans.

  10. Can there be any doubt that this trail of crumbs was laid deliberately? I think the identity of the whistleblower may be coming into focus.

    However, it is curious that the coder makes no reference to brining the problems to the attention of Jones et al

  11. The pot of gold in Climategate. While we knew via Hansen’s scripts and Fortran77 something beyond incompetence, probably fraud, was afoot, Mann never released anything but McIntyre’s reconstructions are a strong indication of fraud.

    This project file of commentary is the sort of file, only much larger, I’ve often used, maintaining old code I’d not developed originally. Usually it’s to establish hierarchies, patterns of attack, this file includes comments better placed in the files themselves, but gives every indication, to me, of authenticity.

    The weight of the evidence following Climategate is inescapable.

  12. Totally amazing. I’m no programming expert; but I do routinely construct Optical system models for ray trace simulations; and even they require detailed comment of every element that is in the system, or else even months later, I cannot keep track of why the hell, I included some element. Now this is orders of magnitude more straight forward, than actually writing the code; but I certainly understand the concept that the comments better explain in plain English (and evidently include also some Australian terminology) so anyone can tell what it is supposed to do.

    Well I have always believed that the failure to observe Nyquist in sampling strategies, was at the heart of this climate data inadequacy, and reading about all their ad hoc inclusions and exclusions of data sites; clearly shows what total BS they are cooking up.

    And If I was paying the bills for that stuff; I would really like my money back, and also see some jail (gaol) time included in the correction process for the varmints purveying this rubbish as science data gathering.

  13. In-bloody-credible! I’ve had to rework crappy code on occassion and discovered that you “can’t get there from here”. I sympathize with Harry. I can’t sympathize with the people who have foisted this garbage on us as “science”.

  14. And yet the lap-dog press will continue to embargo the controversy, Obama will “get one more step closer to climate change legislation”, and the scientists (sic) will offer lame but “scientific” excuses.

    Maybe the way to nail the bastards will be the old fashioned way, mail-fraud and IRS transgressions. I would imagine there has been snail-mail crimes of some sort , and I’m positive such large amounts of money led to some malfeasance on Mann’s and other’s part. This type of personality would never be clean in their persoanal lives while perpetrating such monumental fraud in academia.

  15. I spent a decade or so as a professional programmer. Let’s just say that the excerpts revealed above would not have only gotten me fired, they would have gotten me blackballed and unable to work in the industry forever. And rightfully so.

  16. Was the data that the code uses also released?

    With the data, one could remove the fudge factors from the code and see what really comes out.

    The tone of the comments do raise the question as to whether their author helped release the information.

  17. Having worked as a research assistant at Columbia University, the only thing that surprises me is the childish language and the tone of all the emails. Scholarship is corrupt to the core and that’s why I decided never to get a PHD.

    The number one corruption is that no thesis is approved for research unless the research passes a political litmus test. Translation: Unless you already have a conclusion in mind that does not seriously detract from that of your colleagues, you have no hope of being published.

    Now, I am published (only one paper), but it is in the field of Philosophical History, it did not require any grant money, and it passed the political litmus test among Orientalists detracting from those whose background ais Occidental. Translation: the paper is perceived as “progressive” so it is good to go.

    Of course it’s a bad idea to base policy upon modern scholarship in highly controversial areas. Unless proposed research is designed to meet, substantiate, or reflect progressive ideals, you haven’t got a chance.

    Mainstream scholarship in recent history has got some very obvious things dead wrong because of progressive politics. Just as we heard all the lies about imminent O-Zone disaster to racial integration increasing property value (believe what you will about the social merit of integration, the economic is quite obvious), they are wrong again about yet another one.

    However, their arrogance is astonishing in the email and programming notes. It should be no surprise they fudge evidence and conclusions–that’s 95% of scholarship, where position is more important than the truth. The surprise is that they all seem to be knowingly complicit in deception and not care. Everyone whom I worked with believed their crap!

  18. Context:

    “ Around 1996, I became aware of how corrupt and ideologically driven current climate research can be. A major researcher working in the area of climate change confided in me that the factual record needed to be altered so that people would become alarmed over global warming. He said, “We have to get rid of the Medieval Warm Period.”
    Dr. David Deming (University of Oklahoma)

  19. Oh, good lord!…. and these mob were given a grant of 13 million pounds of British taxpayers money to do climate science?

    …. I think the only thing “done” here, was the British taxpayer. They were well and truly “done” over.:-(

  20. Funny in that of all the text emails in the Zip file the night this got released, this is the file I read first. And kept reading. The comments looked to be pretty damning, but I wasn’t sure which “official” curve this program contributed to. If this was just a program for some obscure and not relied upon graph, it meant little. If this was something actively promoted by HadCRU then it’s a big deal.

  21. “Somebody should Slashdot this, it is exactly the sort of thing the programming geeks there will understand and appreciate. They may also be able to provide valuable insight.”

    I’ve been watching Slashdot for this story to come up, obviously this is just the sort of thing that SHOULD show up there.

    It won’t. Slashdot is filled with pro AGW types, and controlled by them. I wouldn’t be surprised if it never show up there at all. Even the “science” section is glaringly silent on the issue.

    Yes they are the type that would understand this, but for many it would mean exposing their preconceptions to themselves, and that won’t happen.

    There is a reason why people will defend the crappy deal they got from a used car salesman.

  22. Another paragraph from the Harry_Read_Me that inpires confidence:

    “You can’t imagine what this has cost me – to actually allow the operator to assign false
    WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’
    database of dubious provenance (which, er, they all are and always will be).

    False codes will be obtained by multiplying the legitimate code (5 digits) by 100, then adding
    1 at a time until a number is found with no matches in the database. THIS IS NOT PERFECT but as
    there is no central repository for WMO codes – especially made-up ones – we’ll have to chance
    duplicating one that’s present in one of the other databases. In any case, anyone comparing WMO
    codes between databases – something I’ve studiously avoided doing except for tmin/tmax where I
    had to – will be treating the false codes with suspicion anyway. Hopefully.”

    This is like the train wreck that you don’t want to watch, but you can’t look away from.

  23. What would it take to start over, and make a completely independent temperature record?

    Judging by the state of the databases with a seemingly unmanaged history and no “method” in use to ensure integrity, it would be literally impossible to make sense of it all in a second pass. I now understand why they resist the release of code and data so strongly. It isn’t that they can’t, it’s that they’re just plain embarrassed by the hideous mess that would be exposed. I said “would”, I mean “has been”.

  24. I know why Warmists deny global warming is not man-made. They are living in a Hollywood dream world and they can’t wake up.

  25. My question is how the HARRY_READ_ME file fits into the greater scope of CRU’s data analysis. It’s not clear to me whether this was some side project for the programmer to analyze a problematic data subset or if this represents his frustrations with the main CRU data set. There is certainly a story here just don’t know if there are more chapters in this novel.

  26. If this isn’t enough to get MainStreamMedia to actually do their work – then I’ve lost any hope for humanity!

  27. You know, I have asked at realclimate.org how they did software quality control. The post was censored ofcourse.

    My guess is that they would not be able to verify their data/models/implementation, so that the operational verification would be something along the lines:

    This output looks funny, this can’t be right. Lets look at the software and fix it.

    Even without malice in complicated software this will lead to software that confirms what the scientist expects, because then it won’t look ‘funny’.

    I wrote something along those lines at rc, and apparently it hit very close to the mark, because it was censored.

  28. ”22. Right, time to stop pussyfooting around the niceties of Tim’s labyrinthine software
    suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the
    definitive failure of the entire project..”

    That will be Tim Osborne at a guess and Harry seems to think this manipulation of the data is vital for the ‘project’. The smoking gun is in the data, which is why they won’t give it up without a fight.

  29. Limbaugh tearing into Obama on the AGW Hoax !!!!

    RUSH: Now we go over to the Universe of Lies. This afternoon, President Obama press conference with the Indian prime minister. Here’s a portion of Obama’s opening remarks.

    OBAMA: We’ve made progress in confronting climate change. I commended the prime minister for India’s leadership in areas like green buildings and energy efficiency, and we agreed to a series of important new efforts: A clean energy initiative that will create jobs and improve people’s access to cleaner, more affordable energy; a green partnership to reduce poverty through sustainable and equitable development and an historic effort to phase out subsidies for fossil fuels. With just two weeks until the beginning of Copenhagen, it’s also essential that all countries do what is necessary to reach a strong operational agreement that will confront the threat of climate change while serving as a stepping-stone to a legally binding treaty.

    RUSH: The president of the United States has just said in an internationally televised press conference that he is going to continue seeking a resolution to a problem that doesn’t exist. There is no man-made global warming. But it doesn’t matter because he exists in the Universe of Lies. And man-made global warming is only a means to an end to him. It is simply a way to once again chip away at the size of this country and the wealth of this country and to make sure that he is able to enact legislation that will allow him to raise everyone’s taxes so that he can begin even more redistribution of wealth. It’s just a mechanism like all liberal ideas are. They are dressed up in flowery, compassionate language to hide the deceit — the insidiousness — of their real intentions. This is mind-boggling, mind-boggling. It’s a hoax! It has been totally made up! It’s been known since Thursday. This is Tuesday.

    The president of the United States, in an internationally televised press conference, says, “We gotta move further and we gotta get closer to Copenhagen with a working agreement. We gotta confront climate change.” There isn’t any! The whole concept of “climate change” is a fraud because climate “changes” constantly, and we’re not responsible for it. We don’t have that kind of power. We can’t do it! If somebody says, “Make it warmer tomorrow,” you can’t. If somebody says, “Make it colder tomorrow,” you can’t. If somebody says, “Make it rain tomorrow,” you can’t — and yet to listen to these people, we’re doing all of that and we’re doing it so fast that we are going to destroy our climate. It’s a hoax! It’s a fraud! There is no climate change. There is no global warming. There never has been any man-made global warming. How else can it be said?……..”.

    http://www.rushlimbaugh.com/home/daily/site_112409/content/01125109.guest.html

  30. If a physicist were to submit a paper without showing the math, that paper would (I assume) be rightly ridiculed and sent back with a “show your work” rebuke. It doesn’t seem right that one can hide one’s work in software, and then casually dismiss the absence of documented code upon submitting a paper as these yahoos have done. And yet, that seems exactly the way mainstream climatology works. Do any other sciences permit one to hide calculations in a program and then not publish said program with the paper?

  31. This is just astonishing, and it’s evolving into a serious scandal.

    I don’t think most of the public even realize that the computer code upon which these “models” are based isn’t revealed to other scientists for checks/verification/improvement. What’s UEA CRU’s basis for the validity of its predictions? “Trust us, we’re a really nice bunch of English climatologists”???

  32. Oops, forgot ‘)’ was valid a URL-char. http://www.jargon.net/jargonfile/c/CLM.html

    Trying to look at some of this charitably: It does seem to me that the “fudging” of the decline was done to avoid the calibration of tree proxy to real temperature being thrown off by the known (but unexplained) divergence post 1960, rather than as a fudge to the dataset itself (although the results of that calibration set the offset of the result, of course). If there is a known problem with part of one dataset, it’s reasonable enough to ignore it when you’re trying to calibrate the good parts with something else. The real question (which I don’t know the answer to) is how good is the correlation in the bits that you do believe are comparable (i.e. 1850-1960)?

    But even I, stretching my charitable nature to its limit, am pretty worried by the fact that he doesn’t believe in the core gridding algorithm, and the poor quality of the data seems to be all-pervasive, not just noise from a few bad stations. You can’t blame “Harry” for this – quite the opposite – but it does seem quite incredible that something of such huge importance should have been given so little resource and basic software project management for so long.

  33. It reminds me of Jones public statement:

    “So apart from the removal of a few troublesome editors, blocking some FOI requests, deleting emails, blocking a few contrarians from publication, peer reviewing each others papers, cherry picking trees and modifying code to hide the decline: please tell me – exactly what is wrong with our science?”

  34. Michael Alexis wrote:

    Simplified code for CRU to use (from an average programmer with no climatology expertise):

    For intYear = 1400 to 2009;
    floatGlobalMeanTemerature = floatGlobalMeanTemperature + WHATEVER_THE_HELL_YOU_WANT_IT_TO_BE;
    intYear++
    next

    Print “Holy Crap! We’re all going to die!”

    Fell off my chair laughing!!!!!! No more, PLEASE, it HURTS!!!!!!!!!!

  35. Do we know what CRU “software” we are talking about here? The first comments seem to be for ten year old proxy reconstructions, and some of the later ones the temperature “product(s)”

    Is there any relationship to, or does this provide any insight into GISS software? It seems that Phil Jones made some winking statement about the CRU temperature anomalies being remarkably similar to GISS, “as they should be” or something similar.

    The programming is clearly bad and any output from the software described here would have to be questionable. However, it is very difficult to assess risk and estimate error without knowing which software has problems and for what it is currently used (or which AGM dogmas it supports.)

  36. Looking at the various values for valadj and they way they are applied its appears that they always lower the values (whatever they are) for the period 1930 to 1955 and increase them for the period 1955 to 1999. I wonder why.

  37. Two interesting coder notes in HARRY_READ_ME.txt are:

    “This still meant an awful lot of encounters with naughty Master stations, when really I suspect nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option – to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad, but I really don’t think people care enough to fix ‘em, and it’s the main reason the project is nearly a year late.”

    “You can’t imagine what this has cost me – to actually allow the operator to assign false WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’ database of dubious provenance (which, er, they all are and always will be).”

  38. This is not some random crack from the outside. This is almost certainly an inside leak. 61 Mb is nothing. The probability that any 61 Mb of data, pulled off a file or email server, containing this much salient and inculpatory information is virtually nil.

    This data was selected by someone who knew what they were doing, what to look for and where to find it.

  39. Chris (09:36:01) :

    My question is how the HARRY_READ_ME file fits into the greater scope of CRU’s data analysis. It’s not clear to me whether this was some side project for the programmer to analyze a problematic data subset or if this represents his frustrations with the main CRU data set. There is certainly a story here just don’t know if there are more chapters in this novel.
    —————————————————–

    Well that’s a simple problem to solve then….. Just send CRU a FOI request for the data and meta data and check it out;-)

  40. O.T. but:

    Did you know that the BBC held a secret meeting that decided NOT to give balanced time to GW and anti-GW viewpoints. Apparently, the communique said:

    “The BBC has held a high-level seminar with some of the best scientific experts, and has come to the view that the weight of evidence no longer justifies equal space being given to the opponents of the consensus.”

    http://burningourmoney.blogspot.com/2007/06/bbc-bias.html

    .

  41. James Hastings-Trew (09:16:45) :

    If you have looked at the massive differences between the timespans quoted in HARRY_READ_ME.txt stations like Elko, NV / Charleston, SC / Red Bluff, CA (and dozens of other sites) and what exists today, it’s going to take years to recover, providing that original submitted paper forms have NOT been destroyed.
    It’s going to have to be done. Too much damage has occured, and the digital databases are not to be trusted.
    Do you (or anyone else for that matter) personally know where the original paper forms submitted are kept?
    And where to obtain a legitimate copy of Tom Karl’s 1990 USHCN database?

  42. [snip]

    “These huckstering snake-oil salesmen and “global warming” profiteers — for that is what they are — have written to each other encouraging the destruction of data that had been lawfully requested under the Freedom of Information Act in the UK by scientists who wanted to check whether their global temperature record had been properly compiled. And that procurement of data destruction, as they are about to find out to their cost, is a criminal offense. They are not merely bad scientists — they are crooks. And crooks who have perpetrated their crimes at the expense of British and U.S. taxpayers”. (Lord Monckton)

  43. Correct me if I’m wrong, but this isn’t actually `model’ code. It’s code for producing the temperature record. The model code must look even worse given the nature of its inherent complexity.

  44. All I can say it, WOW. The way in which this programmer commented his work makes me think that he was somewhat frustraded with the fraud that he was being ask to commit. Really, we are now past the point of scientific debate. The skeptics need to have some good lawyers in their camp to help resolve some of these issues. Without taking these people to court, they are simply not going to come clean about what they have been doing.

  45. Gee, I wonder what the GISS data sets are like? Since they won’t release them, despite an FOI request, it makes you wonder.

  46. I used to sincerely believe in the global warming theory. But one thing that’s troubled me in the past few years were the shrill responses to skepticism regarding it.

    Now, I’m not a scientist by profession–but shouldn’t there be room for debate in discussing the following counter arguments:

    1. Correlation does not prove causation
    2. A true independent, critical peer review

    From a brief cursory reading of these emails, it sounds to be like the head of research wants to make the collected data fit the hypothesis.

    It’s human nature to be biased and partisan. I wonder if it could be possible to collect the raw data and have it examined by qualified people with no ideological axe to grind–and let the chips fall where they may?

    What the global warming proponents want to do is redesign our entire economic system and create layers and layers of bureaucracy in order to regulate almost every aspect of our lives.

    I believe in being good custodians of our planet and the environment of course. It is a good thing to eliminate pollutants that are proven to be harmful. As far as CO2 goes, nothing has been proven yet.

  47. I’ve done programming in a scientific research environment using large codes to do computational physics. All programs had to be well organized, debugged and verified by more than one researcher/grad student . They were also highly commented and even have descriptive documents on the side. Those procedures were absolutely necessary to keep everything in line and to eliminate mistakes, which are easy to make when the machinery is so complex.

    If the Harry file is any indication, then I’d say they results from these programs can’t be considered as professional level science at all.

  48. Been looking into the code and data held in the \documents\cru-code\f77\mnew directory. Here is what I have found:

    The data sets are master.dat.com, master.src.com. master.src.com is the important file. Don’t open these without changing the extension to .txt otherwise Windows interprets them as executables, and you won’t be able to view them properly anyway. Could send copies capable of being opened in Windows. These contain monthly weather station data with one row per year. I don’t know the exact nature of these files, but some of the data does relate to sunlight duration. A site in Finland suggests master.src.com is temperature related, but there’s a lot of speculation flying around the Internet regarding the leaked files at the moment, so can’t be certain.

    There are 3526 stations in all and 2578488 monthly observations. -9999 in a field means the observation for that month is absent. There are 269172 (10%) missing observations in master.dat.com and 14226 complete missing years. The programs are designed to completely ignore any years with no observations. In total there are 200649 rows (years) of observations which should equate to 2407788 months, however due to some years having up to 11 missing months there are 2309316 monthly observations used. Now what’s interesting is how these missing months are processed. Programs such as split2.f, where a year contains one or more missing months actually invents the figures using the following heuristic

    If a month is missing try to infill using duplicate. if duplicates both have data, then takes a weighted average, with weights defined according to inverse of length of record (1/N)

    That’s from the comment at the start of split2.f

    What this really means is more than 4% of the data is being completely fabricated by at least some of the Fortran data processing programs. If this were done in other disciplines this would be extremely questionable.

    Also did notice quite a few programs, especially in the documents\cru-code\idl\pro directory are designed to process data deemed anomalous, though this isn’t necessarily suspicious.

    This is the header comment from documents\cru-code\idl\pro\quick_interp_tdm2.pro

    ; runs Idl trigrid interpolation of anomaly files plus synthetic anomaly grids
    ; first reads in anomaly file data
    ; the adds dummy gridpoints that are
    ; further than distance (dist)
    ; from any of the observed data
    ; TDM: the dummy grid points default to zero, but if the synth_prefix files are present in call,
    ; the synthetic data from these grids are read in and used instead

    What is ‘synthetic data’ and why might it be applied to dummy gridpoints away from genuine observation points? This could be a recognised statistical procedure, or data massaging, or creating more observations out of thin air to skew certainty levels, just can’t tell and don’t have time to look at anything else in depth right now. Like it says, e-mails can be open to interpretation but it’s the code and what it does to the raw data which really matters. The comment in the Mann code described in the link below is a work-around to a recognised issue with dendrochronology data. During 1960s the correlation coefficient between tree growth rate and temperature altered.

    The recent ERBE results are really significant, the discrepancy between IPCC modelled values and the real world figures is quite something.

    http://wattsupwiththat.com/2009/11/22/cru-emails-may-be-open-to-interpretation-but-commented-code-by-the-programmer-tells-the-real-story/

  49. Sorry, but Slashdot will not be of much help. They have become so liberal over at Slashdot that they have drunk the global warming koolaid by the gallon.

    REPLY: They did carry the initial hacking story, I don’t see why they would not carry this.

  50. In electrical engineering, any unique code used is included in the methods portion.

    Because engineers expect to be able to investigate each others’ claims, instead of letting them be hidden in software black boxes.

    Of course, that is engineering, not “science”.

  51. “OH *UCK THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done I’m
    hitting yet another problem that’s based on the hopeless state of our databases. There is no uniform
    data integrity, it’s just a catalogue of issues that continues to grow as they’re found.”

    So this tells us what the quality of the data is

    We ahve already shown how Mannn uses the guesses from one tree’s rings to over rule a set of global temperature means.
    This is more like voodoo and palm reading than science.

    I am so not surprised at his use of “weights” to strengthen samples that fit the dogma.

  52. Back to the gridding. I am seriously worried that our flagship gridded data product is produced by Delaunay triangulation – apparently linear as well. As far as I can see, this renders the station…

    Well, this brings me back to a question I asked here a long time ago: how is average global temperature calculated? Somebody replied, “It’s called gridding.”

    This happens to be something I know a bit about through my GIS work. Delauney triangulation (TIN production) would be the last thing I would expect them to use for this sort of a model. Anyway, one must peek into the code to see how the sausage is made, no?

    Think about all those attractive colorful thematic maps of the globe, all gridded…

  53. It’s actually quite entertaining to read the programmers notes while playing M4GW’s “Hide the Decline (hide the decline), in the background.

    ….. it stops you from crying.

  54. Will there be a Congressional Investigation?
    Comment by JamesD

    Wednesday, 25 Nov 09 @ 12:24 PM

    There will be no investigations with a Democrat congress. No committee chairman will allow it. I hope some of the heavyweight scientists will come to the defense of scientific integrity but I’m not holding my breath. For a long time this naked emperor has been parading down the street. The press and, particularly, the educated public all agree, he’s wearing a fine suit of clothes.

    http://carboneconomy.economist.com/

    http://carboneconomy.economist.com/content/programme

  55. Steve Gavin at RealClimate has refused more than 6 times to post the following message, are you willing to present this very important point for me?

    JCS says:
    Your comment is awaiting moderation.
    25 November 2009 at 12:58 PM

    Gavin,

    I have repeatedly tried to post comments on your website asking you to respond to the following statement:

    If the first rule of science is to question everything, and another fundamental rule is that no hypothesis can be proven true, regardless of imposing the precautionary principle, why is the first rule and another fundamental rule being discarded, and any SCIENTIST (SKEPTIC) vilified or or censored for practicing what can only be considered good science.

    I am a Climate scientist with a degree in applied science wildlife biology and a masters in climate change and sustainability and I am not convinced by anything I have read, seen, studied or experimented that there is a definitive correlation between CO2/Greenhouse gases and climate variability.

    Also I have read much of this information from this hack/screw up/whatever and I clearly see what i would consider malpractice and unethical collusion. Particularly in the case where advice is given in ways to avoid taxes, data has been significantly fudged, code is manipulated and peer review process stacked (more so than usual).

    I believe in Climate Variability, I also believe that humans as a whole cause irreparable environmental harm to the planet, however I am skeptical of the hypothesis that is Anthropogenic Climate Change and believe that more research and a more open and debatable approach needs to be undertaken to achieve real results in understanding this. Why is this wrong and why are so many other skeptics with the same opinion vilified and persecuted. Why have you censored more than 6 previous posts I attempted to put up on this topic.

    Can you not see how this topic risks the credibility of science as a whole!!!!!!

  56. Eric (09:40:29) :
    In fairness that is too general a statement. It is important to be precise and specific, otherwise folks at RealClimate who actually really know their stuff will simply rip you to shreds. Certain critical pieces of station data have been requested. Certain pieces of code have been requested. More generally, authors of cliamte science research papers have been asked to post their raw data and their code in a way that will allow a complete replication of their results by interested third parties. It is the Institutional and individual refusal to do these simple things that has caused the questioning of the motives of climate scientists in general.

  57. I don’t have to be a programmer to understand this:

    The tree-ring density’
    printf,1,’records tend to show a decline after 1960 relative to the summer’
    printf,1,’temperature in many high-latitude locations. In this data set’
    printf,1,’this “decline” has been artificially removed in an ad-hoc way, and’
    printf,1,’this means that data after 1960 no longer represent tree-ring
    printf,1,’density variations, but have been modified to look more like the
    printf,1,’observed temperatures.’

    they have committed fraud. plain and simple.

    meanwhile, I note that Washington Post and WSJ have picked up the story, reporting not “it’s taken out of context,” but rather “it’s starting too look like they manufactured the data.” There is yet hope that MSM will take it up and run with it.

    for, even though MSM is firmly in the AGW camp, they will not be able to resist a juicy expose. Juicy makes ratings and sells advertising. They will ignore that they have been made fools.

  58. How do know that the HARRY_READ_ME.txt file and the data he’s working on represents the official temperature ouput of CRU?

    Maybe he is put to work on a special dataset tha has been corrupted somehow?

    Surly there is more than only 1 guy doing this stuff at CRU??


  59. hmmm (09:14:27) :

    It’s almost as if whoever wrote these notes wanted us to find them.

    Completely missed the point; these were entered/typed up AS the code was being written/debugged/maintained/retrofitted.

    It is almost obvious you have never ‘coded’.
    .
    .

  60. Here is the output of some code from a Briffa related file. If you run a flat temperature graph through it if gives a Hockey Stick. If you run an inconveniently divergent tree ring graph through it, it acts as a trick to hide the decline.

  61. In reading through this … stuff … I found my mind seemed to be stuck in a goto loop, continuously replaying that YouTube video — “Hide the decline … hide the decline”.

    And then I come to this gem —

    What the hell is supposed to happen here? Oh yeah – there is no ‘supposed’, I can make it up. So I have :-)

    This is what happens when you approach a problem with a preconceived belief system in place describing what’s happening. Data doesn’t conform to your belief? No problem – fix it so the problem goes away.

  62. hmmm (09:14:27) :

    > It’s almost as if whoever wrote these notes wanted us to find them.

    I disagree, though he is writing for future readers.. Faced with the same morass I likely would have kept something similar to Poor Harry’s Dairy. It would be useful to my manager to help chase after the data providers and document what I had been doing for the next performance review. It would be useful to me as something to refer to when faced with those “I thought I fixed that already” moments (there’s at least one of those comments in his diary). And it would be useful to cram down the original authors’ throats someday.

    For a system this complicated, it’s difficult to keep some of the system issues straight in the comments, another reason it’s worthwhile to have a separate document. (Ideally that would be a design specification, but none came with the code. When porting software like this, I generally follow a start with the first things that need to run, end with the things that pull it all together. In the future I’m going add a pass to scan through everything and try to get a sense of what it does. That was my starting point decades ago, but with experience and skill, programmers can develop a “disdain” for the whole and can quickly find their way to the core problems. fortunately I’ve never had to work on something as much a mess as this.)

  63. Hilarious:

    ‘discovered that a sum-of-squared variable is becoming very, very negative!’

    Don’t they know ‘ i ‘ ??? imaginary as some theories?

    For Europe the subject is less that hilarious!
    Schellnhuber is the adviser of Kanzlerin Merkel and of the President of the European Commission Barroso!
    A position with extreme power.
    Now look at his writings, with Mann, Schneider,Rahmstorf and others, very recent:

    http://www.copenhagendiagnosis.com/

    and on the reports in the media:

    http://www.n-tv.de/politik/dossier/Noch-gibt-es-Hoffnung-article79835.html

    http://www.spiegel.de/wissenschaft/natur/0,1518,663045,00.html

  64. Anthony–

    can’t express the gratitude we all owe you for pulling these programmer comments out of the code, so revealing. Question– the ‘FOI’ header, where did they come from? what do they signify?

  65. I did some numerical coding for my MSc-equivalent thesis, and in theory is possible to obtain a negative number by the addition of positive ones if the total is large enough to overflow, and the most significative bit of the mantissa is the sign bit, with the value of 1 for negative numbers. With those conditions, an untreated overflow would cause negative numbers.

    If the code does not treat that, the results are no good. The solution is the use of data types with more bytes, and some studies about the numerical stability of the algorithms employed is a must. However, it’s amazing that such a newbie error could be made in one of the most prestigious research institutions, so another explanation could be possible. We did not find this problem in our work because we used algorithms that we knew stable with the sets of data employed. Normalization of data also helped :-)

    However, it amazes me that a organization with so much at stake in numbers did not get some basic reference texts* and had to resort to search for algorithms to calculate distances across Great Circles in Wikipedia!

    * Such as this, edited in 1992: http://www.amazon.com/Numerical-Recipes-FORTRAN-Scientific-Computing/dp/052143064X

  66. Maybe it’s time to GPL the codes… Make them a true public venture where everyone can see what’s being done.

  67. No rob it’s not hot stuff. No one cares. Outside this narrow little circle of people who care about truth nobody give a f—

    AGW plays to some sort of primitive pre-rational impuse or mythic structures that some people have … and that’s it. There’s no science here. Just stories.

    Expect green taxes. No nuclear power stations. Shutting down the coal fired stations we do have.

    “My three main goals would be to reduce human population to
    about 100 million worldwide, destroy the industrial infrastructure
    and see wilderness, with it’s full complement of species,
    returning throughout the world.”
    -Dave Foreman,
    co-founder of Earth First!

    More here The Green Agenda

    The warmists are essentially the forces of the counter-enlightenment – quite literally the people who want to put the lights out. Do you think they’ll be swayed by truth?

  68. I keep seeing this line in the code;

    ; plot past 1960 because these will be artificially adjusted to look closer to
    ; the real temperatures.

    but what does it actually mean?
    Observed Temperatures?
    Homogenized Temperatures?
    Fudge Factor Temperatures?

    Not really sure what to make of this statement as yet…

  69. Having been a developer for 25 years (I’ve written a fair amount of Fortran code), I can sympathize with the programmer. He’s been asked to produce consistent output that appears plausible from a train wreck of ratty source data collections. The collections don’t agree where they overlap, so he’s been told which ones to use over which time periods using what appears (to him) to be a completely arbitrary basis; to make it all fit together he has to “fudge” the numbers and write code to handle each set differently. It’s as if he’s been asked to generate the data for a bank’s corporate tax return using checkbook register copies obtained from customers and the bank’s stock price for the last year, along with general ledger entries from three different internal accounting systems. The results of his efforts are then to be used at the annual shareholder’s meeting in two week’s time, and if they’re wrong, the CEO will make him the fall guy.

    Programmers are normally a meticulous bunch, and this is a programmer complaining bitterly about the crap he’s been handed. Further, he’s upset that he’s been asked to produce plausible results in short order by people who as scientists are supposed to care about accuracy as much as he does, but clearly don’t care in this situation so long as the results come out looking like what they want to see. His notes clearly indicate that he wants everyone to know what he was facing and what sort of shenanigans had to be done to make it all look good.

    What the public is finally getting to see is just how inconsistent and error-prone all the source data is and just how much manipulation is going on behind the scenes. Everyone has been led to believe that the measurement data sets are pristine, accurate and from data sources shown to be reliable. It’s assumed they are good enough to be used to make decisions that affect millions of people and cost billions of dollars. What public is now seeing is a real shock, and people are coming to a common sense conclusion: The data is crap, and the results can’t be trusted at all.

    I feel for the programmer….I would not want to be in his shoes.

  70. The ‘science’ behind the AAM narrative can now be seen clearly for what it is a cloak to conceal the true intentions of the political classes.
    I suppose the false cloak of science was only meant to cover the political aims until the point had been reached when the political narrative no longer needed a cloak to hide behind.
    The BBC.
    The university of East Anglia.
    The climatic research unit.
    The UK meteorological office(met office).
    The Hadley climate research centre.
    The US link being the Goddard institute of space sciences(GISS).
    All of these institutions are involved in the scandal to at least some extent in the scandal and by some strange coincidence all of them are fanatical believers in and supporters of the AAM narrative, they report the science as settled and beyond doubt, they indulge in attacking sceptical scientists and flout the rules when it suits them, they feel they are above reproach and audit perhaps because they have powerful friends?
    Of all the world reach media platforms the BBC has been perhaps the most fanatical in peddling the anti CO2 anti capitalist anti industrial anti free market stories of the ‘eco industry’ it has used its nearly unlimited resources to provide a tsunami of supposed evidence however thin and patchy using state of the art visual manipulation and using the propaganda arts to full effect. The link to all of them is money and/or political affiliation, they are a closed loop of interconnected people with a common agenda showing the arrogance of those who know they have friends in high places, powerful and influential allies.
    In attacking the fake cloak of science we are in fact not directly attacking the puppet masters driving the entire narrative, like a bull attacking the matadors cloak? The political forces need this sea change in our civilisation and they feel that they cannot be open and honest with us about the reasons for the required changes so they cover it in lies to hide it. Whatever the real reasons for the massive reordering of our entire civilisation are I suspect that AAM is not one of them.
    To get at the truth we must expose the entire chain and the links in that chain, the political classes are moving fast now, faster than perhaps they expected to move because the foundations of their scam ie a warming planet is not happening, without the cloak of fake science to cover it the real movers are exposed for the world to see, if the political classes refuse to alter their policies when the science is exposed as fraudulent we will know who is behind the cloak.

  71. John F. Hultquist (09:21:16) :

    ‘“a sum-of-squared variable is becoming very, very negative!”

    This is an interesting “trick.”…. ‘

    I had this happen using a common spreadsheet – which shall remain unnamed here – several years ago. I no longer use any spreadsheet for any serious numbers. Under and overflow conditions can occasionally break things. Commercial programs are particularly poor at documenting such things, and worse at fixing them.

    On another note, the QC on the climate history data “Harry” is trying to work with is terrible. Whoever was responsible for supervising the primary data entry should have been running consistency checks, possibly every day, before forwarding it to Hadley or where ever. This is particularly important if the data is keyed from written records. Typos, misreadings etc. can creep in, and often, due to the character of the data recording methods, no useful filters can be employed to catch subtle errors during the entry process (e.g. instead of a proper data entry system the data was keyed into a spreadsheet line by line) – and Harry obviously is trying to deal with such ugly data.

  72. OMG! The sum of squares parameter going negative? This isn’t possible! Even if all computed values from the model deviate negatively from the actual data, the squares of these numbers are positive values and thus the sum of squares is always positive.

    I had a similar problem fitting titration data back in graduate school where the fitting statistics weren’t working out. It turned out I had coded in an incorrect equation for a derivative, neglecting to multiply by ln(2) in one line out of a thousand lines of code. It was plainly obvious something was wrong with my model from output plots, but wasn’t obvious in the code.

    I feel for the programmer, but it’s his job to straighten out glaring problems, particularly when the output is a mathematical impossibility.

  73. jamespapsdorf(09:40:10)

    the probelm is that quoting Limbaugh, Beck et al will get us no-where – far too easy to dismiss them as politically motivated, and they have no credibility in around 50% of the population (in the USA) and of course a much lower number globally (remember – this is a global issue)

    No – I think the solution to this is independant review as called for by Lawson in the UK and I understand one of the Senators in the US.

    Attack with facts, deconstruct the code issues and eventually the MSM might, just might, start to run with it.

    My opinion – for what its worth – I think we are too late.

  74. ralph (09:51:07) : writes:

    O.T. but:
    Did you know that the BBC held a secret meeting that decided NOT to give balanced time to GW and anti-GW viewpoints. Apparently, the communique said:
    “The BBC has held a high-level seminar with some of the best scientific experts, and has come to the view that the weight of evidence no longer justifies equal space being given to the opponents of the consensus.”

    This is only partly correct, my friend.

    For over 3 years I have been trying to elicit answers from both Mark Thompson (Director General) and Sir Michael Lyons (Trust Chairman). All I had received was sophistry and obfuscation, until I engaged the help of my MP.

    Recently it came to light that a report had been commissioned in June 2007 jointly by the Trust and BBC Board of Management entitled “From Seesaw to Wagon Wheel-Safeguarding Impartiality in the 21st Century”. It concluded: ‘There may be now a broad scientific consensus that climate change is definitely happening and that it is at least predominantly man-made… the weight of evidence no longer justifies equal space being given to the opponents of the consensus’.

    (SO THEY HAVEN’T EVEN TRIED TO MAKE A SECRET OF THIS…JUST SHOWS THEIR ARROGANCE!)

    Despite this damning evidence from their own report, they steadfastly cling to the belief that their impartiality is intact as required by the BBC Charter. Such is their state of denial that Sir Michael Lyons has even tried to deliberately mislead my MP despite evidence I have to the contrary.

    In light of this I have posed the question, through my MP: “On whose authority did the BBC cease to be an impartial Public Service Broadcaster, as required by its Charter, and become the judge, jury and sponsor of such dangerously specious political dogma so eloquently described as ‘…the consensus…’?

    Answer comes there none! I believe it is time for the BBC to be subjected to an enquiry on this matter.

  75. JCS (10:01:28) :

    Steve Gavin at RealClimate has refused more than 6 times to post the following message, are you willing to present this very important point for me?

    JCS says:
    Your comment is awaiting moderation.
    25 November 2009 at 12:58 PM

    Gavin,

    I have repeatedly tried to post comments on your website asking you to respond to the following statement:

    If the first rule of science is to question everything, and another fundamental rule is that no hypothesis can be proven true, regardless of imposing the precautionary principle, why is the first rule and another fundamental rule being discarded, and
    XXXXXXX

    Real climate says all comments are shut off for 2 days. The adverse comments outnumber the puff comments 10:1. It looks like the outrage is being posted over there.

    CEI has sued Schmidt, Gavin for working on the blog instead of doing NASA work. There are years of FOIA requests in the que at NASA GISS that wait being released.
    Just to be graphic, Gavin schmidt is pimpin’ global warming when he is not doing work he is paid to do.

    Human Resource managers have a problem when people are on the job and doing work for themselves. Earlier a mod named “eric” was doing all the comments and then they shut down. Yesterday Gavin posted a request for someone to volunteer to help.

  76. The blog post should distinguish between the various bits of code.

    The stuff in folders like osborn-tree6\mann and harris-tree\recon1.pro and osborn-tree6\mann are most likely programs used to generate things for peer-reviewed articles.

    This is quite different and distinct from the code used to produce the HADCRUT3 temperature series. I am not postive, but it does appear that the HARRY_READ_ME.txt file is about the “CRU Code” as most of us would interpret it — the code used to produce the HADCRUT temperature record.

  77. I am a bit worried, though. Nothing is showing in the MSM. The folks at RC seem to be moving on as if nothing happened. It’s almost as if they have all agreed to never speak of it again. The only ones discussing this are us and the likes of Glenn Beck, Limbaugh. This reminds me of “1984”, a surreal situation where everybody knows the truth but everybody pretends that they don’t and keep on shouting that the world is going to burn. I cannot believe that such an opportunity to kill the AGW theory is just going away as if it never happened.
    What is going on?, is the world mad, or are we?

  78. Re. the “sum-of-squared variable is becoming very, very negative”.

    I’d imagine that was due to an overflow.

    For the non-programmer folks; there are various data types that can be used to represent numbers (signed integer, unsigned integer, float, double etc.) but they aren’t capable of representing arbitrarily large numbers. If you exceed the limit, you generally end up with a negative number of the same magnitude.

    i.e. MAX_NUMBER + 1 -> -(MAX_NUMBER + 1)

    The fix is simply to use a type that allows larger numbers.

  79. Obvious explanation (09:22:20) :

    “But gavin says we’re taking this out of context…..”

    I’m amused by the use of the king of spin to explain away the “inconvenient truths”. It’s sort of like asking the fox if he knows what the commotion in the henhouse was caused by. “Nothing here to see, move on”

  80. Just watched ‘The Cloud Mystery’ on Youtube about Svenmark’s work on Cosmic Rays and their creation of the aerosols on which clouds form. Interesting that his experiment was conducted in Copenhagen? – You don’t think December’s meeting is a distraction ploy while a certain group smash his lab up??

  81. >>” Bernie (10:01:40) :
    In fairness that is too general a statement. It is important to be precise and specific, otherwise folks at RealClimate who actually really know their stuff will simply rip you to shreds. Certain critical pieces of station data have been requested. Certain pieces of code have been requested. More generally, authors of cliamte science research papers have been asked to post their raw data and their code in a way that will allow a complete replication of their results by interested third parties. It is the Institutional and individual refusal to do these simple things that has caused the questioning of the motives of climate scientists in general.”

    Has UEA/CRU released any of their modeling code to the public? The impression gathered from reading the “liberated” e-mails and programming code is that they were doing everything possible to block open peer review of their work, and were instead trying to keep as much of their data (which maybe should be termed “data” given what we’re learning about its quality) and “secret sauce” code concealed from other scientists, let alone non-professional lay scholars and the public.

    It’s a little shocking — why on earth isn’t every single piece of modeling code originating in university and public labs released to the public and subjected to open review by computer scientists, code writers, mathematicians and other climatologists? To increasingly learn that it’s not is a surprise even to me. How widespread is this kind of concealment?

    HSBC, Citibank and the CIA can keep their internal climate code secret. But there’s no justification for universities and academics to conceal theirs’. It’s a subversion of the scientific process. And it demands that the question of motive be answered.

  82. The Pennsylvania State University has a prepared statement you may request by calling the Office of Public Information at 814-865-7517. I have it if you can’t get through.

  83. After reading all of this, I would readily bet that Harry is the whistleblower. His words are those of someone becoming really angry, and less and less confident in the scientific integrity of the team he was working for. I cannot put myself in his shoes, but it is not difficult to imagine that, some day, possibly discovering the FOIA stuff, he decided that it was too much.

  84. “Mike (09:38:32) :

    One should just run this code on some totally random data series and plot the output.”

    Given the sustained bias of the “synthetic” corrections, I wonder if it would actually be possible to output anything BUT warming. Would it be possible to put in made up declining data and see what happens? I suspect it would come out as a hockey stick anyway.

    I sincerely hope we will find out, and interview the person who wrote Harry. His insight into what was going on was invaluable.

  85. Now I am convinced it was an inside job. Seeing the exasperation in the comments paints for me a very convincing picture of who the leaker was. Some code monkey, who was probably also doing double duty as an IT tech. He (or she) finally got fed up with the constant demands of the heads to do things that fly in the face of both ethical scientific procedure, and worse still, best practice computer programming.

    I guess they worked him one sunday too many, or gave him black marks on his review because he couldn’t get the computer to say what they wanted. The programmer went off the deep end and decided to start compiling a file of some of the more incriminating skeletons in the CRU’s closet.

  86. I have worked as a professional programmer for more than 20 years, and I think that the language in these comments is strange, to say the least. I mean – I have often been swearing over poorly documented spaghetti code – (almost) as bad as this one – but I have NEVER put the swearing into writing. In my opinion, this stinks. It seems that the author of these comments WANTED the world to see them. He is certainly writing to another audience than his fellow programmers. So there are two possibilities: 1) Either, the programmer (Harry?) is the whistleblower, or 2) This is a trap.

  87. Hultquist, Bosseler:

    I’m not a programmer so I’ll probably say this wrong. I came across a commenter the other day (wish I could remember where) who seemed to have an explanation for the sum-of-squared variable going negative. He said it was a common error for inexperienced programmers to make: recursiively incrementing a variable until the sign bit gets changed. Kind of an overflow problem. He took it as an indicator of the quality of the coding….

  88. Henry chance:

    Real climate says all comments are shut off for 2 days. The adverse comments outnumber the puff comments 10:1. It looks like the outrage is being posted over there.

    CEI has sued Schmidt, Gavin for working on the blog instead of doing NASA work. There are years of FOIA requests in the que at NASA GISS that wait being released.

    Just to be graphic, Gavin schmidt is pimpin’ global warming when he is not doing work he is paid to do.

    Human Resource managers have a problem when people are on the job and doing work for themselves. Earlier a mod named “eric” was doing all the comments and then they shut down. Yesterday Gavin posted a request for someone to volunteer to help.

    I notice that Joel Shore has also stopped posting as of 11/19.

    I’ve warned Joel before that every post has a time/date stamp.

  89. >>” Paul (10:12:48) :

    I keep seeing this line in the code;

    ; plot past 1960 because these will be artificially adjusted to look closer to
    ; the real temperatures.

    but what does it actually mean?
    Observed Temperatures?
    Homogenized Temperatures?
    Fudge Factor Temperatures?

    Not really sure what to make of this statement as yet…”

    I’ve presumed it means the following: ‘Given the underlying dataset, our model predicts that the years from 1960 onwards should have been warmer…. warmer, that is, than they actually were. If we therefore publish the “unpolished” results from this model, people will be able to compare the predicted temps to the actual temps…. and it’ll become clear that our model makes poor predictions. Then they’d conclude that its predictions about the future obviously cannot be relied upon….’ And we wouldn’t want anybody doubting the model, would we?!

    That’s a personal supposition; informed enlightenment requested.

  90. “Do any other sciences permit one to hide calculations in a program and then not publish said program with the paper?”

    Yes and no. Journals have differring standards, sometimes they’re enforced sometimes they’re not, sometimes they would go to the length you describe, sometimes not. This particular crew at Hadley has taken heat because the policy implications of their work imply a heavy right to know on the part of the public and especially people who wanted to and were capable of reviewing the methods. At which point the journals policies on the matter were examined and they were prevailed upon to actually enforce them, which they often didn’t, which lead to numerous and multiplying efforts to either get the information through enforcement of their policies or subsequently FOIA requests, which has eventually lead to… this.

    But if we’re talking about a study in biology on the contents of the feces of some frog in the far corners of the jungle, it’s likely the journal it’s submitted to wouldn’t enforce their policy, and it’s likely no one would care. People cared here, and these guys having been lifetime academics and thus never actually having to earn a living via the quality of their work, thought they could simply blow off their detractors.

  91. Hysteria, I’m not so sure. O’Reilly has, I believe, said that Global Warming is real and something needs to be done about it. So did McCain during his bid for the Presidency. That indicates a fraction of the conservative base had been convinced this was a real issue that needed to be addressed.

    This revelation of subterfuge and skulduggery has certainly made a bunch of conservatives re-think this position, and I’m sure libertarians and a handful of liberals as well. Beat this drum loud enough, often enough, and the support base for global warming hysteria will once again return to little more than tree-hugging alarmists. But the time to act is NOW, before any more talk of ‘cap and trade’ or Copenhagen concessions make their way through Congress.

  92. A bit OT, but maybe not too far. A question for the legal beagles out there: When a close circle of researchers conspire to block another researcher’s publication, would that not be tortious interference under the law?


  93. Hysteria (10:19:26) :


    My opinion – for what its worth – I think we are too late.

    It is, however, going to make interesting reading in the history books, on a number of different levels, perhaps even on a par with Piltdown Man(n).

    The “Piltdown Man” is a famous paleontological hoax concerning …

    The Piltdown hoax is perhaps the most famous paleontological hoax in history. It has been prominent for two reasons: … and the length of time (more than 40 years) that elapsed from its discovery to its full exposure as a forgery.

    http://en.wikipedia.org/wiki/Piltdown_Man

  94. The worst possible scenario now is that the Sun will continue down it’s degratory path, CERN Cloud will pan out to support Svensmark, the global climate will enter areas we don’t really know signal real trouble. And all because some overzealous hypothesis funding gravy-trained the world’s climate databases into a spaghehtti-coded event-horizon.
    This is a perfect example of Murphy’s Law striking mankind due to pure greed.

  95. New internet meme: “Harry_Read_ME”

    Examples:
    HRM: You may have the chart upside down.
    It was a HARRY_READ_ME job.
    After an industrial accident: Their control code was still waiting for Harry to read it.
    If only Harry were here to read this.
    Etc.

  96. Over at Connolley’s blog I posted:

    “The Emails show that Jones and Mann can’t be trusted. HARRY_READ_ME shows that the code is incompetent and the code itself shows manual adjustments that have no scientific basis. This is sufficient evidence to call for a third party review of the entire CRU methodology. ”

    To which I got two replies:
    ——
    PR Guy – what papers was the HARRY_READ_ME code used on? Have you any evidence it was used at all?

    Posted by: Chris S. | November 25, 2009 12:07 PM
    —–
    PR Guy,do you even know which dataset/product the HARRY_READ_ME code is dealing with?

    Posted by: Adam | November 25, 2009 12:46 PM
    —-

    To which I responded:
    “Chris S and Adam, these are very reasonable questions. Perhaps you should submit a FOIA to find out. I’m sure we all agree that answers to these sorts of questions are vital and should not be obstructed.”

    This last comment was deleted by William (or maybe the moderator, if there is a moderator). The Team never lets points get scored against them on their court.

  97. I’ve done some programming myself. I used to put in swear words in the code all the time.

    In fact, I would even use the F word and variations thereof for variable and object names, LOL.

  98. I keep coming back to the 1960 date. Newer data did not bear out global warming which global warming advocates had staked their careers. Therefore they played with it and maded it up to get the results they wanted. What they seem to miss in this scientific fraud is that the newer data might be more relaible and that instead of it being wrong, maybe their old data and the entire approach or methodology was the problem. Can anyone seriously deny that they need to simply start from scratch? Put the real data out for everyone to access and analyze.

    BTW, I manage software development projects and work with programmers. I have NEVER seen anyone document code or write read me files as seen in the code directories. This is so incredibly unprofessional and raises big red flags about what was transpiring. Didn’t anyone ever look at the code?!

    And as this was funded by US and UK tax dollars, who had oversight and audit responsibilities? A few heads need to roll…

  99. Hi Folks
    What do you think about this one? It looks as if the “data adjustment contamination” has infected New Zealand as well. Look at

    http://www.climatescience.org.nz/

    and click on Link at: CLIMATEGATE IN NEW ZEALAND? – TEMPERATURE RECORDS MANIPULATED Science
    It is incredible to read how New Zealand’s National Institute of Water & Atmospheric Research (NIWA) seems to have managed to make a “hockey stick” out of their raw data which apparently shows that there has been no warming of any consequence since 1850. One wonders who else has been involved in this game.

  100. I mean seriously, [snip]????? “yearlyadj”? Temp proxy declines so just add a ramp to the values??????

    ;mknormal,yyy,timey,refperiod=[1881,1940]
    ;
    ; Apply a VERY ARTIFICAL correction for decline!!
    ;
    yrloc=[1400,findgen(19)*5.+1904]
    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
    2.6,2.6,2.6]*0.75 ; fudge factor
    (…)
    ;
    ; APPLY ARTIFICIAL CORRECTION
    ;
    yearlyadj=interpol(valadj,yrloc,x)
    densall=densall+yearlyadj

  101. I had a really scary thought…

    What if the problem isn’t with the tree rings, but with the temperature series?

    What if the temperature series is actually off by 2.6C (high) since 1940?

    Does that mean we’re actually 2C or more below the 1940s temps?

    Are we screwed?

  102. The major problem with climate science (along with much of acadamia) is that they dont use ‘Software Engineers’ following good software development practices to develop the programms, models, etc. Mostly its just PhDs hacking stuff together. They are very smart… but software engineering is not their expertise.

    Ask them where are the Requirement documents, Design Documents, Review documents, Test Plans, Test results, configuration Control plans etc..

    Now they are asking for $100 billions to be spent based to some extent upon software that has not passed any formal testing….

  103. JCS
    “Steve Gavin at RealClimate has refused more than 6 times to post the following message, are you willing to present this very important point for me?”

    LOL. JCS, welcome to the club of thousands who Gavin has moderated out because he finds their comments inconvenient. Gavin is a part of the AGW cabal. He is on the distribution list for many of the e-mails from CRU gate. Some of those comments make it clear that Jones and others consider Gavin as the guy that runs interference for them. I have written a small piece about how debates are orchestrated at RC here:

    http://reallyrealclimate.blogspot.com/

  104. HARRY_READ_ME is a great work of stream-of-consciousness literature and perhaps Harry was aware of it so when they asked him to delete it after he got all excited about it being included in a Freedom of Information Act package that then was denied release…he said to himself:

    “No, so holp me Petault, it is not a miseffectual whyancinthinous riot of blots and blurs and bars and balls and hoops and wriggles and juxtaposed jottings linked by spurts of speed: it only looks as like is as damn it; and, sure, we ought really to rest thankful that at this deleteful hour of dungflies dawning we have even a written on with dried ink scrap of paper at all to show for ourselves, tare it or leaf it, (and we are lufted to ourselves as the soulfisher when he led the cat out of the bout) after all that we lost and plundered of it even to the hidmost coignings of the earth and all it has gone through and by all means, after a good ground kiss to Terracussa and for wars luck our lefftoff’s flung over our home homeplate, cling to it as with drowning hands, hoping against all hope all the while that, by the light of philosophy, (and may she never folsage us!) things will begain to clear up a bit one way or another within the next quarrel of an hour and be hanged to them as ten to one they will too, please the pigs, as they ought to categorically, as, strickly between ourselves, there is a limit to all things so this will never do.” – James Joyce (“Finnegans Wake” 1939)

  105. Are there perhaps some people around still denying this fairly decent evidence of poor science? Is there some term we could use for them perhaps?

  106. ;mknormal,yyy,timey,refperiod=[1881,1940]
    ;
    “; Apply a VERY ARTIFICAL correction for decline!!
    ;
    yrloc=[1400,findgen(19)*5.+1904]
    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
    2.6,2.6,2.6]*0.75 ; fudge factor
    (…)
    ;
    ; APPLY ARTIFICIAL CORRECTION
    ;
    yearlyadj=interpol(valadj,yrloc,x)
    densall=densall+yearlyadj”
    ************************************

    Hmm. Are there edited versions out there? My file:

    ; Apply a VERY ARTIFICAL correction for decline!!
    ;
    yrloc=[1400,findgen(19)*5.+1904]
    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
    2.6,2.6,2.6]*0.75 ; fudge factor
    if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
    ;
    yearlyadj=interpol(valadj,yrloc,timey)
    ;
    ;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
    ;oplot,timey,tslow,thick=5,color=20
    ;
    filter_cru,5.,/nan,tsin=yyy,tslow=tslow
    oplot,timey,tslow,thick=5,color=21
    ;
    oplot,!x.crange,[0.,0.],linestyle=1
    ;
    plot,[0,1],/nodata,xstyle=4,ystyle=4
    ;legend,['Northern Hemisphere April-September instrumental temperature',$
    ; 'Northern Hemisphere MXD',$
    ; 'Northern Hemisphere MXD corrected for decline'],$
    ; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
    legend,['Northern Hemisphere April-September instrumental temperature',$
    'Northern Hemisphere MXD'],$
    colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
    ;
    end

  107. Reed Coray (09:35:23) : “http://i47.tinypic.com/mrxszt.png

    Great picture, but I have a question. Which one is Judas?”

    Easy. They all are. :-)

  108. If I produced work like that I would want to delete it also rather than give it to an auditor.

    Short of a detailed explanation and the full production of a model establishing this as nothing more than an interim piece of work; Jones should be fired for this alone.

    He probably should also be held to account for the fraud that this is. This is beyond incompetence for somebody in Jones position if he can’t bring something forward to mitigate this!

    This would also appear to establish Jones as a liar in suggesting the “to hide the decline” behavior only dealt with modifying a graphic.

  109. Very disappointing. slashdot.org has only posted the original Nov 20th story about the fact that files had been hacked (or leaked) from the Hadley CRU. No follow-up about the content or the code that is being found.

    Global economic nightfall, perhaps aided and abetted by programmers, and not a peep out of slashdot?

  110. I’ve done some programming myself. I used to put in swear words in the code all the time.

    In fact, I would even use the F word and variations thereof for variable and object names, LOL.

    That isn’t considered very professional in my business, to be honest.

  111. Excuse me, Mr. Monbiot. Nobody has given me a copper-coated zinc penny.
    I want my weather back. Get it?

    Dishman (11:00:55) :

    Are we scrooged?
    Probably.
    Depends upon whether you can trust GHCN with Jones/Karl online now is not the mangled mess HARRY was tasked with while supposing that the MasterDB will somehow miraculouly appear in a pristine state.
    Now, just close your eyes, click your heels 3 times, and say “there’s no place like home, there’s no place like home”.

  112. documents\cru-code\linux\cruts
    This code is used to convert new data into the new CRU 2.0 data format (.cts files).
    There is another version of this code in cru-code\alpha which is, per comment in the readme file, intended for running on the “Alphas”

    Data can come in from text files or Excel spreadsheet files (or actually Excel spreadsheets written out to text files from Excel). These programs are designed to read multiple climate data file formats include
    GHCNv2
    CLIMAT (Phil Jones formt)
    MCDW
    CLIMAT (original)
    CLIMAT (AOPC-Offenbach)
    Jian’s Chinese data from Excel (appears to be text output from Excel)
    CRU time-series file format – with the comment “(but not quite right)”

    Data files for running these code files are not available in this archive.

    Software engineering comment – this collection of programs – very large source code files – is implementing a crude database management system. Most of the source code is uncommented and undocumented. From a s/w engineering perspective, it would have seemed wise to have used an existing DBMS that had been extensively tested and verified. Instead, the approach chosen results in extremely large amounts of custom code being written. There is no evidence provided of software quality assurance (SQA) procedures applied, such as a test plan, test scenarios, unit testing, test driven development and so forth. It would most likely have been quicker and more reliable to use existing software tools like DBMS.

    The goal of the software is to eventually calculate the anomalies of the temperature series from the 1961-1990 mean.

    Because station reporting data is often missing, the code works to find nearby stations and then substitute those values directly or through a weighting procedure. In effect, the code is estimating a value for missing data. Station data will be used as long as at least 75% of the reporting periods are present (or stated the other way, up to 25% of the data can be missing and missing data will be estimated).

    The linux\_READ_ME.txt file contains an extensive description. Of interest, stations
    within 8km of each other are considered “duplicates” and the data between the stations is “merged”. I have a question about this which may not really matter – but there is no attempt to determine if the nearby stations are correlated with one another. It is possible, for example, that one station is near a body of water (and less volatile) and another is on the roof of a fire station (see surfacestations.org). Or the stations could be at different elevations. In my town, the official weather reporting station moved 4 times over the past century – from downtown in a river valley, to eventually up on a plateau next to a windy airport. These locations would today fall within the 8km bounding area. My concern is that this could skew results in an unpredictable way. Then again, it could be that the situations like I describe are rare and would have negligable impact on the calculations.

  113. Robinson: I may have exaggerated a bit. I only did it for one particular employer and it was because I was working 14+ hours a day and being placed on 24/7 pager duty with no extra compensation (imagine getting called during your grandmother’s funeral or while in Church on Christmas Eve–yes it happened to me), I survived 20 rounds of layoffs at that company during the dotcom crash.

    So in order to relieve a bit of frustration, I had a bit of fun with my source code.

    And one of these programs was literally a SPAM DIALER, which was a robo caller that would annoy people with telemarketing promotions. It was quite efficient, it was able to call about 20,000 people a day (with 2 T1s).

    Yes, I confess that I programmed a spam dialer during the dotcom crash in order to pay the bills. And there were swear words in the source code! :) (and I don’t feel the least bad about it).

    The company is now dead and bankrupt and I danced a little jig when I found out about it a few years later!


  114. Hysteria (10:19:26) :


    My opinion – for what its worth – I think we are too late.

    It is, however, going to make interesting reading in the history books, on a number of different levels, perhaps even on a par with Piltdown Man(n).

    The “Piltdown Man” is a famous paleontological hoax concerning …

    The Piltdown hoax is perhaps the most famous paleontological hoax in history. It has been prominent for two reasons: … and the length of time (more than 40 years) that elapsed from its discovery to its full exposure as a forgery.

    .
    .

  115. In spite of all this, a BBC program announcement for this eveneing, ref Copenhagen, is: “Can President Obama save the Planet?” I despair.

  116. “a sum-of-squared variable is becoming very, very negative!”

    …nice … the only way to get a large negative with a sum of square is if the numbers are “imaginary” [no pun intended - ie i = sqrt(-1) ]

    ,,,but an ironic comment none the less given the second meaning of “imaginary” numbers (ie – they made them up)

    The real meat of this whole deal is likely to be found in the code & how the data has been manipulated. Not that this is neccessarily illegal, but it will clearly show that the science is not nearly as settled as it is reported to be (especially for making trillion dollar decisions based on POS code). It would also appear based on all the information I have seen so far that the magnitude of warming over the recent historical record (roughly 130 years) may be less (possibly significantly less) than has been represented.

    The significance of the last statement can not be over-emphasized. This is what needs to be determined (or re-determined) ASAP. I wish I had more time as I would delve into it.

    The original data isn’t neccessarily needed to do this. If a synthetic dataset of reasonable similarity were created & then run through the code, you could look at the output compared to the input & get a sense of how the data has been distorted. With several synthetic datasets that had differenet assumptions, you could test some sensativities to different aspects of input. With that information in hand, you could probably create resonable scalings to estimate what the original data looked like & how it has been distorted & presented to the public.

    This approach is somewhat similar to what Steve McIntire did with the hockey stick de-bunking, where he adventually showed that even inputing a random number seqence resulted in a “hockey stick”. If someone who has the time & skills could take this same approach with this code, it might not just kill a “hockey stick” but AGW in total – in other words, it is possible no matter what raw data you put in , the global temp tend always comes out increasing. Don’t dismiss this possibility! If this hypothesis was born out, AGW would be catagorically dead. Given the implication, it certainly seems worth the time & effort.

    The above scenario is interesting to contemplate, but the most like scenario is that the actual warming is less than represented. Keep in mind, if you just do a logarithmic curve fit to the data as currently presented (temp & CO2 have a logarithmic relationship theoretically) that the sensativity in terms of deg /CO2 doubling is already substantially below the IPCC #’s (see my post on “spencer on finding a new climate sensitivity marker”, 10-5-09). So, if the actual temp tend is flatter than currently represented coming out of CRU, it means the sensitivity is even smaller still. Of course, that does mean that going fwd, there is no way to represent CO2 as a significant problem and AGW as a “problem” is dead.

  117. documents\cru-code\f77\mnew\sh2sp_m.for

    This program, sunh2sunp, converts the “sun hours monthly time series to sun percent (n/N)”. I do not have access to the cited reference used for calculation so have not yet determined if the code is implemented correctly.

    However, in the odd situation where the calculation exceeds 100%, the code, surprisingly, checks for this but then leaves the incorrect value in place:

    c decide what to do when % > 100
    if(sunp(im).gt.100)sunp(im)=sunp(im)

    For non programmers this says, in simplified form
    if x > 100, then let x = x

    Normally, if a value is incorrect, the error is either flagged or perhaps in this case it
    could be due to round off error – in which case, we might expect something like:
    if x > 100, then let x = 100
    which would force the value of x to never exceed 100.

    The purpose of this program and how it fits into any analysis is not yet understood. The program appears to date back to 1997 (and probably went out of usage by 2003) and it may no longer be in use. It is entirely possible that the above error condition never occurred – and consequently, this defect in the software would have no impact on the results.

    In a separate code file (sp2cld_m.for), the above test is implemented correctly:
    IF(CLD(im).GT.80.0) CLD(im)=80.0

    The file exhibits poor Fortran coding standards such as:
    ratio=(REAL(sunp(im))/1000)
    IF(RATIO.GE.0.95) CLD(im)=0
    Note the lower case ‘ratio’ and the upper case ‘RATIO’ variable names.

    The variables XLAT and RATIO are not declared. Similarly for iy, iy1, iy2. Fortran 90 permitted this practice and would automatically define the value based on the first letter of the variable name: A through H and O to Z are set to type ‘real’. Use of this feature is discouraged because the compiler is then unable to flag typographical errors – instead of warning of using an undeclared variable, it just defines a new one. This can result in erroneous program operation – if that occurs. Note – this is a software engineering issue and is not the source of any identified execution errors in this program. This is to note that this is poor programming practice. It does not appear to have resulted in an implementation or execution error.

    Note – the issues I cite do not mean the program’s executed incorrectly. They are more indications of poor programming practices. And I believe we the people deserve the utmost care and professionalism in a matter as important as this.

  118. I have to say, speaking as an OBI/Hyperion system consultant, the biggest issues we have on virtually every project are;
    1) Serious data issues
    2) People don’t understand the data in the first place.

    I feel for Harry.

  119. Forgive me for jumping in here, as its a bit OT.

    Richard A. said:
    ‘But if we’re talking about a study in biology on the contents of the feces of some frog in the far corners of the jungle, it’s likely the journal it’s submitted to wouldn’t enforce their policy, and it’s likely no one would care.’

    Not so! To the contrary. Without documenting how many frogs you were studying, and where (not forgetting a control group, heh!) and at what time (dd/mm/yy), you’d get your paper sent back.
    All this comes under ‘Material and Methods’.
    Next you have ‘Results’.
    Thats where all your numbers go, and the stats.
    The point is, especially in biology, that anybody must be able to go where you went, do exactly as you’ve done, and come up with the same results (given a dead frog here and there …)
    Then you can talk about what you’ve done and what your results mean.

    That is why I, a retired zoologist, find these revelations so utterly distressing. If you don’t provide the data on which you’ve built your hypothesis, how can it ever be replicated? How can it be confirmed or refuted?

    Science is about replication of that what you discovered – its not about secret knowledge which only the select are allowed to share.

    I am dismayed at the huge disservice these people have done to science.

  120. M.A.DeLuca (09:40:16) wrote :
    “If a physicist were to submit a paper without showing the math, that paper would (I assume) be rightly ridiculed and sent back with a “show your work” rebuke. It doesn’t seem right that one can hide one’s work in software, and then casually dismiss the absence of documented code upon submitting a paper as these yahoos have done. And yet, that seems exactly the way mainstream climatology works. Do any other sciences permit one to hide calculations in a program and then not publish said program with the paper?”

    M.A., I am a medical research scientist with five peer reviewed abstracts published in leading journals. Last year one of the papers I co-authored was selected for oral presentation (a high honor).

    Before we can even begin a study, we face a panel of experts called an IRB (Institutional Review Board). They review our hypothesis, proposed methodology, demographics, and inclusion criteria we intend to use in the study. This is our first peer review. If we don’t pass this review, the study is dead.

    While we conduct the study, we must be absolutely careful to follow the study protocol approved by the IRB. If we discover anything that needs to be changed in the protocol, we must stop the study and go back to the IRB to request a protocol change approval. We can’t simply say “we’ll make adjustments here and there to fix the problem.” The IRB holds another review. If we can’t get their approval, the study is dead.

    When the study is completed, we then submit it to the journal for publication. We are required to disclose all data and methods sufficient to reproduce our results. Typically, we create a resource package containing the database (in XLS format to facilitate import into any database), queries, and formulas. If formulas are calculated using computer programs, we supply the source code. There is no concept of hiding behind intellectual properties in the disclosure – if you can’t provide the means to reproduce the study using our methodologies then the study is summarily rejected. By journal requirements, we must make the same package available to any doctor or center requesting it. We can charge a reasonable processing fee to defer costs in providing the package.

    We do not choose who the reviewers will be. It is during this official peer review process that we must respond to any and all questions from the reviewers. Sometimes we are asked to include additional information in the abstract or fix up citations and other presentation issues. Assuming we pass the peer review, the abstract is published. All of the above applies even to retrospective studies (most climate studies are retrospective).

    From what I’m seeing, it appears that the climate journals have little to no independence in peer review and bow quite low to peer pressure. When people say peer reviewed in context to climate, I laugh and remind them its a good ole’ boy network. Bring a case of Mann’s favorite beer and it’ll get published.

  121. This all may be a distraction, to keep our eye off the Copenhagen ball.

    Make no mistake, they have not given up. Quite the contrary.
    “Obama says ‘step closer’ to climate deal”

    http://news.ninemsn.com.au/article.aspx?id=975599

    They have given us the sacrificial goat, which has served it’s purpose. And, while we have a feeding frenzy, the real beast walks by unhindered, and barely noticed.

  122. In less than 8 hours, google hits on “Climategate” have gone from 160,000 to 24,200,000. Hockey stick anyone? Isn’t 24,200,000 about the same as WUWT hits? Coincidence or WATT?

  123. The bug shown traipsing across this code (in the picture) … looks a lot like the assassin beetles that buzz into our house ever summer. If so, this picture is apt.

  124. Wow, instead of cutting out the lines before 1400, they should have used a low-pass filter. They know how to use a high pass filter already, so why not a low pass filter?

    How hard is it really to read a thermometer? From what they say, the temperature read on any thermometer is not the real temperature… that’s news to me! I better get a copy of their code because I have many thermometers, RTD and thermocouples in my lab.

    Damn it, for all my life I thought water was freezing at 0 Celsius and boiling at 100 Celsius at 1 atmosphere… got to go back to school to learn the new rules provided by these bunch of people.

  125. Proof again (if any was needed) that HADCRUT is untrustworthy for climatology! Should rely on satellite temperatures only from 1979 since the provenance and processing are better documented, better maintained, and independently (UAH vs RSS) validated.

  126. All this blog communicating/ranting is fine but I’m here to tell ya that the Copenhagen ’support change’ bits are rolling on the radios. Sirius Left channel 146 are running them big time. Bill Press, Alex Bennett, Thom Hartmann, Lynn Samuels, Mark Thompson, Mike Malloy and others… google ‘em up, call ‘em up, get email addresses from the sirius left site and give ‘em a hard time. I do, every day!

    They hang up on me, they know me too well, but you smart people can get through to a real load of people at a perfect time. They are a lot of fun to mess with.

  127. this is so bad.

    the global financial ramifications of this fraud are incomprehensibly huge. even when one considers damages incurred to date, let alone the future damages of pending legislation, there are at least hundreds of billions of dollars that have been bilked from taxpayers worldwide to fund this insidious mess. it must be the largest single fraud ever perpetrated.

    anybody an expert on class action lawsuits to recover damages and put the fraud that is global climate change on trial?

    at least in that case the world may be able to subpoena these groups to get at the truth once and for all…

    makes one sick to think what a handful of politicians and “scientists” have been able to do to all of us and nearly every industry.

    as a builder, the entire LEED certification process is nearly fully predicated on data and conclusions put forth by these groups, which has resulted in enormous additional costs on nearly every public construction project. what a sham.

    i’m all for sustainability, but fraud is fraud.

    -patternbuilder

  128. When I was a student, we had to turn in our source code for our projects.

    We couldn’t just turn in some output and say “see, I got the right answer”. The source code and the output was evaluated. The code had to be properly commented.

    It’s obvious these programmers never expected any outsiders to view the source code. Is the science as sloppy as the code? By all appearances, yes.

  129. Seems to me that “calibration” of tree ring data is a bit of a joke.

    For the modern periods where we have the highest quality records available we see the “recalibration” where the numbers get “fudged” (the scientific term used in the code is “fudge factor”) in order to make a “calibration” work that makes the pre 1900 data reflect cooler temperatures than if the data had been calibrated to post 1900 data.

    Not only do we have the “hide the decline at the end” (which can be argued as a plausible treatment if young tree rings are somehow not “ripe”), but the pre 1900 manipulation is just the sort of thing you need to do in order the straighten the handle of the hockey stick, isn’t it?

  130. New Zealand Icebergs —
    More than 100 icebergs that were first spotted off the coast of Macquarie Island, an Australian territory around 900 miles south east of Tasmania, are now thought to be only 200 miles away from New Zealand’s south coast.

    This is only the second time in 78 years that large Antarctic icebergs have been sighted so far north.

    “While the size of the icebergs has attracted a lot of attention, it is not unusual for icebergs to be found in these waters,” a spokesperson for Maritime New Zealand told CNN, who continued to say that alerts for smaller icebergs are not uncommon.

    But a half-kilometer wide iceberg visible from New Zealand’s coast would represent a very rare occurrence.

    “An iceberg that size this far north is pretty significant,” Philip Duncan, Head Weather Analyst of the New Zealand-based Weather Watch Center told CNN.

    It is thought that the current flotilla of icebergs came off the Ross Ice shelf between 2000 and 2002, the same period that produced the 2006 icebergs.

    The question now is what caused the huge fresh water icebergs to break off from an Antarctic Ice shelf and what has allowed them to travel so far north.

    “A lot of people are saying it was due to a very cold snap a few years ago in Antarctica that caused more ice than usual and the outer regions of that ice snap off each summer,” said Duncan.

    SOOOO — despite all the global warming claims, Antarctica was suffering very cold snaps in 2000 to 2002, enough so that more ice and snow deposited, leading to large icebergs breaking off. In other words, all the warming nuts who point to icebergs and scream “The ice is melting we’re all going to die” failed to check the weather, which was COLDER, had more ICE, and naturally lead to more bergs breaking off the Ross Ice shelf.

  131. Ironically, the University of East Anglia has a Computer Science department:

    Welcome to the School of Computing Sciences, CMP. With about 35 staff and 500 students we conduct research and teach in the fields of Computer Science, Business and Electronics. We have a consultancy company, SYSCO, and many of our staff have close links with industry.

    Computing science graduates are well paid and our graduates are among the best paid from UEA and can be found in almost every employment sector all over the world.

    The fact is that they have the expertise on campus to engineer a decent piece of software. I’m not saying it would neccessarily work that way of course. Most of the good practice in design and implementation I have learnt since leaving University, not while I was an undergraduate. In industry you literally won’t have a paycheck if things don’t work.

    The problem is that nobody outside of a small circle of users was asked to audit the software, or, I suspect, to contribute to its design or development. It’s quite stunning that its output is being used as “evidence” (cast iron!) forming the basis of trillion dollar government programmes.

    But anyway, we don’t know that the program is broken. It probably produces the desired output ;).

  132. The more I read about this, the more disgusted I become. While in college I studied under Chris McKay somewhat (Dr. Terraforming), and the “tricks” that keep showing up here are items that he told me to not do. He always told me to go from “base principles instead of pinning.” What it seems like to me, is that the observed data doesn’t fit the model, so the data is artificially pinned to meet the expectations of the model. Disgusting.

  133. ; artificially removed (i.e. corrected) the decline in this calibrated

    ; we know the file starts at yr 440, but we want nothing till 1400, so we
    ; can skill lines (1400-440)/10 + 1 header line
    ; we now want all lines (10 yr per line) from 1400 to 1980, which is
    ; (1980-1400)/10 + 1 lines
    (…)
    ; we know the file starts at yr 1070, but we want nothing till 1400, so we
    ; can skill lines (1400-1070)/10 + 1 header line

    Love the euphemisms. “Corrected…” “We can skill lines…?”

  134. While it’s not over by a long shot, this is a taste of vindication for all of those who refused to worship at the alter of Gaia.!!! And cheers to the people here and elsewhere that do the legwork for so many of us. Keep fighting the good fight!!!!

  135. “Pieter F (09:16:40) :
    … why won’t the mainstream media report on the matter?”

    …Because they’re all waiting for each other to be the first to break the story properly, which will take a huge investment and a massive gamble with their credibility to pull off.

    When the Telegraph exposed our MPs’ expenses they were assigning up to 60 journalists to cover it, and check all the facts before each publication. That’s why all the main papers are still only putting the CRU fraud in sidebars and in non-staffers’ opinion pieces, and not in their headlines.

  136. RE Hank Hancock and Viv Evans

    Hank is a perfect example of what it’s like when you’re actually held accountable for your work. Note Hank’s field: Medical Research. I agree Viv, these are the standards that should be met. But quite obviously they aren’t always met, or none of us would be here, right now, reading this stuff.

  137. Wow.

    *** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

    Just…wow….

  138. Folks, I do forecasting for an XX billion $$ corp. The program is written in FORTRAN and has over a million lines of code. It was written by mastered programmers and math geeks. The program is documented in 10 volumes and the code contains extremely well defined notes for variable and process outcomes. This was built by a private corporation to manage X billions in yearly expenditures.

    To see and read anything that looks like this mess is quite disturbing. The company would go broke with this type of programming.

    BTW – Anthony this is the best site on climate I have ever seen (for 2 years now) and all of you other science geek’s hats off for your contributions. From the girl in the northwest to the guys that can’t figure out what the sun really does to our climate I really enjoy & learn from your inputs and knowledge. I am anal about being analytical and one of these days, one of these days Alice – bang – to the moon. I’ll contribute to this most excellent adventure.
    Timothy

  139. Swiss Bob (10:38:34) :

    For those of you who don’t know, the Cloud experiment at CERN is meant to test the connection between cosmic rays (from the sun and galactic centers) and cloud nucleation and hence cloud formation. The rate of cosmic rays entering the atmosphere is controlled by the strength of the earths magnetic field. During strong sunspot activity the earths magnetic field strengthens, limiting the amount of cosmic rays. When the sun is quiet (as it is now) the earths magnetic field weakens allow in more cosmic rays. To summarize the theory:

    Quite Sun -> more cosmic rays -> more clouds
    Active Sun -> fewer cosmic rays -> fewer clouds

  140. Got to nail these guys, otherwise, to use an apt quote “For if we fail, the whole world will sink into a new Dark Age, made more sinister, and perhaps more prolonged, by the light of a perverted science.”

  141. Question for the warmists, if you saw data like this having been used for building an aeroplane, would you fly in that aeroplane?

  142. *************************************
    Robinson (11:58:42) :
    The fact is that they have the expertise on campus to engineer a decent piece of software. I’m not saying it would neccessarily work that way of course. Most of the good practice in design and implementation I have learnt since leaving University, not while I was an undergraduate. In industry you literally won’t have a paycheck if things don’t work.
    **************************************
    Judging from the comments, it probably isn’t the programmer that’s bad, it’s the data. It appears he was being pushed to achieve a certain outcome. If he was disgusted and frustrated enough, he just might do something a little rash like push the code and other info into the wild.

  143. Connected to the station count problem, this was one of my jaw-dropping moments so far in HARRY. It confirms the problem I diagnosed in GHCN concerning station “sphere of influence” effects.

    “Worked out an algorithm from scratch. It seems to give better answers than the others, so we’ll go
    with that. Also decided that the approach I was taking (pick a gridline of latitude and reverse-
    engineer the GCD algorithm so the unknown is the second lon) was overcomplicated, when we don’t
    need to know where it hits, just that it does. Since for any cell the nearest point to the station
    will be a vertex, we can test candidate cells for the distance from the appropriate vertex to the
    station. Program is stncounts.for, but is causing immense problems.

    The problem is, really, the huge numbers of cells potentially involved in one station, particularly
    at high latitudes. Working out the possible bounding box when you’re within cdd of a pole (ie, for
    tmean with a cdd of 1200, the N-S extent is over 20 cells (10 degs) in each direction. Maybe not a
    serious problem for the current datasets but an example of the complexity. Also, deciding on the
    potential bounding box is nontrivial, because of cell ‘width’ changes at high latitudes (at 61 degs
    North, the half-degree cells are only 27km wide! With a precip cdd of 450 km this means the
    bounding box is dozens of cells wide – and will be wider at the Northern edge!

    Clearly a large number of cells are being marked as covered by each station. So in densely-stationed
    areas there will be considerable smoothing, and in sparsely-stationed (or empty) areas, there will be
    possibly untypical data. I might suggest two station counts – one of actual stations contributing from
    within the cell, one for stations contributing from within the cdd. The former being a subset of the
    latter, so the latter could be used as the previous release was used.

    Well, got stncounts.for working, finally. And, out of malicious interest, I dumped the first station’s
    coverage to a text file and counted up how many cells it ‘influenced’. The station was at 10.6E, 61.0N.
    The total number of cells covered was a staggering 476! Or, if you prefer, 475 indirect and one direct.”

  144. Ray (12:20:28) :

    Only in their world that tree-thermometers are better than real thermometers. Amazing

    Yep. The tree ring data from 1 tree has the power to over rule sat and human gathered readings. You betcha.

  145. As a software engineer I ask: where are the test cases that prove the proper functioning of all this code? Surely there could be test data sets that could be fed in and check that the output is as expected, before applying it to real data? Test handling missing stations, duplicated stations, wildly varying Tmin/Tmax to generate alerts during generation.

    If this is not done then a simple programming bug introdiced when modifying code will remain hidden and be very difficult to discover. Only code reviews/blind luck would catch this, without test cases.

  146. Jonny B. Good (12:22:06) :

    Purchase proprietary data products from independent sources: $12,000
    Pay programmer to adjust 1940’s and hide the decline: $85,000
    Pay publishers to publish the results: 1 case of beer ($18.95)
    Travel to Tahiti: $9,462
    Take Andy Revkin out to dinner: $350 (including wine and cheese)
    Senate investigation: priceless!

  147. I starting to think “climategate” is going to turn into a $100,000,000 (or so) grant to CRU for software engineers to clean up the mess. Jones will be the sacrificial lamb, but he’ll walk with a huge severance package since he brought the huge grant in.

    Never let a good crisis go to waste, you know.

  148. Someone should see if MythBusters would like to take a crack at re-constructing the series, Just for Kicks.

    REPLY: There’s no explosions or high speed crashes involved, doubtful they’d be interested. Climate science can be excruciatingly dull compared to TV science. – A

  149. Mark (11:33:11) :

    “I’d like to know what the “decline” is. Is it temperature?”

    I’ll take a stab at your question with an offer to have anyone else correct it. The decline is not in the temperature record, but in the proxy data(tree rings, etc.). During a “calibration period”, the temp data is matched up to proxy data for good correlation. The CRU problem is the well known “divergence” that occurred between the proxy data and the temperature record, i.e., the temperatures went up, but the proxy data went down. To hide that decline in the proxy data, the temperature was spliced onto the end of the proxies at a convenient point (1940-1960). Voila, no more divergence

    Which begs the question: if the proxies diverged after this period, why couldn’t they have diverged before the era of instrumental temp records. Any takers?

  150. Of the prople that work there what names do not appear in the emails, what names do not appear as part of a progam or data filename?

  151. Slashdot comments has an interesting link to a Finnish TV documentary where the reporters discovered the ‘climate scientists’ had flipped a temperature chart that was showing cooling to show warming instead.

    How do you say Gotcha! in Finnish?

  152. We ‘programming geeks’ try to comment our code so we can understand it when we go back to revise it, sometimes years later. Since we expect that we are the only ones who will ever read it, we tend to be very honest in our comments, especially “why” we include/exclude/fudge something.
    Fudge factors are normal when trying to account for ‘real world’ data with an imperfect model. Anyone involved in modelling knows this. You have to have some way to account for factors that are not understood. The idea is to find a way to bring the results of the model into line with real world’s data. It is never suppposed to be used to twist the data to create an artifical world, which is what this nightmare is trying to do.

  153. Ron de Haan (12:15:08) :
    Great link Ron.

    You’ve got to love Singer.

    “The Climategate disclosures over the past few days, consisting of some thousands of emails between a small group of British and US climate scientists, suggest that global warming may be man-made after all – created by a small group of zealous scientists!”

  154. Averaging day and night temperatures has been reported as enhancing perceived warming in England as daytime alone shows less, or no, warming.
    This was highlighted this summer, which was dull and miserable. Our esteemed Met Office( see list of usual warmist suspects) put out some spin that it was warmer than usual when they included the night.

    As trees grow in the sunlight while eating CO2, perhaps the tree ring proxies reflect only daytime conditions. This would make them different from the “real” temperatures if these included the night. This might hide any decline in daytime growth of trees if spliced on after 1960.

  155. yep – I noted the past tense without surprise. Maybe in academia you could do that in code, but not as a professional in the private sector. Not for long anyway.

    Robinson (11:17:25) :

    I’ve done some programming myself. I used to put in swear words in the code all the time.

    In fact, I would even use the F word and variations thereof for variable and object names, LOL.

    That isn’t considered very professional in my business, to be honest.

  156. M.A.DeLuca (10:42:18) :

    “Hysteria, I’m not so sure. O’Reilly has, I believe, said that Global Warming is real and something needs to be done about it. So did McCain during his bid for the Presidency. That indicates a fraction of the conservative base had been convinced this was a real issue that needed to be addressed.”

    I saw a comment somewhere (here?) a couple of days ago that McCain has, in the past two months, backed off from his support of CAWG.

  157. Sorry,

    Statistics sans Frontières (10:24:11)
    juan (10:35:09)

    ‘I’d imagine that was due to an overflow.’
    This type of incident is clearly signaled with another error message and the programmer would know, at least I hope he is knowing what he does.
    A very, very negative number is still a number and not an owerflow!

  158. Ray (12:20:28) :

    “Only in their world that tree-thermometers are better than real thermometers. Amazing!”


    it is actually much worse that that.

    Certain special kinds of tree-thermometers that show the desired signal (identified b/c they show the desired signal) are better than actual thermometers. In the 1960s these special tree-thermos stop showing the desired signal and real thermometers become better.

    There is no exaggeration in the above statement. No need for it. It is absolutely unbelievable to me that this crap was published.

  159. It is as if climate research is a small field that someone decided to exploit for political purposes. But instead of doing things professionally, they just made up data and fudged methods. The fact that it was fudged together on a shoestring by some researcher up all night, is all the more to the cause’s benefit. If they’d decided to do things rigorously and professionally, there would be no results to speak of, they’d have to say, “come back in 10 years” and see if we have anything we can reliably speak of then.

  160. But these emails were gotten illegally and CA others are pseudo and non-scientific blogs and Mandia was right.

    Thus saith a AGW proponent in a comment on my blog.

    LOL

  161. People are already talking about the legal implications of all this.

    Joanne Nova makes a salient point when she says; “Australia is in the extraordinary position of passing legislation that is known to be based on fraudulent science“.

    Canada Free Press says ‘Greens to be to account‘.

  162. I remain unconvinced that this story will ever see the light of day in MSM. The Canadian and US govs are well on track to introduce a 20% decrease in CO2 by 2020 and I think Copenhagen will move us dangerously close to a global agreement in principle. The science no longer matters. The money, research and legislation already dedicated to this will pass by shear inertia. We are to late to stop it. I just talked to my local MP here in Canada and he was so politically evasive its not even funny, and he is a die hard conservative. I got the real sense that the fix is in and nothing can stop it. To little to late. Only one thing remains. Mass revolt! By mid January we will see people in the streets, perhaps even dying over this issue! The Copenhagen agreement is the most draconian shift of power and sovereignty I have ever read. Folks, you are on the edge of losing all your rights!

  163. I have done my fair share of hypercard programming (loved that language). HyperTalk supports most standard programming structures such as “if-then” and “repeat”. The “if-then” structure is so flexible that it even allows “case” structured code. The code “F*** This” seems to be missing its “If…” part. What comes after the “Case if…” in order to fill in the “…then F*** This” statement?

    I can think of a few “If” fill ins.

    “Case if field (found out) then (F*** This)” comes to mind.

    And my apologies for very rusty Hypertalk. It has been FOREVER since I have used it. And to really show my age, I cut my computer teeth on a WANG that had the motherboard taking up the entire basement wing of the old VA hospital in Portland. Those were the days. The old WANG terminals wouldn’t let you use swear words. If you did the programmer had put in subroutines that gave you a lecture on using foul language at work.

  164. Roger (10:33:00) :

    I have worked as a professional programmer for more than 20 years, and I think that the language in these comments is strange, to say the least. I mean – I have often been swearing over poorly documented spaghetti code – (almost) as bad as this one – but I have NEVER put the swearing into writing. In my opinion, this stinks. It seems that the author of these comments WANTED the world to see them. He is certainly writing to another audience than his fellow programmers. So there are two possibilities: 1) Either, the programmer (Harry?) is the whistleblower, or 2) This is a trap.

    I suggest you try working for a defense contractor on DOD, DHS coding projects then. You obviously need to get out of the house more often.

    This is no trap. I am sympathize with Harry completely. Been there, done that. This is nothing new…

  165. I may not understand any of the arguments, but never have I been more convinced about something than I am here. This is shocking.

  166. Have anyone noticed this reasonig regarding FOIA in file FOIA\jones-foiathoughts.doc

    “Options appear to be:

    1. Send them the data
    2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
    3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.”

    Second option seems quite revealing….

  167. Robinson (11:58:42) :

    Ironically, the University of East Anglia has a Computer Science department

    Apparently & according to the emails this dept. has more FOI requests than CRU!

    Go figure!

  168. I heare some time ago that there is no such thing as a global average temperature. The idea is meaningless and about as useful as calculating the average phone number out of the phone book. It looks like the folks at the CRU have produced numbers that are about as useful as the average phone number.

  169. ” 1Spectre4U (10:31:54) :

    Now I am convinced it was an inside job.”

    Imagine that you wanted to evade an FOI request, and all future ones, one would do a round of cleaning of programs, data files and emails.
    BUT, you would want to make sure you didn’t throw anything important away.
    So have a dedicated recyclying bin. Put everything potentially dodgy in there, and go through it to make sure that you are not throwing away anything you might need.
    Put programs in there; then one at a time upload them and read all the read me comments. Keep the ‘censored’ one.

    Someone could raid the trash and find out what is being thrown away.

  170. “I have worked as a professional programmer for more than 20 years, and I think that the language in these comments is strange, to say the least. I mean – I have often been swearing over poorly documented spaghetti code”

    Having written hundreds of thousands of lines of code, much of it in, believe it or not, qbasic, I can assure you, these comments int he code don’t shock nor surprise me. I wrote much worse in my code. When its 3am, you’ve been staring at a screen for 40 hours straight and you are trying to debug subroutines somoene else wrote, sometimes in a different country, trust me, you’ll write some serious stuff in the comments!

  171. RE:

    Mark (11:33:11) :
    I’d like to know what the “decline” is. Is it temperature?

    Hi Mark,

    No. It is a decline in a series of values that are supposed to match temperatures. Actually, these numbers are functions of some tree ring width or density. They use these numbers as ‘proxys’ for old temperature, i.e., numbers that are supposedly the best they can come up with for temperatures, given that there were no thermometers then…

    These proxys do a decent job at matching actual temperatures for the earlier part of the short period (couple of centuries) for which we have thermometer data. But somewhere in the middle of the century, there is a problem. The proxys and the temperature take very different paths, temperatures going well up, and proxys going down. They don’t pick up the last 50 years warming at all. That’s why many question their use as proxys for temperature going back 2000 years: How do we know if trees picked up warming signals then if they do not do now??

    This is called the ‘divergence problem’. Climate Audit has nice posts on this. I know that Climate Audit can a tough read for the non scientifically inclined, but it is the best site out there for these questions.

    for example:

    http://www.climateaudit.org/?p=570

  172. Here’s a link to the BBC story on the CRU “hack”:

    http://news.bbc.co.uk/2/hi/science/nature/8370282.stm

    This was originally posted at the start of last weekend. It went from
    being a “Science and Environment” entry to “Technology” to both
    and now it’s buried again under the “Technology” header.

    So far, the text hasn’t changed from when they first posted it.

    Stop by there via the link to beef up the internal “hit” counters to
    keep even this pitiful bit of BBC coverage active.

  173. I found the problem with “APPLY ARTIFICIAL CORRECTION”
    that I couldn’t find in my downloaded file
    FOIA\documents\osborn-tree6\briffa_sep98_d.pro

    In my file it’s in \documents\harris-tree\briffa_sep98_e

    This file starts out with the comment:

    ; PLOTS ‘ALL’ REGION MXD timeseries from age banded and from hugershoff
    ; standardised datasets.
    ; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
    ; with missing values set appropriately. Uses mxd, and just the
    ; “all band” timeseries
    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

  174. Petition the Law Makers to Stop this Scam
    Take it out of EPA’s Hand

    They say we’ll loose a little Freedom, what’s the fuss
    Freedom is Cheep there’s plenty of it
    It was only bought with the others blood

    The Stick is broken Open your eyes
    The MWP was grafted Upside Down
    Real Temperatures can’t be found
    Stop believing, The IPCC lies

    Set may CO2 FREE, STOP insanity
    I Will Not go down that California Road to Prosperity

    The Question remains, the Question is bound
    What side will You be on, LAW MAKER
    As the Team goes Down

    Will You fight for my Liberties
    Or sell your Soul for a piece of gold?

    So, make your case CO2, that “Evil Gas”, Rules the Climate if you can
    Or declare CO2 a non pollutant
    Petition the Law Makers to Stop this Scam

    By FIRE ANT

    So, Harry read_me, have you sold Your soul for a piece of gold?

  175. Phil Jones is the director of the CRU. His own email correspondence (even before this leak) said that the CRU had lost original raw data files (even though the CRU founding was, in part, to document and create a record of historical temperatures). Comments in the source code indicate they lost their entire database of cloud data prior to 1995. Comments in the HARRY_README file say show that they had no source code management system in place – taking 3 years to get the code working again after the original author disappeared (died perhaps?). The source code itself used to collect and process temperature data (I am a s/w engineer and have been staying up ’til midnight each night to do preliminary source code reviews) is awful and does not meet any semblance of modern software quality standards that inspire confidence in accuracy or maintainability.

    The internal organizational culture, per the emails, is a dysfunctional mix of paranoia and tribalism. (I have an MBA too and have had a lot of training on organizational behavior issues and dysfunctional management)

    He must be held accountable for his own and his staff’s loss of data and poor quality tools. (Would you fly in an airplane if the plane’s aeronautical design model was written like this code?)

    On the basis of his utter failure as a manager, he must be fired.

    Even George Monbiot is now calling for his removal within days.

  176. “”” Adam Sullivan (11:58:14) :

    Seems to me that “calibration” of tree ring data is a bit of a joke.

    For the modern periods where we have the highest quality records available we see the “recalibration” where the numbers get “fudged” (the scientific term used in the code is “fudge factor”) in order to make a “calibration” work that makes the pre 1900 data reflect cooler temperatures than if the data had been calibrated to post 1900 data. “””

    A tree of any significant (climatically) size, is a relatively voluminous three dimensional object.

    If you core drill such a tree to obtain a sequence of samples of tree rings, it is the sampling equivalent of sticking a drill into some arbitrarily chosen rock face at some altitude somewhere in the rocky mountains; well maybe the Alps if you are European, and studying the rock samples from the surface to the extent of the drill, and then claiming to know the age, or temperature, or some other historical information relating not only to the rocky mopuntains (or alps; but to also a very lareg are surrounding those mountains.

    Anyone who has seen a sizeable tree crossection in a museum or a park display, can clearly see, that even in a single section, the tree rings are far from uniform thickness, and one would expect every other property of the material to change around the tree, depending on whether it was on the sun side, or shade side of the trunk. Throw in the additional change in section with height , of the section, and you can see that a core sample is an extremely poor sample of a large three dimensional object.

    About the one thing we can say about that core sample, is that we are fairly confident of the age of each ring layer. Then certain compositional changes in each layer could be indicative of other parameters, such as C14 dating each ring sample, to get a picture of C14 production rates, since the real age of the sample is known.

    But as a thermometer; try putting a wooden tongue depressor in your mouth, and then asking the doctor to look at it to see if you are running a fever.

    A totally crazy idea to believe that tree rings through thick and thin can, tell the temperature under which they were laid down; uncorrupted by any other variable, such as available water, sunlight, soil nutrients and so on.

    But for a real crazy idea; pray tell me how a sum of squared values (of a real physical variable presumably) can ever go negative; let alone continue to do so.

    Has anybody ever actually read on a real instrument with the suffix “meter”, or “ometer” any imaginary number of even a complex number. What physical processes yield observable values that can be actually measured, that are imaginary or complex .

    Engineers at least tend to take the position that if they can get an answer (mathematically), it must be the real answer; so they don’t worry much about existence theorems. No engineer could care whether an absolutely convergent infinite series converges; but mathematicians feel they have to prove that is true.

    So nyet ! on your increasingly negative sum of squares parameter; it’s a gremlin in your spaghetti code.

  177. If this is what Dr Phil had to do to get a graph that looks like the American temp trends we’ve really got to wonder which cherries have gone into their pies.

    That’s another way of saying if CRUtemp has such a tenuous grasp on reality and it’s the same as the other 2 leading temp indicies then presumably they must be wrong too?

  178. BOTO (09:17:11) :

    Hi Anthony,

    last dinner at Copenhagen!
    I love it!

    Great picture, but I have a question. Which one is Judas?

    I must object I really must. This artwork does NOT include Prime Minister Kevin Rudd of my country Australia ( the one with the dodgy data – see above).
    Now Kevvy is a Friend of the Chairman at Copenhagen and -as he is wont to do – a great strutter on the international stage especially when the big fellas (like Obama) are there too. He believes unquestioningly in the IPCC and is about to wreck the country that 2 generations of my family have fought for with a Carbon Pollution Reduction Scheme law.
    This cretin really deserves to be in the picture – so please fix!
    PS He would do a beautiful JUDAS.

  179. All these nasty interpretations of the CRU programmers’ notes are really just a simple problem of linguistic translation. You folks simply do not understand “AGW speak”. Let me enlighten:

    “Artificially adjusted” means “teasing a signal out of random noise – I’ll know when I’ve found it by the smile on Dr. Jones face”.
    “Real” means “something the general public can measure with their own thermometers”.
    “Very, very negative” means “very, very positive” (it’s sarcasm, you dolts!)
    “Dummy stations” means “smart stations – the ones on which to place the greatest statistical weight – they weren’t slipped into the base for no reason,” (sarcasm again).
    “False references” is a slang term meaning “all data that ever came out of Australia”.
    “Oh fuck this” means “This body of work is so pure and elegant that I have developed an unnatural attraction for it.”


  180. Roger Knights (13:02:50) :

    M.A.DeLuca (10:42:18) :

    “Hysteria, I’m not so sure. O’Reilly has, I believe, said that Global Warming is real and something needs to be done about it. So did McCain during his bid for the Presidency. That indicates a fraction of the conservative base had been convinced this was a real issue that needed to be addressed.”

    I saw a comment somewhere (here?) a couple of days ago that McCain has, in the past two months, backed off from his support of CAWG.

    Your premise is wrong on two points:
    1) Conservatives do not have a list of talking point they must believe in.
    2) The term conservative means different things to different people.

    Senator McCain is a Republican, not a conservative. I don’t know many people who would call McCain a conservative, except for a handful of issues.

    O’Reilly has said it’s obvious pollution must be doing something to the planet — which shows he believes greenhouse gases are pollutants.

  181. 1) I don’t understand the technical issues referred to in the above notes and I suspect none of the people saying ‘this is proof of fraud’ understand them either.

    2) I can’t believe anyone is surprised by coders swearing and getting frustrated with the code they are working on. It doesn’t mean anything in itself, except that everyone hates their job sometimes.

    Conclusion: I am not able to draw a conclusion from this data.

    Anyone who thinks they can draw a conclusion from it is just seeing what they want to see.

  182. “Robinson (11:58:42) :

    Ironically, the University of East Anglia has a Computer Science department:
    ……………………………………………………………
    The fact is that they have the expertise on campus to engineer a decent piece of software.”

    I tried to get a computing mathematician/statistician M. Sc. Student for a Summer or for a project while I was part of UCL. No way. The departments were not keen on letting the little birds out of the nests and doing actual problems. It was too difficult to mark their assessment if they actually helped scientists at the coal-face.
    There is a Statistics unit at UCL, and in most other large Universities, who will aid you in your stats. The main problem is the ‘road to Cork’ problem; too many scientists take their data to the stats people at the end of a study, rather than before they start. Slight changes in experimental design make all the difference to what statistics one can apply. I had a good experience with them, it is disconcerting when they tell you they are not interested in your data, but in what you want your data to be able to explain in a testable manner.
    We should have to go back to the basics I’m afraid, and teach the Ph. D students both ethics and statistics, right at the beginning as it is apparent that the mentoring system has failed massively.
    Can you imagine what it must be like for the Ph. D’s and Post-Doc’s at UEA now? They have screwed up their whole lives by association.

  183. Just for the sake of fairness, please bear in mind that ‘hide the decline’ is referring to the ‘divergence problem’
    – the divergence problem is the fact that the tree-ring data doesn’t track temperature after about 1960.
    – the tree-ring growth has actually declines since then in many, but not all, NH tree-ring data sets.

    – so the hockey-teams hide this by stopping the plots in 1960 or 1980, and also seem to mix in a bit of real-temp, in order to give their proxy plots a nice up-tick at the end, as if to say ‘the plot would have continued to go up had we not stopped here…’

    – although the decline has been known about for 10 years or more (mentioned in the 1998 Nature article), the reasons for this decline are not known….

    – so I do think it is under-hand to ‘hide’ this decline, even if it is ‘hidden in plain view’ so to speak….

    – once could speculate that if the proxydata doesn’t track temps reliably in the current time, then they may not track-temp in historical times as well…..

    Also, the programmer Harry is dealing with getting the HADCRU temp v3.0 going…not hockey-sticks…

  184. Yeah, I know, I’m an a$$#0!e, but I’ll tell you what’s going to happen.

    The poor schmuck who wrote these comments is going to be blamed for the whole fiasco…UNLESS he can PROVE that he passed these complaints along to Jones. Otherwise Jones will be SHOCKED, SHOCKED, I tell you, to find out that these things were going on and he was not told.

    Whoever you are you’d better start looking for cover, because they are coming after you.

    Write it in stone.

  185. “One of the most damaging emails was sent by the head of the climatic research unit, Phil Jones. He wrote “I can’t see either of these papers being in the next IPCC report. Kevin and I will keep them out somehow – even if we have to redefine what the peer-review literature is!”

    Censorship and manipulation.

  186. Hey folks, we have another climategate. This one is down in New Zealand.

    http://nzclimatescience.net/index.php?option=com_content&task=view&id=550&Itemid=1

    “There have been strident claims that New Zealand is warming. The Inter-governmental Panel on Climate Change (IPCC), among other organisations and scientists, allege that, along with the rest of the world, we have been heating up for over 100 years. But now, a simple check of publicly-available information proves these claims wrong. In fact, New Zealand’s temperature has been remarkably stable for a century and a half. So what’s going on?” Researchers find records adjusted to represent ‘warming’ when raw data show temperatures have been stable.

    The pdf file is unbelievable. It should also be worldwide news.

    I wish we had “publicly available” information here in the USA. :-(

  187. The corollary to my 14:13:55 is that you will know they have his cojones in a vise when he starts waffling about how it wasn’t really that bad and people are “taking it out of context”.

    Can’t blame him.

  188. Probably been said elsewhere but:

    “Here, the expected 1990-2003 period is MISSING – so the correlations aren’t so hot! Yet
    the WMO codes and station names /locations are identical (or close). What the hell is
    supposed to happen here? Oh yeah – there is no ‘supposed’, I can make it up. So I have :-),

    This is dynamite

    Bring on the inquisition

  189. ; Certain boxes that appear to reconstruct well are “manually” removed because
    ; they are isolated and away from any trees

    LoL.

    :-))

  190. I am a professional mathematician. My only experience with Fortran was a GE program built to compute the path of a sounding rocket (GEMASS).
    I make no claim to be a programmer, aside for my elementary excursions into computing pi to 2000 digits, using an early version of Pascal, and some early experiments with the Mandelbrot set.
    The code which has been published is laughably bad. No wonder the authors refused all requests for the publication of the code.
    The problem is not in the emails. The problem is in the code.
    There is no evidence that the code was ever reviewed, either by a competent programmer or by a peer review agency. This is not science; this is a political crusade masquerading as science.
    When one writes code, it is for the purpose of uniform treatment of a collection of data, as one does with inventory code, accounts receivable code, or any code which processes and interprets raw data.
    That is, of course, all that a computer can do: a staggering number of computations in order to make sense of otherwise unintelligible data.
    Just think of the aerodynamic computations which are now routine: one builds a teraflop machine and tests all of the variations in air flow around an aircraft fuselage, obviating the necessity for model building and wind tunnel testing. The reason: making fiddling changes in the fuselage amounts to a trivial tweak of the code, rather than the skills of a model maker. Aircraft are now designed and built by CAD/CAM, because it is cheaper!
    When one fudges the code in order to alter the raw data, this is called fraud. This is as bad as the Nobel Prize given out a century ago for curing cancer.

    The programmers were not able to compute the distances between temperature sensing sites? Absurd. This is a trivial exercise in spherical trigonometry.
    The inability to use the appropriate language to handle text versus numbers?
    This is a very expensive joke, at our expense.
    If you cannot include in your documentation the use of unique temporary file cache names, the explicit format of your data files, the precise algorithm used to process the data, and the exact parameters which are used to smooth your data, you have garbage.
    Smoothing data is a routine exercise; it is used all the time in handling my favorite kind of data, which is the collection of recorded magnitudes of astronomical objects (the search for variable stars, invisible orbiting planets of other stars, and so on). All such data must be smoothed, because measurement is inherently subject to uncertainty of measurement (thank you, Gauss). The standard normal curve of error always applies.
    The measurement of temperature has only been possible for three centuries or less. When Thomas Jefferson first began his recordings, that was a new thing in the United States. Minimax thermometers are even newer, and thermoelectric temperature sensors newer still.
    All attempts to infer temperatures before the days from the first calibration of temperature on a digital scale are just that: inferences. And unless one is prepared to defend and separate the effects of rainfall, cloud cover, human activity, animal activity, volcanic activity, to say nothing of other variables which are involved with plant and animal growth, we are left with huge gaps in what we know, as differentiated with what we infer.
    There is literally no excuse for this ridiculous trash. All of the participants must be immediately prohibited from any further employment in any scientific endeavor or publication in any scientific journal.

  191. Johnathan Dumas:
    “These proxys do a decent job at matching actual temperatures for the earlier part of the short period (couple of centuries) for which we have thermometer data.”

    There’s some issues with this. Dendroclimatology has not progressed to the point where they can walk up to a tree, examine the external factors (treeline, soil, elevation, bark) and determine whether it will have a respectable chance of being a decent proxy prior to the coring. Nor is it typical to attempt a true calibration after the fact. (By sequestering some data, etc.)

    So they tend to take far more samples than they end up including in the final analysis – because lots of the trees “Don’t appear to have any temperature signal.”

    When you combine that with the divergence of the self-same trees that were previously considered decent proxies as time progresses, you are running across a completely fatal flaw. It is a very strong sign that you’re observing some correlation – but don’t have true causation.

    They’re coring 100 trees (that all meet the “best criteria” for siting etc.) and picking the 10 with the best correspondance with local temperature. Then sweeping the reevaluation of the exact same trees under the rug when they fail to continue correlating. (See: Ababneh.) as well as obstructing other who would like to recore the identical trees.

    They’re making the fallacy of assuming they have a valid proxy because it happens to line up sometimes.

  192. I found the line of code in the program that calculates Temperature

    Tw = 58 + 6 * Log(CO2/ 280) / Log(2)

    REPLY: It appears they are assuming a basline CO2 value of 280 PPM.

  193. Pieter F (09:16:40) :

    “It just gets better and better (worse for them, that is). The question remains: is this enough to overcome the momentum acquired by the AGW ”

    Not at all, it was never about the science, so why should it?

  194. “Michael Manns articles have been published in well respected peer reviewed scientific journals”

    English translation . . me and my buds do the peer-review-each-other thing and since each back is well scratched, we can have our papers published and keep out the papers of non members of our mutual admiration and back scratching club.

    Or something like that . . . and we do the same thing when we write the IPCC reports

  195. “Averaging day and night temperatures has been reported as enhancing perceived warming in England as daytime alone shows less, or no, warming.” – Wasp

    This rings bells – have read something similar about day and night temperatures before, probably at CA. Was it that increased night temperatures are an urban heat island effect? Arg, anyone remember this one?

    (Met Office was announcing today that 2009 begs to be hotter again – BBC are happy to jump in as if being “one of 10 hottest years” is unusual – when you’re on top of a big hill, every step is a high one!)

  196. Do anyone seriously think that a non-partisan review board can be put together to sort all of this out? 95% of those who would be appointed to it already believe that there is no need because “the science is already done”. The only way to shake things up is to come up with a serious class action suit, see it through the courts and get a big pay out. Maybe some of the people squeezed out of jobs or blocked from publishing in professional journals might have a case??? After all they have suffered professional and personal trauma. Also get the courts to mandate a review of all the information and findings. I am not a lover of our activist court system in the US but they can mandate that things get done when the politicians refuse to move. And we have a Supreme Court that might be sympathetic.

  197. Someone asked what this Harry thing is all about. From my reading of things, he is trying to duplicate the results of the original 1.0 and 2.0 versions in a new version called 3.0. It appears that along the way, they really did lose either some data, the instructions on how this works, or the people that originally ran it (Tim??), or the code, or several of the above. My take is that Harry was assigned to take what code he could find and make v3.0. To do that he had to first recreate the original results. If it couldn’t reproduce the original results, then it would be obvious that something was wrong. YMMV.

  198. Anyone interested in trees should read up on earthworms and their reintroduction into North America by the European colonists.
    Here is a gem;

    DOI 10.1007/s10530-009-9523-3

    Tree rings detect earthworm invasions and their effects in northern Hardwood forests
    Evan R. Larson, Kurt F. Kipfmueller, Cindy M. Hale, Lee E. Frelich, Peter B. Reich

    Biological Invasions (2009) ISSN 1387-3547 (Print) 1573-1464 (Online)

  199. If it’s that bad, maybe it’s time to start thinking about shooting ourselves. Anyeone here got any good reason why not?

  200. Colin W (12:32:58) :

    As a software engineer I ask: where are the test cases that prove the proper functioning of all this code? Surely there could be test data sets that could be fed in and check that the output is as expected, before applying it to real data? Test handling missing stations, duplicated stations, wildly varying Tmin/Tmax to generate alerts during generation.

    If this is not done then a simple programming bug introdiced when modifying code will remain hidden and be very difficult to discover. Only code reviews/blind luck would catch this, without test cases.

    Perceptive comment. I doubt the climate ‘scientists’ involved ever found time or advice for this sort of rigor.

  201. I just saw this issue discussed on CNN for the first time. The spin was amazing. They stated the old 2500 scientists support AGW and mentioned Obama was still planning on going to Copenhagen and committing to a 17% reduction in emissions by 2020 and 80% by 2050. They also quoted polls that stated 72% of Americans believed in AGW.

    So, the game is on …

  202. I hate to say it, but some of the code cited in FOIA\documents\osborn-tree6\briffa_sep98_d.pro does not exist in my copy of that file. Specifically the line which creates the densall object is not in that file. Also, if you look at that file you will see that the yearlyadj values are *NOT* used in the code. At one point they were, but in the extant version of the code that line is commented out.

  203. Robinson (11:58:42) :

    > Ironically, the University of East Anglia has a Computer Science department:

    > The fact is that they have the expertise on campus to engineer a decent piece of software.

    Yes, but do they teach software engineering?

    I like to distinguish between science and engineering as:

    Science is the act of developing new tools (e.g. wheels, thermometers, IR sensors)

    Engineering is the act of developing new systems out of tools created by scientists. (e.g. cars, weather stations, remote sensing satellites).

    I know several very good computer scientists who could never become good software engineers, they get distracted by new things and don’t focus on the system at hand.

    The lack of polish, inattention to details, likely development on smaller datasets than what Harry is using, etc, shows up throughout Poor Harry’s Diary. The overflowing sum of squares in the standard deviations calculation is a likely a good example.

  204. I sincerely hope the police investigating the hack/leak from CRU have retained a backup of the CRU server at the time the crime was reported, so that this code can be verified to be exactly what CRU have been working with.
    Otherwise we might find in the course of the investigation that CRU’s server has unfortunately suffered a catastrophic and unrecoverable failure.

    Whoops.. global warming ate my hard drive.

  205. I’ve made a good living as computer programmer,and I do stuff like that all the time. If you saw my code, you’d see comments like “The customer insists that I do this even though I know this make the outcome skewed in a [positive/negative] direction.”

    In one case, I was adding on to an existing program that needed to find the volume of spheres, to calculate material loss (metal). The original program had the volume of sphere as 3/4 (PI)(R³). Obviously, the fraction has been flipped by mistake. When I brought this to the customer’s attn, I was told to leave it alone! For if I changed it, the LOSS would’ve counted LESS (cuz Vol was correct, and higher),and thus the loss reimbursement (which translated into local stimulus dollars) would be lower! So I added a comment that said: “Of course I know the correct formula for volume of a sphere, but the customer insists on the wrong formula.”

    I have numerous examples of customers insisting on SLOWER-PERFORMING processes, because they didn’t want to raise the expectation of the end users, who might learn to expect data to come back “instantaneously.” So I would add comments like “If you ever want this process to perform better, do X, Y and Z. The customers has requested such-n-such roadblock.”

    My point: This happens ALL THE TIME, and the programmer is almost ALWAYS working “for someone else,” and not for themselves. “just doing as I’m told.”

  206. Proving a graph was produced by a programmer who was working with approximations, estimates and a specified goal may say something about that programmer and the process. The code can be replaced, the programmer can find another job.

    But the fact remains, humanity is polluting the world and could easily do something about that. Global warming and climate change are symptoms of pollution. It is not very progressive to carry on poisoning the air.

    Free, clean energy is available if we invest in the right things and create solutions. The oil industry is not interested in free energy. But humanity will evolve more quickly.

  207. Anthony

    Joseph in Florida and KlausB’s messages above concerning ‘Climategate II’ in NewZealand deserves a look and a post of its own.

    KlausB’s linked pdf – of manipulated temp records – shows a direct connection with CRU…

    “Dr Jim Salinger (who no longer works for NIWA [National Institute of Water & Atmospheric Research]) started this graph in the 1980s when he was at CRU (Climate Research Unit at the University of East Anglia, UK) and it has been updated with the most recent data.”

    REPLY: I’ll have a look – A

  208. 1st. The HARRY_READ_ME file is a collection of notes from the programmer, (unknown), to Ian ‘Harry’ Harris.

    J.Ferg (09:55:10) :

    E.M.Smith is better qualified to answer this & I believe GISTemp does the same thing with station Temps. You note that Months with 9999 are ignored but what isn’t obvious from that is that there may only be one day of readings missing. The obvious thing to do would be to take the average of the adjacent days to salvage the month but they don’t, they just write the month off.

    Talking of E.M. Smith. I believe he’s fixed the -ve sum of squares problem, though I’ve not been over there to find out.

    Hank Hancock (12:39:10) :

    I’m so happy I’d put my drink down, you would have owed me a new keyboard & monitor ;-)

    DaveE.

  209. American Free Thinker has an EXCELLENT article the uncovers the code that actually shows tampering with tree ring proxy temperatures to make it match instrument temperatures from 1930 to 1994 in a folder attributed to Michael Mann and between 1904 and 1994 in other code titled briffa_Sep98_d.pro and briffa_Sep98_e.pro.

    From the article:

    http://www.americanthinker.com/2009/11/crus_source_code_climategate_r.html

    In fact, workarounds for the post-1960 “divergence problem,” as described by both RealClimate and Climate Audit, can be found throughout the source code. So much so that perhaps the most ubiquitous programmer’s comment (REM) I ran across warns that the particular module “Uses ‘corrected’ MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”

    Here’s the “fudge factor” (notice the brash SOB actually called it that in his REM statement):
    yrloc=[1400,findgen(19)*5.+1904]

    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor

    These two lines of code establish a twenty-element array (yrloc) comprising the year 1400 (base year, but not sure why needed here) and nineteen years between 1904 and 1994 in half-decade increments. Then the corresponding “fudge factor” (from the valadj matrix) is applied to each interval. As you can see, not only are temperatures biased to the upside later in the century (though certainly prior to 1960), but a few mid-century intervals are being biased slightly lower. That, coupled with the post-1930 restatement we encountered earlier, would imply that in addition to an embarrassing false decline experienced with their MXD after 1960 (or earlier), CRU’s “divergence problem” also includes a minor false incline after 1930.

    And the former apparently wasn’t a particularly well-guarded secret, although the actual adjustment period remained buried beneath the surface.

    Plotting programs such as data4alps.pro print this reminder to the user prior to rendering the chart:
    IMPORTANT NOTE: The data after 1960 should not be used. The tree-ring density records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations. In this data set this “decline” has been artificially removed in an ad-hoc way, and this means that data after 1960 no longer represent tree-ring density variations, but have been modified to look more like the observed temperatures.
    Others, such as mxdgrid2ascii.pro, issue this warning:
    NOTE: recent decline in tree-ring density has been ARTIFICIALLY REMOVED to facilitate calibration. THEREFORE, post-1960 values will be much closer to observed temperatures then (sic) they should be which will incorrectly imply the reconstruction is more skilful than it actually is. See Osborn et al. (2004).

    Care to offer another explanation, Dr. Jones?

  210. The debate about the contents of the HARRY_READ_ME.txt and the validity of the programming and modelling techniques is something only experts and argue over.

    However, what the lay person only needs to know this about the programming (which they can verify for themselves from the HARRY_READ_ME.txt file), this file is a THREE YEAR journal of a CRU programmer describing everything he tried with the data and models in an attempt to REPRODUCE existing results CRU had published. Comments in the file make it clear that “HARRY” tried FOR THREE YEARS (2006-2009)to recreate CRU’s published results AND FAILED.

    Do you all see the REAL significance of this because it is absolutely fatal to the credibility of anything CRU has produced.

    What we have here is a documented THREE year effort by a CRU programmer, who had access to all the data, access to all the code, access to all the people who developed the code and the models and still HE could still NOT duplicate CRU’s OWN results. If he can’t it simply means the CRU’s results cannot be reproduced even by themselves and so there is no point anyone else even trying — CRU themselves have proven it’s a waste of time and so they themselves have proven their own results are plain rubbish. That means any “peer reviewed” document CRU produced along with any other papers that cited the CRU papers are based on data the CRU themselves can’t verify.

    Besides, the absolutly sorry state of affairs in the data handling and software managment the HARRY_READ_ME.txt reveals, the utter and total mess of CRU data and software this document reveals is WHY CRU has not released its data and model software.

    Given the CRU is one of the most, if not the most cited sources of climate data — upon which trillions of dollars of economic policy is being set, the importance of what the HARRY_READ_ME.txt file reveals becomes scary.

    A very nice layman’s summary of some of the issues in the HARRY_READ_ME.txt can be found here

    http://www.devilskitchen.me.uk/2009/11/data-horribilis-harryreadmetxt-file.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+TheDevilsKitchen+%28The+Devil%27s+Kitchen

  211. Copenhagen will go on pretending that nothing has happened because it is part of what documentary filmmaker Ann Mcllhenny calls “the moveable feast.”

    But I do not despair that “we are too late.” The global warming alarmists are at the stage where there are gearing up to demand more more than millions for research and obedient lip service. Now they are gunning for trillions in payments and global governance to enforce the payments. There is going to be pushback from the nation states of the developed word against this, and those doing the pushing are looking for evidence. They have reasons to protect the CRU leak story from being suppressed, as the German princes had reasons to protect Martin Luther from being burned as a heretic.

  212. John Peter (10:55:08) : “…What do you think about this one? It looks as if the “data adjustment contamination” has infected New Zealand as well….”

    Looks like they need to call in Phil Jones’s dog to take care of that nasty old ‘uncorrected’ data.

  213. “The poor schmuck who wrote these comments is going to be blamed for the whole fiasco…”

    I don’t think so. It’s pretty clear from the comments that he has been told what to do and that he doesn’t like it. He clearly states that he has been told to make up data.

    I think that the threat to bring law enforcement into this is an idle threat. CRU has way to much to loose to start a legal process against the whistle blower. If they do, the whistle blowers defense attorney will have access to anything and everything that is in the CRU archives – as though he doesn’t already have enough to justify whistle blowing. CRU would like to have this thing crawl under a rug and go away, not to have it fought out in a highly publicised legal battle.

  214. Nicholas Alexander (15:24:49)

    [snip] no one here is suggesting wilful and wholesale pollution is a good thing – far from it. I work for a major international oil company and I do not have space to tell you about the time, effort and resources we spend to minimise our impact on the environment – steps which cost $$ by the way.

    The issue is whether the science that has been shown to be “proven” is, in fact, flawed.

    turning to your last paragraph

    “Free, clean energy is available if we invest in the right things and create solutions. The oil industry is not interested in free energy. But humanity will evolve more quickly.”

    this is wrong on so many levels,,,,, “free” energy? eh? so all the staff who work in this fabulous power plant work for nothing? “invest” – erm – this implies money movement from those who have it (investors) to an enterprise in the expectation of a return…..

    The oil industry is not interested in free energy – erm – no – I guess we are not. I offer no defence to the need to make a profit, pay dividends and prop up the investment funds that support millions of pensions………….

    Humanity will evolve more quickly? [snip] No response to this as I have no clue what you are talking about!

  215. In case people haven’t found it yet, there is another Zip file in the documents folder entitled “mbh98-osborn” the size is 44.6 megs when extracted, and has other .tar files within it.

    Maybe some of the more knowledgeable people here would care to give that a looksy.

  216. Tilo Reber 15:52:27 ” CRU would like to have this thing crawl under a rug and go away, not to have it fought out in a highly publicised legal battle.”

    I have no doubt that is what they would like. I doubt that is what they will get.

    I work with lawyers (US admittedly) almost every day. Every one I know would be after this guy like a duck on a junebug if they have to go to court – unless he has his butt covered.

  217. Robert Wykoff (14:30:24) :

    Could you point out that line of code to me. Six degrees per doubling is crazy. I’d like to try to see what calculation it goes into.

  218. Another tidbit for digestion: If you look at the file hadcrut3_gmr+defra_report_200503.pdf in the disclosures, you’ll find a report to a funding agency titled “Development of the global surface temperature dataset HadCRUT3″ by Philip Brohan, John Kennedy, Simon Tett, Ian Harris and Phil Jones. It’s dated March, 2005. From the document routing head information it seems this was a deliverable from two contracts, one called “Revised optimally averaged global and hemispheric land and ocean surface temperature series including HadCRUT3 data set ” and the other “Report on HadCRUT3 including error estimates” both with the same investigators as on the report itself. There’s a magic contract number, MS—RAND—CPP—PROG0407, that when fed into Google comes up with a number of other reports suggesting that this was an omnibus dataset gathering and update project funded at CRU by DEFRA (UK Dept. for Environmental, Food and Rural Affairs).

    This is the description of the reported activities: “Since the last update, which produced HadCRUT2 [2], important improvements have been made in the
    marine component of the dataset [3]. These include the use of additional observations, the development of comprehensive uncertainty estimates, and technical improvements that enable, for instance, the production of gridded fields at arbitrary resolution. This document is a report on work to produce a new dataset version, HadCRUT3, which will extend the advances made in the marine data to the global dataset. The work is being managed in the Hadley Centre, but part of the work to be done needs expertise from CRU, so a contract has been placed with CRU to fund them to work on the project in collaboration with Hadley Centre staff. ”

    The final paragraph gives a purported status: “We are making good progress towards the production of an updated version of the global historical surface
    temperature dataset HadCRUT. This new version will be based on improved observational data, will have comprehensive error estimates, and will have associated local and global average time-series that are produced using fully tested methods. ”

    Note again that this is submitted in March, 2005. Now we have the following from poor old Harry’s READ ME, at least notionally dated to 2006+:
    “22. Right, time to stop pussyfooting around the niceties of Tim’s labyrinthine software suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the definitive failure of the entire project..”

    This looks like ‘Harry’ (possible Ian Harris, possibly his coder) is in fact attempting to generate the dataset referenced in the report, having to go back in part to previous work by ‘Tim’ to do so since intermediate data had been discarded, and some of the original perhaps ‘lost’. In spite of the reported ‘good progress’, he’s still deep in the trenches, and apparently continued so until some time in 2009. One wonders how happy the DEFRA folks were with the actual status of the work?

  219. Goggling “Climategate” on the web gave nearly 3 million hits.

    One response in Computer World on an article about how to prevent this type of hacking said it all for Wattsupsters.

    Submitted by Anonymous on November 25, 2009 – 16:23.

    “Excellent advice.

    Still I thank God, the spirit realm, and the Angel of Hacking who saved us from the plot to harm us economically through the man made global warming hoax.

    Thank You!

    Thank You!

    Thank You!

    A lifetime of thank you’s is not enough.”

    To which we all say, “Amen”.

  220. Anyone have any idea of the purpose of the variable called “Cheat” in

    cru-code/linux/mod/ghcnrefiter.f90 ??

    It appears to be used in an adjustment but the writing out of the adjusted data seems to be commented out in this version (maybe debugging output?) but it is added to the array Addit(XYear).


    do XYear = 1,NYear ! adjust addit data
    if (Addit(XYear).NE.MissVal) then
    New=Cheat+(Multi*(Addit(XYear)**Power))
    ! write (99,"(i4,3f10.2)"), XYear,Stand(XYear),Addit(XYear),New !
    Addit(XYear)=New
    end if
    end do

  221. Curious:

    ;
    ; Reads in the gridded Hugershoff MXD data, plus the regional age-banded and
    ; regional Hugershoff series and attempts to adapt the gridded Hugershoff
    ; data to have the same low-frequency variability as the ABD regions.
    ; The procedure is as follows:
    ;
    ; HUGREG=Hugershoff regions, ABDREG=age-banded regions, HUGGRID=Hugershoff grid
    ; The calibrated (uncorrected) versions of all these data sets are used.
    ; However, the same adjustment is then applied to the corrected version of
    ; the grid Hugershoff data, so that both uncorrected and corrected versions
    ; are available with the appropriate low frequency variability. There is some
    ; ambiguity during the modern period here, however, because the corrected
    ; version has already been artificially adjusted to reproduce the largest
    ; scales of observed temperature over recent decades – so a new adjustment
    ; would be unwelcome. Therefore, the adjustment term is scaled back towards
    ; zero when being applied to the corrected data set, so that it is linearly
    ; interpolated from its 1950 value to zero at 1970 and kept at zero thereafter.

    ;
    ; (1) Compute regional means of HUGGRID to (hopefully) confirm that they
    ; give a reasonably good match to HUGREG. If so, then for the remainder of
    ; this routine, HUGREG is replaced by the regional means of HUGGRID.
    ;
    ; (2) For each region, low-pass filter (30-yr) both HUGREG and ABDREG,
    ; and difference them. This is the additional low frequency information
    ; that the Hugershoff data set is missing.
    ;
    ; (3) To each grid box in HUGGRID, add on a Gaussian-distance-weighted
    ; mean of nearby regional low frequency, assuming that the low frequency
    ; information obtained from (2) applies to a point central to each region.
    ;
    ; (4) Compute regional means of the adjusted HUGGRID and confirm that they
    ; give a reasonable match to ABDREG.
    ;
    ; For some regions (CAS, TIBP) the low frequency signal is set to zero because
    ; the gridded data gives a quite different time series than either of the
    ; regional-mean series. Also, for those series limited by the availability
    ; of age-banded results, I set all values from 1400 to 50 years prior to the
    ; first non-missing value to zero, and then linearly interpolate this 50 years
    ; and any other gaps with missing values. Any missing values at the end of
    ; the series are filled in by repeating the final non-missing value.

    Source:
    osborn-tree6/mann/abdlowfreq2grid.pro


  222. jgfox (16:32:31) :

    Goggling “Climategate” on the web gave nearly 3 million hits.

    Just performed Google search on “Hide the decline” (within double quotes) –
    – showed OVER 2.7 million hits!
    .
    .

  223. A while back somebody generated a US temperature record for the 1900s using 5 consistently rural stations from around the country. It showed clearly that the 1930s were the warmest, we dropped until the late 70s and have been zigzagging slowly up until 1998, and then leveled off and declined. The 1998 peak was equal to only 1953 when we were already cooling – still about 0.5 deg less than 1938!. If you take the fabricated warming trend from 1978 and cut it in half, it looks a lot like this more realistic data. At the very least, the data was corrupted with UHI, which was not adequately removed, but it appears that this was not good enough and they had to make things more drastic and radical. It’s like an addict who slowly takes more and more, “it’s just not warm enough yet!”


  224. Nicholas Alexander (15:24:49) :

    Free, clean energy is available if we invest in the right things and create solutions.

    Using the web as a resource, pls cost-out what it would take to construct on-site wind and solar and associated storage facilities (for when the sun does not shine or when the wind does not blow) to power a 1500 sq. foot house in the north, the south, and the southwest. Wind and solar charts are available to assist in planning for various times of the year. Bulk steel and aluminum pricing is likewise available. These costs will NOT include engineering time (for calculations, the creation of fabrication drawings et al) that will be required to actually build these components.

    Putting all this together will create in effect a BOM (bill of materials); also include an estimate of the machining costs and the on-site installation costs (usual costs for tradesmen, a crane for the windgenerator etc). Again, we are not including engineering costs.

    Finish that, and next lets scale that up for industrial use at a small factory …

    Are you capable of this sort of exercise or does it just stop at wishful and idyllic but nonetheless empty platitudes?
    .
    .

  225. Twenty years as a Big 8 CPA computer systems audit manager. This garbage would have gotten the front doors locked, the SEC notified, and a forensic audit started.

  226. All you people complaining about how Briffa and that crowd are trying to hide something in their MXD analysis, read this:

    Briffa, K.R., Schweingruber, F.H., Jones, P.D., Osborn, T.J.,
    Shiyatov, S.G. and Vaganov, E.A. 1998: Reduced sensitivity of
    recent tree-growth to temperature at high northern latitudes. Nature
    391, 678–82.

    Yep, a great way to hide something is to publish about it in a paper in Nature. It’s real smart, cleaver even: hide it in plain sight and establish an active line of research on this interesting observation.

  227. _Jim (18:23:13) :

    yup – my point exactly – it realy pisses me off when people talk about “free” anything. OT but heard someone praising the “free UK health care system” – with no reference to the bloody tax we pay !!!!!! Sheesh – liberals!

  228. Rattus Norvegicus, RC refugee at (18:27:52) writes :

    All you people complaining about how Briffa and that crowd are trying to hide something in their MXD analysis

    Do they explain ‘artificial adjustments’ in those pubs?

    Do they explain the reason for the values of the series of terms appearing in the code?

    Are these values related in anyway to physical processes in tree ring growth?
    .
    .

  229. Rattus,

    It’s been mentioned that Keith Briffa appears to be at least somewhat concerned about the shenanigans. Maybe he’s a straight shooter, I don’t know. People can make up their own minds about him. Here’s a good place to start: click

  230. Rattus Norvegicus (18:27:52) :

    All you people complaining about how Briffa and that crowd are trying to hide something in their MXD analysis, read this:

    Briffa, K.R., Schweingruber, F.H., Jones, P.D., Osborn, T.J.,
    Shiyatov, S.G. and Vaganov, E.A. 1998: Reduced sensitivity of
    recent tree-growth to temperature at high northern latitudes. Nature
    391, 678–82.

    Noted.

    http://www.climateaudit.org/?p=529

  231. Nicholas Alexander.

    Lots of things sound good, but wind and solar will never save us. There is no way to keep humanity warm, fed, industrious and ALIVE in the northern parts of the Northern Hemisphere without fossil fuels of some sort…or nuclear. Just can never happen from logistical and engineering aspects.

    Unless, or course, we kill off 90 percent of humanity. Perhaps you will volunteer for agathusia … or perhaps the more distasteful, aschimothusia?? ☺ Step right up. No waiting. Take one for the Gipper, and you will spend eternity with virgin Goracles. It’s true. ☺

    Solar and wind (or whatever the heck you are talking about..perpetual motion perhaps ☺ ) may be great for back-to-the-lander acreage owners, but not for the masses in big northern cities when it is -35°C.

    Solar and wind indeed have applications, but for the masses on this planet they are nothing more eco-weenie dreamin’.

    It is hard to hide the decline in commonsense on this earth. ☺

    Except here at WUWT … where commonsense prevails. Keep up the fight Anthony. Well done.

  232. Nicholas Alexander (15:24:49) :

    “Free, clean energy is available if we invest in the right things and create solutions. The oil industry is not interested in free energy. But humanity will evolve more quickly.”

    Nick, TANSTAAFL! (There ain’t no such thing as a free lunch.) Free? Even with slaves at pedal-power generators, you must feed, clothe, warm, and house your power source. Ain’t free! Everything that produces power demands investment, development, and management.

    Reality is that hydrocarbons (coal, oil, gas) provide extremely dense power sources. Wind, wave, tide and solar are defuse, requiring concentration. Basic physics. Access effeciency depends on density. Only nukes can compete with fossil fuels, but nukes cannot run planes, trucks, trains and cars. (NO! Batteries cannot compete with gasoline without subsidies.)

    Clean? If you weren’t an urban dweller in the1940s and ’50s in the U.S., you don’t know what dirty is. Compare today’s US cities to China, India, or even Central Europe. Your comment reveals your energy and economic ignorance.

  233. crosspatch (17:39:28) :

    Anyone have any idea of the purpose of the variable called “Cheat” in

    cru-code/linux/mod/ghcnrefiter.f90 ??”

    Look up a few lines. It’s a variable.

  234. Glenn:

    I meant the purpose of the variable. I know it is a variable. That is why I asked:

    “Anyone have any idea of the purpose of the variable called ‘Cheat'”

    As in: has anyone looked yet at that portion of the code and decided what Cheat does?

  235. “A site in Finland suggests master.src.com is temperature related, but there’s a lot of speculation flying around the Internet regarding the leaked files at the moment, so can’t be certain.”

    Apologies if this was addressed earlier as I didn’t read every comment, but I believe that “master.dat.com” is the temp file, not master.src.com. The format looks standard to what is seen with GISS and NCDC. The temps are degrees Celsius multiplied by ten. For instance, 222 is 22.2 deg C (72.0 F). Most of the values appear to range between -250 and 320 or so, consistent with measured temp values.

    Thanks for your other comments regarding the code. Anyone know if there is a thread somewhere dedicated to the CRUt code? It would be nice to pool information without having to wade through 300+ comments.

  236. Holy Carp! I’ve been going through some of the files, but haven’t been able to devote the time like Ecotratus. Good work; and shocking in the depravity of the people who wrote this carp and presented the results as the work of the very most Tip-Top Climate Scientists; their own work!

    I work for a mere commercial oprganisation that must make a profit from the products we sell. We would be unemployed tomorrow producing carp like this.

  237. See, here’s the thing about that “Cheat” variable. The comment for the subroutine says:

    !*******************************************************************************
    ! adjusts the Addit(ional) vector to match the characteristics of the corresponding
    ! Stand(ard), for ratio-based (not difference-based) data, on the
    ! assumption that both are gamma-distributed
    ! the old (x) becomes the new (y) through y=ax**b
    ! b is calc iteratively, so that the shape parameters match
    ! then a is deduced, such that the scale parameters match

    And:

    New=Cheat+(Multi*(Addit(XYear)**Power)) looks a lot like y=ax**b except there is something being added to the result of ax**b to increase its value. In the context of what we have already seen with “Fudge”, I am curious of the purpose of “Cheat” and what influence it is having.

    And I can’t help but wonder if “Harry” was a bit upset over his pay slip and decided to go back and rename certain variables to what they REALLY did :)

  238. Rattus Norvegicus (18:27:52) :

    You’re missing the point with that publication. It isn’t about “cooling”; it’s about “reduced sensitivty” (to warming?).

    I haven’t read the paper but I suspect it argues for ramping up, or explaining away the lack of, the warming signal.

    Hey, I’m not only an engineer who understands mass, length and time, but also language usage. And, if you really insist, I’ll dig the article out at the local University.

  239. crosspatch (20:12:57) :

    Glenn:

    I meant the purpose of the variable. I know it is a variable. That is why I asked:

    “Anyone have any idea of the purpose of the variable called ‘Cheat’”

    As in: has anyone looked yet at that portion of the code and decided what Cheat does?”

    Oops. You might have to explain that to me again. :)

    Well after a quick look, I’d say it modifies “addit”;
    “adjusts the Addit(ional) vector to match the characteristics of the corresponding Stand(ard), for ratio-based (not difference-based) data, on the assumption that both are gamma-distributed”

    in the subroutine “MregeForRatio”, which is then called in another routine, eventually ending in “Trustworthy”.

    Bet you’re overjoyed with my insight! Hey, whadda ya expect, I’m a VB guy.
    I got lost trying to find out what the heck the product was (look at the Trustworthy comments), thought that if I could see that I could begin to see what some of the variables represented.
    No such luck. I learned long ago I couldn’t even follow my own code unless I named things plainly, even if it took more characters it’s worth it. That also eliminates some doc coding, and that’s another story itself.

    Before I understood what is being processed (other than a guess of station selections based on database values of temperature and such, I’d have to have the data and run the program to see the output.

    Figure this out?

    “if Omit AND Need/Force are set, the default is to allow Omit missing values in the needed/forced period; if this is undesirable set MakeAll if MakeMore AND Need/Force are set, no ref ts will be calc unless > than N/F”

  240. Robert Wood of Canada (21:19:23) :

    Rattus Norvegicus (18:27:52) :

    You’re missing the point with that publication. It isn’t about “cooling”; it’s about “reduced sensitivty” (to warming?).

    I haven’t read the paper but I suspect it argues for ramping up, or explaining away the lack of, the warming signal.

    Hey, I’m not only an engineer who understands mass, length and time, but also language usage. And, if you really insist, I’ll dig the article out at the local University.”

    If you do, get the supplemental pages on Mann’s 1998 hockey stick paper.
    The Briffa 1998 paper is referenced.

  241. crosspatch (20:55:42) :

    And I can’t help but wonder if “Harry” was a bit upset over his pay slip and decided to go back and rename certain variables to what they REALLY did :)”

    Can’t blame you for that. “Cheat” doesn’t sound like a short form of a bigger word, like “Addit” for “Additional” vector(?). It makes no sense except in the obvious sense. Even a temp value used for local calculating purposes wouldn’t be called “Cheat”, I’d use “locval” or something. It’s Cheat on the brain.

  242. Is there any code in the leaked file that allows for fudge factors related to degrees of freedom? The degrees of freedom statistic (+ or – whatever) could allow you to write code that allows you to input the top end of the band (add warming in later years and subtract warming in earlier years) so that statistically you could still be within the margin of error in the hockey stick. Legal yes, ethical, hell no.

  243. Here is the article refered by RealClimate on the issue of divergence:

    http://www.nature.com/nature/journal/v391/n6668/full/391678a0.html

    It says:
    “The cause of this increasing insensitivity of wood density to temperature changes is not known, but if it is not taken into account in dendroclimatic reconstructions, past temperatures could be overestimated.”

    I don’t get it. Why does the code increase the temperature if it is already overestimated?

  244. James Hastings-Trew (09:16:45) :
    “For the current temperature records pains would have to be made to either use only station data that is free of UHI effect, or (somehow) to figure out what the heat signature of each station is in relation to the rural surroundings, and use that as an offset of some kind.”

    Inspection of long-term land surface temperature record averages derived from raw temperature data (unaffected by data alteration or recalibrations – this is very important) for long-term sites measured in Stevenson screens not influenced by resiting or surrounding development – for instance, sites located on coastal headlands or rural sites well away from any development or heat absorbing surface (airport weather stations, mostly, will not do) – clearly shows temperature oscillations or cycles in accord with natural climate variability (and possibly, a very small atmospheric CO2 component consistent with measurable CO2 increases, both natural and anthropogenic) such that average temperatures at these sites today are somewhat similar to those recorded over the last eighty or ninety years, ie they lie within the bounds of normal variability. Most of these unaffected sites show no signs of global warming.

    Surface temperature increases that can be associated with anthropogenic warming are derived from measurements affected by the urban heat island effect, changing land uses, altered albedos, deforestation or simply bad temperature sensor placement, such as near air conditioning outlets, asphalt pavements, buildings, etc – it is these affected temperature records, by being included in the climate models, that suggest we are heading for catastrophic global warming or runaway climate change.

    Of course, it is suggested that the affected sites’ measurements are recalibrated to take these affects into account, which simply suggests that the figures are unreliable – all these recalibrations needed to be independently tested. If the recalibrations were accurate then they should accord with the raw data taken from sites unaffected by any change in their surrounds. In other words, unaffected long-term sites show an oscillation; affected sites show an increase – if anthropogenic CO2 was truly driving runaway global warming then the unaffected sites should show a similar increase.

    This would be a good exercise for high school geography students. The biggest problem though is that raw temperature data is increasingly hard to access. NASA GISS Temp had raw figures from around the world presented numerically and as graphs, which made this an easy exercise but they took those pages down some time ago (they could be up again).

  245. You cannot make it up. What a shambles … A sum of squares going negative big time? Have these guys ever heard of overflow?

    It seems to me that we are witnessing the undressing of the AGW emperor: code that isn’t even wrong. One might as well use random number generators.

  246. WW @21:50: Yes, I’m confused by that as well. I thought the logic went that if trees were “peak clipping” the warm temperatures now (for example, because some other factor, such as moisture, becomes limiting), then the transfer function from temperature to ring width/wood density is non-linear (like a gamma curve, gamma<1). If you then reconstruct the historical temperature still assuming it is linear, you will *underestimate* the temperature during warm periods when the trees were clipping before.

  247. I absolutely love this comment:
    Mark Wagner (10:01:51) :
    The tree-ring density’
    printf,1,’records tend to show a decline after 1960 relative to the summer’
    printf,1,’temperature in many high-latitude locations. In this data set’
    printf,1,’this “decline” has been artificially removed in an ad-hoc way, and’
    printf,1,’this means that data after 1960 no longer represent tree-ring
    printf,1,’density variations, but have been modified to look more like the
    printf,1,’observed temperatures.’

    they have committed fraud. plain and simple

    Using a print statement to hide a bodge isn’t a good way, Looks to mas if they are being open with the change

  248. Eric (11:26:56) :

    I also wondered why they didn’t use an existing database system. One that is spatially aware would make their job much simpler.

    My guess is that they aren’t programmers and don’t know any better.

    I am currently converting such a “homemade” database. The difference is that this one works, but is a real pain to maintain.

  249. Am I stupid????

    If you square a real number it is always positive…. and if you sum the squares then the sum of the squares must also be positive….

    Do we have imaginary temperature?

    Holy cow… what is with these guys?

  250. This reeks of a disgruntled employee (The very worst kind from a data security perspective in any organisation), someone who has been working under challenging conditions for sometime. Or was a pinhead and too cocky, maybe thought s/he was worth more, or someone, as suggested, who knew what was going on (Fudging).

    But it could just be a scam/rouse as I have suggested before in another thread. The timing is just too coincidental IMO.
    y’know
    But then when one considers NIWA did the same as CRU, well…y’know, it sounds too good to be true.

  251. “The tree-ring density records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations.”

    Could this be because the temperature measurements were higher than they should have been (due to UHI effects?). Why should tree rings suddenly start behaving differently from 1960? I would suggest that the tree ring data is as reliable as it’s always been but recorded raw temperatures have been inflated by some means.

  252. I know it is still early days but it really is important to identify the papers which relied on this code.

    Thanks, and good luck to all who are working so hard on this.

    This is reminiscent not so much of Piltdown Man as it is of Lysenko. Who said it could never happen here?

  253. physics geek: if you’re a professional programmer, I’m surprised you hadn’t heard about the awful quality of scientific code before. That’s also part of the reason they didn’t hand their software over to the people looking to challenge AGW; if someone really wanted to retest the calculations, the correct method is to take the raw results and write their own software.

    (Also, all the comments about “hiding the decline” tell us… nothing we don’t already know. They’re all related to the tree data, which is known to produce a decline in temperature after 1960 that doesn’t match any other figures. Mark Wagner: this is why that comment doesn’t show fraud. Chances are any papers making use of the figures mention the issue.)

    lichanos: you’re interested in GIS, they’re physicists mathematicians. For some common forms of mathematical modelling, Delauney triangulation makes sense. It constructs a relatively well-behaved triangulation over a given set of points.

  254. Raredog: of course, any results claiming no overall warming, like the ones you mention, need close scrutiny. Is the selection of which measurement sites to ignore influenced by their temperature figures, for example by subjecting sites that show warming to more scrutiny than sites that don’t? How does the removal of urbanised sites and ones with changed land use affect coverage of different countries?

  255. makomk (04:52:35) :

    Chances are any papers making use of the figures mention the issue.

    Perhaps in vague terms. What we do no is that subject of splicing or grafting instrumental data onto proxy data was specifically denied (if you’ll pardon the expression) by one Michael Mann.

    http://www.climateaudit.org/?p=438

    …if someone really wanted to retest the calculations, the correct method is to take the raw results and write their own software.

    Perhaps you can point us to the “raw data”, including which specific stations were used for their “calculations”. Mind you, don’t play the game equivalent to saying “it’s in the New York Times someplace”. The specific stations used.

    Are we also free to come up with our own “fudge factors” and “cheat” parameters?

  256. makomk (04:52:35) :

    “physics geek: if you’re a professional programmer, I’m surprised you hadn’t heard about the awful quality of scientific code before. That’s also part of the reason they didn’t hand their software over to the people looking to challenge AGW; if someone really wanted to retest the calculations, the correct method is to take the raw results and write their own software.”

    And when that someone “retested” using supplied data and their own coding and failed to get the same results, the squabble could very well be about the coding, not about the data or the stated methodology and logic. Code is only a translation of the stated methodology. It’s really perverse to consider that code should not be made available. That includes the excuse of “awful quality”.

    The only time code should be protected is when it’s about the code itself. Last I heard, climatology isn’t about the code. Or maybe it is.

  257. recent decline in tree-ring density has been ARTIFICIALLY’
    printf,1,’REMOVED

    Should be broadcast on C-Span as part of a Senate hearing.

  258. ‘density variations, but have been modified to look more like the
    printf,1,’observed temperatures.’

    [snip]

  259. MattN: actually, decline.pro seems fairly boring. It’s not fudging anything so much as it’s trying to measure the discrepancy between tree growth measurements and the actual temperature. Y’know, to help try and figure out why the discrepancy exists.

  260. Apply a VERY ARTIFICAL correction for decline!!

    A few people have commented that the code is worse than the emails. My God were they right!

  261. If the non-coders were to realize how bad this really is, someone need to photoshop an image showing what a bridge (for example) would look like, when constructed using craftsmanship standards corresponding to those revealed by the HARRY_READ_ME.txt. I can imagine the outcome, but I don’t know how to use Photoshop.

  262. *** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

    God bless the people that released ClimateGate to the public!! What is revealed here is atrocious! These are heinous violations of science! Heinous!

  263. Obvious explanation (09:22:20) :

    But gavin says we’re taking this out of context…..

    No, no, no, we’re getting it in context. We’re just not getting it in his context.

  264. Tim S. (09:23:11) :

    “; we know the file starts at yr 1070, but we want nothing till 1400, so we”

    Hide the MWP?

    No wonder whatever data is put into their program it produces a hockey stick!

  265. Charles Higley: oh dear, you do realise the flaw in that? There are many different combinations of 5 rural stations that could be used. Since the only people that would be looking at small numbers of rural stations like this are anti-AGW, any result that doesn’t produce cooling would be irrelevant and not get reported. Therefore a classic form of bias is introduced.

  266. This is making me bury my face in my hands and my hands are shaking. I can’t believe what I am seeing in this code.

    I can’t believe people pulled this kind of a thing. Maybe HARRY couldn’t believe it too. Maybe (they) he is the wistleblower.

  267. @ Gene Nemetz (11:38:37) :

    Is the fence around the weather station an attempt to, preemptively, block a FOI request?

  268. Gene Nemetz (12:03:12) :

    Obvious explanation (09:22:20) :

    But gavin says we’re taking this out of context…..
    ———-

    No, no, no, we’re getting it in context. We’re just not getting it in his context.
    ============

    Bingo!

  269. makomk (11:45:26) :

    MattN: actually, decline.pro seems fairly boring. It’s not fudging anything so much as it’s trying to measure the discrepancy between tree growth measurements and the actual temperature. Y’know, to help try and figure out why the discrepancy exists.”

    Well, maybe the temperature 500 miles away taken by Bubba in 1903.
    The “discrepancy” if you are referring to the divergence problem or challenge, has not been resolved to date, and there is little alleged identification and documentation of past divergence. There are regional differences as well as species specifics, not to mention others known, and unknown that may affect tree growth. It’s unclear what causes “divergence” now, it’s unknown whether it can be detected in historical records of even recent age that we have some idea of temp, and it’s unknown whether any temperature reconstruction based on tree ring width/density is accurate to any degree worthy of error inside of today’s climate. And that even assuming reconstructions are based on accurate and complete local temperature/pressure/etc data used to calibrate tree ring growth in order to reconstruct actual pre-historical temperatures. It’s hogwash.

  270. You guys are referencing COMMENTS in the source code, not the source code itself. As a software developer myself, it is plain to see the comments don’t indicate a fraud, they indicate frustration at the large data set and the fact there’s no data integrity indexes. This is not a smoking gun. It’s a bunch of software guys complaining about the database.

  271. And by the way, the statistical methods they are using require data sets from two different methodologies to be processed separately. That’s exactly what they’re doing when they ‘scrub’ the data past 1960. They are reconciling data gotten from atmospheric temperature readings with data gotten from tree ring readings. They are then splicing the two graphs together. It’s not a fraud, guys. It’s ridiculous to assume that.

  272. Eric (18:26:06) :

    “You guys are referencing COMMENTS in the source code, not the source code itself. As a software developer myself, it is plain to see the comments don’t indicate a fraud, they indicate frustration at the large data set and the fact there’s no data integrity indexes. This is not a smoking gun. It’s a bunch of software guys complaining about the database.”

    I’ve seen no comments in the source code that indicates frustration with the database, plainly or otherwise. Certainly not in the code excerpts posted in this thread. I can only surmise that you are confusing the Harry text with code, which it is not, and which is not even the subject of this thread, bringing into question your claim of being a software developer, nor the same “Eric” that has posted previously in this thread.

  273. Eric,

    IIRC, there’s a big question as to whether the tree ring proxies are even valid for temperature up to 1960. And what the heck are people doing when they slap real world temperatures on top of questionable extrapolated temps up to 1960?

    And what exactly do you mean by “there’s no data integrity indexes”? That sounds not-quite-kosher.

  274. Eric (18:27:41) :

    “And by the way, the statistical methods they are using require data sets from two different methodologies to be processed separately. That’s exactly what they’re doing when they ’scrub’ the data past 1960. They are reconciling data gotten from atmospheric temperature readings with data gotten from tree ring readings. They are then splicing the two graphs together. It’s not a fraud, guys. It’s ridiculous to assume that.”

    I’d say you’re a troll, and that is a ridiculously lame attempt to defend the graph.

    ****Moderator****, could you check to see if this is the same Eric who posts here, such as:

    From Eric (13:05:27) :
    “There is no exaggeration in the above statement. No need for it. It is absolutely unbelievable to me that this crap was published.”

  275. makomk (12:14:58) :

    There are many different combinations of 5 rural stations that could be used.

    I see, so when you said

    <…if someone really wanted to retest the calculations, the correct method is to take the raw results and write their own software.

    you recognize that it’s critical that everyone be talking about the same stations.

    Do you accept that CRU needs to tell everyone which specific stations they used?

  276. John M (19:32:51) :

    makomk (12:14:58) :

    There are many different combinations of 5 rural stations that could be used.

    I see, so when you said

    <…if someone really wanted to retest the calculations, the correct method is to take the raw results and write their own software.

    you recognize that it’s critical that everyone be talking about the same stations.

    "Do you accept that CRU needs to tell everyone which specific stations they used?"

    And the specific data the CUR used from them.

  277. As a none too bright middle-aged woman who hasn’t had a science class since the 1970s I would like to thank everyone who has posted on this site. You have all helped me understand a little better what all of this means. I have read many on-line articles and comments over the last several days, but this has been the most helpful. Thanks.

  278. Eric (18:27:41) :

    They are then splicing the two graphs together.

    Oh, I never get tired of doing this.

    No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstruction. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum.

    Michael Mann, 2004

  279. John M (20:27:02) :

    Eric (18:27:41) :

    They are then splicing the two graphs together.

    “Oh, I never get tired of doing this.

    No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstruction. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum.

    Michael Mann, 2004″

    Splicing isn’t grafting – Michael Mann 2010

  280. I am truly affraid they will move forward on sheer momentum at Coppenhaggen as someone mentioned.

    Governments fails us all.

    If the mainstream media does not cover this more that they have to this point, the masses will continue believing what they are told. A sad state of being.

    But who cares right…that New Moon movie is out now, and that is so much more important than all this (waxing sarcastic!)

    Please inform your friends, e-mail them, do anything to get the word out faster than our failing news institutions can.

    Everyone should be made to know about this. Heads should roll.

    Cheers to whoever leaked this, hacked this and released it!

  281. I certainly found this amusing. Mr. Ian Harris, who was mentioned in a post above as a possible “Harry”, has his job desciption as “……. climate scenario development, data manipulation and visualisation, programming”. Maybe data manipulation is a standard term, but it gave me a chuckle anyhow.

    http://www.cru.uea.ac.uk/cru/people/

  282. As usual, I’m pointed to one of these sites as evidence of the ‘proof’ of a global warming fiddle… and find a string of irrelevant splodges of evidence.

    If anyone can show me which of these comments is supposed to indicate fraud, rather than correcting for the divergence problem and other problems in multi-proxy analysis, post your result here:

    http://anarchish.blogspot.com/2009/11/challenge/

    And I will donate £50 to the charity, church or political organisation of your choice.

  283. Wow, admittedly i tend toward believing AGW, greenhouse effect, Climate Change, whatever you call it, but i come here and find:


    “You guys are referencing COMMENTS in the source code, not the source code itself. As a software developer myself, it is plain to see the comments don’t indicate a fraud, they indicate frustration at the large data set and the fact there’s no data integrity indexes. This is not a smoking gun. It’s a bunch of software guys complaining about the database.”

    I’ve seen no comments in the source code that indicates frustration with the database, plainly or otherwise. Certainly not in the code excerpts posted in this thread. I can only surmise that you are confusing the Harry text with code, which it is not, and which is not even the subject of this thread, bringing into question your claim of being a software developer, nor the same “Eric” that has posted previously in this thread.

    Wow, so you’re saying that:

    “OH FUCK THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done I’m
    hitting yet another problem that’s based on the hopeless state of our databases. There is no uniform
    data integrity, it’s just a catalogue of issues that continues to grow as they’re found.”

    does not “indicate frustration with the database” and that its source text is not being discussed here. That would put you in the category of liar along with AGW by the standards of this thread.

    But the really great thing is this thread is so open-minded that the one opposite opinion gets shouted down straight away, called a troll and a fraud.
    Ladies and Gentlemen, the defenders of free f**king speech!

    Now i’m not saying that I think the views of this thread are wrong, in fact it seems to be backing up the view that the underlying theory of this simulation is incomplete at best. For what it’s worth, i think Eric has a point, but i would say that even accusations of poor programming are bit harsh. You guys are forgetting that programming well-defined problems like accounting and graphics packages is not the same as duplicating partially understood processes like large-scale climate (neither have i, but i’ve spent time trying to work through how one would simulate ecologies realistically, and it does not convert into easy classes and relationships).

  284. Jonathan May-Bowles (04:43:48) :

    If anyone can show me which of these comments is supposed to indicate fraud, rather than correcting for the divergence problem and other problems in multi-proxy analysis, post your result here:

    Can we agree on a definition of “fraud” first?

    We can start with my trusty Merriam Webster’s Collegiate Dictionary, 10th Edition .

    1 a: DECEIT, TRICKERY; specif: intentional perversion of truth in order to induce another to part with something of value or to surrender a legal right b: an act of deceiving or misrepresenting: TRICK 2 a: a person who is not what he or she pretends to be: IMPOSTOR: also : one who defrauds : CHEAT b: one that is not what it seems or is represented to be.

    Since I assume you’re not referring to specific individuals when you say fraud (as in “they are frauds”), can we assume you’re comfortable with definition 1? My choice is 1 b, especially the synonym offered.

  285. Oh well, had a clever response to

    Jonathan May-Bowles (04:43:48) :

    but it seems to have disappeared (too many HTML commands maybe).

    Turns out his URL is dead anyway.

    “Sigh”

  286. matthew (10:18:53) :

    Wow, admittedly i tend toward believing AGW, greenhouse effect, Climate Change, whatever you call it, but i come here and find:


    “You guys are referencing COMMENTS in the source code, not the source code itself. As a software developer myself, it is plain to see the comments don’t indicate a fraud, they indicate frustration at the large data set and the fact there’s no data integrity indexes. This is not a smoking gun. It’s a bunch of software guys complaining about the database.”

    I’ve seen no comments in the source code that indicates frustration with the database, plainly or otherwise. Certainly not in the code excerpts posted in this thread. I can only surmise that you are confusing the Harry text with code, which it is not, and which is not even the subject of this thread, bringing into question your claim of being a software developer, nor the same “Eric” that has posted previously in this thread.

    “Wow, so you’re saying that:

    “OH FUCK THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done I’m hitting yet another problem that’s based on the hopeless state of our databases. There is no uniform data integrity, it’s just a catalogue of issues that continues to grow as they’re found.”

    does not “indicate frustration with the database” and that its source text is not being discussed here. That would put you in the category of liar along with AGW by the standards of this thread.”
    **************************************

    No, but that could easily put you in that category. Before you accuse another lying you should at least double check your facts.

    “Source text” is not “source code”; “Harry” is not source code, nor is “Harry” comments in source code. I didn’t claim Harry did not indicate frustration with the database, and I didn’t say Harry wasn’t being discussed here. I said Harry wasn’t the subject of the thread, and it isn’t. The source code is the subject of the thread: “Hide the decline – codified”.

    Go back to RC, where Gavin snips because “don’t put words in other peoples mouths”.

  287. John F. Hultquist (09:21:16) :
    “a sum-of-squared variable is becoming very, very negative!”

    This is an interesting “trick.” Has this issue been figured out yet?

    Yes. I did, here:

    http://chiefio.wordpress.com/2009/11/21/hadley-hack-and-cru-crud/

    Code is in the link. They square an INTEGER (that does a bit overflow into a negative number) then stuff it into a REAL (preserving the broken negative). There is an interesting exchange in comments with “Steve” (who I suspect is a ‘Team Member’) who tries to justify this broken code by pointing out that “Harry Readme” picked the one bad data roach that caused this to crash the system out of the HadCRUT stew (but left in any other roaches not swimming on the top…)

    I’ve posted the specific “fix” needed to make this one specific issue go away along with a general description of how to properly handle the data “preening” for bogus values (see the comments).

    Did Harry ever figure it out?

    No. It’s still in the code (specific lines quoted in link). “Harry Readme” (I quote since it is not clear if Harry was the writer or the intended reader of the file) did find one very large wrong data item in the input and plucked it out (took the roach off the top of the stew…) but left the rest of the code and data ‘as is’.

    This means that any OTHER large wrong datum can still be squared and cause a bit overflow of the INTs into a negative number, but if it is a small enough negative number, the “sum of the squares” might stay positive (even if totally wrong and fictional) and thus give completely BOGUS output. (i.e. he left any roaches swimming below the surface of the stew still in place…)

    Douglas DC (09:21:35) :
    hmmm (09:14:27) :

    It’s almost as if whoever wrote these notes wanted us to find them.

    Positively Freudian.I know enough about programming though no expert,
    just from my own limited experience the commands and data input are
    to quote:” Crap crap ‘..

    Programmers can’t even get their bosses or work mates to read their code most of the time. They become conditioned to the idea that the only guy who will ever read what they put in the comments is the next poor sot to be assigned this bucket of warm spit to swim in… and that may be none other than themselves a year from now when they’ve forgotten some hideous bit.

    As a consequence, comments in code are often very blatant and very directly honest, in a cynical kind of way. After all, it’s just you talking to a future you… Saying ” Buck up, kid. I made it once. You can too… and until then, you won’t believe this stupid human trick: {cite} “

  288. rbateman (11:22:54) : Depends upon whether you can trust GHCN with Jones/Karl online now is not the mangled mess HARRY was tasked with while supposing that the MasterDB will somehow miraculouly appear in a pristine state.

    A light slowly dawns…

    Is GHCN the work product of NCDC and Karl?

    GHCN is thorough cooked by thermometer deletions on / about 1990:

    http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/

    has the ‘by latitude’ analysis while the ‘by altitude’ is still in the right sidebar as the articles are still being done… I’ll add pointers to the ‘global analysis’ in the next day or to for the ‘by altitude’ studies shown mountains systematically deleted from the record.

    I thought GHCN was a NOAA product… then again, with all the constantly changing acronyms hiding which department is a subset of what other department, keeping it straight has been a bit hard… “Get your Org Chart, Get your OOORRg Chart here! Can’t tell the players without an Org Chart!!”…

    IFF GHCN is a synonym for “NCDC Data Set” then it’s all very very clear… both HadCrut and GIStemp are modest re-digestions of GHCN… no wonder they would all “agree”…

  289. Oh, almost forgot.

    In one of the two threads, the comment is made, something along the lines of ‘waiting for Steve M and Anthony W to realease their personal e-mails’

    Couldn’t waste my time responding that neither are govt. funded, and as a result don’t fall under any FOIA requirements.

    You want to take the money, you have to play by the rules.

  290. Eric (11:26:56) : … From a s/w engineering perspective, it would have seemed wise to have used an existing DBMS that had been extensively tested and verified. … It would most likely have been quicker and more reliable to use existing software tools like DBMS.

    You, sir, are a professional and competent programmer (but you knew that already and don’t need external validation..)

    FWIW, that was exactly what I thought once I figured out that 90% of it was “glue data together and report it” and only about 10% was ‘make modest perhaps bogus changes’. It is also what some other folks have said in comments where we’re looking at the code.

    This whole thing could be sucked into a RDMS with about 1/10 th the effort…

    hitfan (11:28:11) :… I was working 14+ hours a day and being placed on 24/7 pager duty with no extra compensation (imagine getting called during your grandmother’s funeral or while in Church on Christmas Eve–yes it happened to me),

    And you sir are an under appreciated professional being ground down… but you knew that, too… Having carried ‘the duty pager’ 24 x 7 for 2 years “I felt your pain”…

    Yes, I confess that I programmed a spam dialer during the dotcom crash in order to pay the bills. And there were swear words in the source code! :) (and I don’t feel the least bad about it).

    Were I the “project manager” I would have had to officially waggle a finger at you during your review (and explain that you needed a pre-editor sed script to turn “”Overlord” into “system” and “Snoring” into “. *** … and only check in the “Overlord/ Snoring” version into the SCCS.. ;-) Then I’d have taken you out for a beer or two and prescribed some ‘off time’ to get your sprits more positive and remove the ‘root problem’ of extreme frustration…

    g/Snoring/s//….ing /g (It’s a Linux/Unix thing…)

    The company is now dead and bankrupt and I danced a little jig when I found out about it a few years later!

    Once or twice I was sucked into some bit of code-whoring against my will and under contract… I’ve done a jig or two myself… But when you are a contract programmer and sales already signed the contract, it’s a bit hard to tell them to stuff it. The spouse frowns on suddenly moving into a tent…

    But seriously, having a set of “personal sed scripts” can work wonders… (sed is an automatic editor). I once did a script that let me write what looked rather like a gothic novel (yes, I was very bored at that site…)

    “Verily Overlord did slay and flog”

    Call Validation; call system (yz); if (true) execute (foo) and exitprocess.

    Yeah, it’s a bored programmer trick…

    But: It was really fun when folks would look over my shoulder and asks what the heck I was doing… then I’d type: runtest and the whole translate, compile, and run would happen… and it was great fun to watch them realize that the ‘stuff’ I had written really DID run 8-)

    Oh, and since the macros where much shorter than the expansion, I could crank out more total code per keystroke and it was actually faster to write, so I could even justify it rationally ;-)

    And yes, only the translation was actually submitted as final ‘work product’…

  291. E.M.Smith (14:50:54) :
    rbateman (11:22:54) : Depends upon whether you can trust GHCN with Jones/Karl online now is not the mangled mess HARRY was tasked with while supposing that the MasterDB will somehow miraculouly appear in a pristine state.

    A light slowly dawns…

    Is GHCN the work product of NCDC and Karl?

    Starting from a google of NCDC I drilled down their web site and then the ftp links until I ended up in a VERY familiar place… the directory where GIStemp downloads the GHCN data.

    So I’ve answered my own question: GHCN is the same as NCDC data set. And I’m already investigating it and finding it bogus.

    So many names and organizational perks; so little work that is really different or of value.

  292. Jean Bosseler (13:03:02) :
    This type of incident is clearly signaled with another error message and the programmer would know, at least I hope he is knowing what he does.
    A very, very negative number is still a number and not an owerflow!

    Um, no. I’ve got sample code up showing no error message. It’s an integer overflow, nothing more. I also show the exact lines in the program that do it… See the “Hadley Hack” link a few messages up thread.

  293. politicoassassin (14:10:51) : 1) I don’t understand the technical issues referred to in the above notes

    Anyone who thinks they can draw a conclusion from it is just seeing what they want to see.

    No. Some of us, like me, do understand the technical issues. We can very easily conclude:

    1) There code management stinks.
    2) They have ‘goal seeking behavoiur” in their “science”.
    3) They have no QA and the code is buggy.
    4) The results are worse than worthless, they are a deception by design.

  294. It is very sad and disappointing to find highly distinguished professionals indulging in such unethical activities.Society expects better conduct from these seekers of truth.Falsehood and manipulations are expected to be foreign to these scientists – cream of the society.

  295. Bruckner8 (15:23:17) :
    I’ve made a good living as computer programmer,and I do stuff like that all the time. If you saw my code, you’d see comments like “The customer insists that I do this even though I know this make the outcome skewed in a [positive/negative] direction.”

    Oddly, the early parchment copies of things done by very dedicated monks sometimes have comments in the margin like: “I think this is sacrilege but I must copy the document faithfully”.

    Seems that scribes of all ages have had that in common… “The customer wants it this way, despite my protests, so I’m doing it, but don’t blame me…”

    If only the patrons were so bound to honesty and fidelity …

  296. Lets be clear here. Climate change does take place. It always has and always will. A proper look into the past clearly shows this. However Co2 has so very little to do with climate change. Nature has to do with climate change. Ask anyone of these people how much cap and trade will reduce changes in the climate and listen closely to the answers they try to give you…if any.

    Climate change is natural, we must accept that and then we can move past that idea to see through this cap and trade nonsense for what it is. An attempt to control, and makes ridiculous profits for some.

    Create a problem that does not truly exist and then manufacture a way to make money from it. This can also be observed throughout history.

    Al Gore stands to make billions, and that doesn’t account for the money used so far to fund this shady science premise.

    Still no MSM coverage save one little CNN video and FOX’s coverage.

    The biggest global money making scheme ever and based on what?
    In the long run this will make the bailouts look like chump change, and freedom will continue to erode.

    I wish someone could stop these globalist idiots.

  297. DaveE (15:41:04) : E.M.Smith is better qualified to answer this & I believe GISTemp does the same thing with station Temps. You note that Months with 9999 are ignored but what isn’t obvious from that is that there may only be one day of readings missing. The obvious thing to do would be to take the average of the adjacent days to salvage the month but they don’t, they just write the month off.

    GIStemp gets the data from NOAA (in what it turns out is the NCDC product of GHCN) all ready rolled up into a single “monthly” datum. One has to swim upstream to NCDC / NOAA / GHCN to find out how they decided to put a “missing data flag” in a month (-9999 or sometimes 9999 in some data sets and some steps of GIStemp).

    So to the assertion that a single missing day might cause a missing data flag, I can not speak (yet…) Heck, NCDC could choose to simply fill in any month with a ‘missing data flag’ if it fails their “QA Tests”. (They do something along those lines already). But by the time it gets to GHCN and thus to GIStemp there is no daily detail left.

    To the issue of “data creation”: It is RAMPANT throughout the entire GISS and HadCRUt process. There are so many holes in the data by time and by space that they have no choice but to pick one:

    1) Admit they have no hope of creating a “global temperature” for any significant length of time.

    2) Make up ‘temperature values’ for the 80% or so of time and space that are missing. (The southern hemisphere is substantially empty for the first half of the temperature record and is still remarkably blank. Everything with about 20 Degrees of the North Pole is fabricated. Etc.)

    The links in: http://chiefio.wordpress.com/2009/11/09/gistemp-a-human-view/

    cover it pretty well in detail, while being readable at the top level by anyone.

    Especially see the graph here:

    http://chiefio.wordpress.com/2009/02/24/so_many_thermometers_so_little_time/

    and the coverage charts here:

    http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/

    Talking of E.M. Smith. I believe he’s fixed the -ve sum of squares problem, though I’ve not been over there to find out.

    Yes, I have. A couple of different ways :-)

    It’s a ‘square of integers’ (which can have overflow) problem. There is a commenter “Steve” who asserts it was just a single bad data item and that “Harry Readme” removing it is all it takes to “fix it”. Totally insufficient. There was one bad data item big enough to cause an integer overflow, but there could just as easily be others that did not cause a crash like the one that lead “Harry Readme” to pluck that bad datum from the set. (i.e. there could still be bogus values not yet found).

    There are 3 levels of fix:

    1) Range check in the program (i.e. catch broken large data before it causes an overflow).

    2) The “square of INTEGERs” gets stuck into a floating point number (so an implicit “cast to float” is done). Just change the INTEGERs into FLOATS before the squaring (“cast to float” first) and you eliminate the overflow ( IEEE compliant Floating point math does not overflow) though you might still have wrong too large data in your input. Not strictly needed if range checking is perfect, but a nice bit of robustness anyway. Belt and Suspenders, don’t you know…

    3) Write a “preening” program to check for insane data in the input file prior to running. This can be more detailed than the basic range checks in the program itself. (I.e. the program might just check temps between -90 and +60 C while the ‘preening’ step might assume even 0C was too warm and wrong at the South Pole while -89 C was a possible, yet at the equator might accept nothing below 10C unless at altitude for hot countries…

    This lets you run the ‘preening’ as a distinct step for debug, data quality report and assessment, efficiency, etc.

  298. Jeff C. (20:21:17) : Thanks for your other comments regarding the code. Anyone know if there is a thread somewhere dedicated to the CRUt code? It would be nice to pool information without having to wade through 300+ comments.

    I’ve got a little one going. Started with the one program in the URL, but there is a link to an online archive of the file structure with all the code populated and the “guidance” is to put the link for a particular program you want to comment about in your comment and then add your observations.

    If anyone knows of a ‘bigger discussion group’ doing code review, feel free to add a pointer to it in a comment.

    http://chiefio.wordpress.com/2009/11/25/crut-fromexcel-f90-program-listing/

  299. Raredog (00:15:14) : The biggest problem though is that raw temperature data is increasingly hard to access. NASA GISS Temp had raw figures from around the world presented numerically and as graphs, which made this an easy exercise but they took those pages down some time ago (they could be up again).

    GIStemp does not produce nor take in “raw” data. It takes in GHCN (aka NCDC already adjusted and massaged data) and produces more massaged anomaly maps. You can download the GHCN dataset (aka NCDC data) from their FTP site, but they have deleted close to 90% of the cold thermometer records (for recent years only, leaving them in the older baseline periods…).

    See:

    http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/

    that includes a link to the NCDC / NOAA / GHCN ftp download.

    That’s as close to “raw” as I’ve been able to find so far. They do have some other data sets on the NCDC site that may include more (there are some daily values for example) but I’ve not had time to wander through it all.

    See:

    http://lwf.ncdc.noaa.gov/oa/climate/surfaceinventories.html

    Happy Wading …

  300. Rabe (02:54:21) : Floating point numbers don’t “overflow” this way. They just stay at (+)INF in case of about 10^300 even in fortran (real*8).

    They didn’t square a float, they square an INT then stuff it into a float. The INT overflows to a negative on the squaring…

    from down in the comments of:

    http://chiefio.wordpress.com/2009/11/21/hadley-hack-and-cru-crud/


    Further, to your assertion the changing a bad data item “fixed it” and “the code now works”. It does not. It is just as broken as it ever was. From the code:

    integer, pointer, dimension (:,:,:)             :: Data,DataA,DataB,DataC
    ...
    real :: OpVal,OpTot,OpEn,OpTotSq,OpStdev,OpMean,OpDiff,
    ...
    OpTotSq=OpTotSq+(DataA(XAYear,XMonth,XAStn)**2)
    

    The “square an INT” and stuff it into a REAL running total is still there.

  301. My apologies, Glen, i meant to say you’re a liar by your own standards AND everything that happens to you is somebody else ‘s fault, poor baby.

    “Harry” is commentary on the same source code. It is no different to the comments directly in the code that have been discussed ad infinitum above.
    Eric was perfectly within justifiable bounds to suggest the comments might just be indications of opinions on the work, not implications about its purpose, hidden or otherwise, even if that isn’t the general opinion of the board.

  302. matthew (23:52:22) :

    “My apologies, Glen, i meant to say you’re a liar by your own standards AND everything that happens to you is somebody else ’s fault, poor baby.

    “Harry” is commentary on the same source code. It is no different to the comments directly in the code that have been discussed ad infinitum above.
    Eric was perfectly within justifiable bounds to suggest the comments might just be indications of opinions on the work, not implications about its purpose, hidden or otherwise, even if that isn’t the general opinion of the board.”
    *********************

    You’re not in a position to determine what is justifiable here, after breaking etiquette rules, making rude unfounded accusations followed by intentional misrepresenttation, further obfuscation and misrepresentation… you’ve got it all, warmer. Sounds like you’d fit right in with those in the email files.

    Eric made incorrect, misleading and preposterous statements. One was that Mann had “grafted” the instrumental record on his graph, defending what he tried to characterize as a valid technique.

    Eric said posters were referencing comments IN the code that “indicate frustration at the large data set”. There are no such comments in the actual code.

    Eric claimed the comments IN the code “clearly don’t indicate a fraud”, but clearly the comments in the actual code DOES indicate fraud.

    This was clearly an attempt to “hide the fraud” seen in the code, by use of the strawman “the indicate frustration with the database”. Tactic not unlike that used Eric’s attempt to legitimize Mann’s hockey stick.

    Now to you.

    You claim that the comments in Harry are “no different” than comments “directly” in the code. But general comments in a programmer log is not the same as specific comments in source code. It IS “different”.

    You claim Eric is justified in “suggesting the comments” does not indicate fraud. But they do. The comment about databases does not, then again that comment is not a source code comment but a “Harry” comment. You tried to characterize it as “source text”.

    See the pattern between your argument and Eric’s, matthew?

  303. Climategate

    Let’s see:

    a) subverting the peer review process
    b) stacking the UN IPCC
    c) obstruction of the Freedom on Information Act
    d) breach of university and state ethics codes

    … and we haven’t even talked about the data yet.

    Climate Science – the new Ponzi scheme!

    p.s. – Is this what Science is all about? Meet the new boss (science), same as the old boss (religion). When are they issuing funny hats to scientists?
    p.p.s. – Who needs Wall Street when you have Science?

  304. Looking at the Harry Read Me files online,

    http://di2.nu/foia/HARRY_READ_ME-0.html

    it appears they incorporate other (likely huge) data files. I can imagine that they had so many chances to corrupt their data, if that were their intent. A scientist with a warming agenda could presumably:

    core a strip bark tree where an obvious, recent bulge on one side would yield wider rings.

    select (from many cores) the ones with visibly higher density in modern periods.

    “analyze” the ring widths with a prejudice to fudging the numbers.

    pay off the data entry person to accidentally show more weight in the 20th century.

    hire a data programmer to write the codes, presumably one who is not squeamish about such things, or disgruntled about his job…

    I don’t know if that characterizes this fellow or not. Did he want someone to read this stuff?

    Has this been analyzed by an expert data programmer to full explicate what was done? (Some sort of comparison of graphs which were created here – contrasted to what would have been created without fudging?)

    And (I suppose) the naive question of the year: is there any uncorrupted data still available?

  305. OK people – you have to understand the problems tree ring researchers are faced with to understand what this program does. This is called understanding the ‘problem domain’.

    MXD = maximum latewood density

    It is a known fact that tree ring width is a function of temperature. However, it also a known fact that as a tree ages, the density of tree rings begins to DECLINE after the tree reaches a certain age. This presents a problem if one is to use tree rings to determine temperatures from hundreds of years ago.

    A way around this problem is to take tree ring data from trees that have grown recently (ie. last hundred years) and compare it to actual, observed and measured temperatures, where the tree was alive during the years where temperatures were measured.

    You can see where the decline in tree ring density starts and compare it to actual temperature series.

    Armed with this new data, you now a have a way to ‘calibrate’ your analysis of older tree rings, perhaps hundreds of years old.

    It would appear from some of the programmers comments, that the decline in tree ring density in the samples that he has, corresponds to the year 1960, 1940, etc. So he has to ‘hard code’ his calibration technique for tree rings after that date, so they reflect actual observed and measured temperatures.

    This gives you the ability to look at the tree density of rings from hundreds of years ago, and know that it represents a certain temperature, because you’ve observed the same tree ring density that corresponds to actual measured temperatures.

    It’s valid science, and it’s logical and reasonable.

    Everyone here is taking this WAY out of context. Without seeing the program in its entirety, it’s impossible to put in the proper context and determine exactly what it does.

    Again, he’s not hiding a decline in temperatures, but a decline in tree ring densities!!!

  306. treeringer (20:24:35) :

    Without seeing the program in its entirety, it’s impossible to put in the proper context and determine exactly what it does.

    Really?

  307. I used to work in the parallel scientific computing area and I wish I could still be there but one of the things that really bothered me was how poor the code quality was. What I saw was nothing compared to the Harry description. I don’t believe that Harry was trying to cook the books. (The hide the decline stuff is another issue.) Harry was trying to do the right thing, which was to generate an accurate model, but he was set up to fail. He had a pile of numbers and a vague description of what they represented. He was making hundreds of assumptions about the data. They aren’t all correct. So the resulting model is questionable. It might show things worse than they really are or better. We have no idea.

    Before we change the world’s economy it would be prudent to do a rather large data and code review.

  308. Speaking of tree ring proxies, the ONE thing Prof. M Mann’s bristle-cone proxied “hockey stick” study proved was that nothing has done more to GREEN (verb) the planet over the past few decades than elevated levels of atmospheric CO2 in the presence of moderate sun-driven warming.

  309. treeringer (20:24:35) :

    “It would appear from some of the programmers comments… It’s valid science, and it’s logical and reasonable.”

    So, subjective and arbitrary revisions, for which the only documentation is what one may glean from what “appears” to be in computer code comments, based on some low level computer jockey’s eyeballing of data are “logical and reasonable” “valid science”?

    Not where I come from, Bub.

  310. Nearly all the comments here are making the unwarranted assumption that if the data were 100% complete and needed no interpolations or adjustments for factors such as the well-known tree ring issue (explained above by treeringer at 20:24:35), then it would show no warming.

    Except for paranoia, there is no particular reason to believe that.

    It is equally possible that if the data were complete — and so complete that there would be no need to use indirect proxies for temperature such as tree-ring thickness (which inherently require calibration) — we would see even more clear evidence of global warming. There are numerous indications of global warming completely independent of these temperature records.

    Because this is such an important issue, the proper response is not to deny global warming, but to demand a substantial increase in expenditures to collect better data and more thoroughly and carefully analyze what we have.

    The real scandal revealed by these code comments is that analyses of extreme importance to the world are ridiculously underfunded.

  311. “he’s not hiding a decline in temperatures, but a decline in tree ring densities!!!”

    Sure treeringer, whatever. I guess that there are documented and widely used within the specialty, formulas for adjusting the density numbers? Why then would the programmer use the word “artificial”?

  312. Adjusting the density number for the age of the trees, I mean. This seems like a phenomenon that would be known and measured and calculable.

  313. jm (08:49:56) :

    The real scandal revealed by these code comments is that analyses of extreme importance to the world are ridiculously underfunded.

    You’re kidding right?

    But let’s say for the sake of your argument that it was underfunded.

    Isn’t that the usual state of affairs for a “settled” science?

  314. Lies, damned lies, and statistics.

    The falsification of data and the conspiracy to commit same etc, constitutes serious criminal activity. Further, the granting of public funds for research warrants a federal investigation. I’m hoping the perpetrators, including possibly Professor Michael Mann, director of Pennsylvania State University’s Earth System Science Centre and a regular contributor to the popular climate science blog Real Climate, and their facilitators will be tracked down and prosecuted to the fullest extent the law allows. — Michael Santomauro, Publisher of “Debating The Holocaust: A New Look At Both Sides by Thomas Dalton

  315. Harry’s a hero. He may not have intended to, but he’s done the world a huge favour – without publicly verifiable data, the climate change debate is meaningless.

    Enough crap computer modelling, let’s concentrate on getting decent data.

    And yes, some of us are verbose programmers, especially after a long frustrating weekend trying to clean up some other idiot’s mess, while they’re at the beach.

  316. After reading through Harry’s notes, I had a lot of sympathy for him. It’s not easy puzzling through a mess like this. It’s worse when you conclude your own organization’s data is unusable.

    It’s easy to conclude his predecessor, apparently somebody named Tim, was responsible, but I couldn’t help but wonder if Tim was in a similar position as Harry.

  317. “I know enough about programming though no expert,
    just from my own limited experience the commands and data input are
    to quote:” Crap crap ‘..”

    Anyone who could write “the commands are crap” knows NOTHING about programming.

  318. I have worked as a programmer or other kind of software engineer since 1982, so I am quite familiar with what goes on in software development. On that basis I can tell you all with great confidence: you are ALL reading HARRY_README out of context. There is no justification for claiming that this exposes man-made global warming as a fraud. Likewise, there is no justification for claiming that this shows that all the data on which the theory is based is junk.

    Rather, most of the posts in this blog are a frenzied exercise in the fallacy of quoting out of context.

  319. As I said on http://www.climateaudit.org/?p=2530#comment-183123

    “No professional computer programmer can take the AGW proposition with it’s forecast seriously when these things are based on the most fallible of all man’s creations – the computer program; especially when the underlying programs are coded by non-professional programmers and not independently audited.
    All this discussion on tree rings, SST, Ice Cores, Satellite data etc. is minor in comparison to that simple observation. (It’s still fascinating though).”

    Oh alright! So I am having a smug “told-you-so” moment:-)

  320. Nice work here folks, very nice. It’s tragic that we can’t get the MSM to cover this kind of news. They do a horrible injustice to our country with their blatant bias.

    Hopefully this evidence of fraud and malfeasance can still get out and stop Obama from passing the absurd Cap and Trade law.

  321. Once again many people fall for the selective data trap. Computer programmers and scientists will often feed artificial data into their code to test the response under a known set of conditions. Deliberately highlighting small portions and comments of many different programs does not accurately represent the total. If you would like to gloat and say I told you so, go ahead download the data, write the code and analyze it yourself. There is nothing here to warrant any kind of discussion

  322. I am a theoretical physicist and I am not as surprised as most of the readers seem to be. This is taken out of a context and the only conclusion that I can draw is that we need more data.

    Grid and interpolation problems that the poor guy mentions simply cannot be avoided when one deals with real world data – and sometimes the only way to fix the fitting functions is to insert points by hand. That can lead to a minimal uncertainty of the end result but it’s the only way to do it.

    Also, you have to understand that you feel very frustrated when you work nights and weekends on your projects and code. I also write all kinds of comments in my personal log files. Science is difficult. If we had all the data and knew all the explanations there would be no point in doing research. There is a reason why it’s called ‘climate science’. By the time things get into textbooks countless hours of work have been done by researchers. And even then it’s not perfect. This particular British group is not the only one in the world dealing with the global warming problem. It’s worrying because there exists a global consensus among researchers that the climate change is real and human made. Even if Jones’ group is inventing data in such way to make it more dramatic it will be contradicted by the rest of the scientific community and it will affect their reputation.

    My point: Based on this log alone there is little to conclude except that Harry is frustrated and that more data is needed. Luckily, currently there are several international projects in progress that will increase dramatically the amount of climatological data in the near future.

  323. http://www.sciencedaily.com/releases/1998/08/980814065506.htm

    why discount tree data after 1960? That seems to be the crux. They say tree ring started *not* matching direct temp. readings. Maybe because co2 was distorting tree ring growth verse previouse hundreds of years?

    And of course they used tree data because it goes back thousand years while direct measurement in any reliable sense probably only 200 or so, and that in limited locations.

    Sunshine into the process is a very good thing. If this model is worthless it doesn’t mean climate isn’t being changed– just means this model doesn’t actually speak to the issue. There should be no sides here — it’s a serious concept and deserves science.

  324. I would like to second MG’s point: scientists are always throwing out certain bits of data as nonsense, and sometimes (like Einstein did) they think there model is right and the “data” wrong. Data is always already theory laden. There may or may not have been some malfeaseance here, but the view that “the data” is always king is just naive.

  325. If you would like to see some REAL climate data charts look at:

    http://noglobalwarming.info/CanadianClimate.html

    I have plotted data available from Environment Canada for a bunch of cities in Canada, most from 1940 to present. There is a constant trend (no warming or cooling) from 1940 – 1980, a small rise from 1980 – 2000 and then a decline since then. Real data does not show hockey stick patterns or any co-relation to CO2 levels.

  326. John, please look at where and when you have you have graphed temperature data. It is only in Canada, and it is only from 1948 to 2008. Sadly I feel as though you do not understand the concept of global warming and climate change. Global warming is considered to be a global phenomena, not just a point in the northern hemisphere. If you had even looked at your graphs the only ones which show cooling are the summer averages of Canada and the graph for Vancouver (the farthest south). You are only looking at the summer averages which is probably less than half of the whole dataset. Also your graphs seem to support the theory of Polar Amplification Effect (PAE), which is a byproduct of global warming. This is where the poles tend to feel planetary trends quicker than a global response. You can see this in the Resolute Bay data (closest to the poles) which from 1948 to present shows a 1.5 degree increase. Your graphs also seem to cover only a short time period of when humans have been adding CO2 to the atmosphere. A more appropriate time period would be to graph temperature from the start of the industrial revolution(end of 18th century to start of 19th century) to present since this is when we started adding CO2 to the atmosphere. Also please look up what the hockey stick graph actually is. This graph covers the past 1000 years to the present, not the 70-ish years you have plotted. If you can come up with a better way to reconstruct temperatures from before accurate records please share it with the rest of the world since the best and the brightest apparently couldn’t. Please do ask more questions and read some papers on climate change before you make such sweeping statements.

  327. the sad thing is that, even though these emails ultimately amount to a non-story, denialists will be citing it falsely for years, just like Limbaugh’s volcano fabrication. they read the snippets and quotes and completely ignore the fact that the ellipses on either side are placed very strategically by people who want the emails to be considered scandalous.

  328. My technical writing professor said it correctly “Question what you read because you can manipulate statistics till your had gets sticky.”

  329. From these notes its sounds like the programmer struggling with (1) messy data, (2) a number of necessary modeling assumptions (whether accurate or not based on the literature) being applied to the data. These are universal issues facing any statistical programmer/researcher.

    Taking these out of context without looking at the whole is very misleading. I had a programmer working for me become self-righteous and convinced that I was up to no good, because they didn’t understand the assumptions and model tweaks that the rest of the research literature suggested and other analyses required.

    That said, all of this should be open to cross-validation and replications. All assumptions made transparent to peer-review.

    Imagine if we could see the ‘clean-up’ procedures that big-Pharma uses internally to support approval of a billion-dollar drug…no pressure on an analyst there to throw out an outlier!

  330. While getting my MBA my statistics instructor presented the class with a cute little pocket-sized book with the tongue-in-cheek title, “How to Lie with Statistics.” I have since lost that book, to my dismay. From what I can fathom, it appears CRU used it as their business model. Am tempted to ask if I could have/purchase one of their copies.

  331. The village idiot of the day is Robert Gibbs. Even with all of the evidence from Climategate showing that temerature records were manipulated, Robert Gibbs proclaims climate change is settled science. What a dufus. You would think that guy has been locked in a closet for years.

    Robert Gibbs, lets see what happens if your boss pushes to pass Cap and Tax or goes to Copenhagen and signs a treaty for us to pay other countries trillions of dollars for an issue that is not determined to be an issue afterall.

    Thanks to the Wall Street Journal being about the only mainstream press investigating and giving us insight on the issue.

  332. I am a programmer. I have been an amateur meteorolgist my entire lifetime. Global warming or climate change associated with man has been and will always be a complete and utter farce. The emails can be debated… so beit. The program cannot. It is exact and indicates a complete lack of ethics and professionalism. If this program were used in an pharma or healthcare application, the applicants would die harsh and cruel deaths, nearly immediately. The programmer, the company and the industrial complex that promoted the application would be sued out of existance and the executives sentenced to long stints in prison. I think that the same is due these folks that have been playing loosely with an unsuspecting ‘global’ public, to the order of fraud and racketeering on the public dole. Get em!

  333. It’s SO clear that AGW is one of the biggest frauds ever perpetrated in the name of ‘science’. In this blog, the professional programmers and the Alarmist Loons are given equal space. I suppose that’s fair, but not very productive. Anyone with a working brain can tell who’s who.

    No one has yet mentioned Michael Crichton’s ‘State of Fear’ (HarperCollins, 2004). Sure, it’s an adventure novel, and certainly not ‘science’, per se. Yet the meticulous research that went into it (thoroughly documented in the bibliography) makes it clear that the whole AGW enterprise is ‘agenda-driven’ by large money and political power.

    Peer-review can now be seen for the farce that it is: ‘my peers are the people who pay for the garbage I produce, and share in the government largess which is is our just reward’.

    I’m sickened. A trillion dollars for this insanity?

    The most pessimistic posts are probably right: it is too late to stop the train. The ‘science’ is ‘settled’. And Science is the big loser. We may as well go back to witchcraft, astrology, and eugenics.

  334. The site referred to by ‘aceandgary’ is DEFINITELY a must-see. It uses MSM sources to ‘prove’ that ‘the science is settled’.

    What is much more disturbing is the agenda, revealed for all to see: AGW alarmists are shown to be creatures of the political Left – globalists, socialists, the same ‘share the wealth’ folks that promised us ‘Change’.

    ‘From each according to his ability, to each according to his needs’. Chilling. The fact that otherwise well-intentioned people could still believe this isn’t surprising, but it speaks to what has become of education. The Orwellian nightmare has become reality.

    ‘If you have been through grad school you will realize scientists are always after the truth.’ So naive it’s not worth comment. Maybe ‘aceandgary’ meant GRADE school.

  335. Google is up to speed now: type in ‘cli’ and the FIRST auto-suggest is ‘climate-gate’, 10,400,00 hits!

    And Real Climate is terrified! They’re so worried, they feel the need to refute Crichton’s FICTION.

    I’m starting to love carbon dioxide; my houseplants are getting Greener. Credit goes to Al Gore’s Information Superhighway.

    Keep it up; they’re AFRAID now!

  336. Novice question. Why does everyone assume the code notes are the original and not added later by the hacker/Whisleblower to really grab attention to certain parts for people like you?
    Stupid? Can you change thos notes without effecting program?

  337. JPNJ (18:27:48) :

    Novice question. Why does everyone assume the code notes are the original and not added later by the hacker/Whisleblower to really grab attention to certain parts for people like you?

    Because had that been the case UEA would have been screaming that fact from the rooftops. They have maintained a deafening silence on the Harry file. Something which tells me it is a bona fide piece of code.

  338. The computer program operates upon temperature or proxy temperature data, as I understand this. Suppose one input data, consistent with the temperature being constant for the last 10,000 years. Would this program produce an output which indicated a constant temperature, a rise in temperature or a decline in temperature, over the last 10,000 years? If the code is released, then it may be possible to assess whether it has built-in biases and if these are significant.
    Also, suppose that instead of rejecting tree rings after 1960 as a valid proxy for temperature, the correlation between tree growth and measured temperature post 1960 was used to calibrate the previous tree-rings, what effect would this have upon inferred historical temperatures? Surely, if tree rings are no longer valid as proxies for temperature, then their claimed validity as proxies for previous times is not necessarily true. Likewise any other proposed proxy.

  339. Why all the complexity? I’m not a scientist, but I’d invest most of my time in calibrating the proxy data to the instrument data and then I’d just plot the proxy data. Any other thought process mixed into the method just skews the results in the direction of their imagination.

  340. This is all SO FRAUDULENT.

    All of these ‘warmers’ who keep diverting the topic to the ‘heinous act of stealing emails’ need to go outside and jump in front of a truck.

    THE REAL ISSUE: Why did all of the Climate-Scientists react like they were guilty if they weren’t? Caught with their hands in the cookie jar.

    THE REAL ISSUE: If the data is so accurate then shouldn’t it stand the test of scrutiny in an open investigation? We all know now that these scientists would not provide this data to other scientists who wanted to check the validity of the ‘Consensus’ because they had already made their minds up that such people were ‘denyers’. Maybe they can find the data and we can now get this scrutiny properly started for the first time in 30 odd years.

    THE REAL ISSUE: Why did all of the Climate-Scientists firstly admit that the emails were theirs an then do a backflip….I know, how about legal advice was given to them. That’s cool, but once again, WHY DID THEY ADMIT FIRSTLY THAT THESE EMAILS WERE THEIRS? If they were fabricated the the scientists would have categorically denied fraud.

    THE REAL ISSUE: How does a climate institute responsible for the largest chunk of climate modelling being used by almost every political body on Earth, suddenly lose all of the modelling data for climate models that are being used to structure the biggest ever re-organization of world economics imagineable? Just this fact alone should mean that all of the Copenhageners should CANCEL THEIR VISITS, and start asking questions about why on Earth they were relying on such an institute in the first place to give professional input if they can’t even store and manage their data.

    There are many REAL issus out there, but why do the ‘warmers’ ignore logical proof of manipulation to deceive in order to protect their egos only?

    You people should learn to admit when you are wrong, and if you are involved in the coverup then there are people such as me out there that will now not stop until you are all brought to justice and the extent of this world government conspiracy is known to the public.

    ENOUGH IS ENOUGH. Copenhagen better come and go, because if ay of these stuffed shirts sign anything there will be hell to pay.

  341. Folks, I am workingfor a company that has maintained accurate, independant, temperature records for at least the last 80 years across multiple sites worldwide and a single trend is clear: It is downward – the temperatures are dropping, so much so that they are looking at planting nearer the equator in both hemispheres.

    One lot of monitoring gear ran nice and quietly for twenty years with no modifications and religiously reporting a decline in temperature, quietly tucked away in a corner of a national park – free from any man made sources of heat.

    This data is not unique – any company that derives its income from planting would also have good records. Why can’t NASA and the UK idiots at least release their data for public scruting? Because it has been thoroughly doctored in order to support their beliefs. This is just the most blatant disregard of their own scientific profession – it seems proof is simply not required to support their political and monetary goals but the newspapers just don’t seem to care.

    Thanks god for the US senate and the few people around the world who just won’t take the media crap!!

  342. I never write parts of programs or comments when i don’t use them. That’s too much work. Fraud – plain and simple.

  343. These comments don’t look too bad to me. I work in software development myself, and this all looks innocent enough. Further more, if it is really possible to make a case that tree ring behavior changed past 1960, I don’t see the harm of swapping out data with ground stations that — after all — are supposed to be even more accurate.

    Crap like this shows up all the time in real coding work, and especially in more math-heavy code produced by universities and engineering companies. Honestly I don’t understand all the “developers” who keep saying how evil this is; this sort of sloppyness happens all the time. Sloppy code doesn’t equal inaccurate code; a sloppily-written program may run just as well as one coded better.

    It’s not that sloppy programming always produces incorrect results (though it can), it simply produces hard and expensive to maintain programs.

  344. I should point out that certain industries do have much stricter standards, mostly related to ensuring the behavior of software is always completely predictable and reliable; e.g., the software that runs trains, medical equipment, etc. IIRC this is mandated by law, and as far as I know hasn’t proliferated past those few industries as yet (mostly due to lack of cheap analysis tools I think).

  345. This is horrific.

    The scientist(s) have told the programmer results they want. He has worked all weekend to adjust the data to produce the “correct” results and hide the cooling after certain dates.

    This is not just the smoking gun: this is the bullet in motion being scored by barrel of the gun.

  346. ——————-
    CITE: SidViscous (09:33:44) :

    “Somebody should Slashdot this, it is exactly the sort of thing the programming geeks there will understand and appreciate. They may also be able to provide valuable insight.”

    I’ve been watching Slashdot for this story to come up, obviously this is just the sort of thing that SHOULD show up there.

    It won’t. Slashdot is filled with pro AGW types, and controlled by them. I wouldn’t be surprised if it never show up there at all.
    —————————-

    hahaha

    “it is exactly the sort of thing the programming geeks there will understand and appreciate” and “Slashdot is filled with pro AGW types”

    do you get it? oh please this is so ridiculous!

  347. If you read the code and have any practical programming expertise at all, and if you read all the emails, these documents themselves have little incriminating evidence. The issue isn’t really whether or not they altered the proxy data, it’s whether or not the historical temperature data they used was accurate. Right now there isn’t any damning (in a legal sense) evidence in that regard.

    There are unresolved issues in AGW (e.g. is the water vapor feedback really as huge as the models suggest, what about the data the CRU lost, etc) but I believe the false furor these documents have spawned has really hurt the credibility of global warming skeptics. If you have any sort of expertise at all, it’s obvious the documents themselves aren’t nearly so damning as they appear on the surface.

    In the end, I hope investigation is done into the real inconsistencies, but I worry these documents have damaged the chances of that happening (if I were prone to conspiracy theories I’d say that was why the CRU wasn’t more active in denying the validity of the documents, to lure skeptics into shooting themselves in the foot, but I doubt the decision was that simple).

    The reality is that global warming is happening; the real question is how fast. I don’t have a stance on this myself (I’m not a scientist, nor do I have accurate to the right data), but I’m hoping we have enough time we don’t have to sacrifice world economic growth to fix this problem.

    Interestingly enough, water vapor levels in the atmosphere are the real driving force here; it’s widely believed that the relatively small amount of warming from CO2 causes ocean water to evaporate faster, producing a feedback where the warming temperatures cause more and more water vapor to enter the atmosphere (water vapor is a huge greenhouse gas). The alleged validity of this theory is the defining factor on whether global warming is truly catastrophic or not.

  348. If I wrote code like this I would be facing jail time (I am a finiancial programmer).

    This is criminal

  349. Joeedh

    Feel free to disagree with any or all of my points.

    1. Tree rings were used to reconstruct past temperatures
    2. In order to do that one must calibrate against known temperatures
    3. Briffa produced a reconstruction that showed that there were serious calibration issues.
    4. Jones & Mann used a “trick” to “hide” this fact

    To the extent that any money was obtained by Jones and/or Mann based on the “trick” (given that they knowingly hid facts that were at least pertinent, even material, to the issue at hand) then that is *obtaining financial advantage by deception*

    I’m pretty sure that’s a crime in most countries around the world. Certainly is in the UK (although it may be called something slightly different).

  350. joeedh, with all due respect you state that “I don’t have a stance on this myself (I’m not a scientist, nor do I have accurate to the right data)”, then you confidently propound theories regarding natural feedback loops, the existence of a trend in global warming, and whether this file is damning or not. Allow me to assume that you are also not a programmer.

    As someone with a PhD in Chemical Engineering I at least have a solid grounding in scientific programming (10+ years) and the scientific method (15+years). I do not presume to propound theories on the science indovled (I only maintain a heathly scepticism regarding *any* scientific theory, mine included) but I can judge that this project is typical of a university grade project and simultaneously damages the claim that these scientists have access to the extrodinary evidence required to support their extraordinary claims.

    Again, with all due respect, who on earth are you to judge?

  351. this Tim Mitchell that wrote the code that Harry is dealing with:

    “…………….Although I have yet to see any evidence that climate change is a sign of Christ’s imminent return, human pollution is clearly another of the birth pangs of creation, as it eagerly awaits being delivered from the bondage of corruption (Romans. 19-22).
    Tim Mitchell works at the Climactic Research Unit, UEA, Norwich, and is a
    member of South Park Evangelical Church.”

    from this article:

    http://www.e-n.org.uk/1129-Climate-change-and-the-christian.htm

    google the following sentence for his bio, link
    “In 1997 I moved to Norwich to carry out the research for a PhD at the
    Climatic Research Unit (CRU) of the University of East Anglia. ..”

    Tim Mitchell bio: ( a little bit changed to CAPS by me)
    In 1997 I moved to Norwich to carry out the research for a PhD at the
    Climatic Research Unit (CRU) of the University of East Anglia. My subject
    was the development of climate scenarios for SUBSEQUENT USE BY RESEARCHERS investigating the impacts of climate change. I was supervised by Mike Hulme and by John Mitchell (Hadley Centre of the UK Meteorological Office). The PhD was awarded in April 2001.

    http://www.cru.uea.ac.uk/~timm/personal/index.html – Cached

    These guys found him FIRST . Weeks ago! (about 2/3 down the page)

    http://www.tickerforum.org/cgi-ticker/akcs-www?post=118625&page=13

    To quote Harry: (Dr Ian ‘Harry’ Harris – Harry_Read_Me.txt)
    So what the hell did Tim do?!! As I keep asking.”
    Whilst many people of faith are excellent dedicated professional scientists.

    I have a few doubts that an evangelical eco christian (my label), that
    obviously is passionate and committed to the above, and clearly believes in
    human pollution/corruption, and is aparently anticipating Christ’s return to
    earth due to impending climate armageddon, may not be perhaps open as they may think they are, perhaps they should be, to both sides of the debate. (ie the theory could be wrong!)

    http://www.e-n.org.uk/2625-Day-after-tomorrow.htm

    “The librarian chooses to rescue an old Bible, not because he believes in
    God, but because its printing was ‘the dawn of the age of reason’. In this
    film we see how far we have fallen. Lost, we retreat into a virtual world
    where disaster becomes entertainment and the unreal seems more real than reality itself. ‘For whom tolls the bell? It tolls for thee.’

    Dr. Tim Mitchell, climate scientist”

    Where is Tim, Why can’t ‘Harry’ just ask him?

    Dr Tim, obviously left CRU around 2004, as published research papers dry up
    2004 Dr. Tim Mitchell,formerly a scientist, now a student at LTS

    London Theological Seminary
    Evangelical Protestant college for the training of preachers and pastors.
    Provides degrees up to Masters level. includes course details and resources.

    http://www.ltslondon.org

    This guy has an opinion on the code:

    http://di2.nu/200912/01.htm

    Don’t forget the church he worshiped at(the irony):
    South Park (Al Gore – ManBearPig – South park episode)
    need to save my $ for those CO2 taxes

  352. harpo:

    The tree ring data did turn out to be faulty, and modern models are robust without them. It’s not like they were the only ones used; there’s ice cores, core reef samples, etc.

    The ring data fit the temperatures before 1960. Correcting them to match the observed temperatures after 1960 sounds a bit off, but I’m not a statistician so I’m not totally sure. You could argue that humans had changed the world so much that the tree ring inconsistency was caused by us (this view would fit the relevant scientist’s worldview quite well, though I disagree with it), but ultimately tree rings were judged to be too unreliable to rely on in the latest models.

    The “spike” in the hockey graph comes from temperature records, it’s not affected by the proxy data (which is inherently less accurate and more error-prone). My original point is that the “spike” is the important part, and if you assume the temperature records are correct, they do show a significant global temperature increase, that does seem to correspond to the models.

    That’s why I said the real question was the temperature record accuracy, not the proxy data. Since this so many sites and blogs fell to the fallacy of concentrating on the proxy data, I worry that skeptics have shot themselves in the foot, resulting in a significantly weakened position in the long run, and no check on the climate change activists.

    My position has morphed to “well, it is happening, the question is how fast; drastic measures will do more harm then good, especially in developing countries that might not be able to bear the economic strain.” But it’s hard to tell, there’s so many shadings, and both sides are biased to hell and occasionally lie their asses off.

    I suspect the truth is somewhere in between, but finding it seems impossible without access to the raw data of temperature stations across the world (and a statistician skilled enough to normalize the data and analyze it).

  353. Doug:

    I’m taking a little more active stance since I’ve done more research (which seems to show that both sides are full of crap).

    I’m a software developer (admittedly about half self-taught) in computer graphics. My experience with university CG code hasn’t been that much better then what I saw in the files, thus why I was so confused why people were blowing up at it. I’ve also seen much *worse* code in my work, it happens all the time in the commercial world.

    As for the water vapor feedback, I was simply reporting it (I’m not fully convinced of its validity myself, and I thought I implied as much). Just look at the NASA news release on it from last year (I’ve not really deciphered the paper it was based on, it *seemed* weak but I’m not qualified enough to say).

    Anyway, I stand on my original point. I don’t want the skeptic movement to die (I think they ask some very interesting questions, and serve as a check on the climate change activists, and since I think both sides are full of crap I don’t want either one to win outright). But I’m afraid this will come back and bite them in the long term.

  354. The sum of the squares going negative is very revealing.

    The sum of the squares is a basic calculation in statistics, and is one of the steps taken in doing a regression or correlation. It is a measure of the amount of error; lack of correlation is another way of saying it.

    If the sum of the squares is very large it means there is a lot of error and almost no correlation at all. If the sum of the squares gets really large, and they are using integers in the calculation, then the sum of the squares will go negative. Whatever word size they are using in the program – 16 bit, 32 bit, etc, – when it is maxed out, it will be a negative number.

    This means the errors that they are calculating are so large, they max out the integers being used to store the sum of the squares.

  355. If it wasn’t a hacker, but a whistleblower, my money is on whoever was writing these notes.

    Think about it – the stress, the ‘Twilight Zone Syndrome’ mentioned in trying to deal with it – SWEARING in the rem notes!

    If it was me trying to deal with this, and I had Phil Jones and Michael Mann up my rear, I could totally see myself, jacked up on too much coffee, no sleep, and a head full of total bullshit just deciding one night to hit the reset button on the whole charade.

    It’s funny how they talk of ONE email taken out of context, when clearly “Hide The Decline” was literally their mantra, spoken and written hundreds of times daily.

    Let this trigger the end of the UN, Socialism as a whole, the arrogant, shrill nonsense of the ‘Green’ movement, and Taxation-as-Industry in our time.

    Let’s quit whining, and get back to work – there is an entire galaxy to explore.

Comments are closed.