Climategate: hide the decline – codified

WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the  HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.

http://codyssey.files.wordpress.com/2009/02/software_bug.jpg

To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

  • FOIA\documents\harris-tree\recon_esper.pro; Computes regressions on full, high and low pass Esper et al. (2002) series,

    ; anomalies against full NH temperatures and other series.

    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline

  • FOIA\documents\harris-tree\calibrate_nhrecon.pro;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline that affects tree-ring density records)

    ;

  • FOIA\documents\harris-tree\recon1.pro

    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro;

    ; Specify period over which to compute the regressions (stop in 1940 to avoid

    ; the decline

    ;

  • FOIA\documents\HARRY_READ_ME.txt17. Inserted debug statements into anomdtb.f90, discovered that

    a sum-of-squared variable is becoming very, very negative! Key

    output from the debug statements:

    (..)

    forrtl: error (75): floating point exception

    IOT trap (core dumped)

    ..so the data value is unbfeasibly large, but why does the

    sum-of-squares parameter OpTotSq go negative?!!

  • FOIA\documents\HARRY_READ_ME.txt22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software

    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the

    definitive failure of the entire project..

  • FOIA\documents\HARRY_READ_ME.txtgetting seriously fed up with the state of the Australian data. so many new stations have been

    introduced, so many false references.. so many changes that aren't documented. Every time a

    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with

    references, some with WMO codes, and some with both. And if I look up the station metadata with

    one of the local references, chances are the WMO code will be wrong (another station will have

    it) and the lat/lon will be wrong too.

  • FOIA\documents\HARRY_READ_ME.txtI am very sorry to report that the rest of the databases seem to be in nearly as poor a state as

    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO

    and one with, usually overlapping and with the same station name and very similar coordinates. I

    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!

    There truly is no end in sight.

  • FOIA\documents\HARRY_READ_ME.txt28. With huge reluctance, I have dived into 'anomdtb' - and already I have

    that familiar Twilight Zone sensation.

  • FOIA\documents\HARRY_READ_ME.txtWrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not

    being kept in step. Sounds familiar, if worrying. am I the first person to attempt

    to get the CRU databases in working order?!!

  • FOIA\documents\HARRY_READ_ME.txtWell, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I

    immediately found a mistake! Scanning forward to 1951 was done with a loop that, for

    completely unfathomable reasons, didn't include months! So we read 50 grids instead

    of 600!!! That may have had something to do with it. I also noticed, as I was correcting

    THAT, that I reopened the DTR and CLD data files when I should have been opening the

    bloody station files!!

  • FOIA\documents\HARRY_READ_ME.txtBack to the gridding. I am seriously worried that our flagship gridded data product is produced by

    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station

    counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived

    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented

    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?

    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding

    procedure? Of course, it's too late for me to fix it too. Meh.

  • FOIA\documents\HARRY_READ_ME.txtHere, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet

    the WMO codes and station names /locations are identical (or close). What the hell is

    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)

  • FOIA\documents\HARRY_READ_ME.txtWell, it's been a real day of revelations, never mind the week. This morning I

    discovered that proper angular weighted interpolation was coded into the IDL

    routine, but that its use was discouraged because it was slow! Aaarrrgghh.

    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'

    to 720x360 - also deprecated! And now, just before midnight (so it counts!),

    having gone back to the tmin/tmax work, I've found that most if not all of the

    Australian bulletin stations have been unceremoniously dumped into the files

    without the briefest check for existing stations.

  • FOIA\documents\HARRY_READ_ME.txtAs we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
  • FOIA\documents\HARRY_READ_ME.txtOH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm

    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform

    data integrity, it's just a catalogue of issues that continues to grow as they're found.

  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.proprintf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’

    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’

    printf,1,’Reconstruction is based on tree-ring density records.’

    printf,1

    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY

    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’

    printf,1,’will be much closer to observed temperatures then they should be,’

    printf,1,’which will incorrectly imply the reconstruction is more skilful’

    printf,1,’than it actually is. See Osborn et al. (2004).’

  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    FOIA\documents\osborn-tree6\summer_modes\data4sweden.proprintf,1,'IMPORTANT NOTE:'

    printf,1,'The data after 1960 should not be used. The tree-ring density'

    printf,1,'records tend to show a decline after 1960 relative to the summer'

    printf,1,'temperature in many high-latitude locations. In this data set'

    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'

    printf,1,'this means that data after 1960 no longer represent tree-ring

    printf,1,'density variations, but have been modified to look more like the

    printf,1,'observed temperatures.'

  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro;

    ; Remove missing data from start & end (end in 1960 due to decline)

    ;

    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)

    sst=prednh(kl)

  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the

    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors

    ; but not as predictands. This PCR-infilling must be done for a number of

    ; periods, with different EOFs for each period (due to different spatial

    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),

    ; since they won’t be used due to the decline/correction problem.

    ; Certain boxes that appear to reconstruct well are “manually” removed because

    ; they are isolated and away from any trees.

  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]

    ;

    ; Apply a VERY ARTIFICAL correction for decline!!

    ;

    yrloc=[1400,findgen(19)*5.+1904]

    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$

    2.6,2.6,2.6]*0.75 ; fudge factor

    (...)

    ;

    ; APPLY ARTIFICIAL CORRECTION

    ;

    yearlyadj=interpol(valadj,yrloc,x)

    densall=densall+yearlyadj

  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro;

    ; Plots density ‘decline’ as a time series of the difference between

    ; temperature and density averaged over the region north of 50N,

    ; and an associated pattern in the difference field.

    ; The difference data set is computed using only boxes and years with

    ; both temperature and density in them – i.e., the grid changes in time.

    ; The pattern is computed by correlating and regressing the *filtered*

    ; time series against the unfiltered (or filtered) difference data set.

    ;

    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE

    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro;

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

    ;

  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered

    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which

    ; gives a zero mean over 1881-1960) after extending the calibration to boxes

    ; without temperature data (pl_calibmxd1.pro). We have identified and

    ; artificially removed (i.e. corrected) the decline in this calibrated

    ; data set. We now recalibrate this corrected calibrated dataset against

    ; the unfiltered 1911-1990 temperature data, and apply the same calibration

    ; to the corrected and uncorrected calibrated MXD data.

  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro; No need to verify the correct and uncorrected versions, since these

    ; should be identical prior to 1920 or 1930 or whenever the decline

    ; was corrected onwards from.

  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro; we know the file starts at yr 440, but we want nothing till 1400, so we

    ; can skill lines (1400-440)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1980, which is

    ; (1980-1400)/10 + 1 lines

    (...)

    ; we know the file starts at yr 1070, but we want nothing till 1400, so we

    ; can skill lines (1400-1070)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1991, which is

    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro

    FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro

    FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

  • FOIA\documents\harris-tree\recon_esper.pro

    ; Computes regressions on full, high and low pass Esper et al. (2002) series,

    ; anomalies against full NH temperatures and other series.

    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline

  • FOIA\documents\harris-tree\calibrate_nhrecon.pro

    ;

    ; Specify period over which to compute the regressions (stop in 1960 to avoid

    ; the decline that affects tree-ring density records)

    ;

  • FOIA\documents\harris-tree\recon1.pro

    FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro

    ;

    ; Specify period over which to compute the regressions (stop in 1940 to avoid

    ; the decline

    ;

  • FOIA\documents\HARRY_READ_ME.txt

    17. Inserted debug statements into anomdtb.f90, discovered that

    a sum-of-squared variable is becoming very, very negative! Key

    output from the debug statements:

    (..)

    forrtl: error (75): floating point exception

    IOT trap (core dumped)

    ..so the data value is unbfeasibly large, but why does the

    sum-of-squares parameter OpTotSq go negative?!!

  • FOIA\documents\HARRY_READ_ME.txt

    22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software

    suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the

    definitive failure of the entire project..

  • FOIA\documents\HARRY_READ_ME.txt

    getting seriously fed up with the state of the Australian data. so many new stations have been

    introduced, so many false references.. so many changes that aren't documented. Every time a

    cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with

    references, some with WMO codes, and some with both. And if I look up the station metadata with

    one of the local references, chances are the WMO code will be wrong (another station will have

    it) and the lat/lon will be wrong too.

  • FOIA\documents\HARRY_READ_ME.txt

    I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as

    Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO

    and one with, usually overlapping and with the same station name and very similar coordinates. I

    know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!

    There truly is no end in sight.

  • FOIA\documents\HARRY_READ_ME.txt

    28. With huge reluctance, I have dived into 'anomdtb' - and already I have

    that familiar Twilight Zone sensation.

  • FOIA\documents\HARRY_READ_ME.txt

    Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not

    being kept in step. Sounds familiar, if worrying. am I the first person to attempt

    to get the CRU databases in working order?!!

  • FOIA\documents\HARRY_READ_ME.txt

    Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I

    immediately found a mistake! Scanning forward to 1951 was done with a loop that, for

    completely unfathomable reasons, didn't include months! So we read 50 grids instead

    of 600!!! That may have had something to do with it. I also noticed, as I was correcting

    THAT, that I reopened the DTR and CLD data files when I should have been opening the

    bloody station files!!

  • FOIA\documents\HARRY_READ_ME.txt

    Back to the gridding. I am seriously worried that our flagship gridded data product is produced by

    Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station

    counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived

    at from a statistical perspective - since we're using an off-the-shelf product that isn't documented

    sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?

    Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding

    procedure? Of course, it's too late for me to fix it too. Meh.

  • FOIA\documents\HARRY_READ_ME.txt

    Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet

    the WMO codes and station names /locations are identical (or close). What the hell is

    supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)

  • FOIA\documents\HARRY_READ_ME.txt

    Well, it's been a real day of revelations, never mind the week. This morning I

    discovered that proper angular weighted interpolation was coded into the IDL

    routine, but that its use was discouraged because it was slow! Aaarrrgghh.

    There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'

    to 720x360 - also deprecated! And now, just before midnight (so it counts!),

    having gone back to the tmin/tmax work, I've found that most if not all of the

    Australian bulletin stations have been unceremoniously dumped into the files

    without the briefest check for existing stations.

  • FOIA\documents\HARRY_READ_ME.txt

    As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>

  • FOIA\documents\HARRY_READ_ME.txt

    OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm

    hitting yet another problem that's based on the hopeless state of our databases. There is no uniform

    data integrity, it's just a catalogue of issues that continues to grow as they're found.

  • FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro

    printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’

    printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’

    printf,1,’Reconstruction is based on tree-ring density records.’

    printf,1

    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY

    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’

    printf,1,’will be much closer to observed temperatures then they should be,’

    printf,1,’which will incorrectly imply the reconstruction is more skilful’

    printf,1,’than it actually is. See Osborn et al. (2004).’

  • FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro

    printf,1,'IMPORTANT NOTE:'

    printf,1,'The data after 1960 should not be used. The tree-ring density'

    printf,1,'records tend to show a decline after 1960 relative to the summer'

    printf,1,'temperature in many high-latitude locations. In this data set'

    printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'

    printf,1,'this means that data after 1960 no longer represent tree-ring

    printf,1,'density variations, but have been modified to look more like the

    printf,1,'observed temperatures.'

  • FOIA\documents\osborn-tree6\combined_wavelet_col.pro

    ;

    ; Remove missing data from start & end (end in 1960 due to decline)

    ;

    kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)

    sst=prednh(kl)

  • FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro

    ; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the

    ; EOFs of the MXD data set. This is PCR, although PCs are used as predictors

    ; but not as predictands. This PCR-infilling must be done for a number of

    ; periods, with different EOFs for each period (due to different spatial

    ; coverage). *BUT* don’t do special PCR for the modern period (post-1976),

    ; since they won’t be used due to the decline/correction problem.

    ; Certain boxes that appear to reconstruct well are “manually” removed because

    ; they are isolated and away from any trees.

  • FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]

    ;

    ; Apply a VERY ARTIFICAL correction for decline!!

    ;

    yrloc=[1400,findgen(19)*5.+1904]

    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$

    2.6,2.6,2.6]*0.75 ; fudge factor

    (...)

    ;

    ; APPLY ARTIFICIAL CORRECTION

    ;

    yearlyadj=interpol(valadj,yrloc,x)

    densall=densall+yearlyadj

  • FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro

    ;

    ; Plots density ‘decline’ as a time series of the difference between

    ; temperature and density averaged over the region north of 50N,

    ; and an associated pattern in the difference field.

    ; The difference data set is computed using only boxes and years with

    ; both temperature and density in them – i.e., the grid changes in time.

    ; The pattern is computed by correlating and regressing the *filtered*

    ; time series against the unfiltered (or filtered) difference data set.

    ;

    ;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE

    ; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

  • FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro

    ;

    ; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

    ; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

    ; plot past 1960 because these will be artificially adjusted to look closer to

    ; the real temperatures.

    ;

  • FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro

    ; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered

    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which

    ; gives a zero mean over 1881-1960) after extending the calibration to boxes

    ; without temperature data (pl_calibmxd1.pro). We have identified and

    ; artificially removed (i.e. corrected) the decline in this calibrated

    ; data set. We now recalibrate this corrected calibrated dataset against

    ; the unfiltered 1911-1990 temperature data, and apply the same calibration

    ; to the corrected and uncorrected calibrated MXD data.

  • FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro

    ; No need to verify the correct and uncorrected versions, since these

    ; should be identical prior to 1920 or 1930 or whenever the decline

    ; was corrected onwards from.

  • FOIA\documents\osborn-tree5\densplus188119602netcdf.pro

    ; we know the file starts at yr 440, but we want nothing till 1400, so we

    ; can skill lines (1400-440)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1980, which is

    ; (1980-1400)/10 + 1 lines

    (...)

    ; we know the file starts at yr 1070, but we want nothing till 1400, so we

    ; can skill lines (1400-1070)/10 + 1 header line

    ; we now want all lines (10 yr per line) from 1400 to 1991, which is

    ; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)


Sponsored IT training links:

Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.


The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
5 2 votes
Article Rating
445 Comments
Inline Feedbacks
View all comments
November 25, 2009 3:52 pm

“The poor schmuck who wrote these comments is going to be blamed for the whole fiasco…”
I don’t think so. It’s pretty clear from the comments that he has been told what to do and that he doesn’t like it. He clearly states that he has been told to make up data.
I think that the threat to bring law enforcement into this is an idle threat. CRU has way to much to loose to start a legal process against the whistle blower. If they do, the whistle blowers defense attorney will have access to anything and everything that is in the CRU archives – as though he doesn’t already have enough to justify whistle blowing. CRU would like to have this thing crawl under a rug and go away, not to have it fought out in a highly publicised legal battle.

SteveSadlov
November 25, 2009 3:52 pm

Nicholas there is no such thing as a free lunch.

b_C
November 25, 2009 3:53 pm

Among the alarmist crowd, I can readily see where all this controversy about code is leading …
http://tinyurl.com/y9zpzlw

November 25, 2009 4:00 pm

Nicholas Alexander (15:24:49)
[snip] no one here is suggesting wilful and wholesale pollution is a good thing – far from it. I work for a major international oil company and I do not have space to tell you about the time, effort and resources we spend to minimise our impact on the environment – steps which cost $$ by the way.
The issue is whether the science that has been shown to be “proven” is, in fact, flawed.
turning to your last paragraph
“Free, clean energy is available if we invest in the right things and create solutions. The oil industry is not interested in free energy. But humanity will evolve more quickly.”
this is wrong on so many levels,,,,, “free” energy? eh? so all the staff who work in this fabulous power plant work for nothing? “invest” – erm – this implies money movement from those who have it (investors) to an enterprise in the expectation of a return…..
The oil industry is not interested in free energy – erm – no – I guess we are not. I offer no defence to the need to make a profit, pay dividends and prop up the investment funds that support millions of pensions………….
Humanity will evolve more quickly? [snip] No response to this as I have no clue what you are talking about!

Detestible
November 25, 2009 4:02 pm

In case people haven’t found it yet, there is another Zip file in the documents folder entitled “mbh98-osborn” the size is 44.6 megs when extracted, and has other .tar files within it.
Maybe some of the more knowledgeable people here would care to give that a looksy.

fred
November 25, 2009 4:12 pm

Tilo Reber 15:52:27 ” CRU would like to have this thing crawl under a rug and go away, not to have it fought out in a highly publicised legal battle.”
I have no doubt that is what they would like. I doubt that is what they will get.
I work with lawyers (US admittedly) almost every day. Every one I know would be after this guy like a duck on a junebug if they have to go to court – unless he has his butt covered.

fred
November 25, 2009 4:18 pm

Robert Wykoff (14:30:24) :
Could you point out that line of code to me. Six degrees per doubling is crazy. I’d like to try to see what calculation it goes into.

Nero
November 25, 2009 4:19 pm

Another tidbit for digestion: If you look at the file hadcrut3_gmr+defra_report_200503.pdf in the disclosures, you’ll find a report to a funding agency titled “Development of the global surface temperature dataset HadCRUT3” by Philip Brohan, John Kennedy, Simon Tett, Ian Harris and Phil Jones. It’s dated March, 2005. From the document routing head information it seems this was a deliverable from two contracts, one called “Revised optimally averaged global and hemispheric land and ocean surface temperature series including HadCRUT3 data set ” and the other “Report on HadCRUT3 including error estimates” both with the same investigators as on the report itself. There’s a magic contract number, MS—RAND—CPP—PROG0407, that when fed into Google comes up with a number of other reports suggesting that this was an omnibus dataset gathering and update project funded at CRU by DEFRA (UK Dept. for Environmental, Food and Rural Affairs).
This is the description of the reported activities: “Since the last update, which produced HadCRUT2 [2], important improvements have been made in the
marine component of the dataset [3]. These include the use of additional observations, the development of comprehensive uncertainty estimates, and technical improvements that enable, for instance, the production of gridded fields at arbitrary resolution. This document is a report on work to produce a new dataset version, HadCRUT3, which will extend the advances made in the marine data to the global dataset. The work is being managed in the Hadley Centre, but part of the work to be done needs expertise from CRU, so a contract has been placed with CRU to fund them to work on the project in collaboration with Hadley Centre staff. ”
The final paragraph gives a purported status: “We are making good progress towards the production of an updated version of the global historical surface
temperature dataset HadCRUT. This new version will be based on improved observational data, will have comprehensive error estimates, and will have associated local and global average time-series that are produced using fully tested methods. ”
Note again that this is submitted in March, 2005. Now we have the following from poor old Harry’s READ ME, at least notionally dated to 2006+:
“22. Right, time to stop pussyfooting around the niceties of Tim’s labyrinthine software suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the definitive failure of the entire project..”
This looks like ‘Harry’ (possible Ian Harris, possibly his coder) is in fact attempting to generate the dataset referenced in the report, having to go back in part to previous work by ‘Tim’ to do so since intermediate data had been discarded, and some of the original perhaps ‘lost’. In spite of the reported ‘good progress’, he’s still deep in the trenches, and apparently continued so until some time in 2009. One wonders how happy the DEFRA folks were with the actual status of the work?

anaon
November 25, 2009 4:32 pm

So Global Warming was man made after all! 🙂

jgfox
November 25, 2009 4:32 pm

Goggling “Climategate” on the web gave nearly 3 million hits.
One response in Computer World on an article about how to prevent this type of hacking said it all for Wattsupsters.
Submitted by Anonymous on November 25, 2009 – 16:23.
“Excellent advice.
Still I thank God, the spirit realm, and the Angel of Hacking who saved us from the plot to harm us economically through the man made global warming hoax.
Thank You!
Thank You!
Thank You!
A lifetime of thank you’s is not enough.”
To which we all say, “Amen”.

Thomas
November 25, 2009 5:25 pm

James Corbett’s message to the hijacked environmental movement

barbee butts
November 25, 2009 5:34 pm

Looks like we found our whistle blower ~but don’t tell. LOL

crosspatch
November 25, 2009 5:39 pm

Anyone have any idea of the purpose of the variable called “Cheat” in
cru-code/linux/mod/ghcnrefiter.f90 ??
It appears to be used in an adjustment but the writing out of the adjusted data seems to be commented out in this version (maybe debugging output?) but it is added to the array Addit(XYear).

do XYear = 1,NYear ! adjust addit data
if (Addit(XYear).NE.MissVal) then
New=Cheat+(Multi*(Addit(XYear)**Power))
! write (99,"(i4,3f10.2)"), XYear,Stand(XYear),Addit(XYear),New !
Addit(XYear)=New
end if
end do

crosspatch
November 25, 2009 5:50 pm

Curious:

;
; Reads in the gridded Hugershoff MXD data, plus the regional age-banded and
; regional Hugershoff series and attempts to adapt the gridded Hugershoff
; data to have the same low-frequency variability as the ABD regions.
; The procedure is as follows:
;
; HUGREG=Hugershoff regions, ABDREG=age-banded regions, HUGGRID=Hugershoff grid
; The calibrated (uncorrected) versions of all these data sets are used.
; However, the same adjustment is then applied to the corrected version of
; the grid Hugershoff data, so that both uncorrected and corrected versions
; are available with the appropriate low frequency variability. There is some
; ambiguity during the modern period here, however, because the corrected
; version has already been artificially adjusted to reproduce the largest
; scales of observed temperature over recent decades – so a new adjustment
; would be unwelcome. Therefore, the adjustment term is scaled back towards
; zero when being applied to the corrected data set, so that it is linearly
; interpolated from its 1950 value to zero at 1970 and kept at zero thereafter.

;
; (1) Compute regional means of HUGGRID to (hopefully) confirm that they
; give a reasonably good match to HUGREG. If so, then for the remainder of
; this routine, HUGREG is replaced by the regional means of HUGGRID.
;
; (2) For each region, low-pass filter (30-yr) both HUGREG and ABDREG,
; and difference them. This is the additional low frequency information
; that the Hugershoff data set is missing.
;
; (3) To each grid box in HUGGRID, add on a Gaussian-distance-weighted
; mean of nearby regional low frequency, assuming that the low frequency
; information obtained from (2) applies to a point central to each region.
;
; (4) Compute regional means of the adjusted HUGGRID and confirm that they
; give a reasonable match to ABDREG.
;
; For some regions (CAS, TIBP) the low frequency signal is set to zero because
; the gridded data gives a quite different time series than either of the
; regional-mean series. Also, for those series limited by the availability
; of age-banded results, I set all values from 1400 to 50 years prior to the
; first non-missing value to zero, and then linearly interpolate this 50 years
; and any other gaps with missing values. Any missing values at the end of
; the series are filled in by repeating the final non-missing value.

Source:
osborn-tree6/mann/abdlowfreq2grid.pro

November 25, 2009 6:06 pm


jgfox (16:32:31) :
Goggling “Climategate” on the web gave nearly 3 million hits.

Just performed Google search on “Hide the decline” (within double quotes) –
– showed OVER 2.7 million hits!
.
.

Charles Higley
November 25, 2009 6:23 pm

A while back somebody generated a US temperature record for the 1900s using 5 consistently rural stations from around the country. It showed clearly that the 1930s were the warmest, we dropped until the late 70s and have been zigzagging slowly up until 1998, and then leveled off and declined. The 1998 peak was equal to only 1953 when we were already cooling – still about 0.5 deg less than 1938!. If you take the fabricated warming trend from 1978 and cut it in half, it looks a lot like this more realistic data. At the very least, the data was corrupted with UHI, which was not adequately removed, but it appears that this was not good enough and they had to make things more drastic and radical. It’s like an addict who slowly takes more and more, “it’s just not warm enough yet!”

November 25, 2009 6:23 pm


Nicholas Alexander (15:24:49) :
Free, clean energy is available if we invest in the right things and create solutions.

Using the web as a resource, pls cost-out what it would take to construct on-site wind and solar and associated storage facilities (for when the sun does not shine or when the wind does not blow) to power a 1500 sq. foot house in the north, the south, and the southwest. Wind and solar charts are available to assist in planning for various times of the year. Bulk steel and aluminum pricing is likewise available. These costs will NOT include engineering time (for calculations, the creation of fabrication drawings et al) that will be required to actually build these components.
Putting all this together will create in effect a BOM (bill of materials); also include an estimate of the machining costs and the on-site installation costs (usual costs for tradesmen, a crane for the windgenerator etc). Again, we are not including engineering costs.
Finish that, and next lets scale that up for industrial use at a small factory …
Are you capable of this sort of exercise or does it just stop at wishful and idyllic but nonetheless empty platitudes?
.
.

JimInIndy
November 25, 2009 6:23 pm

Twenty years as a Big 8 CPA computer systems audit manager. This garbage would have gotten the front doors locked, the SEC notified, and a forensic audit started.

Rattus Norvegicus
November 25, 2009 6:27 pm

All you people complaining about how Briffa and that crowd are trying to hide something in their MXD analysis, read this:
Briffa, K.R., Schweingruber, F.H., Jones, P.D., Osborn, T.J.,
Shiyatov, S.G. and Vaganov, E.A. 1998: Reduced sensitivity of
recent tree-growth to temperature at high northern latitudes. Nature
391, 678–82.
Yep, a great way to hide something is to publish about it in a paper in Nature. It’s real smart, cleaver even: hide it in plain sight and establish an active line of research on this interesting observation.

November 25, 2009 6:31 pm

_Jim (18:23:13) :
yup – my point exactly – it realy pisses me off when people talk about “free” anything. OT but heard someone praising the “free UK health care system” – with no reference to the bloody tax we pay !!!!!! Sheesh – liberals!

November 25, 2009 6:37 pm

Rattus Norvegicus, RC refugee at (18:27:52) writes :
All you people complaining about how Briffa and that crowd are trying to hide something in their MXD analysis

Do they explain ‘artificial adjustments’ in those pubs?
Do they explain the reason for the values of the series of terms appearing in the code?
Are these values related in anyway to physical processes in tree ring growth?
.
.

November 25, 2009 6:37 pm

Rattus,
It’s been mentioned that Keith Briffa appears to be at least somewhat concerned about the shenanigans. Maybe he’s a straight shooter, I don’t know. People can make up their own minds about him. Here’s a good place to start: click

Christopher Byrne
November 25, 2009 6:53 pm

My creative other half came up with this. I thought it was quite funny:
http://www.freeimagehosting.net/uploads/6fa0eea5a0.jpg

John M
November 25, 2009 6:55 pm

Rattus Norvegicus (18:27:52) :

All you people complaining about how Briffa and that crowd are trying to hide something in their MXD analysis, read this:
Briffa, K.R., Schweingruber, F.H., Jones, P.D., Osborn, T.J.,
Shiyatov, S.G. and Vaganov, E.A. 1998: Reduced sensitivity of
recent tree-growth to temperature at high northern latitudes. Nature
391, 678–82.

Noted.
http://www.climateaudit.org/?p=529

Clive
November 25, 2009 6:56 pm

Nicholas Alexander.
Lots of things sound good, but wind and solar will never save us. There is no way to keep humanity warm, fed, industrious and ALIVE in the northern parts of the Northern Hemisphere without fossil fuels of some sort…or nuclear. Just can never happen from logistical and engineering aspects.
Unless, or course, we kill off 90 percent of humanity. Perhaps you will volunteer for agathusia … or perhaps the more distasteful, aschimothusia?? ☺ Step right up. No waiting. Take one for the Gipper, and you will spend eternity with virgin Goracles. It’s true. ☺
Solar and wind (or whatever the heck you are talking about..perpetual motion perhaps ☺ ) may be great for back-to-the-lander acreage owners, but not for the masses in big northern cities when it is -35°C.
Solar and wind indeed have applications, but for the masses on this planet they are nothing more eco-weenie dreamin’.
It is hard to hide the decline in commonsense on this earth. ☺
Except here at WUWT … where commonsense prevails. Keep up the fight Anthony. Well done.

1 10 11 12 13 14 18