WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.

To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered thata sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine softwaresuites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have beenintroduced, so many false references.. so many changes that aren't documented.
Every time acloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state asAustralia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I havethat familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases notbeing kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and Iimmediately found a mistake!
Scanning forward to 1951 was done with a loop that, forcompletely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!!
That may have had something to do with it. I also noticed, as I was correctingTHAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced byDelaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yetthe WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning Idiscovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code> - FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'mhitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940];
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered thata sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine softwaresuites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have beenintroduced, so many false references.. so many changes that aren't documented. Every time a
cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state asAustralia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I havethat familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases notbeing kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and Iimmediately found a mistake! Scanning forward to 1951 was done with a loop that, for
completely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!! That may have had something to do with it. I also noticed, as I was correcting
THAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced byDelaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yetthe WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning Idiscovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code> - FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'mhitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940];
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
Sponsored IT training links:
Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.
“The poor schmuck who wrote these comments is going to be blamed for the whole fiasco…”
I don’t think so. It’s pretty clear from the comments that he has been told what to do and that he doesn’t like it. He clearly states that he has been told to make up data.
I think that the threat to bring law enforcement into this is an idle threat. CRU has way to much to loose to start a legal process against the whistle blower. If they do, the whistle blowers defense attorney will have access to anything and everything that is in the CRU archives – as though he doesn’t already have enough to justify whistle blowing. CRU would like to have this thing crawl under a rug and go away, not to have it fought out in a highly publicised legal battle.
Nicholas there is no such thing as a free lunch.
Among the alarmist crowd, I can readily see where all this controversy about code is leading …
http://tinyurl.com/y9zpzlw
Nicholas Alexander (15:24:49)
[snip] no one here is suggesting wilful and wholesale pollution is a good thing – far from it. I work for a major international oil company and I do not have space to tell you about the time, effort and resources we spend to minimise our impact on the environment – steps which cost $$ by the way.
The issue is whether the science that has been shown to be “proven” is, in fact, flawed.
turning to your last paragraph
“Free, clean energy is available if we invest in the right things and create solutions. The oil industry is not interested in free energy. But humanity will evolve more quickly.”
this is wrong on so many levels,,,,, “free” energy? eh? so all the staff who work in this fabulous power plant work for nothing? “invest” – erm – this implies money movement from those who have it (investors) to an enterprise in the expectation of a return…..
The oil industry is not interested in free energy – erm – no – I guess we are not. I offer no defence to the need to make a profit, pay dividends and prop up the investment funds that support millions of pensions………….
Humanity will evolve more quickly? [snip] No response to this as I have no clue what you are talking about!
In case people haven’t found it yet, there is another Zip file in the documents folder entitled “mbh98-osborn” the size is 44.6 megs when extracted, and has other .tar files within it.
Maybe some of the more knowledgeable people here would care to give that a looksy.
Tilo Reber 15:52:27 ” CRU would like to have this thing crawl under a rug and go away, not to have it fought out in a highly publicised legal battle.”
I have no doubt that is what they would like. I doubt that is what they will get.
I work with lawyers (US admittedly) almost every day. Every one I know would be after this guy like a duck on a junebug if they have to go to court – unless he has his butt covered.
Robert Wykoff (14:30:24) :
Could you point out that line of code to me. Six degrees per doubling is crazy. I’d like to try to see what calculation it goes into.
Another tidbit for digestion: If you look at the file hadcrut3_gmr+defra_report_200503.pdf in the disclosures, you’ll find a report to a funding agency titled “Development of the global surface temperature dataset HadCRUT3” by Philip Brohan, John Kennedy, Simon Tett, Ian Harris and Phil Jones. It’s dated March, 2005. From the document routing head information it seems this was a deliverable from two contracts, one called “Revised optimally averaged global and hemispheric land and ocean surface temperature series including HadCRUT3 data set ” and the other “Report on HadCRUT3 including error estimates” both with the same investigators as on the report itself. There’s a magic contract number, MS—RAND—CPP—PROG0407, that when fed into Google comes up with a number of other reports suggesting that this was an omnibus dataset gathering and update project funded at CRU by DEFRA (UK Dept. for Environmental, Food and Rural Affairs).
This is the description of the reported activities: “Since the last update, which produced HadCRUT2 [2], important improvements have been made in the
marine component of the dataset [3]. These include the use of additional observations, the development of comprehensive uncertainty estimates, and technical improvements that enable, for instance, the production of gridded fields at arbitrary resolution. This document is a report on work to produce a new dataset version, HadCRUT3, which will extend the advances made in the marine data to the global dataset. The work is being managed in the Hadley Centre, but part of the work to be done needs expertise from CRU, so a contract has been placed with CRU to fund them to work on the project in collaboration with Hadley Centre staff. ”
The final paragraph gives a purported status: “We are making good progress towards the production of an updated version of the global historical surface
temperature dataset HadCRUT. This new version will be based on improved observational data, will have comprehensive error estimates, and will have associated local and global average time-series that are produced using fully tested methods. ”
Note again that this is submitted in March, 2005. Now we have the following from poor old Harry’s READ ME, at least notionally dated to 2006+:
“22. Right, time to stop pussyfooting around the niceties of Tim’s labyrinthine software suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the definitive failure of the entire project..”
This looks like ‘Harry’ (possible Ian Harris, possibly his coder) is in fact attempting to generate the dataset referenced in the report, having to go back in part to previous work by ‘Tim’ to do so since intermediate data had been discarded, and some of the original perhaps ‘lost’. In spite of the reported ‘good progress’, he’s still deep in the trenches, and apparently continued so until some time in 2009. One wonders how happy the DEFRA folks were with the actual status of the work?
So Global Warming was man made after all! 🙂
Goggling “Climategate” on the web gave nearly 3 million hits.
One response in Computer World on an article about how to prevent this type of hacking said it all for Wattsupsters.
Submitted by Anonymous on November 25, 2009 – 16:23.
“Excellent advice.
Still I thank God, the spirit realm, and the Angel of Hacking who saved us from the plot to harm us economically through the man made global warming hoax.
Thank You!
Thank You!
Thank You!
A lifetime of thank you’s is not enough.”
To which we all say, “Amen”.
James Corbett’s message to the hijacked environmental movement
Looks like we found our whistle blower ~but don’t tell. LOL
Anyone have any idea of the purpose of the variable called “Cheat” in
cru-code/linux/mod/ghcnrefiter.f90 ??
It appears to be used in an adjustment but the writing out of the adjusted data seems to be commented out in this version (maybe debugging output?) but it is added to the array Addit(XYear).
do XYear = 1,NYear ! adjust addit data
if (Addit(XYear).NE.MissVal) then
New=Cheat+(Multi*(Addit(XYear)**Power))
! write (99,"(i4,3f10.2)"), XYear,Stand(XYear),Addit(XYear),New !
Addit(XYear)=New
end if
end do
Curious:
Source:
osborn-tree6/mann/abdlowfreq2grid.pro
Just performed Google search on “Hide the decline” (within double quotes) –
– showed OVER 2.7 million hits!
.
.
A while back somebody generated a US temperature record for the 1900s using 5 consistently rural stations from around the country. It showed clearly that the 1930s were the warmest, we dropped until the late 70s and have been zigzagging slowly up until 1998, and then leveled off and declined. The 1998 peak was equal to only 1953 when we were already cooling – still about 0.5 deg less than 1938!. If you take the fabricated warming trend from 1978 and cut it in half, it looks a lot like this more realistic data. At the very least, the data was corrupted with UHI, which was not adequately removed, but it appears that this was not good enough and they had to make things more drastic and radical. It’s like an addict who slowly takes more and more, “it’s just not warm enough yet!”
Using the web as a resource, pls cost-out what it would take to construct on-site wind and solar and associated storage facilities (for when the sun does not shine or when the wind does not blow) to power a 1500 sq. foot house in the north, the south, and the southwest. Wind and solar charts are available to assist in planning for various times of the year. Bulk steel and aluminum pricing is likewise available. These costs will NOT include engineering time (for calculations, the creation of fabrication drawings et al) that will be required to actually build these components.
Putting all this together will create in effect a BOM (bill of materials); also include an estimate of the machining costs and the on-site installation costs (usual costs for tradesmen, a crane for the windgenerator etc). Again, we are not including engineering costs.
Finish that, and next lets scale that up for industrial use at a small factory …
Are you capable of this sort of exercise or does it just stop at wishful and idyllic but nonetheless empty platitudes?
.
.
Twenty years as a Big 8 CPA computer systems audit manager. This garbage would have gotten the front doors locked, the SEC notified, and a forensic audit started.
All you people complaining about how Briffa and that crowd are trying to hide something in their MXD analysis, read this:
Briffa, K.R., Schweingruber, F.H., Jones, P.D., Osborn, T.J.,
Shiyatov, S.G. and Vaganov, E.A. 1998: Reduced sensitivity of
recent tree-growth to temperature at high northern latitudes. Nature
391, 678–82.
Yep, a great way to hide something is to publish about it in a paper in Nature. It’s real smart, cleaver even: hide it in plain sight and establish an active line of research on this interesting observation.
_Jim (18:23:13) :
yup – my point exactly – it realy pisses me off when people talk about “free” anything. OT but heard someone praising the “free UK health care system” – with no reference to the bloody tax we pay !!!!!! Sheesh – liberals!
Do they explain ‘artificial adjustments’ in those pubs?
Do they explain the reason for the values of the series of terms appearing in the code?
Are these values related in anyway to physical processes in tree ring growth?
.
.
Rattus,
It’s been mentioned that Keith Briffa appears to be at least somewhat concerned about the shenanigans. Maybe he’s a straight shooter, I don’t know. People can make up their own minds about him. Here’s a good place to start: click
My creative other half came up with this. I thought it was quite funny:
http://www.freeimagehosting.net/uploads/6fa0eea5a0.jpg
Rattus Norvegicus (18:27:52) :
Noted.
http://www.climateaudit.org/?p=529
Nicholas Alexander.
Lots of things sound good, but wind and solar will never save us. There is no way to keep humanity warm, fed, industrious and ALIVE in the northern parts of the Northern Hemisphere without fossil fuels of some sort…or nuclear. Just can never happen from logistical and engineering aspects.
Unless, or course, we kill off 90 percent of humanity. Perhaps you will volunteer for agathusia … or perhaps the more distasteful, aschimothusia?? ☺ Step right up. No waiting. Take one for the Gipper, and you will spend eternity with virgin Goracles. It’s true. ☺
Solar and wind (or whatever the heck you are talking about..perpetual motion perhaps ☺ ) may be great for back-to-the-lander acreage owners, but not for the masses in big northern cities when it is -35°C.
Solar and wind indeed have applications, but for the masses on this planet they are nothing more eco-weenie dreamin’.
It is hard to hide the decline in commonsense on this earth. ☺
Except here at WUWT … where commonsense prevails. Keep up the fight Anthony. Well done.