WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.

To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered thata sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine softwaresuites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have beenintroduced, so many false references.. so many changes that aren't documented.
Every time acloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state asAustralia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I havethat familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases notbeing kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and Iimmediately found a mistake!
Scanning forward to 1951 was done with a loop that, forcompletely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!!
That may have had something to do with it. I also noticed, as I was correctingTHAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced byDelaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yetthe WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning Idiscovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code> - FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'mhitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940];
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered thata sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine softwaresuites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have beenintroduced, so many false references.. so many changes that aren't documented. Every time a
cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state asAustralia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I havethat familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases notbeing kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and Iimmediately found a mistake! Scanning forward to 1951 was done with a loop that, for
completely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!! That may have had something to do with it. I also noticed, as I was correcting
THAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced byDelaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yetthe WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning Idiscovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code> - FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'mhitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940];
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
Sponsored IT training links:
Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.
“Averaging day and night temperatures has been reported as enhancing perceived warming in England as daytime alone shows less, or no, warming.” – Wasp
This rings bells – have read something similar about day and night temperatures before, probably at CA. Was it that increased night temperatures are an urban heat island effect? Arg, anyone remember this one?
(Met Office was announcing today that 2009 begs to be hotter again – BBC are happy to jump in as if being “one of 10 hottest years” is unusual – when you’re on top of a big hill, every step is a high one!)
Do anyone seriously think that a non-partisan review board can be put together to sort all of this out? 95% of those who would be appointed to it already believe that there is no need because “the science is already done”. The only way to shake things up is to come up with a serious class action suit, see it through the courts and get a big pay out. Maybe some of the people squeezed out of jobs or blocked from publishing in professional journals might have a case??? After all they have suffered professional and personal trauma. Also get the courts to mandate a review of all the information and findings. I am not a lover of our activist court system in the US but they can mandate that things get done when the politicians refuse to move. And we have a Supreme Court that might be sympathetic.
Someone asked what this Harry thing is all about. From my reading of things, he is trying to duplicate the results of the original 1.0 and 2.0 versions in a new version called 3.0. It appears that along the way, they really did lose either some data, the instructions on how this works, or the people that originally ran it (Tim??), or the code, or several of the above. My take is that Harry was assigned to take what code he could find and make v3.0. To do that he had to first recreate the original results. If it couldn’t reproduce the original results, then it would be obvious that something was wrong. YMMV.
Anyone interested in trees should read up on earthworms and their reintroduction into North America by the European colonists.
Here is a gem;
DOI 10.1007/s10530-009-9523-3
Tree rings detect earthworm invasions and their effects in northern Hardwood forests
Evan R. Larson, Kurt F. Kipfmueller, Cindy M. Hale, Lee E. Frelich, Peter B. Reich
Biological Invasions (2009) ISSN 1387-3547 (Print) 1573-1464 (Online)
[ http://www.youtube.com/watch?v=nEiLgbBGKVk ]
REPLY: already on the main page of WUWT but thanks – Anthony
Do not the two plots in the following reference illustrate the “trick”?
http://www.uea.ac.uk/mac/comm/media/press/2009/nov/homepagenews/CRUupdate
If it’s that bad, maybe it’s time to start thinking about shooting ourselves. Anyeone here got any good reason why not?
Getting a negative sum of squares was a neat trick.
Colin W (12:32:58) :
As a software engineer I ask: where are the test cases that prove the proper functioning of all this code? Surely there could be test data sets that could be fed in and check that the output is as expected, before applying it to real data? Test handling missing stations, duplicated stations, wildly varying Tmin/Tmax to generate alerts during generation.
If this is not done then a simple programming bug introdiced when modifying code will remain hidden and be very difficult to discover. Only code reviews/blind luck would catch this, without test cases.
Perceptive comment. I doubt the climate ‘scientists’ involved ever found time or advice for this sort of rigor.
I just saw this issue discussed on CNN for the first time. The spin was amazing. They stated the old 2500 scientists support AGW and mentioned Obama was still planning on going to Copenhagen and committing to a 17% reduction in emissions by 2020 and 80% by 2050. They also quoted polls that stated 72% of Americans believed in AGW.
So, the game is on …
I hate to say it, but some of the code cited in FOIA\documents\osborn-tree6\briffa_sep98_d.pro does not exist in my copy of that file. Specifically the line which creates the densall object is not in that file. Also, if you look at that file you will see that the yearlyadj values are *NOT* used in the code. At one point they were, but in the extant version of the code that line is commented out.
Re: Robert Wykoff 14:30:24
Cripes!! That’s 6 degrees Fahrenheit per doubling!
Robinson (11:58:42) :
> Ironically, the University of East Anglia has a Computer Science department:
> The fact is that they have the expertise on campus to engineer a decent piece of software.
Yes, but do they teach software engineering?
I like to distinguish between science and engineering as:
Science is the act of developing new tools (e.g. wheels, thermometers, IR sensors)
Engineering is the act of developing new systems out of tools created by scientists. (e.g. cars, weather stations, remote sensing satellites).
I know several very good computer scientists who could never become good software engineers, they get distracted by new things and don’t focus on the system at hand.
The lack of polish, inattention to details, likely development on smaller datasets than what Harry is using, etc, shows up throughout Poor Harry’s Diary. The overflowing sum of squares in the standard deviations calculation is a likely a good example.
I sincerely hope the police investigating the hack/leak from CRU have retained a backup of the CRU server at the time the crime was reported, so that this code can be verified to be exactly what CRU have been working with.
Otherwise we might find in the course of the investigation that CRU’s server has unfortunately suffered a catastrophic and unrecoverable failure.
Whoops.. global warming ate my hard drive.
Rattus Norvegicus (14:58:28) :
Try
\documents\harris-tree\briffa_sep98_e
I’ve made a good living as computer programmer,and I do stuff like that all the time. If you saw my code, you’d see comments like “The customer insists that I do this even though I know this make the outcome skewed in a [positive/negative] direction.”
In one case, I was adding on to an existing program that needed to find the volume of spheres, to calculate material loss (metal). The original program had the volume of sphere as 3/4 (PI)(R³). Obviously, the fraction has been flipped by mistake. When I brought this to the customer’s attn, I was told to leave it alone! For if I changed it, the LOSS would’ve counted LESS (cuz Vol was correct, and higher),and thus the loss reimbursement (which translated into local stimulus dollars) would be lower! So I added a comment that said: “Of course I know the correct formula for volume of a sphere, but the customer insists on the wrong formula.”
I have numerous examples of customers insisting on SLOWER-PERFORMING processes, because they didn’t want to raise the expectation of the end users, who might learn to expect data to come back “instantaneously.” So I would add comments like “If you ever want this process to perform better, do X, Y and Z. The customers has requested such-n-such roadblock.”
My point: This happens ALL THE TIME, and the programmer is almost ALWAYS working “for someone else,” and not for themselves. “just doing as I’m told.”
Proving a graph was produced by a programmer who was working with approximations, estimates and a specified goal may say something about that programmer and the process. The code can be replaced, the programmer can find another job.
But the fact remains, humanity is polluting the world and could easily do something about that. Global warming and climate change are symptoms of pollution. It is not very progressive to carry on poisoning the air.
Free, clean energy is available if we invest in the right things and create solutions. The oil industry is not interested in free energy. But humanity will evolve more quickly.
Quondam (14:53:36) :
Do not the two plots in the following reference illustrate the “trick”?
http://www.uea.ac.uk/mac/comm/media/press/2009/nov/homepagenews/CRUupdate
– yes – you can see how the proxy data stops in 1960 or 1980, and doesn’t follow the real-temp
– without the two black lines, you don’t actually get the hockey-stick
– but I think the up-tick on the red-line is the ‘trick’
Claude Harvey (14:02:31) :
http://i49.tinypic.com/2gy8w9v.jpg
there you live, don`t you?
Anthony
Joseph in Florida and KlausB’s messages above concerning ‘Climategate II’ in NewZealand deserves a look and a post of its own.
KlausB’s linked pdf – of manipulated temp records – shows a direct connection with CRU…
“Dr Jim Salinger (who no longer works for NIWA [National Institute of Water & Atmospheric Research]) started this graph in the 1980s when he was at CRU (Climate Research Unit at the University of East Anglia, UK) and it has been updated with the most recent data.”
REPLY: I’ll have a look – A
1st. The HARRY_READ_ME file is a collection of notes from the programmer, (unknown), to Ian ‘Harry’ Harris.
J.Ferg (09:55:10) :
E.M.Smith is better qualified to answer this & I believe GISTemp does the same thing with station Temps. You note that Months with 9999 are ignored but what isn’t obvious from that is that there may only be one day of readings missing. The obvious thing to do would be to take the average of the adjacent days to salvage the month but they don’t, they just write the month off.
Talking of E.M. Smith. I believe he’s fixed the -ve sum of squares problem, though I’ve not been over there to find out.
Hank Hancock (12:39:10) :
I’m so happy I’d put my drink down, you would have owed me a new keyboard & monitor 😉
DaveE.
American Free Thinker has an EXCELLENT article the uncovers the code that actually shows tampering with tree ring proxy temperatures to make it match instrument temperatures from 1930 to 1994 in a folder attributed to Michael Mann and between 1904 and 1994 in other code titled briffa_Sep98_d.pro and briffa_Sep98_e.pro.
From the article:
http://www.americanthinker.com/2009/11/crus_source_code_climategate_r.html
In fact, workarounds for the post-1960 “divergence problem,” as described by both RealClimate and Climate Audit, can be found throughout the source code. So much so that perhaps the most ubiquitous programmer’s comment (REM) I ran across warns that the particular module “Uses ‘corrected’ MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
Here’s the “fudge factor” (notice the brash SOB actually called it that in his REM statement):
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor
These two lines of code establish a twenty-element array (yrloc) comprising the year 1400 (base year, but not sure why needed here) and nineteen years between 1904 and 1994 in half-decade increments. Then the corresponding “fudge factor” (from the valadj matrix) is applied to each interval. As you can see, not only are temperatures biased to the upside later in the century (though certainly prior to 1960), but a few mid-century intervals are being biased slightly lower. That, coupled with the post-1930 restatement we encountered earlier, would imply that in addition to an embarrassing false decline experienced with their MXD after 1960 (or earlier), CRU’s “divergence problem” also includes a minor false incline after 1930.
And the former apparently wasn’t a particularly well-guarded secret, although the actual adjustment period remained buried beneath the surface.
Plotting programs such as data4alps.pro print this reminder to the user prior to rendering the chart:
IMPORTANT NOTE: The data after 1960 should not be used. The tree-ring density records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations. In this data set this “decline” has been artificially removed in an ad-hoc way, and this means that data after 1960 no longer represent tree-ring density variations, but have been modified to look more like the observed temperatures.
Others, such as mxdgrid2ascii.pro, issue this warning:
NOTE: recent decline in tree-ring density has been ARTIFICIALLY REMOVED to facilitate calibration. THEREFORE, post-1960 values will be much closer to observed temperatures then (sic) they should be which will incorrectly imply the reconstruction is more skilful than it actually is. See Osborn et al. (2004).
Care to offer another explanation, Dr. Jones?
The debate about the contents of the HARRY_READ_ME.txt and the validity of the programming and modelling techniques is something only experts and argue over.
However, what the lay person only needs to know this about the programming (which they can verify for themselves from the HARRY_READ_ME.txt file), this file is a THREE YEAR journal of a CRU programmer describing everything he tried with the data and models in an attempt to REPRODUCE existing results CRU had published. Comments in the file make it clear that “HARRY” tried FOR THREE YEARS (2006-2009)to recreate CRU’s published results AND FAILED.
Do you all see the REAL significance of this because it is absolutely fatal to the credibility of anything CRU has produced.
What we have here is a documented THREE year effort by a CRU programmer, who had access to all the data, access to all the code, access to all the people who developed the code and the models and still HE could still NOT duplicate CRU’s OWN results. If he can’t it simply means the CRU’s results cannot be reproduced even by themselves and so there is no point anyone else even trying — CRU themselves have proven it’s a waste of time and so they themselves have proven their own results are plain rubbish. That means any “peer reviewed” document CRU produced along with any other papers that cited the CRU papers are based on data the CRU themselves can’t verify.
Besides, the absolutly sorry state of affairs in the data handling and software managment the HARRY_READ_ME.txt reveals, the utter and total mess of CRU data and software this document reveals is WHY CRU has not released its data and model software.
Given the CRU is one of the most, if not the most cited sources of climate data — upon which trillions of dollars of economic policy is being set, the importance of what the HARRY_READ_ME.txt file reveals becomes scary.
A very nice layman’s summary of some of the issues in the HARRY_READ_ME.txt can be found here
http://www.devilskitchen.me.uk/2009/11/data-horribilis-harryreadmetxt-file.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+TheDevilsKitchen+%28The+Devil%27s+Kitchen
Copenhagen will go on pretending that nothing has happened because it is part of what documentary filmmaker Ann Mcllhenny calls “the moveable feast.”
But I do not despair that “we are too late.” The global warming alarmists are at the stage where there are gearing up to demand more more than millions for research and obedient lip service. Now they are gunning for trillions in payments and global governance to enforce the payments. There is going to be pushback from the nation states of the developed word against this, and those doing the pushing are looking for evidence. They have reasons to protect the CRU leak story from being suppressed, as the German princes had reasons to protect Martin Luther from being burned as a heretic.
John Peter (10:55:08) : “…What do you think about this one? It looks as if the “data adjustment contamination” has infected New Zealand as well….”
Looks like they need to call in Phil Jones’s dog to take care of that nasty old ‘uncorrected’ data.