WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.

To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered thata sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine softwaresuites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have beenintroduced, so many false references.. so many changes that aren't documented.
Every time acloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state asAustralia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I havethat familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases notbeing kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and Iimmediately found a mistake!
Scanning forward to 1951 was done with a loop that, forcompletely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!!
That may have had something to do with it. I also noticed, as I was correctingTHAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced byDelaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yetthe WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning Idiscovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code> - FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'mhitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940];
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered thata sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine softwaresuites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have beenintroduced, so many false references.. so many changes that aren't documented. Every time a
cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state asAustralia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I havethat familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases notbeing kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and Iimmediately found a mistake! Scanning forward to 1951 was done with a loop that, for
completely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!! That may have had something to do with it. I also noticed, as I was correcting
THAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced byDelaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yetthe WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning Idiscovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code> - FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'mhitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940];
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
Sponsored IT training links:
Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.
“I know enough about programming though no expert,
just from my own limited experience the commands and data input are
to quote:” Crap crap ‘..”
Anyone who could write “the commands are crap” knows NOTHING about programming.
I have worked as a programmer or other kind of software engineer since 1982, so I am quite familiar with what goes on in software development. On that basis I can tell you all with great confidence: you are ALL reading HARRY_README out of context. There is no justification for claiming that this exposes man-made global warming as a fraud. Likewise, there is no justification for claiming that this shows that all the data on which the theory is based is junk.
Rather, most of the posts in this blog are a frenzied exercise in the fallacy of quoting out of context.
As I said on http://www.climateaudit.org/?p=2530#comment-183123
“No professional computer programmer can take the AGW proposition with it’s forecast seriously when these things are based on the most fallible of all man’s creations – the computer program; especially when the underlying programs are coded by non-professional programmers and not independently audited.
All this discussion on tree rings, SST, Ice Cores, Satellite data etc. is minor in comparison to that simple observation. (It’s still fascinating though).”
Oh alright! So I am having a smug “told-you-so” moment:-)
Nice work here folks, very nice. It’s tragic that we can’t get the MSM to cover this kind of news. They do a horrible injustice to our country with their blatant bias.
Hopefully this evidence of fraud and malfeasance can still get out and stop Obama from passing the absurd Cap and Trade law.
Once again many people fall for the selective data trap. Computer programmers and scientists will often feed artificial data into their code to test the response under a known set of conditions. Deliberately highlighting small portions and comments of many different programs does not accurately represent the total. If you would like to gloat and say I told you so, go ahead download the data, write the code and analyze it yourself. There is nothing here to warrant any kind of discussion
I am a theoretical physicist and I am not as surprised as most of the readers seem to be. This is taken out of a context and the only conclusion that I can draw is that we need more data.
Grid and interpolation problems that the poor guy mentions simply cannot be avoided when one deals with real world data – and sometimes the only way to fix the fitting functions is to insert points by hand. That can lead to a minimal uncertainty of the end result but it’s the only way to do it.
Also, you have to understand that you feel very frustrated when you work nights and weekends on your projects and code. I also write all kinds of comments in my personal log files. Science is difficult. If we had all the data and knew all the explanations there would be no point in doing research. There is a reason why it’s called ‘climate science’. By the time things get into textbooks countless hours of work have been done by researchers. And even then it’s not perfect. This particular British group is not the only one in the world dealing with the global warming problem. It’s worrying because there exists a global consensus among researchers that the climate change is real and human made. Even if Jones’ group is inventing data in such way to make it more dramatic it will be contradicted by the rest of the scientific community and it will affect their reputation.
My point: Based on this log alone there is little to conclude except that Harry is frustrated and that more data is needed. Luckily, currently there are several international projects in progress that will increase dramatically the amount of climatological data in the near future.
For people who want to read more here is a link that puts things in context:
http://www.realclimate.org/index.php/archives/2009/11/the-cru-hack-context/
I challenge people who get all emotional when arguing against the climate change to click on the above link and read the text. I would also suggest that they ask themselves if they can really have an opinion about such an issue if they happen to not understand even the basic science behind it.
http://www.sciencedaily.com/releases/1998/08/980814065506.htm
why discount tree data after 1960? That seems to be the crux. They say tree ring started *not* matching direct temp. readings. Maybe because co2 was distorting tree ring growth verse previouse hundreds of years?
And of course they used tree data because it goes back thousand years while direct measurement in any reliable sense probably only 200 or so, and that in limited locations.
Sunshine into the process is a very good thing. If this model is worthless it doesn’t mean climate isn’t being changed– just means this model doesn’t actually speak to the issue. There should be no sides here — it’s a serious concept and deserves science.
I would like to second MG’s point: scientists are always throwing out certain bits of data as nonsense, and sometimes (like Einstein did) they think there model is right and the “data” wrong. Data is always already theory laden. There may or may not have been some malfeaseance here, but the view that “the data” is always king is just naive.
If you would like to see some REAL climate data charts look at:
http://noglobalwarming.info/CanadianClimate.html
I have plotted data available from Environment Canada for a bunch of cities in Canada, most from 1940 to present. There is a constant trend (no warming or cooling) from 1940 – 1980, a small rise from 1980 – 2000 and then a decline since then. Real data does not show hockey stick patterns or any co-relation to CO2 levels.
John, please look at where and when you have you have graphed temperature data. It is only in Canada, and it is only from 1948 to 2008. Sadly I feel as though you do not understand the concept of global warming and climate change. Global warming is considered to be a global phenomena, not just a point in the northern hemisphere. If you had even looked at your graphs the only ones which show cooling are the summer averages of Canada and the graph for Vancouver (the farthest south). You are only looking at the summer averages which is probably less than half of the whole dataset. Also your graphs seem to support the theory of Polar Amplification Effect (PAE), which is a byproduct of global warming. This is where the poles tend to feel planetary trends quicker than a global response. You can see this in the Resolute Bay data (closest to the poles) which from 1948 to present shows a 1.5 degree increase. Your graphs also seem to cover only a short time period of when humans have been adding CO2 to the atmosphere. A more appropriate time period would be to graph temperature from the start of the industrial revolution(end of 18th century to start of 19th century) to present since this is when we started adding CO2 to the atmosphere. Also please look up what the hockey stick graph actually is. This graph covers the past 1000 years to the present, not the 70-ish years you have plotted. If you can come up with a better way to reconstruct temperatures from before accurate records please share it with the rest of the world since the best and the brightest apparently couldn’t. Please do ask more questions and read some papers on climate change before you make such sweeping statements.
the sad thing is that, even though these emails ultimately amount to a non-story, denialists will be citing it falsely for years, just like Limbaugh’s volcano fabrication. they read the snippets and quotes and completely ignore the fact that the ellipses on either side are placed very strategically by people who want the emails to be considered scandalous.
My technical writing professor said it correctly “Question what you read because you can manipulate statistics till your had gets sticky.”
From these notes its sounds like the programmer struggling with (1) messy data, (2) a number of necessary modeling assumptions (whether accurate or not based on the literature) being applied to the data. These are universal issues facing any statistical programmer/researcher.
Taking these out of context without looking at the whole is very misleading. I had a programmer working for me become self-righteous and convinced that I was up to no good, because they didn’t understand the assumptions and model tweaks that the rest of the research literature suggested and other analyses required.
That said, all of this should be open to cross-validation and replications. All assumptions made transparent to peer-review.
Imagine if we could see the ‘clean-up’ procedures that big-Pharma uses internally to support approval of a billion-dollar drug…no pressure on an analyst there to throw out an outlier!
While getting my MBA my statistics instructor presented the class with a cute little pocket-sized book with the tongue-in-cheek title, “How to Lie with Statistics.” I have since lost that book, to my dismay. From what I can fathom, it appears CRU used it as their business model. Am tempted to ask if I could have/purchase one of their copies.
Cayman, look here for your book: http://www.amazon.com/gp/product/0393310728/ref=pd_lpo_k2_dp_sr_1?pf_rd_p=486539851&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=039309426X&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=13ECR2DHBZDJ1SXX6EQ4
quote:
Mike Nicholson (10:32:34) :
Can anyone tell me why Copenhagen is still going ahead?? !
Yes, I’m afraid I think I can.
It appears that this was always about the money.
As they say “follow the money”.
This is pretty a good explanation:
http://www.marketoracle.co.uk/Article15471.html
Suddenly everything starts to make sense.
The village idiot of the day is Robert Gibbs. Even with all of the evidence from Climategate showing that temerature records were manipulated, Robert Gibbs proclaims climate change is settled science. What a dufus. You would think that guy has been locked in a closet for years.
Robert Gibbs, lets see what happens if your boss pushes to pass Cap and Tax or goes to Copenhagen and signs a treaty for us to pay other countries trillions of dollars for an issue that is not determined to be an issue afterall.
Thanks to the Wall Street Journal being about the only mainstream press investigating and giving us insight on the issue.
I am a programmer. I have been an amateur meteorolgist my entire lifetime. Global warming or climate change associated with man has been and will always be a complete and utter farce. The emails can be debated… so beit. The program cannot. It is exact and indicates a complete lack of ethics and professionalism. If this program were used in an pharma or healthcare application, the applicants would die harsh and cruel deaths, nearly immediately. The programmer, the company and the industrial complex that promoted the application would be sued out of existance and the executives sentenced to long stints in prison. I think that the same is due these folks that have been playing loosely with an unsuspecting ‘global’ public, to the order of fraud and racketeering on the public dole. Get em!
I would like Dodge, Bulldog, regression, and Craig Moore to read this and give me their opinion…
http://www2.sunysuffolk.edu/mandias/global_warming/global_warming_misinformation_reasons.html
If you have been through grad school you will realize scientists are always after the truth, even if it does not support your hypothesis….
It’s SO clear that AGW is one of the biggest frauds ever perpetrated in the name of ‘science’. In this blog, the professional programmers and the Alarmist Loons are given equal space. I suppose that’s fair, but not very productive. Anyone with a working brain can tell who’s who.
No one has yet mentioned Michael Crichton’s ‘State of Fear’ (HarperCollins, 2004). Sure, it’s an adventure novel, and certainly not ‘science’, per se. Yet the meticulous research that went into it (thoroughly documented in the bibliography) makes it clear that the whole AGW enterprise is ‘agenda-driven’ by large money and political power.
Peer-review can now be seen for the farce that it is: ‘my peers are the people who pay for the garbage I produce, and share in the government largess which is is our just reward’.
I’m sickened. A trillion dollars for this insanity?
The most pessimistic posts are probably right: it is too late to stop the train. The ‘science’ is ‘settled’. And Science is the big loser. We may as well go back to witchcraft, astrology, and eugenics.
The site referred to by ‘aceandgary’ is DEFINITELY a must-see. It uses MSM sources to ‘prove’ that ‘the science is settled’.
What is much more disturbing is the agenda, revealed for all to see: AGW alarmists are shown to be creatures of the political Left – globalists, socialists, the same ‘share the wealth’ folks that promised us ‘Change’.
‘From each according to his ability, to each according to his needs’. Chilling. The fact that otherwise well-intentioned people could still believe this isn’t surprising, but it speaks to what has become of education. The Orwellian nightmare has become reality.
‘If you have been through grad school you will realize scientists are always after the truth.’ So naive it’s not worth comment. Maybe ‘aceandgary’ meant GRADE school.
Google is up to speed now: type in ‘cli’ and the FIRST auto-suggest is ‘climate-gate’, 10,400,00 hits!
And Real Climate is terrified! They’re so worried, they feel the need to refute Crichton’s FICTION.
I’m starting to love carbon dioxide; my houseplants are getting Greener. Credit goes to Al Gore’s Information Superhighway.
Keep it up; they’re AFRAID now!
Anti-Climax-Gate
Novice question. Why does everyone assume the code notes are the original and not added later by the hacker/Whisleblower to really grab attention to certain parts for people like you?
Stupid? Can you change thos notes without effecting program?