WUWT blogging ally Ecotretas writes in to say that he has made a compendium of programming code segments that show comments by the programmer that suggest places where data may be corrected, modified, adjusted, or busted. Some the HARRY_READ_ME comments are quite revealing. For those that don’t understand computer programming, don’t fret, the comments by the programmer tell the story quite well even if the code itself makes no sense to you.
To say that the CRU code might be “buggy” would be…well I’ll just let CRU’s programmer tell you in his own words.
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;
; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered that
a sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software
suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have been
introduced, so many false references.. so many changes that aren't documented.
Every time acloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as
Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I have
that familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not
being kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I
immediately found a mistake!
Scanning forward to 1951 was done with a loop that, forcompletely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!!
That may have had something to do with it. I also noticed, as I was correctingTHAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced by
Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet
the WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning I
discovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
- FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm
hitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'
printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;
; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;
; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these
; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we
; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps15.pro
FOIA\documents\osborn-tree6\mann\oldprog\maps24.pro
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
- FOIA\documents\harris-tree\recon_esper.pro
; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
- FOIA\documents\harris-tree\calibrate_nhrecon.pro
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
- FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro
;
; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
- FOIA\documents\HARRY_READ_ME.txt
17. Inserted debug statements into anomdtb.f90, discovered that
a sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
- FOIA\documents\HARRY_READ_ME.txt
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software
suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
- FOIA\documents\HARRY_READ_ME.txt
getting seriously fed up with the state of the Australian data. so many new stations have been
introduced, so many false references.. so many changes that aren't documented. Every time a
cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
- FOIA\documents\HARRY_READ_ME.txt
I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as
Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
- FOIA\documents\HARRY_READ_ME.txt
28. With huge reluctance, I have dived into 'anomdtb' - and already I have
that familiar Twilight Zone sensation.
- FOIA\documents\HARRY_READ_ME.txt
Wrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not
being kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
- FOIA\documents\HARRY_READ_ME.txt
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I
immediately found a mistake! Scanning forward to 1951 was done with a loop that, for
completely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!! That may have had something to do with it. I also noticed, as I was correcting
THAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
- FOIA\documents\HARRY_READ_ME.txt
Back to the gridding. I am seriously worried that our flagship gridded data product is produced by
Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
- FOIA\documents\HARRY_READ_ME.txt
Here, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet
the WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
- FOIA\documents\HARRY_READ_ME.txt
Well, it's been a real day of revelations, never mind the week. This morning I
discovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
- FOIA\documents\HARRY_READ_ME.txt
As we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
- FOIA\documents\HARRY_READ_ME.txt
OH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm
hitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
- FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
- FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,'IMPORTANT NOTE:'
printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
- FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;
; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
- FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
- FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
- FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;
; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
- FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
- FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
- FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro
; No need to verify the correct and uncorrected versions, since these
; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
- FOIA\documents\osborn-tree5\densplus188119602netcdf.pro
; we know the file starts at yr 440, but we want nothing till 1400, so we
; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)
Sponsored IT training links:
Join 70-291 training program to pass 642-446 test plus get free practice files for next 70-643 exam.
It would be fun to have these guys under oath in a deposition with a lawyer who really understands code and an expert to consult. Bottom line — the code will be shown to be crap and the results were predetermined.
It’s what’s known as a dog’s breakfast.
Expose the code and bust the Anti-Trust Climate Team
I suggest that we offer Phil Jones the opportunity to run this code and produce his temperature “product” while the process is being videotaped and broadcast live publicly. He’ll be allowed a time period of perhaps 24 hours to replicate his “robust” temperature results without “fudging” then.
I would suggest the results would be “busted” rather than “robust”.
Shiny
Edward
Nice Tricks!!
It’s almost as if whoever wrote these notes wanted us to find them.
It just gets better and better (worse for them, that is). The question remains: is this enough to overcome the momentum acquired by the AGW side since 2005? Senator Inhofe’s commitment to the skeptic’s cause and his ranking minority seat in the Environment and Public Works Committee looms large. But why won’t the mainstream media report on the matter?
Perhaps a simple question. Science is all about repeatability and it is quite clear from these programmers comments that the CRU was quite obviously “cooking the books”. So, it stands to reason, if we want a reasonably true temperature record for the globe, from now back to whenever, what would it take to do that? For the current temperature records pains would have to be made to either use only station data that is free of UHI effect, or (somehow) to figure out what the heat signature of each station is in relation to the rural surroundings, and use that as an offset of some kind. For temperature records of the past, why are trees used when they so obviously have a mixed signal (more things affect tree growth than simply temperature) instead of human writings – humans have been recording and reporting on temperature and climate for thousands of years.
What would it take to start over, and make a completely independent temperature record?
Hi Anthony,
last dinner at Copenhagen!
I love it!
http://i47.tinypic.com/mrxszt.png
Somebody should Slashdot this, it is exactly the sort of thing the programming geeks there will understand and appreciate. They may also be able to provide valuable insight.
The last two lines stand out. I wonder what was shown in the data between the year 440 and 1400? Medieval Warm Period maybe??????
Karma is such a heinous bitch.
ouch… it’s worse than I thought!
I believe they were using black powder not smokeless in that gun that was found.
Simplified code for CRU to use (from an average programmer with no climatology expertise):
For intYear = 1400 to 2009;
floatGlobalMeanTemerature = floatGlobalMeanTemperature + WHATEVER_THE_HELL_YOU_WANT_IT_TO_BE;
intYear++
next
Print “Holy Crap! We’re all going to die!”
“a sum-of-squared variable is becoming very, very negative!”
This is an interesting “trick.” Has this issue been figured out yet? Buggy code is hard to unravel even when you are the one that wrote it, but if you don’t have it all and it is someone else’s code (maybe several others) the reason for this may not be discoverable. Did Harry ever figure it out?
“Delaunay triangulation” This seems to mean the code located points outside the region they should have been in, and, it was someone else’s poorly documented code (“off-the-shelf”) so Harry couldn’t figure out what it did or how.
hmmm (09:14:27) :
It’s almost as if whoever wrote these notes wanted us to find them.
Positively Freudian.I know enough about programming though no expert,
just from my own limited experience the commands and data input are
to quote:” Crap crap ‘..
But gavin says we’re taking this out of context…..
bahahahaha!
Get better results by oursourcing the coding to medieval Timbuktu
“; we know the file starts at yr 1070, but we want nothing till 1400, so we”
Hide the MWP?
Oh dear, poor chap. I hope this doesn’t turn into a CLM for him (no, that’s not yet another climate acronym, it’s a programmer one: http://www.jargon.net/jargonfile/c/CLM.html)
Apart from Harry, who was doing his best, these guys should be in jail. They knew exactly what the code was (or wasn’t) doing.
Harry spilled the beans.
Can there be any doubt that this trail of crumbs was laid deliberately? I think the identity of the whistleblower may be coming into focus.
However, it is curious that the coder makes no reference to brining the problems to the attention of Jones et al
It’s not just drivel, it’s mendacious drivel.
The pot of gold in Climategate. While we knew via Hansen’s scripts and Fortran77 something beyond incompetence, probably fraud, was afoot, Mann never released anything but McIntyre’s reconstructions are a strong indication of fraud.
This project file of commentary is the sort of file, only much larger, I’ve often used, maintaining old code I’d not developed originally. Usually it’s to establish hierarchies, patterns of attack, this file includes comments better placed in the files themselves, but gives every indication, to me, of authenticity.
The weight of the evidence following Climategate is inescapable.
“If you torture data sufficiently, it will confess to almost anything.”
-Fred Menger
Totally amazing. I’m no programming expert; but I do routinely construct Optical system models for ray trace simulations; and even they require detailed comment of every element that is in the system, or else even months later, I cannot keep track of why the hell, I included some element. Now this is orders of magnitude more straight forward, than actually writing the code; but I certainly understand the concept that the comments better explain in plain English (and evidently include also some Australian terminology) so anyone can tell what it is supposed to do.
Well I have always believed that the failure to observe Nyquist in sampling strategies, was at the heart of this climate data inadequacy, and reading about all their ad hoc inclusions and exclusions of data sites; clearly shows what total BS they are cooking up.
And If I was paying the bills for that stuff; I would really like my money back, and also see some jail (gaol) time included in the correction process for the varmints purveying this rubbish as science data gathering.
In-bloody-credible! I’ve had to rework crappy code on occassion and discovered that you “can’t get there from here”. I sympathize with Harry. I can’t sympathize with the people who have foisted this garbage on us as “science”.
And yet the lap-dog press will continue to embargo the controversy, Obama will “get one more step closer to climate change legislation”, and the scientists (sic) will offer lame but “scientific” excuses.
Maybe the way to nail the bastards will be the old fashioned way, mail-fraud and IRS transgressions. I would imagine there has been snail-mail crimes of some sort , and I’m positive such large amounts of money led to some malfeasance on Mann’s and other’s part. This type of personality would never be clean in their persoanal lives while perpetrating such monumental fraud in academia.
I spent a decade or so as a professional programmer. Let’s just say that the excerpts revealed above would not have only gotten me fired, they would have gotten me blackballed and unable to work in the industry forever. And rightfully so.
Was the data that the code uses also released?
With the data, one could remove the fudge factors from the code and see what really comes out.
The tone of the comments do raise the question as to whether their author helped release the information.
Data, I don’t need no stinking data!
Having worked as a research assistant at Columbia University, the only thing that surprises me is the childish language and the tone of all the emails. Scholarship is corrupt to the core and that’s why I decided never to get a PHD.
The number one corruption is that no thesis is approved for research unless the research passes a political litmus test. Translation: Unless you already have a conclusion in mind that does not seriously detract from that of your colleagues, you have no hope of being published.
Now, I am published (only one paper), but it is in the field of Philosophical History, it did not require any grant money, and it passed the political litmus test among Orientalists detracting from those whose background ais Occidental. Translation: the paper is perceived as “progressive” so it is good to go.
Of course it’s a bad idea to base policy upon modern scholarship in highly controversial areas. Unless proposed research is designed to meet, substantiate, or reflect progressive ideals, you haven’t got a chance.
Mainstream scholarship in recent history has got some very obvious things dead wrong because of progressive politics. Just as we heard all the lies about imminent O-Zone disaster to racial integration increasing property value (believe what you will about the social merit of integration, the economic is quite obvious), they are wrong again about yet another one.
However, their arrogance is astonishing in the email and programming notes. It should be no surprise they fudge evidence and conclusions–that’s 95% of scholarship, where position is more important than the truth. The surprise is that they all seem to be knowingly complicit in deception and not care. Everyone whom I worked with believed their crap!
Context:
“ Around 1996, I became aware of how corrupt and ideologically driven current climate research can be. A major researcher working in the area of climate change confided in me that the factual record needed to be altered so that people would become alarmed over global warming. He said, “We have to get rid of the Medieval Warm Period.”
Dr. David Deming (University of Oklahoma)
Oh, good lord!…. and these mob were given a grant of 13 million pounds of British taxpayers money to do climate science?
…. I think the only thing “done” here, was the British taxpayer. They were well and truly “done” over.:-(
Funny in that of all the text emails in the Zip file the night this got released, this is the file I read first. And kept reading. The comments looked to be pretty damning, but I wasn’t sure which “official” curve this program contributed to. If this was just a program for some obscure and not relied upon graph, it meant little. If this was something actively promoted by HadCRU then it’s a big deal.
“Somebody should Slashdot this, it is exactly the sort of thing the programming geeks there will understand and appreciate. They may also be able to provide valuable insight.”
I’ve been watching Slashdot for this story to come up, obviously this is just the sort of thing that SHOULD show up there.
It won’t. Slashdot is filled with pro AGW types, and controlled by them. I wouldn’t be surprised if it never show up there at all. Even the “science” section is glaringly silent on the issue.
Yes they are the type that would understand this, but for many it would mean exposing their preconceptions to themselves, and that won’t happen.
There is a reason why people will defend the crappy deal they got from a used car salesman.
Another paragraph from the Harry_Read_Me that inpires confidence:
“You can’t imagine what this has cost me – to actually allow the operator to assign false
WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’
database of dubious provenance (which, er, they all are and always will be).
False codes will be obtained by multiplying the legitimate code (5 digits) by 100, then adding
1 at a time until a number is found with no matches in the database. THIS IS NOT PERFECT but as
there is no central repository for WMO codes – especially made-up ones – we’ll have to chance
duplicating one that’s present in one of the other databases. In any case, anyone comparing WMO
codes between databases – something I’ve studiously avoided doing except for tmin/tmax where I
had to – will be treating the false codes with suspicion anyway. Hopefully.”
This is like the train wreck that you don’t want to watch, but you can’t look away from.
BOTO (09:17:11) :
Hi Anthony,
last dinner at Copenhagen!
I love it!
http://i47.tinypic.com/mrxszt.png
Great picture, but I have a question. Which one is Judas?
Judging by the state of the databases with a seemingly unmanaged history and no “method” in use to ensure integrity, it would be literally impossible to make sense of it all in a second pass. I now understand why they resist the release of code and data so strongly. It isn’t that they can’t, it’s that they’re just plain embarrassed by the hideous mess that would be exposed. I said “would”, I mean “has been”.
I know why Warmists deny global warming is not man-made. They are living in a Hollywood dream world and they can’t wake up.
My question is how the HARRY_READ_ME file fits into the greater scope of CRU’s data analysis. It’s not clear to me whether this was some side project for the programmer to analyze a problematic data subset or if this represents his frustrations with the main CRU data set. There is certainly a story here just don’t know if there are more chapters in this novel.
If this isn’t enough to get MainStreamMedia to actually do their work – then I’ve lost any hope for humanity!
You know, I have asked at realclimate.org how they did software quality control. The post was censored ofcourse.
My guess is that they would not be able to verify their data/models/implementation, so that the operational verification would be something along the lines:
This output looks funny, this can’t be right. Lets look at the software and fix it.
Even without malice in complicated software this will lead to software that confirms what the scientist expects, because then it won’t look ‘funny’.
I wrote something along those lines at rc, and apparently it hit very close to the mark, because it was censored.
Reed Coray (09:35:23) :
in my knewing, the one at the very right (your president…)
”22. Right, time to stop pussyfooting around the niceties of Tim’s labyrinthine software
suites – let’s have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..”
That will be Tim Osborne at a guess and Harry seems to think this manipulation of the data is vital for the ‘project’. The smoking gun is in the data, which is why they won’t give it up without a fight.
One should just run this code on some totally random data series and plot the output.
Limbaugh tearing into Obama on the AGW Hoax !!!!
RUSH: Now we go over to the Universe of Lies. This afternoon, President Obama press conference with the Indian prime minister. Here’s a portion of Obama’s opening remarks.
OBAMA: We’ve made progress in confronting climate change. I commended the prime minister for India’s leadership in areas like green buildings and energy efficiency, and we agreed to a series of important new efforts: A clean energy initiative that will create jobs and improve people’s access to cleaner, more affordable energy; a green partnership to reduce poverty through sustainable and equitable development and an historic effort to phase out subsidies for fossil fuels. With just two weeks until the beginning of Copenhagen, it’s also essential that all countries do what is necessary to reach a strong operational agreement that will confront the threat of climate change while serving as a stepping-stone to a legally binding treaty.
RUSH: The president of the United States has just said in an internationally televised press conference that he is going to continue seeking a resolution to a problem that doesn’t exist. There is no man-made global warming. But it doesn’t matter because he exists in the Universe of Lies. And man-made global warming is only a means to an end to him. It is simply a way to once again chip away at the size of this country and the wealth of this country and to make sure that he is able to enact legislation that will allow him to raise everyone’s taxes so that he can begin even more redistribution of wealth. It’s just a mechanism like all liberal ideas are. They are dressed up in flowery, compassionate language to hide the deceit — the insidiousness — of their real intentions. This is mind-boggling, mind-boggling. It’s a hoax! It has been totally made up! It’s been known since Thursday. This is Tuesday.
The president of the United States, in an internationally televised press conference, says, “We gotta move further and we gotta get closer to Copenhagen with a working agreement. We gotta confront climate change.” There isn’t any! The whole concept of “climate change” is a fraud because climate “changes” constantly, and we’re not responsible for it. We don’t have that kind of power. We can’t do it! If somebody says, “Make it warmer tomorrow,” you can’t. If somebody says, “Make it colder tomorrow,” you can’t. If somebody says, “Make it rain tomorrow,” you can’t — and yet to listen to these people, we’re doing all of that and we’re doing it so fast that we are going to destroy our climate. It’s a hoax! It’s a fraud! There is no climate change. There is no global warming. There never has been any man-made global warming. How else can it be said?……..”.
http://www.rushlimbaugh.com/home/daily/site_112409/content/01125109.guest.html
If a physicist were to submit a paper without showing the math, that paper would (I assume) be rightly ridiculed and sent back with a “show your work” rebuke. It doesn’t seem right that one can hide one’s work in software, and then casually dismiss the absence of documented code upon submitting a paper as these yahoos have done. And yet, that seems exactly the way mainstream climatology works. Do any other sciences permit one to hide calculations in a program and then not publish said program with the paper?
This is just astonishing, and it’s evolving into a serious scandal.
I don’t think most of the public even realize that the computer code upon which these “models” are based isn’t revealed to other scientists for checks/verification/improvement. What’s UEA CRU’s basis for the validity of its predictions? “Trust us, we’re a really nice bunch of English climatologists”???
Is this real??????? hot stuff !!!!!!
Oops, forgot ‘)’ was valid a URL-char. http://www.jargon.net/jargonfile/c/CLM.html
Trying to look at some of this charitably: It does seem to me that the “fudging” of the decline was done to avoid the calibration of tree proxy to real temperature being thrown off by the known (but unexplained) divergence post 1960, rather than as a fudge to the dataset itself (although the results of that calibration set the offset of the result, of course). If there is a known problem with part of one dataset, it’s reasonable enough to ignore it when you’re trying to calibrate the good parts with something else. The real question (which I don’t know the answer to) is how good is the correlation in the bits that you do believe are comparable (i.e. 1850-1960)?
But even I, stretching my charitable nature to its limit, am pretty worried by the fact that he doesn’t believe in the core gridding algorithm, and the poor quality of the data seems to be all-pervasive, not just noise from a few bad stations. You can’t blame “Harry” for this – quite the opposite – but it does seem quite incredible that something of such huge importance should have been given so little resource and basic software project management for so long.
It reminds me of Jones public statement:
“So apart from the removal of a few troublesome editors, blocking some FOI requests, deleting emails, blocking a few contrarians from publication, peer reviewing each others papers, cherry picking trees and modifying code to hide the decline: please tell me – exactly what is wrong with our science?”
Michael Alexis wrote:
Simplified code for CRU to use (from an average programmer with no climatology expertise):
For intYear = 1400 to 2009;
floatGlobalMeanTemerature = floatGlobalMeanTemperature + WHATEVER_THE_HELL_YOU_WANT_IT_TO_BE;
intYear++
next
Print “Holy Crap! We’re all going to die!”
Fell off my chair laughing!!!!!! No more, PLEASE, it HURTS!!!!!!!!!!
Do we know what CRU “software” we are talking about here? The first comments seem to be for ten year old proxy reconstructions, and some of the later ones the temperature “product(s)”
Is there any relationship to, or does this provide any insight into GISS software? It seems that Phil Jones made some winking statement about the CRU temperature anomalies being remarkably similar to GISS, “as they should be” or something similar.
The programming is clearly bad and any output from the software described here would have to be questionable. However, it is very difficult to assess risk and estimate error without knowing which software has problems and for what it is currently used (or which AGM dogmas it supports.)
Looking at the various values for valadj and they way they are applied its appears that they always lower the values (whatever they are) for the period 1930 to 1955 and increase them for the period 1955 to 1999. I wonder why.
Two interesting coder notes in HARRY_READ_ME.txt are:
“This still meant an awful lot of encounters with naughty Master stations, when really I suspect nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option – to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad, but I really don’t think people care enough to fix ‘em, and it’s the main reason the project is nearly a year late.”
“You can’t imagine what this has cost me – to actually allow the operator to assign false WMO codes!! But what else is there in such situations? Especially when dealing with a ‘Master’ database of dubious provenance (which, er, they all are and always will be).”
This is pretty funny
[ http://www.youtube.com/watch?v=nEiLgbBGKVk&feature=player_embedded# ]
REPLY: on the main WUWT page
http://politics.slashdot.org/comments.pl?threshold=5&mode=nested&commentsort=0&op=Change&sid=1451926
This is not some random crack from the outside. This is almost certainly an inside leak. 61 Mb is nothing. The probability that any 61 Mb of data, pulled off a file or email server, containing this much salient and inculpatory information is virtually nil.
This data was selected by someone who knew what they were doing, what to look for and where to find it.
Chris (09:36:01) :
My question is how the HARRY_READ_ME file fits into the greater scope of CRU’s data analysis. It’s not clear to me whether this was some side project for the programmer to analyze a problematic data subset or if this represents his frustrations with the main CRU data set. There is certainly a story here just don’t know if there are more chapters in this novel.
—————————————————–
Well that’s a simple problem to solve then….. Just send CRU a FOI request for the data and meta data and check it out;-)
O.T. but:
Did you know that the BBC held a secret meeting that decided NOT to give balanced time to GW and anti-GW viewpoints. Apparently, the communique said:
“The BBC has held a high-level seminar with some of the best scientific experts, and has come to the view that the weight of evidence no longer justifies equal space being given to the opponents of the consensus.”
http://burningourmoney.blogspot.com/2007/06/bbc-bias.html
.
James Hastings-Trew (09:16:45) :
If you have looked at the massive differences between the timespans quoted in HARRY_READ_ME.txt stations like Elko, NV / Charleston, SC / Red Bluff, CA (and dozens of other sites) and what exists today, it’s going to take years to recover, providing that original submitted paper forms have NOT been destroyed.
It’s going to have to be done. Too much damage has occured, and the digital databases are not to be trusted.
Do you (or anyone else for that matter) personally know where the original paper forms submitted are kept?
And where to obtain a legitimate copy of Tom Karl’s 1990 USHCN database?
[snip]
“These huckstering snake-oil salesmen and “global warming” profiteers — for that is what they are — have written to each other encouraging the destruction of data that had been lawfully requested under the Freedom of Information Act in the UK by scientists who wanted to check whether their global temperature record had been properly compiled. And that procurement of data destruction, as they are about to find out to their cost, is a criminal offense. They are not merely bad scientists — they are crooks. And crooks who have perpetrated their crimes at the expense of British and U.S. taxpayers”. (Lord Monckton)
SIR, American Thinker tied these events to ACORN in their article today. http://www.americanthinker.com/2009/11/acorning_the_climate_change_mo.html
Correct me if I’m wrong, but this isn’t actually `model’ code. It’s code for producing the temperature record. The model code must look even worse given the nature of its inherent complexity.
All I can say it, WOW. The way in which this programmer commented his work makes me think that he was somewhat frustraded with the fraud that he was being ask to commit. Really, we are now past the point of scientific debate. The skeptics need to have some good lawyers in their camp to help resolve some of these issues. Without taking these people to court, they are simply not going to come clean about what they have been doing.
Gee, I wonder what the GISS data sets are like? Since they won’t release them, despite an FOI request, it makes you wonder.
I used to sincerely believe in the global warming theory. But one thing that’s troubled me in the past few years were the shrill responses to skepticism regarding it.
Now, I’m not a scientist by profession–but shouldn’t there be room for debate in discussing the following counter arguments:
1. Correlation does not prove causation
2. A true independent, critical peer review
From a brief cursory reading of these emails, it sounds to be like the head of research wants to make the collected data fit the hypothesis.
It’s human nature to be biased and partisan. I wonder if it could be possible to collect the raw data and have it examined by qualified people with no ideological axe to grind–and let the chips fall where they may?
What the global warming proponents want to do is redesign our entire economic system and create layers and layers of bureaucracy in order to regulate almost every aspect of our lives.
I believe in being good custodians of our planet and the environment of course. It is a good thing to eliminate pollutants that are proven to be harmful. As far as CO2 goes, nothing has been proven yet.
I’ve done programming in a scientific research environment using large codes to do computational physics. All programs had to be well organized, debugged and verified by more than one researcher/grad student . They were also highly commented and even have descriptive documents on the side. Those procedures were absolutely necessary to keep everything in line and to eliminate mistakes, which are easy to make when the machinery is so complex.
If the Harry file is any indication, then I’d say they results from these programs can’t be considered as professional level science at all.
Been looking into the code and data held in the \documents\cru-code\f77\mnew directory. Here is what I have found:
The data sets are master.dat.com, master.src.com. master.src.com is the important file. Don’t open these without changing the extension to .txt otherwise Windows interprets them as executables, and you won’t be able to view them properly anyway. Could send copies capable of being opened in Windows. These contain monthly weather station data with one row per year. I don’t know the exact nature of these files, but some of the data does relate to sunlight duration. A site in Finland suggests master.src.com is temperature related, but there’s a lot of speculation flying around the Internet regarding the leaked files at the moment, so can’t be certain.
There are 3526 stations in all and 2578488 monthly observations. -9999 in a field means the observation for that month is absent. There are 269172 (10%) missing observations in master.dat.com and 14226 complete missing years. The programs are designed to completely ignore any years with no observations. In total there are 200649 rows (years) of observations which should equate to 2407788 months, however due to some years having up to 11 missing months there are 2309316 monthly observations used. Now what’s interesting is how these missing months are processed. Programs such as split2.f, where a year contains one or more missing months actually invents the figures using the following heuristic
If a month is missing try to infill using duplicate. if duplicates both have data, then takes a weighted average, with weights defined according to inverse of length of record (1/N)
That’s from the comment at the start of split2.f
What this really means is more than 4% of the data is being completely fabricated by at least some of the Fortran data processing programs. If this were done in other disciplines this would be extremely questionable.
Also did notice quite a few programs, especially in the documents\cru-code\idl\pro directory are designed to process data deemed anomalous, though this isn’t necessarily suspicious.
This is the header comment from documents\cru-code\idl\pro\quick_interp_tdm2.pro
; runs Idl trigrid interpolation of anomaly files plus synthetic anomaly grids
; first reads in anomaly file data
; the adds dummy gridpoints that are
; further than distance (dist)
; from any of the observed data
; TDM: the dummy grid points default to zero, but if the synth_prefix files are present in call,
; the synthetic data from these grids are read in and used instead
What is ‘synthetic data’ and why might it be applied to dummy gridpoints away from genuine observation points? This could be a recognised statistical procedure, or data massaging, or creating more observations out of thin air to skew certainty levels, just can’t tell and don’t have time to look at anything else in depth right now. Like it says, e-mails can be open to interpretation but it’s the code and what it does to the raw data which really matters. The comment in the Mann code described in the link below is a work-around to a recognised issue with dendrochronology data. During 1960s the correlation coefficient between tree growth rate and temperature altered.
The recent ERBE results are really significant, the discrepancy between IPCC modelled values and the real world figures is quite something.
http://wattsupwiththat.com/2009/11/22/cru-emails-may-be-open-to-interpretation-but-commented-code-by-the-programmer-tells-the-real-story/
It appears that the barbarians are closing in on Dr Mann’s castle. He appears to be damaged goods now.
http://www.mcall.com/news/all-a1_5warming.7097398nov25,0,5616440.story
“Judas” is at the very right…
http://i49.tinypic.com/2gy8w9v.jpg
Sorry, but Slashdot will not be of much help. They have become so liberal over at Slashdot that they have drunk the global warming koolaid by the gallon.
REPLY: They did carry the initial hacking story, I don’t see why they would not carry this.
In electrical engineering, any unique code used is included in the methods portion.
Because engineers expect to be able to investigate each others’ claims, instead of letting them be hidden in software black boxes.
Of course, that is engineering, not “science”.
“OH *UCK THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done I’m
hitting yet another problem that’s based on the hopeless state of our databases. There is no uniform
data integrity, it’s just a catalogue of issues that continues to grow as they’re found.”
So this tells us what the quality of the data is
We ahve already shown how Mannn uses the guesses from one tree’s rings to over rule a set of global temperature means.
This is more like voodoo and palm reading than science.
I am so not surprised at his use of “weights” to strengthen samples that fit the dogma.
Back to the gridding. I am seriously worried that our flagship gridded data product is produced by Delaunay triangulation – apparently linear as well. As far as I can see, this renders the station…
Well, this brings me back to a question I asked here a long time ago: how is average global temperature calculated? Somebody replied, “It’s called gridding.”
This happens to be something I know a bit about through my GIS work. Delauney triangulation (TIN production) would be the last thing I would expect them to use for this sort of a model. Anyway, one must peek into the code to see how the sausage is made, no?
Think about all those attractive colorful thematic maps of the globe, all gridded…
It’s actually quite entertaining to read the programmers notes while playing M4GW’s “Hide the Decline (hide the decline), in the background.
….. it stops you from crying.
Will there be a Congressional Investigation?
Comment by JamesD
Wednesday, 25 Nov 09 @ 12:24 PM
There will be no investigations with a Democrat congress. No committee chairman will allow it. I hope some of the heavyweight scientists will come to the defense of scientific integrity but I’m not holding my breath. For a long time this naked emperor has been parading down the street. The press and, particularly, the educated public all agree, he’s wearing a fine suit of clothes.
http://carboneconomy.economist.com/
http://carboneconomy.economist.com/content/programme
Steve Gavin at RealClimate has refused more than 6 times to post the following message, are you willing to present this very important point for me?
JCS says:
Your comment is awaiting moderation.
25 November 2009 at 12:58 PM
Gavin,
I have repeatedly tried to post comments on your website asking you to respond to the following statement:
If the first rule of science is to question everything, and another fundamental rule is that no hypothesis can be proven true, regardless of imposing the precautionary principle, why is the first rule and another fundamental rule being discarded, and any SCIENTIST (SKEPTIC) vilified or or censored for practicing what can only be considered good science.
I am a Climate scientist with a degree in applied science wildlife biology and a masters in climate change and sustainability and I am not convinced by anything I have read, seen, studied or experimented that there is a definitive correlation between CO2/Greenhouse gases and climate variability.
Also I have read much of this information from this hack/screw up/whatever and I clearly see what i would consider malpractice and unethical collusion. Particularly in the case where advice is given in ways to avoid taxes, data has been significantly fudged, code is manipulated and peer review process stacked (more so than usual).
I believe in Climate Variability, I also believe that humans as a whole cause irreparable environmental harm to the planet, however I am skeptical of the hypothesis that is Anthropogenic Climate Change and believe that more research and a more open and debatable approach needs to be undertaken to achieve real results in understanding this. Why is this wrong and why are so many other skeptics with the same opinion vilified and persecuted. Why have you censored more than 6 previous posts I attempted to put up on this topic.
Can you not see how this topic risks the credibility of science as a whole!!!!!!
Eric (09:40:29) :
In fairness that is too general a statement. It is important to be precise and specific, otherwise folks at RealClimate who actually really know their stuff will simply rip you to shreds. Certain critical pieces of station data have been requested. Certain pieces of code have been requested. More generally, authors of cliamte science research papers have been asked to post their raw data and their code in a way that will allow a complete replication of their results by interested third parties. It is the Institutional and individual refusal to do these simple things that has caused the questioning of the motives of climate scientists in general.
I don’t have to be a programmer to understand this:
The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way, and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.’
they have committed fraud. plain and simple.
meanwhile, I note that Washington Post and WSJ have picked up the story, reporting not “it’s taken out of context,” but rather “it’s starting too look like they manufactured the data.” There is yet hope that MSM will take it up and run with it.
for, even though MSM is firmly in the AGW camp, they will not be able to resist a juicy expose. Juicy makes ratings and sells advertising. They will ignore that they have been made fools.
How do know that the HARRY_READ_ME.txt file and the data he’s working on represents the official temperature ouput of CRU?
Maybe he is put to work on a special dataset tha has been corrupted somehow?
Surly there is more than only 1 guy doing this stuff at CRU??
Completely missed the point; these were entered/typed up AS the code was being written/debugged/maintained/retrofitted.
It is almost obvious you have never ‘coded’.
.
.
Here is the output of some code from a Briffa related file. If you run a flat temperature graph through it if gives a Hockey Stick. If you run an inconveniently divergent tree ring graph through it, it acts as a trick to hide the decline.
http://i49.tinypic.com/m9vcxv.jpg
In reading through this … stuff … I found my mind seemed to be stuck in a goto loop, continuously replaying that YouTube video — “Hide the decline … hide the decline”.
And then I come to this gem —
What the hell is supposed to happen here? Oh yeah – there is no ‘supposed’, I can make it up. So I have 🙂
This is what happens when you approach a problem with a preconceived belief system in place describing what’s happening. Data doesn’t conform to your belief? No problem – fix it so the problem goes away.
Sounds like they have some real PEBKAC errors over there. That is Problem Exists Between Keyboard and Chair.
hmmm (09:14:27) :
> It’s almost as if whoever wrote these notes wanted us to find them.
I disagree, though he is writing for future readers.. Faced with the same morass I likely would have kept something similar to Poor Harry’s Dairy. It would be useful to my manager to help chase after the data providers and document what I had been doing for the next performance review. It would be useful to me as something to refer to when faced with those “I thought I fixed that already” moments (there’s at least one of those comments in his diary). And it would be useful to cram down the original authors’ throats someday.
For a system this complicated, it’s difficult to keep some of the system issues straight in the comments, another reason it’s worthwhile to have a separate document. (Ideally that would be a design specification, but none came with the code. When porting software like this, I generally follow a start with the first things that need to run, end with the things that pull it all together. In the future I’m going add a pass to scan through everything and try to get a sense of what it does. That was my starting point decades ago, but with experience and skill, programmers can develop a “disdain” for the whole and can quickly find their way to the core problems. fortunately I’ve never had to work on something as much a mess as this.)
Hilarious:
‘discovered that a sum-of-squared variable is becoming very, very negative!’
Don’t they know ‘ i ‘ ??? imaginary as some theories?
For Europe the subject is less that hilarious!
Schellnhuber is the adviser of Kanzlerin Merkel and of the President of the European Commission Barroso!
A position with extreme power.
Now look at his writings, with Mann, Schneider,Rahmstorf and others, very recent:
http://www.copenhagendiagnosis.com/
and on the reports in the media:
http://www.n-tv.de/politik/dossier/Noch-gibt-es-Hoffnung-article79835.html
http://www.spiegel.de/wissenschaft/natur/0,1518,663045,00.html
Anthony–
can’t express the gratitude we all owe you for pulling these programmer comments out of the code, so revealing. Question– the ‘FOI’ header, where did they come from? what do they signify?
I did some numerical coding for my MSc-equivalent thesis, and in theory is possible to obtain a negative number by the addition of positive ones if the total is large enough to overflow, and the most significative bit of the mantissa is the sign bit, with the value of 1 for negative numbers. With those conditions, an untreated overflow would cause negative numbers.
If the code does not treat that, the results are no good. The solution is the use of data types with more bytes, and some studies about the numerical stability of the algorithms employed is a must. However, it’s amazing that such a newbie error could be made in one of the most prestigious research institutions, so another explanation could be possible. We did not find this problem in our work because we used algorithms that we knew stable with the sets of data employed. Normalization of data also helped 🙂
However, it amazes me that a organization with so much at stake in numbers did not get some basic reference texts* and had to resort to search for algorithms to calculate distances across Great Circles in Wikipedia!
* Such as this, edited in 1992: http://www.amazon.com/Numerical-Recipes-FORTRAN-Scientific-Computing/dp/052143064X
Maybe it’s time to GPL the codes… Make them a true public venture where everyone can see what’s being done.
No rob it’s not hot stuff. No one cares. Outside this narrow little circle of people who care about truth nobody give a f—
AGW plays to some sort of primitive pre-rational impuse or mythic structures that some people have … and that’s it. There’s no science here. Just stories.
Expect green taxes. No nuclear power stations. Shutting down the coal fired stations we do have.
“My three main goals would be to reduce human population to
about 100 million worldwide, destroy the industrial infrastructure
and see wilderness, with it’s full complement of species,
returning throughout the world.”
-Dave Foreman,
co-founder of Earth First!
More here The Green Agenda
The warmists are essentially the forces of the counter-enlightenment – quite literally the people who want to put the lights out. Do you think they’ll be swayed by truth?
I keep seeing this line in the code;
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
but what does it actually mean?
Observed Temperatures?
Homogenized Temperatures?
Fudge Factor Temperatures?
Not really sure what to make of this statement as yet…
Having been a developer for 25 years (I’ve written a fair amount of Fortran code), I can sympathize with the programmer. He’s been asked to produce consistent output that appears plausible from a train wreck of ratty source data collections. The collections don’t agree where they overlap, so he’s been told which ones to use over which time periods using what appears (to him) to be a completely arbitrary basis; to make it all fit together he has to “fudge” the numbers and write code to handle each set differently. It’s as if he’s been asked to generate the data for a bank’s corporate tax return using checkbook register copies obtained from customers and the bank’s stock price for the last year, along with general ledger entries from three different internal accounting systems. The results of his efforts are then to be used at the annual shareholder’s meeting in two week’s time, and if they’re wrong, the CEO will make him the fall guy.
Programmers are normally a meticulous bunch, and this is a programmer complaining bitterly about the crap he’s been handed. Further, he’s upset that he’s been asked to produce plausible results in short order by people who as scientists are supposed to care about accuracy as much as he does, but clearly don’t care in this situation so long as the results come out looking like what they want to see. His notes clearly indicate that he wants everyone to know what he was facing and what sort of shenanigans had to be done to make it all look good.
What the public is finally getting to see is just how inconsistent and error-prone all the source data is and just how much manipulation is going on behind the scenes. Everyone has been led to believe that the measurement data sets are pristine, accurate and from data sources shown to be reliable. It’s assumed they are good enough to be used to make decisions that affect millions of people and cost billions of dollars. What public is now seeing is a real shock, and people are coming to a common sense conclusion: The data is crap, and the results can’t be trusted at all.
I feel for the programmer….I would not want to be in his shoes.
The story was on Slashdot on Nov 20th. http://politics.slashdot.org/story/09/11/20/1747257/Climatic-Research-Unit-Hacked-Files-Leaked
The ‘science’ behind the AAM narrative can now be seen clearly for what it is a cloak to conceal the true intentions of the political classes.
I suppose the false cloak of science was only meant to cover the political aims until the point had been reached when the political narrative no longer needed a cloak to hide behind.
The BBC.
The university of East Anglia.
The climatic research unit.
The UK meteorological office(met office).
The Hadley climate research centre.
The US link being the Goddard institute of space sciences(GISS).
All of these institutions are involved in the scandal to at least some extent in the scandal and by some strange coincidence all of them are fanatical believers in and supporters of the AAM narrative, they report the science as settled and beyond doubt, they indulge in attacking sceptical scientists and flout the rules when it suits them, they feel they are above reproach and audit perhaps because they have powerful friends?
Of all the world reach media platforms the BBC has been perhaps the most fanatical in peddling the anti CO2 anti capitalist anti industrial anti free market stories of the ‘eco industry’ it has used its nearly unlimited resources to provide a tsunami of supposed evidence however thin and patchy using state of the art visual manipulation and using the propaganda arts to full effect. The link to all of them is money and/or political affiliation, they are a closed loop of interconnected people with a common agenda showing the arrogance of those who know they have friends in high places, powerful and influential allies.
In attacking the fake cloak of science we are in fact not directly attacking the puppet masters driving the entire narrative, like a bull attacking the matadors cloak? The political forces need this sea change in our civilisation and they feel that they cannot be open and honest with us about the reasons for the required changes so they cover it in lies to hide it. Whatever the real reasons for the massive reordering of our entire civilisation are I suspect that AAM is not one of them.
To get at the truth we must expose the entire chain and the links in that chain, the political classes are moving fast now, faster than perhaps they expected to move because the foundations of their scam ie a warming planet is not happening, without the cloak of fake science to cover it the real movers are exposed for the world to see, if the political classes refuse to alter their policies when the science is exposed as fraudulent we will know who is behind the cloak.
John F. Hultquist (09:21:16) :
‘“a sum-of-squared variable is becoming very, very negative!”
This is an interesting “trick.”…. ‘
I had this happen using a common spreadsheet – which shall remain unnamed here – several years ago. I no longer use any spreadsheet for any serious numbers. Under and overflow conditions can occasionally break things. Commercial programs are particularly poor at documenting such things, and worse at fixing them.
On another note, the QC on the climate history data “Harry” is trying to work with is terrible. Whoever was responsible for supervising the primary data entry should have been running consistency checks, possibly every day, before forwarding it to Hadley or where ever. This is particularly important if the data is keyed from written records. Typos, misreadings etc. can creep in, and often, due to the character of the data recording methods, no useful filters can be employed to catch subtle errors during the entry process (e.g. instead of a proper data entry system the data was keyed into a spreadsheet line by line) – and Harry obviously is trying to deal with such ugly data.
OMG! The sum of squares parameter going negative? This isn’t possible! Even if all computed values from the model deviate negatively from the actual data, the squares of these numbers are positive values and thus the sum of squares is always positive.
I had a similar problem fitting titration data back in graduate school where the fitting statistics weren’t working out. It turned out I had coded in an incorrect equation for a derivative, neglecting to multiply by ln(2) in one line out of a thousand lines of code. It was plainly obvious something was wrong with my model from output plots, but wasn’t obvious in the code.
I feel for the programmer, but it’s his job to straighten out glaring problems, particularly when the output is a mathematical impossibility.
jamespapsdorf(09:40:10)
the probelm is that quoting Limbaugh, Beck et al will get us no-where – far too easy to dismiss them as politically motivated, and they have no credibility in around 50% of the population (in the USA) and of course a much lower number globally (remember – this is a global issue)
No – I think the solution to this is independant review as called for by Lawson in the UK and I understand one of the Senators in the US.
Attack with facts, deconstruct the code issues and eventually the MSM might, just might, start to run with it.
My opinion – for what its worth – I think we are too late.
ralph (09:51:07) : writes:
O.T. but:
Did you know that the BBC held a secret meeting that decided NOT to give balanced time to GW and anti-GW viewpoints. Apparently, the communique said:
“The BBC has held a high-level seminar with some of the best scientific experts, and has come to the view that the weight of evidence no longer justifies equal space being given to the opponents of the consensus.”
This is only partly correct, my friend.
For over 3 years I have been trying to elicit answers from both Mark Thompson (Director General) and Sir Michael Lyons (Trust Chairman). All I had received was sophistry and obfuscation, until I engaged the help of my MP.
Recently it came to light that a report had been commissioned in June 2007 jointly by the Trust and BBC Board of Management entitled “From Seesaw to Wagon Wheel-Safeguarding Impartiality in the 21st Century”. It concluded: ‘There may be now a broad scientific consensus that climate change is definitely happening and that it is at least predominantly man-made… the weight of evidence no longer justifies equal space being given to the opponents of the consensus’.
(SO THEY HAVEN’T EVEN TRIED TO MAKE A SECRET OF THIS…JUST SHOWS THEIR ARROGANCE!)
Despite this damning evidence from their own report, they steadfastly cling to the belief that their impartiality is intact as required by the BBC Charter. Such is their state of denial that Sir Michael Lyons has even tried to deliberately mislead my MP despite evidence I have to the contrary.
In light of this I have posed the question, through my MP: “On whose authority did the BBC cease to be an impartial Public Service Broadcaster, as required by its Charter, and become the judge, jury and sponsor of such dangerously specious political dogma so eloquently described as ‘…the consensus…’?
Answer comes there none! I believe it is time for the BBC to be subjected to an enquiry on this matter.
JCS (10:01:28) :
Steve Gavin at RealClimate has refused more than 6 times to post the following message, are you willing to present this very important point for me?
JCS says:
Your comment is awaiting moderation.
25 November 2009 at 12:58 PM
Gavin,
I have repeatedly tried to post comments on your website asking you to respond to the following statement:
If the first rule of science is to question everything, and another fundamental rule is that no hypothesis can be proven true, regardless of imposing the precautionary principle, why is the first rule and another fundamental rule being discarded, and
XXXXXXX
Real climate says all comments are shut off for 2 days. The adverse comments outnumber the puff comments 10:1. It looks like the outrage is being posted over there.
CEI has sued Schmidt, Gavin for working on the blog instead of doing NASA work. There are years of FOIA requests in the que at NASA GISS that wait being released.
Just to be graphic, Gavin schmidt is pimpin’ global warming when he is not doing work he is paid to do.
Human Resource managers have a problem when people are on the job and doing work for themselves. Earlier a mod named “eric” was doing all the comments and then they shut down. Yesterday Gavin posted a request for someone to volunteer to help.
The blog post should distinguish between the various bits of code.
The stuff in folders like osborn-tree6\mann and harris-tree\recon1.pro and osborn-tree6\mann are most likely programs used to generate things for peer-reviewed articles.
This is quite different and distinct from the code used to produce the HADCRUT3 temperature series. I am not postive, but it does appear that the HARRY_READ_ME.txt file is about the “CRU Code” as most of us would interpret it — the code used to produce the HADCRUT temperature record.
I am a bit worried, though. Nothing is showing in the MSM. The folks at RC seem to be moving on as if nothing happened. It’s almost as if they have all agreed to never speak of it again. The only ones discussing this are us and the likes of Glenn Beck, Limbaugh. This reminds me of “1984”, a surreal situation where everybody knows the truth but everybody pretends that they don’t and keep on shouting that the world is going to burn. I cannot believe that such an opportunity to kill the AGW theory is just going away as if it never happened.
What is going on?, is the world mad, or are we?
Re. the “sum-of-squared variable is becoming very, very negative”.
I’d imagine that was due to an overflow.
For the non-programmer folks; there are various data types that can be used to represent numbers (signed integer, unsigned integer, float, double etc.) but they aren’t capable of representing arbitrarily large numbers. If you exceed the limit, you generally end up with a negative number of the same magnitude.
i.e. MAX_NUMBER + 1 -> -(MAX_NUMBER + 1)
The fix is simply to use a type that allows larger numbers.
Obvious explanation (09:22:20) :
“But gavin says we’re taking this out of context…..”
I’m amused by the use of the king of spin to explain away the “inconvenient truths”. It’s sort of like asking the fox if he knows what the commotion in the henhouse was caused by. “Nothing here to see, move on”
Just watched ‘The Cloud Mystery’ on Youtube about Svenmark’s work on Cosmic Rays and their creation of the aerosols on which clouds form. Interesting that his experiment was conducted in Copenhagen? – You don’t think December’s meeting is a distraction ploy while a certain group smash his lab up??
>>” Bernie (10:01:40) :
In fairness that is too general a statement. It is important to be precise and specific, otherwise folks at RealClimate who actually really know their stuff will simply rip you to shreds. Certain critical pieces of station data have been requested. Certain pieces of code have been requested. More generally, authors of cliamte science research papers have been asked to post their raw data and their code in a way that will allow a complete replication of their results by interested third parties. It is the Institutional and individual refusal to do these simple things that has caused the questioning of the motives of climate scientists in general.”
Has UEA/CRU released any of their modeling code to the public? The impression gathered from reading the “liberated” e-mails and programming code is that they were doing everything possible to block open peer review of their work, and were instead trying to keep as much of their data (which maybe should be termed “data” given what we’re learning about its quality) and “secret sauce” code concealed from other scientists, let alone non-professional lay scholars and the public.
It’s a little shocking — why on earth isn’t every single piece of modeling code originating in university and public labs released to the public and subjected to open review by computer scientists, code writers, mathematicians and other climatologists? To increasingly learn that it’s not is a surprise even to me. How widespread is this kind of concealment?
HSBC, Citibank and the CIA can keep their internal climate code secret. But there’s no justification for universities and academics to conceal theirs’. It’s a subversion of the scientific process. And it demands that the question of motive be answered.
The Pennsylvania State University has a prepared statement you may request by calling the Office of Public Information at 814-865-7517. I have it if you can’t get through.
After reading all of this, I would readily bet that Harry is the whistleblower. His words are those of someone becoming really angry, and less and less confident in the scientific integrity of the team he was working for. I cannot put myself in his shoes, but it is not difficult to imagine that, some day, possibly discovering the FOIA stuff, he decided that it was too much.
OT: Someone please educate Dr. Jeff Masters over at Weather Underground. If you can. I am beginning to wonder if he is someone who is in bed with these same groups of people because he gleefully talks about warming while hiding cooling. Just look at this blog.
http://www.wunderground.com/blog/JeffMasters/comment.html?entrynum=1389
“Mike (09:38:32) :
One should just run this code on some totally random data series and plot the output.”
Given the sustained bias of the “synthetic” corrections, I wonder if it would actually be possible to output anything BUT warming. Would it be possible to put in made up declining data and see what happens? I suspect it would come out as a hockey stick anyway.
I sincerely hope we will find out, and interview the person who wrote Harry. His insight into what was going on was invaluable.
Now I am convinced it was an inside job. Seeing the exasperation in the comments paints for me a very convincing picture of who the leaker was. Some code monkey, who was probably also doing double duty as an IT tech. He (or she) finally got fed up with the constant demands of the heads to do things that fly in the face of both ethical scientific procedure, and worse still, best practice computer programming.
I guess they worked him one sunday too many, or gave him black marks on his review because he couldn’t get the computer to say what they wanted. The programmer went off the deep end and decided to start compiling a file of some of the more incriminating skeletons in the CRU’s closet.
Can anyone tell me why Copenhagen is still going ahead?? !
I have worked as a professional programmer for more than 20 years, and I think that the language in these comments is strange, to say the least. I mean – I have often been swearing over poorly documented spaghetti code – (almost) as bad as this one – but I have NEVER put the swearing into writing. In my opinion, this stinks. It seems that the author of these comments WANTED the world to see them. He is certainly writing to another audience than his fellow programmers. So there are two possibilities: 1) Either, the programmer (Harry?) is the whistleblower, or 2) This is a trap.
Hultquist, Bosseler:
I’m not a programmer so I’ll probably say this wrong. I came across a commenter the other day (wish I could remember where) who seemed to have an explanation for the sum-of-squared variable going negative. He said it was a common error for inexperienced programmers to make: recursiively incrementing a variable until the sign bit gets changed. Kind of an overflow problem. He took it as an indicator of the quality of the coding….
Henry chance:
I notice that Joel Shore has also stopped posting as of 11/19.
I’ve warned Joel before that every post has a time/date stamp.
>>” Paul (10:12:48) :
I keep seeing this line in the code;
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
but what does it actually mean?
Observed Temperatures?
Homogenized Temperatures?
Fudge Factor Temperatures?
Not really sure what to make of this statement as yet…”
I’ve presumed it means the following: ‘Given the underlying dataset, our model predicts that the years from 1960 onwards should have been warmer…. warmer, that is, than they actually were. If we therefore publish the “unpolished” results from this model, people will be able to compare the predicted temps to the actual temps…. and it’ll become clear that our model makes poor predictions. Then they’d conclude that its predictions about the future obviously cannot be relied upon….’ And we wouldn’t want anybody doubting the model, would we?!
That’s a personal supposition; informed enlightenment requested.
Great reporting. With any luck CERN will finally nail it all with their CLOUD experiments.
“Do any other sciences permit one to hide calculations in a program and then not publish said program with the paper?”
Yes and no. Journals have differring standards, sometimes they’re enforced sometimes they’re not, sometimes they would go to the length you describe, sometimes not. This particular crew at Hadley has taken heat because the policy implications of their work imply a heavy right to know on the part of the public and especially people who wanted to and were capable of reviewing the methods. At which point the journals policies on the matter were examined and they were prevailed upon to actually enforce them, which they often didn’t, which lead to numerous and multiplying efforts to either get the information through enforcement of their policies or subsequently FOIA requests, which has eventually lead to… this.
But if we’re talking about a study in biology on the contents of the feces of some frog in the far corners of the jungle, it’s likely the journal it’s submitted to wouldn’t enforce their policy, and it’s likely no one would care. People cared here, and these guys having been lifetime academics and thus never actually having to earn a living via the quality of their work, thought they could simply blow off their detractors.
Hysteria, I’m not so sure. O’Reilly has, I believe, said that Global Warming is real and something needs to be done about it. So did McCain during his bid for the Presidency. That indicates a fraction of the conservative base had been convinced this was a real issue that needed to be addressed.
This revelation of subterfuge and skulduggery has certainly made a bunch of conservatives re-think this position, and I’m sure libertarians and a handful of liberals as well. Beat this drum loud enough, often enough, and the support base for global warming hysteria will once again return to little more than tree-hugging alarmists. But the time to act is NOW, before any more talk of ‘cap and trade’ or Copenhagen concessions make their way through Congress.
A bit OT, but maybe not too far. A question for the legal beagles out there: When a close circle of researchers conspire to block another researcher’s publication, would that not be tortious interference under the law?
It is, however, going to make interesting reading in the history books, on a number of different levels, perhaps even on a par with Piltdown Man(n).
http://en.wikipedia.org/wiki/Piltdown_Man
The worst possible scenario now is that the Sun will continue down it’s degratory path, CERN Cloud will pan out to support Svensmark, the global climate will enter areas we don’t really know signal real trouble. And all because some overzealous hypothesis funding gravy-trained the world’s climate databases into a spaghehtti-coded event-horizon.
This is a perfect example of Murphy’s Law striking mankind due to pure greed.
New internet meme: “Harry_Read_ME”
Examples:
HRM: You may have the chart upside down.
It was a HARRY_READ_ME job.
After an industrial accident: Their control code was still waiting for Harry to read it.
If only Harry were here to read this.
Etc.
BREAKING
http://www.guardian.co.uk/environment/georgemonbiot/2009/nov/25/monbiot-climate-leak-crisis-response
Over at Connolley’s blog I posted:
“The Emails show that Jones and Mann can’t be trusted. HARRY_READ_ME shows that the code is incompetent and the code itself shows manual adjustments that have no scientific basis. This is sufficient evidence to call for a third party review of the entire CRU methodology. ”
To which I got two replies:
——
PR Guy – what papers was the HARRY_READ_ME code used on? Have you any evidence it was used at all?
Posted by: Chris S. | November 25, 2009 12:07 PM
—–
PR Guy,do you even know which dataset/product the HARRY_READ_ME code is dealing with?
Posted by: Adam | November 25, 2009 12:46 PM
—-
To which I responded:
“Chris S and Adam, these are very reasonable questions. Perhaps you should submit a FOIA to find out. I’m sure we all agree that answers to these sorts of questions are vital and should not be obstructed.”
This last comment was deleted by William (or maybe the moderator, if there is a moderator). The Team never lets points get scored against them on their court.
CLIMATE GATE IN NEW ZEALAND!
J SAllingers Climate fraction caught in temperature swindle:
http://nzclimatescience.net/index.php?option=com_content&task=view&id=550&Itemid=1
Here NZ temperature graph before and after “adjusments”:
http://www.klimadebat.dk/forum/vedhaeftninger/newzealand.jpg
Sallinger has changes NZ temperature trend for the 20´th century from 0,06K to 0,92 K !
The team behind these findings will now move on to other countries.
WAY TO GO!
I’ve done some programming myself. I used to put in swear words in the code all the time.
In fact, I would even use the F word and variations thereof for variable and object names, LOL.
Hi Folks
What do you think about this one? It looks as if the “data adjustment contamination” has infected New Zealand as well. Look at
http://www.climatescience.org.nz/
and click on Link at: CLIMATEGATE IN NEW ZEALAND? – TEMPERATURE RECORDS MANIPULATED Science
It is incredible to read how New Zealand’s National Institute of Water & Atmospheric Research (NIWA) seems to have managed to make a “hockey stick” out of their raw data which apparently shows that there has been no warming of any consequence since 1850. One wonders who else has been involved in this game.
I guess this proves their point that Global Warming truly is man-made. Totally made up, in fact, by a few.
I mean seriously, [snip]????? “yearlyadj”? Temp proxy declines so just add a ramp to the values??????
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(…)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
I had a really scary thought…
What if the problem isn’t with the tree rings, but with the temperature series?
What if the temperature series is actually off by 2.6C (high) since 1940?
Does that mean we’re actually 2C or more below the 1940s temps?
Are we screwed?
The major problem with climate science (along with much of acadamia) is that they dont use ‘Software Engineers’ following good software development practices to develop the programms, models, etc. Mostly its just PhDs hacking stuff together. They are very smart… but software engineering is not their expertise.
Ask them where are the Requirement documents, Design Documents, Review documents, Test Plans, Test results, configuration Control plans etc..
Now they are asking for $100 billions to be spent based to some extent upon software that has not passed any formal testing….
JCS
“Steve Gavin at RealClimate has refused more than 6 times to post the following message, are you willing to present this very important point for me?”
LOL. JCS, welcome to the club of thousands who Gavin has moderated out because he finds their comments inconvenient. Gavin is a part of the AGW cabal. He is on the distribution list for many of the e-mails from CRU gate. Some of those comments make it clear that Jones and others consider Gavin as the guy that runs interference for them. I have written a small piece about how debates are orchestrated at RC here:
http://reallyrealclimate.blogspot.com/
HARRY_READ_ME is a great work of stream-of-consciousness literature and perhaps Harry was aware of it so when they asked him to delete it after he got all excited about it being included in a Freedom of Information Act package that then was denied release…he said to himself:
“No, so holp me Petault, it is not a miseffectual whyancinthinous riot of blots and blurs and bars and balls and hoops and wriggles and juxtaposed jottings linked by spurts of speed: it only looks as like is as damn it; and, sure, we ought really to rest thankful that at this deleteful hour of dungflies dawning we have even a written on with dried ink scrap of paper at all to show for ourselves, tare it or leaf it, (and we are lufted to ourselves as the soulfisher when he led the cat out of the bout) after all that we lost and plundered of it even to the hidmost coignings of the earth and all it has gone through and by all means, after a good ground kiss to Terracussa and for wars luck our lefftoff’s flung over our home homeplate, cling to it as with drowning hands, hoping against all hope all the while that, by the light of philosophy, (and may she never folsage us!) things will begain to clear up a bit one way or another within the next quarrel of an hour and be hanged to them as ten to one they will too, please the pigs, as they ought to categorically, as, strickly between ourselves, there is a limit to all things so this will never do.” – James Joyce (“Finnegans Wake” 1939)
Run these numbers, see if that makes a hockey stick…………
http://spreadsheets.google.com/ccc?key=0Ah4XLQCleuUYdFIxMnhMNnlXb2JQcDZUendjUXpWWUE&hl=en
Are there perhaps some people around still denying this fairly decent evidence of poor science? Is there some term we could use for them perhaps?
;mknormal,yyy,timey,refperiod=[1881,1940]
;
“; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(…)
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj”
************************************
Hmm. Are there edited versions out there? My file:
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
;
oplot,!x.crange,[0.,0.],linestyle=1
;
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,[‘Northern Hemisphere April-September instrumental temperature’,$
; ‘Northern Hemisphere MXD’,$
; ‘Northern Hemisphere MXD corrected for decline’],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,[‘Northern Hemisphere April-September instrumental temperature’,$
‘Northern Hemisphere MXD’],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
;
end
Reed Coray (09:35:23) : “http://i47.tinypic.com/mrxszt.png
Great picture, but I have a question. Which one is Judas?”
Easy. They all are. 🙂
If I produced work like that I would want to delete it also rather than give it to an auditor.
Short of a detailed explanation and the full production of a model establishing this as nothing more than an interim piece of work; Jones should be fired for this alone.
He probably should also be held to account for the fraud that this is. This is beyond incompetence for somebody in Jones position if he can’t bring something forward to mitigate this!
This would also appear to establish Jones as a liar in suggesting the “to hide the decline” behavior only dealt with modifying a graphic.
Very disappointing. slashdot.org has only posted the original Nov 20th story about the fact that files had been hacked (or leaked) from the Hadley CRU. No follow-up about the content or the code that is being found.
Global economic nightfall, perhaps aided and abetted by programmers, and not a peep out of slashdot?
That isn’t considered very professional in my business, to be honest.
Excuse me, Mr. Monbiot. Nobody has given me a copper-coated zinc penny.
I want my weather back. Get it?
Dishman (11:00:55) :
Are we scrooged?
Probably.
Depends upon whether you can trust GHCN with Jones/Karl online now is not the mangled mess HARRY was tasked with while supposing that the MasterDB will somehow miraculouly appear in a pristine state.
Now, just close your eyes, click your heels 3 times, and say “there’s no place like home, there’s no place like home”.
If we don’t fix the science and clean out the “climate caves” we will have more of this bogus in the future:
http://heliogenic.blogspot.com/2009/11/can-it-get-any-more-hysterical.html
http://www.solarcycle24.com/stereobehind.htm
Zzzzzz…….snore….zzzzz…..snuck, snore….zzzzzz
documents\cru-code\linux\cruts
This code is used to convert new data into the new CRU 2.0 data format (.cts files).
There is another version of this code in cru-code\alpha which is, per comment in the readme file, intended for running on the “Alphas”
Data can come in from text files or Excel spreadsheet files (or actually Excel spreadsheets written out to text files from Excel). These programs are designed to read multiple climate data file formats include
GHCNv2
CLIMAT (Phil Jones formt)
MCDW
CLIMAT (original)
CLIMAT (AOPC-Offenbach)
Jian’s Chinese data from Excel (appears to be text output from Excel)
CRU time-series file format – with the comment “(but not quite right)”
Data files for running these code files are not available in this archive.
Software engineering comment – this collection of programs – very large source code files – is implementing a crude database management system. Most of the source code is uncommented and undocumented. From a s/w engineering perspective, it would have seemed wise to have used an existing DBMS that had been extensively tested and verified. Instead, the approach chosen results in extremely large amounts of custom code being written. There is no evidence provided of software quality assurance (SQA) procedures applied, such as a test plan, test scenarios, unit testing, test driven development and so forth. It would most likely have been quicker and more reliable to use existing software tools like DBMS.
The goal of the software is to eventually calculate the anomalies of the temperature series from the 1961-1990 mean.
Because station reporting data is often missing, the code works to find nearby stations and then substitute those values directly or through a weighting procedure. In effect, the code is estimating a value for missing data. Station data will be used as long as at least 75% of the reporting periods are present (or stated the other way, up to 25% of the data can be missing and missing data will be estimated).
The linux\_READ_ME.txt file contains an extensive description. Of interest, stations
within 8km of each other are considered “duplicates” and the data between the stations is “merged”. I have a question about this which may not really matter – but there is no attempt to determine if the nearby stations are correlated with one another. It is possible, for example, that one station is near a body of water (and less volatile) and another is on the roof of a fire station (see surfacestations.org). Or the stations could be at different elevations. In my town, the official weather reporting station moved 4 times over the past century – from downtown in a river valley, to eventually up on a plateau next to a windy airport. These locations would today fall within the 8km bounding area. My concern is that this could skew results in an unpredictable way. Then again, it could be that the situations like I describe are rare and would have negligable impact on the calculations.
Robinson: I may have exaggerated a bit. I only did it for one particular employer and it was because I was working 14+ hours a day and being placed on 24/7 pager duty with no extra compensation (imagine getting called during your grandmother’s funeral or while in Church on Christmas Eve–yes it happened to me), I survived 20 rounds of layoffs at that company during the dotcom crash.
So in order to relieve a bit of frustration, I had a bit of fun with my source code.
And one of these programs was literally a SPAM DIALER, which was a robo caller that would annoy people with telemarketing promotions. It was quite efficient, it was able to call about 20,000 people a day (with 2 T1s).
Yes, I confess that I programmed a spam dialer during the dotcom crash in order to pay the bills. And there were swear words in the source code! 🙂 (and I don’t feel the least bad about it).
The company is now dead and bankrupt and I danced a little jig when I found out about it a few years later!
ManBearPig
It is, however, going to make interesting reading in the history books, on a number of different levels, perhaps even on a par with Piltdown Man(n).
.
.
In spite of all this, a BBC program announcement for this eveneing, ref Copenhagen, is: “Can President Obama save the Planet?” I despair.
“a sum-of-squared variable is becoming very, very negative!”
…nice … the only way to get a large negative with a sum of square is if the numbers are “imaginary” [no pun intended – ie i = sqrt(-1) ]
,,,but an ironic comment none the less given the second meaning of “imaginary” numbers (ie – they made them up)
The real meat of this whole deal is likely to be found in the code & how the data has been manipulated. Not that this is neccessarily illegal, but it will clearly show that the science is not nearly as settled as it is reported to be (especially for making trillion dollar decisions based on POS code). It would also appear based on all the information I have seen so far that the magnitude of warming over the recent historical record (roughly 130 years) may be less (possibly significantly less) than has been represented.
The significance of the last statement can not be over-emphasized. This is what needs to be determined (or re-determined) ASAP. I wish I had more time as I would delve into it.
The original data isn’t neccessarily needed to do this. If a synthetic dataset of reasonable similarity were created & then run through the code, you could look at the output compared to the input & get a sense of how the data has been distorted. With several synthetic datasets that had differenet assumptions, you could test some sensativities to different aspects of input. With that information in hand, you could probably create resonable scalings to estimate what the original data looked like & how it has been distorted & presented to the public.
This approach is somewhat similar to what Steve McIntire did with the hockey stick de-bunking, where he adventually showed that even inputing a random number seqence resulted in a “hockey stick”. If someone who has the time & skills could take this same approach with this code, it might not just kill a “hockey stick” but AGW in total – in other words, it is possible no matter what raw data you put in , the global temp tend always comes out increasing. Don’t dismiss this possibility! If this hypothesis was born out, AGW would be catagorically dead. Given the implication, it certainly seems worth the time & effort.
The above scenario is interesting to contemplate, but the most like scenario is that the actual warming is less than represented. Keep in mind, if you just do a logarithmic curve fit to the data as currently presented (temp & CO2 have a logarithmic relationship theoretically) that the sensativity in terms of deg /CO2 doubling is already substantially below the IPCC #’s (see my post on “spencer on finding a new climate sensitivity marker”, 10-5-09). So, if the actual temp tend is flatter than currently represented coming out of CRU, it means the sensitivity is even smaller still. Of course, that does mean that going fwd, there is no way to represent CO2 as a significant problem and AGW as a “problem” is dead.
Hiding the Decline: Part 1 – The Adventure Begins (Eric S. Raymond)
http://esr.ibiblio.org/?p=1447
I’d like to know what the “decline” is. Is it temperature?
documents\cru-code\f77\mnew\sh2sp_m.for
This program, sunh2sunp, converts the “sun hours monthly time series to sun percent (n/N)”. I do not have access to the cited reference used for calculation so have not yet determined if the code is implemented correctly.
However, in the odd situation where the calculation exceeds 100%, the code, surprisingly, checks for this but then leaves the incorrect value in place:
c decide what to do when % > 100
if(sunp(im).gt.100)sunp(im)=sunp(im)
For non programmers this says, in simplified form
if x > 100, then let x = x
Normally, if a value is incorrect, the error is either flagged or perhaps in this case it
could be due to round off error – in which case, we might expect something like:
if x > 100, then let x = 100
which would force the value of x to never exceed 100.
The purpose of this program and how it fits into any analysis is not yet understood. The program appears to date back to 1997 (and probably went out of usage by 2003) and it may no longer be in use. It is entirely possible that the above error condition never occurred – and consequently, this defect in the software would have no impact on the results.
In a separate code file (sp2cld_m.for), the above test is implemented correctly:
IF(CLD(im).GT.80.0) CLD(im)=80.0
The file exhibits poor Fortran coding standards such as:
ratio=(REAL(sunp(im))/1000)
IF(RATIO.GE.0.95) CLD(im)=0
Note the lower case ‘ratio’ and the upper case ‘RATIO’ variable names.
The variables XLAT and RATIO are not declared. Similarly for iy, iy1, iy2. Fortran 90 permitted this practice and would automatically define the value based on the first letter of the variable name: A through H and O to Z are set to type ‘real’. Use of this feature is discouraged because the compiler is then unable to flag typographical errors – instead of warning of using an undeclared variable, it just defines a new one. This can result in erroneous program operation – if that occurs. Note – this is a software engineering issue and is not the source of any identified execution errors in this program. This is to note that this is poor programming practice. It does not appear to have resulted in an implementation or execution error.
Note – the issues I cite do not mean the program’s executed incorrectly. They are more indications of poor programming practices. And I believe we the people deserve the utmost care and professionalism in a matter as important as this.
I have to say, speaking as an OBI/Hyperion system consultant, the biggest issues we have on virtually every project are;
1) Serious data issues
2) People don’t understand the data in the first place.
I feel for Harry.
Amazing !!!
I can’t believe that this is all real !
The REAL IRONY in this DEBACLE is the INTERNET that GORE INVENTED is going to hasten HIS DEMISE.
Forgive me for jumping in here, as its a bit OT.
Richard A. said:
‘But if we’re talking about a study in biology on the contents of the feces of some frog in the far corners of the jungle, it’s likely the journal it’s submitted to wouldn’t enforce their policy, and it’s likely no one would care.’
Not so! To the contrary. Without documenting how many frogs you were studying, and where (not forgetting a control group, heh!) and at what time (dd/mm/yy), you’d get your paper sent back.
All this comes under ‘Material and Methods’.
Next you have ‘Results’.
Thats where all your numbers go, and the stats.
The point is, especially in biology, that anybody must be able to go where you went, do exactly as you’ve done, and come up with the same results (given a dead frog here and there …)
Then you can talk about what you’ve done and what your results mean.
That is why I, a retired zoologist, find these revelations so utterly distressing. If you don’t provide the data on which you’ve built your hypothesis, how can it ever be replicated? How can it be confirmed or refuted?
Science is about replication of that what you discovered – its not about secret knowledge which only the select are allowed to share.
I am dismayed at the huge disservice these people have done to science.
M.A.DeLuca (09:40:16) wrote :
“If a physicist were to submit a paper without showing the math, that paper would (I assume) be rightly ridiculed and sent back with a “show your work” rebuke. It doesn’t seem right that one can hide one’s work in software, and then casually dismiss the absence of documented code upon submitting a paper as these yahoos have done. And yet, that seems exactly the way mainstream climatology works. Do any other sciences permit one to hide calculations in a program and then not publish said program with the paper?”
M.A., I am a medical research scientist with five peer reviewed abstracts published in leading journals. Last year one of the papers I co-authored was selected for oral presentation (a high honor).
Before we can even begin a study, we face a panel of experts called an IRB (Institutional Review Board). They review our hypothesis, proposed methodology, demographics, and inclusion criteria we intend to use in the study. This is our first peer review. If we don’t pass this review, the study is dead.
While we conduct the study, we must be absolutely careful to follow the study protocol approved by the IRB. If we discover anything that needs to be changed in the protocol, we must stop the study and go back to the IRB to request a protocol change approval. We can’t simply say “we’ll make adjustments here and there to fix the problem.” The IRB holds another review. If we can’t get their approval, the study is dead.
When the study is completed, we then submit it to the journal for publication. We are required to disclose all data and methods sufficient to reproduce our results. Typically, we create a resource package containing the database (in XLS format to facilitate import into any database), queries, and formulas. If formulas are calculated using computer programs, we supply the source code. There is no concept of hiding behind intellectual properties in the disclosure – if you can’t provide the means to reproduce the study using our methodologies then the study is summarily rejected. By journal requirements, we must make the same package available to any doctor or center requesting it. We can charge a reasonable processing fee to defer costs in providing the package.
We do not choose who the reviewers will be. It is during this official peer review process that we must respond to any and all questions from the reviewers. Sometimes we are asked to include additional information in the abstract or fix up citations and other presentation issues. Assuming we pass the peer review, the abstract is published. All of the above applies even to retrospective studies (most climate studies are retrospective).
From what I’m seeing, it appears that the climate journals have little to no independence in peer review and bow quite low to peer pressure. When people say peer reviewed in context to climate, I laugh and remind them its a good ole’ boy network. Bring a case of Mann’s favorite beer and it’ll get published.
This all may be a distraction, to keep our eye off the Copenhagen ball.
Make no mistake, they have not given up. Quite the contrary.
“Obama says ‘step closer’ to climate deal”
http://news.ninemsn.com.au/article.aspx?id=975599
They have given us the sacrificial goat, which has served it’s purpose. And, while we have a feeding frenzy, the real beast walks by unhindered, and barely noticed.
In less than 8 hours, google hits on “Climategate” have gone from 160,000 to 24,200,000. Hockey stick anyone? Isn’t 24,200,000 about the same as WUWT hits? Coincidence or WATT?
The bug shown traipsing across this code (in the picture) … looks a lot like the assassin beetles that buzz into our house ever summer. If so, this picture is apt.
Wow, instead of cutting out the lines before 1400, they should have used a low-pass filter. They know how to use a high pass filter already, so why not a low pass filter?
How hard is it really to read a thermometer? From what they say, the temperature read on any thermometer is not the real temperature… that’s news to me! I better get a copy of their code because I have many thermometers, RTD and thermocouples in my lab.
Damn it, for all my life I thought water was freezing at 0 Celsius and boiling at 100 Celsius at 1 atmosphere… got to go back to school to learn the new rules provided by these bunch of people.
Proof again (if any was needed) that HADCRUT is untrustworthy for climatology! Should rely on satellite temperatures only from 1979 since the provenance and processing are better documented, better maintained, and independently (UAH vs RSS) validated.
All this blog communicating/ranting is fine but I’m here to tell ya that the Copenhagen ’support change’ bits are rolling on the radios. Sirius Left channel 146 are running them big time. Bill Press, Alex Bennett, Thom Hartmann, Lynn Samuels, Mark Thompson, Mike Malloy and others… google ‘em up, call ‘em up, get email addresses from the sirius left site and give ‘em a hard time. I do, every day!
They hang up on me, they know me too well, but you smart people can get through to a real load of people at a perfect time. They are a lot of fun to mess with.
this is so bad.
the global financial ramifications of this fraud are incomprehensibly huge. even when one considers damages incurred to date, let alone the future damages of pending legislation, there are at least hundreds of billions of dollars that have been bilked from taxpayers worldwide to fund this insidious mess. it must be the largest single fraud ever perpetrated.
anybody an expert on class action lawsuits to recover damages and put the fraud that is global climate change on trial?
at least in that case the world may be able to subpoena these groups to get at the truth once and for all…
makes one sick to think what a handful of politicians and “scientists” have been able to do to all of us and nearly every industry.
as a builder, the entire LEED certification process is nearly fully predicated on data and conclusions put forth by these groups, which has resulted in enormous additional costs on nearly every public construction project. what a sham.
i’m all for sustainability, but fraud is fraud.
-patternbuilder
When I was a student, we had to turn in our source code for our projects.
We couldn’t just turn in some output and say “see, I got the right answer”. The source code and the output was evaluated. The code had to be properly commented.
It’s obvious these programmers never expected any outsiders to view the source code. Is the science as sloppy as the code? By all appearances, yes.
Seems to me that “calibration” of tree ring data is a bit of a joke.
For the modern periods where we have the highest quality records available we see the “recalibration” where the numbers get “fudged” (the scientific term used in the code is “fudge factor”) in order to make a “calibration” work that makes the pre 1900 data reflect cooler temperatures than if the data had been calibrated to post 1900 data.
Not only do we have the “hide the decline at the end” (which can be argued as a plausible treatment if young tree rings are somehow not “ripe”), but the pre 1900 manipulation is just the sort of thing you need to do in order the straighten the handle of the hockey stick, isn’t it?
New Zealand Icebergs —
More than 100 icebergs that were first spotted off the coast of Macquarie Island, an Australian territory around 900 miles south east of Tasmania, are now thought to be only 200 miles away from New Zealand’s south coast.
This is only the second time in 78 years that large Antarctic icebergs have been sighted so far north.
“While the size of the icebergs has attracted a lot of attention, it is not unusual for icebergs to be found in these waters,” a spokesperson for Maritime New Zealand told CNN, who continued to say that alerts for smaller icebergs are not uncommon.
But a half-kilometer wide iceberg visible from New Zealand’s coast would represent a very rare occurrence.
“An iceberg that size this far north is pretty significant,” Philip Duncan, Head Weather Analyst of the New Zealand-based Weather Watch Center told CNN.
It is thought that the current flotilla of icebergs came off the Ross Ice shelf between 2000 and 2002, the same period that produced the 2006 icebergs.
The question now is what caused the huge fresh water icebergs to break off from an Antarctic Ice shelf and what has allowed them to travel so far north.
“A lot of people are saying it was due to a very cold snap a few years ago in Antarctica that caused more ice than usual and the outer regions of that ice snap off each summer,” said Duncan.
SOOOO — despite all the global warming claims, Antarctica was suffering very cold snaps in 2000 to 2002, enough so that more ice and snow deposited, leading to large icebergs breaking off. In other words, all the warming nuts who point to icebergs and scream “The ice is melting we’re all going to die” failed to check the weather, which was COLDER, had more ICE, and naturally lead to more bergs breaking off the Ross Ice shelf.
Ironically, the University of East Anglia has a Computer Science department:
The fact is that they have the expertise on campus to engineer a decent piece of software. I’m not saying it would neccessarily work that way of course. Most of the good practice in design and implementation I have learnt since leaving University, not while I was an undergraduate. In industry you literally won’t have a paycheck if things don’t work.
The problem is that nobody outside of a small circle of users was asked to audit the software, or, I suspect, to contribute to its design or development. It’s quite stunning that its output is being used as “evidence” (cast iron!) forming the basis of trillion dollar government programmes.
But anyway, we don’t know that the program is broken. It probably produces the desired output ;).
The more I read about this, the more disgusted I become. While in college I studied under Chris McKay somewhat (Dr. Terraforming), and the “tricks” that keep showing up here are items that he told me to not do. He always told me to go from “base principles instead of pinning.” What it seems like to me, is that the observed data doesn’t fit the model, so the data is artificially pinned to meet the expectations of the model. Disgusting.
Love the euphemisms. “Corrected…” “We can skill lines…?”
While it’s not over by a long shot, this is a taste of vindication for all of those who refused to worship at the alter of Gaia.!!! And cheers to the people here and elsewhere that do the legwork for so many of us. Keep fighting the good fight!!!!
“Pieter F (09:16:40) :
… why won’t the mainstream media report on the matter?”
…Because they’re all waiting for each other to be the first to break the story properly, which will take a huge investment and a massive gamble with their credibility to pull off.
When the Telegraph exposed our MPs’ expenses they were assigning up to 60 journalists to cover it, and check all the facts before each publication. That’s why all the main papers are still only putting the CRU fraud in sidebars and in non-staffers’ opinion pieces, and not in their headlines.
RE Hank Hancock and Viv Evans
Hank is a perfect example of what it’s like when you’re actually held accountable for your work. Note Hank’s field: Medical Research. I agree Viv, these are the standards that should be met. But quite obviously they aren’t always met, or none of us would be here, right now, reading this stuff.
Wow.
*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
Just…wow….
Folks, I do forecasting for an XX billion $$ corp. The program is written in FORTRAN and has over a million lines of code. It was written by mastered programmers and math geeks. The program is documented in 10 volumes and the code contains extremely well defined notes for variable and process outcomes. This was built by a private corporation to manage X billions in yearly expenditures.
To see and read anything that looks like this mess is quite disturbing. The company would go broke with this type of programming.
BTW – Anthony this is the best site on climate I have ever seen (for 2 years now) and all of you other science geek’s hats off for your contributions. From the girl in the northwest to the guys that can’t figure out what the sun really does to our climate I really enjoy & learn from your inputs and knowledge. I am anal about being analytical and one of these days, one of these days Alice – bang – to the moon. I’ll contribute to this most excellent adventure.
Timothy
Swiss Bob (10:38:34) :
For those of you who don’t know, the Cloud experiment at CERN is meant to test the connection between cosmic rays (from the sun and galactic centers) and cloud nucleation and hence cloud formation. The rate of cosmic rays entering the atmosphere is controlled by the strength of the earths magnetic field. During strong sunspot activity the earths magnetic field strengthens, limiting the amount of cosmic rays. When the sun is quiet (as it is now) the earths magnetic field weakens allow in more cosmic rays. To summarize the theory:
Quite Sun -> more cosmic rays -> more clouds
Active Sun -> fewer cosmic rays -> fewer clouds
http://algorelied.com/?p=3246
Follow that link to a post on my blog speculating that “Harry Read Me” possibly is connected to CRU staffer Ian (Harry) Harrison who has as part of his job description “data manipulation”.
It may help in shedding more light on this discussion.
Got to nail these guys, otherwise, to use an apt quote “For if we fail, the whole world will sink into a new Dark Age, made more sinister, and perhaps more prolonged, by the light of a perverted science.”
Who is Harry?
Possibly Ian (Harry) Harris – not Harrison as suggested above.
http://www.cru.uea.ac.uk/cru/people/
Increddible.
Scientists defends climate data he knows is tempered with and calls for an investigation at the same time. That’s what I call “turning on a dime”.
“Scam artist” gets a whole new meaning.
http://climaterealists.com/index.php?id=4457&utm_source=feedburner&utm_medium=feedutm_campaign=Feed%3A+co2sceptics%2Fnews+%28CO2sceptics+News+Blog%29
Question for the warmists, if you saw data like this having been used for building an aeroplane, would you fly in that aeroplane?
Only in their world that tree-thermometers are better than real thermometers. Amazing!
[youtube=http://www.youtube.com/watch?v=deiOUtn5Gh8&hl=en_US&fs=1&]
*************************************
Robinson (11:58:42) :
The fact is that they have the expertise on campus to engineer a decent piece of software. I’m not saying it would neccessarily work that way of course. Most of the good practice in design and implementation I have learnt since leaving University, not while I was an undergraduate. In industry you literally won’t have a paycheck if things don’t work.
**************************************
Judging from the comments, it probably isn’t the programmer that’s bad, it’s the data. It appears he was being pushed to achieve a certain outcome. If he was disgusted and frustrated enough, he just might do something a little rash like push the code and other info into the wild.
http://www.ems.psu.edu/sites/default/files/u5/Mann_Public_Statement.pdf
“Michael Manns articles have been published in well respected peer reviewed scientific journals”
Mann brings a new meaning to “scientific peer reviewed journals”
Connected to the station count problem, this was one of my jaw-dropping moments so far in HARRY. It confirms the problem I diagnosed in GHCN concerning station “sphere of influence” effects.
“Worked out an algorithm from scratch. It seems to give better answers than the others, so we’ll go
with that. Also decided that the approach I was taking (pick a gridline of latitude and reverse-
engineer the GCD algorithm so the unknown is the second lon) was overcomplicated, when we don’t
need to know where it hits, just that it does. Since for any cell the nearest point to the station
will be a vertex, we can test candidate cells for the distance from the appropriate vertex to the
station. Program is stncounts.for, but is causing immense problems.
The problem is, really, the huge numbers of cells potentially involved in one station, particularly
at high latitudes. Working out the possible bounding box when you’re within cdd of a pole (ie, for
tmean with a cdd of 1200, the N-S extent is over 20 cells (10 degs) in each direction. Maybe not a
serious problem for the current datasets but an example of the complexity. Also, deciding on the
potential bounding box is nontrivial, because of cell ‘width’ changes at high latitudes (at 61 degs
North, the half-degree cells are only 27km wide! With a precip cdd of 450 km this means the
bounding box is dozens of cells wide – and will be wider at the Northern edge!
Clearly a large number of cells are being marked as covered by each station. So in densely-stationed
areas there will be considerable smoothing, and in sparsely-stationed (or empty) areas, there will be
possibly untypical data. I might suggest two station counts – one of actual stations contributing from
within the cell, one for stations contributing from within the cdd. The former being a subset of the
latter, so the latter could be used as the previous release was used.
Well, got stncounts.for working, finally. And, out of malicious interest, I dumped the first station’s
coverage to a text file and counted up how many cells it ‘influenced’. The station was at 10.6E, 61.0N.
The total number of cells covered was a staggering 476! Or, if you prefer, 475 indirect and one direct.”
Ray (12:20:28) :
Only in their world that tree-thermometers are better than real thermometers. Amazing
Yep. The tree ring data from 1 tree has the power to over rule sat and human gathered readings. You betcha.
As a software engineer I ask: where are the test cases that prove the proper functioning of all this code? Surely there could be test data sets that could be fed in and check that the output is as expected, before applying it to real data? Test handling missing stations, duplicated stations, wildly varying Tmin/Tmax to generate alerts during generation.
If this is not done then a simple programming bug introdiced when modifying code will remain hidden and be very difficult to discover. Only code reviews/blind luck would catch this, without test cases.
Calibrate to land only poleward of 20N?
That takes the cake.
Finally, someone mentioned Nyquist! It is about time.
Jonny B. Good (12:22:06) :
Purchase proprietary data products from independent sources: $12,000
Pay programmer to adjust 1940’s and hide the decline: $85,000
Pay publishers to publish the results: 1 case of beer ($18.95)
Travel to Tahiti: $9,462
Take Andy Revkin out to dinner: $350 (including wine and cheese)
Senate investigation: priceless!
I starting to think “climategate” is going to turn into a $100,000,000 (or so) grant to CRU for software engineers to clean up the mess. Jones will be the sacrificial lamb, but he’ll walk with a huge severance package since he brought the huge grant in.
Never let a good crisis go to waste, you know.
Someone should see if MythBusters would like to take a crack at re-constructing the series, Just for Kicks.
REPLY: There’s no explosions or high speed crashes involved, doubtful they’d be interested. Climate science can be excruciatingly dull compared to TV science. – A
Mark (11:33:11) :
“I’d like to know what the “decline” is. Is it temperature?”
I’ll take a stab at your question with an offer to have anyone else correct it. The decline is not in the temperature record, but in the proxy data(tree rings, etc.). During a “calibration period”, the temp data is matched up to proxy data for good correlation. The CRU problem is the well known “divergence” that occurred between the proxy data and the temperature record, i.e., the temperatures went up, but the proxy data went down. To hide that decline in the proxy data, the temperature was spliced onto the end of the proxies at a convenient point (1940-1960). Voila, no more divergence
Which begs the question: if the proxies diverged after this period, why couldn’t they have diverged before the era of instrumental temp records. Any takers?
Of the prople that work there what names do not appear in the emails, what names do not appear as part of a progam or data filename?
Slashdot comments has an interesting link to a Finnish TV documentary where the reporters discovered the ‘climate scientists’ had flipped a temperature chart that was showing cooling to show warming instead.
How do you say Gotcha! in Finnish?
Does anyone even know what this code is supposed to do?
We ‘programming geeks’ try to comment our code so we can understand it when we go back to revise it, sometimes years later. Since we expect that we are the only ones who will ever read it, we tend to be very honest in our comments, especially “why” we include/exclude/fudge something.
Fudge factors are normal when trying to account for ‘real world’ data with an imperfect model. Anyone involved in modelling knows this. You have to have some way to account for factors that are not understood. The idea is to find a way to bring the results of the model into line with real world’s data. It is never suppposed to be used to twist the data to create an artifical world, which is what this nightmare is trying to do.
Ron de Haan (12:15:08) :
Great link Ron.
You’ve got to love Singer.
“The Climategate disclosures over the past few days, consisting of some thousands of emails between a small group of British and US climate scientists, suggest that global warming may be man-made after all – created by a small group of zealous scientists!”
Averaging day and night temperatures has been reported as enhancing perceived warming in England as daytime alone shows less, or no, warming.
This was highlighted this summer, which was dull and miserable. Our esteemed Met Office( see list of usual warmist suspects) put out some spin that it was warmer than usual when they included the night.
As trees grow in the sunlight while eating CO2, perhaps the tree ring proxies reflect only daytime conditions. This would make them different from the “real” temperatures if these included the night. This might hide any decline in daytime growth of trees if spliced on after 1960.
yep – I noted the past tense without surprise. Maybe in academia you could do that in code, but not as a professional in the private sector. Not for long anyway.
Robinson (11:17:25) :
I’ve done some programming myself. I used to put in swear words in the code all the time.
In fact, I would even use the F word and variations thereof for variable and object names, LOL.
That isn’t considered very professional in my business, to be honest.
M.A.DeLuca (10:42:18) :
“Hysteria, I’m not so sure. O’Reilly has, I believe, said that Global Warming is real and something needs to be done about it. So did McCain during his bid for the Presidency. That indicates a fraction of the conservative base had been convinced this was a real issue that needed to be addressed.”
I saw a comment somewhere (here?) a couple of days ago that McCain has, in the past two months, backed off from his support of CAWG.
Sorry,
Statistics sans Frontières (10:24:11)
juan (10:35:09)
‘I’d imagine that was due to an overflow.’
This type of incident is clearly signaled with another error message and the programmer would know, at least I hope he is knowing what he does.
A very, very negative number is still a number and not an owerflow!
Ray (12:20:28) :
“Only in their world that tree-thermometers are better than real thermometers. Amazing!”
—
it is actually much worse that that.
Certain special kinds of tree-thermometers that show the desired signal (identified b/c they show the desired signal) are better than actual thermometers. In the 1960s these special tree-thermos stop showing the desired signal and real thermometers become better.
There is no exaggeration in the above statement. No need for it. It is absolutely unbelievable to me that this crap was published.
Climategate: Alarmism Is Underpinned by Fraud
November 25, 2009 – by Ian Plimer
http://pajamasmedia.com/blog/climategate-alarmism-is-underpinned-by-fraud-pjm-exclusive/
It is as if climate research is a small field that someone decided to exploit for political purposes. But instead of doing things professionally, they just made up data and fudged methods. The fact that it was fudged together on a shoestring by some researcher up all night, is all the more to the cause’s benefit. If they’d decided to do things rigorously and professionally, there would be no results to speak of, they’d have to say, “come back in 10 years” and see if we have anything we can reliably speak of then.
But these emails were gotten illegally and CA others are pseudo and non-scientific blogs and Mandia was right.
Thus saith a AGW proponent in a comment on my blog.
LOL
People are already talking about the legal implications of all this.
Joanne Nova makes a salient point when she says; “Australia is in the extraordinary position of passing legislation that is known to be based on fraudulent science“.
Canada Free Press says ‘Greens to be to account‘.
I remain unconvinced that this story will ever see the light of day in MSM. The Canadian and US govs are well on track to introduce a 20% decrease in CO2 by 2020 and I think Copenhagen will move us dangerously close to a global agreement in principle. The science no longer matters. The money, research and legislation already dedicated to this will pass by shear inertia. We are to late to stop it. I just talked to my local MP here in Canada and he was so politically evasive its not even funny, and he is a die hard conservative. I got the real sense that the fix is in and nothing can stop it. To little to late. Only one thing remains. Mass revolt! By mid January we will see people in the streets, perhaps even dying over this issue! The Copenhagen agreement is the most draconian shift of power and sovereignty I have ever read. Folks, you are on the edge of losing all your rights!
A new Hockey Stick emerging (II):
http://www.blogpulse.com/trend?query1=michael+mann&label1=Mann+s+Trend+Curve&query2=phil+jones&label2=Jones+Trend+Curve&query3=climategate&label3=ClimateGate+Curve&days=60&x=0&y=0
I have done my fair share of hypercard programming (loved that language). HyperTalk supports most standard programming structures such as “if-then” and “repeat”. The “if-then” structure is so flexible that it even allows “case” structured code. The code “F*** This” seems to be missing its “If…” part. What comes after the “Case if…” in order to fill in the “…then F*** This” statement?
I can think of a few “If” fill ins.
“Case if field (found out) then (F*** This)” comes to mind.
And my apologies for very rusty Hypertalk. It has been FOREVER since I have used it. And to really show my age, I cut my computer teeth on a WANG that had the motherboard taking up the entire basement wing of the old VA hospital in Portland. Those were the days. The old WANG terminals wouldn’t let you use swear words. If you did the programmer had put in subroutines that gave you a lecture on using foul language at work.
I suggest you try working for a defense contractor on DOD, DHS coding projects then. You obviously need to get out of the house more often.
This is no trap. I am sympathize with Harry completely. Been there, done that. This is nothing new…
oops .. meant to say: “I can sympathize with Harry”
I may not understand any of the arguments, but never have I been more convinced about something than I am here. This is shocking.
A must see. http://www.youtube.com/watch?v=cTGLpqFGyYM
Have anyone noticed this reasonig regarding FOIA in file FOIA\jones-foiathoughts.doc
“Options appear to be:
1. Send them the data
2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.”
Second option seems quite revealing….
Robinson (11:58:42) :
Ironically, the University of East Anglia has a Computer Science department
Apparently & according to the emails this dept. has more FOI requests than CRU!
Go figure!
I heare some time ago that there is no such thing as a global average temperature. The idea is meaningless and about as useful as calculating the average phone number out of the phone book. It looks like the folks at the CRU have produced numbers that are about as useful as the average phone number.
” 1Spectre4U (10:31:54) :
Now I am convinced it was an inside job.”
Imagine that you wanted to evade an FOI request, and all future ones, one would do a round of cleaning of programs, data files and emails.
BUT, you would want to make sure you didn’t throw anything important away.
So have a dedicated recyclying bin. Put everything potentially dodgy in there, and go through it to make sure that you are not throwing away anything you might need.
Put programs in there; then one at a time upload them and read all the read me comments. Keep the ‘censored’ one.
Someone could raid the trash and find out what is being thrown away.
“I have worked as a professional programmer for more than 20 years, and I think that the language in these comments is strange, to say the least. I mean – I have often been swearing over poorly documented spaghetti code”
Having written hundreds of thousands of lines of code, much of it in, believe it or not, qbasic, I can assure you, these comments int he code don’t shock nor surprise me. I wrote much worse in my code. When its 3am, you’ve been staring at a screen for 40 hours straight and you are trying to debug subroutines somoene else wrote, sometimes in a different country, trust me, you’ll write some serious stuff in the comments!
I think that Phil Jones should get 10 to 20 tree rings in jail for this.
RE:
Mark (11:33:11) :
I’d like to know what the “decline” is. Is it temperature?
Hi Mark,
No. It is a decline in a series of values that are supposed to match temperatures. Actually, these numbers are functions of some tree ring width or density. They use these numbers as ‘proxys’ for old temperature, i.e., numbers that are supposedly the best they can come up with for temperatures, given that there were no thermometers then…
These proxys do a decent job at matching actual temperatures for the earlier part of the short period (couple of centuries) for which we have thermometer data. But somewhere in the middle of the century, there is a problem. The proxys and the temperature take very different paths, temperatures going well up, and proxys going down. They don’t pick up the last 50 years warming at all. That’s why many question their use as proxys for temperature going back 2000 years: How do we know if trees picked up warming signals then if they do not do now??
This is called the ‘divergence problem’. Climate Audit has nice posts on this. I know that Climate Audit can a tough read for the non scientifically inclined, but it is the best site out there for these questions.
for example:
http://www.climateaudit.org/?p=570
NPR has actually published a somewhat balanced story, with quotes Christy and Curry:
http://www.npr.org/templates/story/story.php?storyId=120846593
Here’s a link to the BBC story on the CRU “hack”:
http://news.bbc.co.uk/2/hi/science/nature/8370282.stm
This was originally posted at the start of last weekend. It went from
being a “Science and Environment” entry to “Technology” to both
and now it’s buried again under the “Technology” header.
So far, the text hasn’t changed from when they first posted it.
Stop by there via the link to beef up the internal “hit” counters to
keep even this pitiful bit of BBC coverage active.
See this take on the HARRY_READ_ME file:
http://vulgarmorality.wordpress.com/2009/11/25/scientsts-arent-science-and-science-isnt-a-method/
I found the problem with “APPLY ARTIFICIAL CORRECTION”
that I couldn’t find in my downloaded file
FOIA\documents\osborn-tree6\briffa_sep98_d.pro
In my file it’s in \documents\harris-tree\briffa_sep98_e
This file starts out with the comment:
; PLOTS ‘ALL’ REGION MXD timeseries from age banded and from hugershoff
; standardised datasets.
; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
; with missing values set appropriately. Uses mxd, and just the
; “all band” timeseries
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
Petition the Law Makers to Stop this Scam
Take it out of EPA’s Hand
They say we’ll loose a little Freedom, what’s the fuss
Freedom is Cheep there’s plenty of it
It was only bought with the others blood
The Stick is broken Open your eyes
The MWP was grafted Upside Down
Real Temperatures can’t be found
Stop believing, The IPCC lies
Set may CO2 FREE, STOP insanity
I Will Not go down that California Road to Prosperity
The Question remains, the Question is bound
What side will You be on, LAW MAKER
As the Team goes Down
Will You fight for my Liberties
Or sell your Soul for a piece of gold?
So, make your case CO2, that “Evil Gas”, Rules the Climate if you can
Or declare CO2 a non pollutant
Petition the Law Makers to Stop this Scam
By FIRE ANT
So, Harry read_me, have you sold Your soul for a piece of gold?
Phil Jones is the director of the CRU. His own email correspondence (even before this leak) said that the CRU had lost original raw data files (even though the CRU founding was, in part, to document and create a record of historical temperatures). Comments in the source code indicate they lost their entire database of cloud data prior to 1995. Comments in the HARRY_README file say show that they had no source code management system in place – taking 3 years to get the code working again after the original author disappeared (died perhaps?). The source code itself used to collect and process temperature data (I am a s/w engineer and have been staying up ’til midnight each night to do preliminary source code reviews) is awful and does not meet any semblance of modern software quality standards that inspire confidence in accuracy or maintainability.
The internal organizational culture, per the emails, is a dysfunctional mix of paranoia and tribalism. (I have an MBA too and have had a lot of training on organizational behavior issues and dysfunctional management)
He must be held accountable for his own and his staff’s loss of data and poor quality tools. (Would you fly in an airplane if the plane’s aeronautical design model was written like this code?)
On the basis of his utter failure as a manager, he must be fired.
Even George Monbiot is now calling for his removal within days.
“”” Adam Sullivan (11:58:14) :
Seems to me that “calibration” of tree ring data is a bit of a joke.
For the modern periods where we have the highest quality records available we see the “recalibration” where the numbers get “fudged” (the scientific term used in the code is “fudge factor”) in order to make a “calibration” work that makes the pre 1900 data reflect cooler temperatures than if the data had been calibrated to post 1900 data. “””
A tree of any significant (climatically) size, is a relatively voluminous three dimensional object.
If you core drill such a tree to obtain a sequence of samples of tree rings, it is the sampling equivalent of sticking a drill into some arbitrarily chosen rock face at some altitude somewhere in the rocky mountains; well maybe the Alps if you are European, and studying the rock samples from the surface to the extent of the drill, and then claiming to know the age, or temperature, or some other historical information relating not only to the rocky mopuntains (or alps; but to also a very lareg are surrounding those mountains.
Anyone who has seen a sizeable tree crossection in a museum or a park display, can clearly see, that even in a single section, the tree rings are far from uniform thickness, and one would expect every other property of the material to change around the tree, depending on whether it was on the sun side, or shade side of the trunk. Throw in the additional change in section with height , of the section, and you can see that a core sample is an extremely poor sample of a large three dimensional object.
About the one thing we can say about that core sample, is that we are fairly confident of the age of each ring layer. Then certain compositional changes in each layer could be indicative of other parameters, such as C14 dating each ring sample, to get a picture of C14 production rates, since the real age of the sample is known.
But as a thermometer; try putting a wooden tongue depressor in your mouth, and then asking the doctor to look at it to see if you are running a fever.
A totally crazy idea to believe that tree rings through thick and thin can, tell the temperature under which they were laid down; uncorrupted by any other variable, such as available water, sunlight, soil nutrients and so on.
But for a real crazy idea; pray tell me how a sum of squared values (of a real physical variable presumably) can ever go negative; let alone continue to do so.
Has anybody ever actually read on a real instrument with the suffix “meter”, or “ometer” any imaginary number of even a complex number. What physical processes yield observable values that can be actually measured, that are imaginary or complex .
Engineers at least tend to take the position that if they can get an answer (mathematically), it must be the real answer; so they don’t worry much about existence theorems. No engineer could care whether an absolutely convergent infinite series converges; but mathematicians feel they have to prove that is true.
So nyet ! on your increasingly negative sum of squares parameter; it’s a gremlin in your spaghetti code.
If this is what Dr Phil had to do to get a graph that looks like the American temp trends we’ve really got to wonder which cherries have gone into their pies.
That’s another way of saying if CRUtemp has such a tenuous grasp on reality and it’s the same as the other 2 leading temp indicies then presumably they must be wrong too?
BOTO (09:17:11) :
Hi Anthony,
last dinner at Copenhagen!
I love it!
http://i47.tinypic.com/mrxszt.png
Great picture, but I have a question. Which one is Judas?
I must object I really must. This artwork does NOT include Prime Minister Kevin Rudd of my country Australia ( the one with the dodgy data – see above).
Now Kevvy is a Friend of the Chairman at Copenhagen and -as he is wont to do – a great strutter on the international stage especially when the big fellas (like Obama) are there too. He believes unquestioningly in the IPCC and is about to wreck the country that 2 generations of my family have fought for with a Carbon Pollution Reduction Scheme law.
This cretin really deserves to be in the picture – so please fix!
PS He would do a beautiful JUDAS.
All these nasty interpretations of the CRU programmers’ notes are really just a simple problem of linguistic translation. You folks simply do not understand “AGW speak”. Let me enlighten:
“Artificially adjusted” means “teasing a signal out of random noise – I’ll know when I’ve found it by the smile on Dr. Jones face”.
“Real” means “something the general public can measure with their own thermometers”.
“Very, very negative” means “very, very positive” (it’s sarcasm, you dolts!)
“Dummy stations” means “smart stations – the ones on which to place the greatest statistical weight – they weren’t slipped into the base for no reason,” (sarcasm again).
“False references” is a slang term meaning “all data that ever came out of Australia”.
“Oh fuck this” means “This body of work is so pure and elegant that I have developed an unnatural attraction for it.”
Roger Knights (13:02:50) :
M.A.DeLuca (10:42:18) :
“Hysteria, I’m not so sure. O’Reilly has, I believe, said that Global Warming is real and something needs to be done about it. So did McCain during his bid for the Presidency. That indicates a fraction of the conservative base had been convinced this was a real issue that needed to be addressed.”
I saw a comment somewhere (here?) a couple of days ago that McCain has, in the past two months, backed off from his support of CAWG.
Your premise is wrong on two points:
1) Conservatives do not have a list of talking point they must believe in.
2) The term conservative means different things to different people.
Senator McCain is a Republican, not a conservative. I don’t know many people who would call McCain a conservative, except for a handful of issues.
O’Reilly has said it’s obvious pollution must be doing something to the planet — which shows he believes greenhouse gases are pollutants.
1) I don’t understand the technical issues referred to in the above notes and I suspect none of the people saying ‘this is proof of fraud’ understand them either.
2) I can’t believe anyone is surprised by coders swearing and getting frustrated with the code they are working on. It doesn’t mean anything in itself, except that everyone hates their job sometimes.
Conclusion: I am not able to draw a conclusion from this data.
Anyone who thinks they can draw a conclusion from it is just seeing what they want to see.
“Robinson (11:58:42) :
Ironically, the University of East Anglia has a Computer Science department:
……………………………………………………………
The fact is that they have the expertise on campus to engineer a decent piece of software.”
I tried to get a computing mathematician/statistician M. Sc. Student for a Summer or for a project while I was part of UCL. No way. The departments were not keen on letting the little birds out of the nests and doing actual problems. It was too difficult to mark their assessment if they actually helped scientists at the coal-face.
There is a Statistics unit at UCL, and in most other large Universities, who will aid you in your stats. The main problem is the ‘road to Cork’ problem; too many scientists take their data to the stats people at the end of a study, rather than before they start. Slight changes in experimental design make all the difference to what statistics one can apply. I had a good experience with them, it is disconcerting when they tell you they are not interested in your data, but in what you want your data to be able to explain in a testable manner.
We should have to go back to the basics I’m afraid, and teach the Ph. D students both ethics and statistics, right at the beginning as it is apparent that the mentoring system has failed massively.
Can you imagine what it must be like for the Ph. D’s and Post-Doc’s at UEA now? They have screwed up their whole lives by association.
Just for the sake of fairness, please bear in mind that ‘hide the decline’ is referring to the ‘divergence problem’
– the divergence problem is the fact that the tree-ring data doesn’t track temperature after about 1960.
– the tree-ring growth has actually declines since then in many, but not all, NH tree-ring data sets.
– so the hockey-teams hide this by stopping the plots in 1960 or 1980, and also seem to mix in a bit of real-temp, in order to give their proxy plots a nice up-tick at the end, as if to say ‘the plot would have continued to go up had we not stopped here…’
– although the decline has been known about for 10 years or more (mentioned in the 1998 Nature article), the reasons for this decline are not known….
– so I do think it is under-hand to ‘hide’ this decline, even if it is ‘hidden in plain view’ so to speak….
– once could speculate that if the proxydata doesn’t track temps reliably in the current time, then they may not track-temp in historical times as well…..
Also, the programmer Harry is dealing with getting the HADCRU temp v3.0 going…not hockey-sticks…
Yeah, I know, I’m an a$$#0!e, but I’ll tell you what’s going to happen.
The poor schmuck who wrote these comments is going to be blamed for the whole fiasco…UNLESS he can PROVE that he passed these complaints along to Jones. Otherwise Jones will be SHOCKED, SHOCKED, I tell you, to find out that these things were going on and he was not told.
Whoever you are you’d better start looking for cover, because they are coming after you.
Write it in stone.
“One of the most damaging emails was sent by the head of the climatic research unit, Phil Jones. He wrote “I can’t see either of these papers being in the next IPCC report. Kevin and I will keep them out somehow – even if we have to redefine what the peer-review literature is!”
Censorship and manipulation.
Hey folks, we have another climategate. This one is down in New Zealand.
http://nzclimatescience.net/index.php?option=com_content&task=view&id=550&Itemid=1
“There have been strident claims that New Zealand is warming. The Inter-governmental Panel on Climate Change (IPCC), among other organisations and scientists, allege that, along with the rest of the world, we have been heating up for over 100 years. But now, a simple check of publicly-available information proves these claims wrong. In fact, New Zealand’s temperature has been remarkably stable for a century and a half. So what’s going on?” Researchers find records adjusted to represent ‘warming’ when raw data show temperatures have been stable.
The pdf file is unbelievable. It should also be worldwide news.
I wish we had “publicly available” information here in the USA. 🙁
Found on Icecap:
http://icecap.us/images/uploads/global_warming_nz_pdf.pdf
A comparison between longterm raw and adjusted temperatures
from New Zealands NIWA.
Climategate Episode II – Upping The Down Under ?
The corollary to my 14:13:55 is that you will know they have his cojones in a vise when he starts waffling about how it wasn’t really that bad and people are “taking it out of context”.
Can’t blame him.
Probably been said elsewhere but:
“Here, the expected 1990-2003 period is MISSING – so the correlations aren’t so hot! Yet
the WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah – there is no ‘supposed’, I can make it up. So I have :-),”
This is dynamite
Bring on the inquisition
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees
LoL.
:-))
I am a professional mathematician. My only experience with Fortran was a GE program built to compute the path of a sounding rocket (GEMASS).
I make no claim to be a programmer, aside for my elementary excursions into computing pi to 2000 digits, using an early version of Pascal, and some early experiments with the Mandelbrot set.
The code which has been published is laughably bad. No wonder the authors refused all requests for the publication of the code.
The problem is not in the emails. The problem is in the code.
There is no evidence that the code was ever reviewed, either by a competent programmer or by a peer review agency. This is not science; this is a political crusade masquerading as science.
When one writes code, it is for the purpose of uniform treatment of a collection of data, as one does with inventory code, accounts receivable code, or any code which processes and interprets raw data.
That is, of course, all that a computer can do: a staggering number of computations in order to make sense of otherwise unintelligible data.
Just think of the aerodynamic computations which are now routine: one builds a teraflop machine and tests all of the variations in air flow around an aircraft fuselage, obviating the necessity for model building and wind tunnel testing. The reason: making fiddling changes in the fuselage amounts to a trivial tweak of the code, rather than the skills of a model maker. Aircraft are now designed and built by CAD/CAM, because it is cheaper!
When one fudges the code in order to alter the raw data, this is called fraud. This is as bad as the Nobel Prize given out a century ago for curing cancer.
The programmers were not able to compute the distances between temperature sensing sites? Absurd. This is a trivial exercise in spherical trigonometry.
The inability to use the appropriate language to handle text versus numbers?
This is a very expensive joke, at our expense.
If you cannot include in your documentation the use of unique temporary file cache names, the explicit format of your data files, the precise algorithm used to process the data, and the exact parameters which are used to smooth your data, you have garbage.
Smoothing data is a routine exercise; it is used all the time in handling my favorite kind of data, which is the collection of recorded magnitudes of astronomical objects (the search for variable stars, invisible orbiting planets of other stars, and so on). All such data must be smoothed, because measurement is inherently subject to uncertainty of measurement (thank you, Gauss). The standard normal curve of error always applies.
The measurement of temperature has only been possible for three centuries or less. When Thomas Jefferson first began his recordings, that was a new thing in the United States. Minimax thermometers are even newer, and thermoelectric temperature sensors newer still.
All attempts to infer temperatures before the days from the first calibration of temperature on a digital scale are just that: inferences. And unless one is prepared to defend and separate the effects of rainfall, cloud cover, human activity, animal activity, volcanic activity, to say nothing of other variables which are involved with plant and animal growth, we are left with huge gaps in what we know, as differentiated with what we infer.
There is literally no excuse for this ridiculous trash. All of the participants must be immediately prohibited from any further employment in any scientific endeavor or publication in any scientific journal.
Johnathan Dumas:
“These proxys do a decent job at matching actual temperatures for the earlier part of the short period (couple of centuries) for which we have thermometer data.”
There’s some issues with this. Dendroclimatology has not progressed to the point where they can walk up to a tree, examine the external factors (treeline, soil, elevation, bark) and determine whether it will have a respectable chance of being a decent proxy prior to the coring. Nor is it typical to attempt a true calibration after the fact. (By sequestering some data, etc.)
So they tend to take far more samples than they end up including in the final analysis – because lots of the trees “Don’t appear to have any temperature signal.”
When you combine that with the divergence of the self-same trees that were previously considered decent proxies as time progresses, you are running across a completely fatal flaw. It is a very strong sign that you’re observing some correlation – but don’t have true causation.
They’re coring 100 trees (that all meet the “best criteria” for siting etc.) and picking the 10 with the best correspondance with local temperature. Then sweeping the reevaluation of the exact same trees under the rug when they fail to continue correlating. (See: Ababneh.) as well as obstructing other who would like to recore the identical trees.
They’re making the fallacy of assuming they have a valid proxy because it happens to line up sometimes.
I found the line of code in the program that calculates Temperature
Tw = 58 + 6 * Log(CO2/ 280) / Log(2)
REPLY: It appears they are assuming a basline CO2 value of 280 PPM.
Pieter F (09:16:40) :
“It just gets better and better (worse for them, that is). The question remains: is this enough to overcome the momentum acquired by the AGW ”
Not at all, it was never about the science, so why should it?
“Michael Manns articles have been published in well respected peer reviewed scientific journals”
English translation . . me and my buds do the peer-review-each-other thing and since each back is well scratched, we can have our papers published and keep out the papers of non members of our mutual admiration and back scratching club.
Or something like that . . . and we do the same thing when we write the IPCC reports
“Averaging day and night temperatures has been reported as enhancing perceived warming in England as daytime alone shows less, or no, warming.” – Wasp
This rings bells – have read something similar about day and night temperatures before, probably at CA. Was it that increased night temperatures are an urban heat island effect? Arg, anyone remember this one?
(Met Office was announcing today that 2009 begs to be hotter again – BBC are happy to jump in as if being “one of 10 hottest years” is unusual – when you’re on top of a big hill, every step is a high one!)