The Smoking Code, part 2

Climategate Code Analysis Part 2

atomThere are three common issues that have been raised in my previous post that I would like to officially address concerning the CRU’s source code.

If you only get one thing from this post, please get this. I am only making a statement about the research methods of the CRU and trying to show proof that they had the means and intent to falsify data. And, until the CRU’s research results can be verified by a 3rd party, they cannot be trusted.

Here are the four most frequent concerns dealing with the CRU’s source code:

  1. The source code that actually printed the graph was commented out and, therefore, is not valid proof.
  2. No proof exists that shows this code was used in publishing results.
  3. Interpolation is a normal part of dealing with large data sets, this is no different.
  4. You need the raw climate data to prove that foul play occurred.

If anyone can think of something I missed, please let me know.

The source code that actually printed the graph was commented out and, therefore, is not valid proof.

Had I done a better job with my source analysis, I would have found a later revision of the briffa_sep98_d.pro source file (linked to in my previous post) contained in a different working tree which shows the fudge-factor array playing a direct result in the (uncommented) plotting of the data.

Snippit from: harris-tree/briffa_sep98_e.pro (see the end of the post for the full source listing)

;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
  ;
  ; Now plot them
  ;
  filter_cru,20,tsin=densall,tslow=tslow,/nan
  cpl_barts,x,densall,title='Age-banded MXD from all sites',$
    xrange=[1399.5,1994.5],xtitle='Year',/xstyle,$
    zeroline=tslow,yrange=[-7,3]
  oplot,x,tslow,thick=3
  oplot,!x.crange,[0.,0.],linestyle=1
  ;

Now, we can finally put this concern to rest.

Interpolation is a normal part of dealing with large data sets, this is no different.

This is partially true, the issue doesn’t lie in the fact that the CRU researchers used interpolation. The issue is the weight of the valadj array with respect to the raw data. valadj simply introduces too large of an influence to the original data to do anything productive with it.

Here is the graph I plotted of the valadj array. When we’re talking about trying to interpret temperature data that grows on the scale of one-tenths of a degree over a period of time, “fudging” a value by 2.5 is going to have a significant impact on the data set.

No proof exists that shows this code was used in publishing results.

Correct! That’s why I am (and always have) taken the following stand: Enough proof exists that the CRU had both the means and intent to intentionally falsify data. This means that all of their research results cannot be trusted until they are verified. Period.

The fact that the “fudge-factor” source code exists in the first place is reason enough for alarm. Hopefully, they didn’t use fudged results in the CRU research results, but the truth is, we just don’t know.

You need the raw climate data to prove that foul play occurred.

This is assuming the raw data are valid, which I maintain that it probably is. Several people question the validity of the climate data gathering methods used by the different climate research institutions, but I am not enough of a climate expert to have an opinion one way or the other. Furthermore, It simply doesn’t matter if the raw climate data are correct or not to demonstrate the extreme bias the valadj array forces on the raw data.

So, the raw data could actually be temperature data or corporate sales figures, the result is the same; a severe manipulation of data.

Full Source Listing

As promised, here is the entire source listing for: harris-tree/briffa_sep98_e.pro

   1. ;
   2. ; PLOTS 'ALL' REGION MXD timeseries from age banded and from hugershoff
   3. ; standardised datasets.
   4. ; Reads Harry's regional timeseries and outputs the 1600-1992 portion
   5. ; with missing values set appropriately.  Uses mxd, and just the
   6. ; "all band" timeseries
   7. ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
   8. ;
   9. yrloc=[1400,findgen(19)*5.+1904]
  10. valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
  11.   2.6,2.6,2.6]*0.75         ; fudge factor
  12. if n_elements(yrloc) ne n_elements(valadj) then message,'Oooops!'
  13. ;
  14. loadct,39
  15. def_1color,20,color='red'
  16. plot,[0,1]
  17. multi_plot,nrow=4,layout='large'
  18. if !d.name eq 'X' then begin
  19.   window, ysize=800
  20.   !p.font=-1
  21. endif else begin
  22.   !p.font=0
  23.   device,/helvetica,/bold,font_size=18
  24. endelse
  25. ;
  26. ; Get regional tree lists and rbar
  27. ;
  28. restore,filename='reglists.idlsave'
  29. harryfn=['nwcan','wnam','cecan','nweur','sweur','nsib','csib','tib',$
  30.   'esib','allsites']
  31. ;
  32. rawdat=fltarr(4,2000)
  33. for i = nreg-1 , nreg-1 do begin
  34.   fn='mxd.'+harryfn(i)+'.pa.mean.dat'
  35.   print,fn
  36.   openr,1,fn
  37.   readf,1,rawdat
  38.   close,1
  39.   ;
  40.   densadj=reform(rawdat(2:3,*))
  41.   ml=where(densadj eq -99.999,nmiss)
  42.   densadj(ml)=!values.f_nan
  43.   ;
  44.   x=reform(rawdat(0,*))
  45.   kl=where((x ge 1400) and (x le 1992))
  46.   x=x(kl)
  47.   densall=densadj(1,kl)     ; all bands
  48.   densadj=densadj(0,kl)     ; 2-6 bands
  49.   ;
  50.   ; Now normalise w.r.t. 1881-1960
  51.   ;
  52.   mknormal,densadj,x,refperiod=[1881,1960],refmean=refmean,refsd=refsd
  53.   mknormal,densall,x,refperiod=[1881,1960],refmean=refmean,refsd=refsd
  54. ;
  55. ; APPLY ARTIFICIAL CORRECTION
  56. ;
  57. yearlyadj=interpol(valadj,yrloc,x)
  58. densall=densall+yearlyadj
  59.   ;
  60.   ; Now plot them
  61.   ;
  62.   filter_cru,20,tsin=densall,tslow=tslow,/nan
  63.   cpl_barts,x,densall,title='Age-banded MXD from all sites',$
  64.     xrange=[1399.5,1994.5],xtitle='Year',/xstyle,$
  65.     zeroline=tslow,yrange=[-7,3]
  66.   oplot,x,tslow,thick=3
  67.   oplot,!x.crange,[0.,0.],linestyle=1
  68.   ;
  69. endfor
  70. ;
  71. ; Restore the Hugershoff NHD1 (see Nature paper 2)
  72. ;
  73. xband=x
  74. restore,filename='../tree5/densadj_MEAN.idlsave'
  75. ; gets: x,densadj,n,neff
  76. ;
  77. ; Extract the post 1600 part
  78. ;
  79. kl=where(x ge 1400)
  80. x=x(kl)
  81. densadj=densadj(kl)
  82. ;
  83. ; APPLY ARTIFICIAL CORRECTION
  84. ;
  85. yearlyadj=interpol(valadj,yrloc,x)
  86. densadj=densadj+yearlyadj
  87. ;
  88. ; Now plot it too
  89. ;
  90. filter_cru,20,tsin=densadj,tslow=tshug,/nan
  91. cpl_barts,x,densadj,title='Hugershoff-standardised MXD from all sites',$
  92.   xrange=[1399.5,1994.5],xtitle='Year',/xstyle,$
  93.   zeroline=tshug,yrange=[-7,3],bar_color=20
  94. oplot,x,tshug,thick=3,color=20
  95. oplot,!x.crange,[0.,0.],linestyle=1
  96. ;
  97. ; Now overplot their bidecadal components
  98. ;
  99. plot,xband,tslow,$
 100.   xrange=[1399.5,1994.5],xtitle='Year',/xstyle,$
 101.   yrange=[-6,2],thick=3,title='Low-pass (20-yr) filtered comparison'
 102. oplot,x,tshug,thick=3,color=20
 103. oplot,!x.crange,[0.,0.],linestyle=1
 104. ;
 105. ; Now overplot their 50-yr components
 106. ;
 107. filter_cru,50,tsin=densadj,tslow=tshug,/nan
 108. filter_cru,50,tsin=densall,tslow=tslow,/nan
 109. plot,xband,tslow,$
 110.   xrange=[1399.5,1994.5],xtitle='Year',/xstyle,$
 111.   yrange=[-6,2],thick=3,title='Low-pass (50-yr) filtered comparison'
 112. oplot,x,tshug,thick=3,color=20
 113. oplot,!x.crange,[0.,0.],linestyle=1
 114. ;
 115. ; Now compute the full, high and low pass correlations between the two
 116. ; series
 117. ;
 118. perst=1400.
 119. peren=1992.
 120. ;
 121. openw,1,'corr_age2hug.out'
 122. thalf=[10.,30.,50.,100.]
 123. ntry=n_elements(thalf)
 124. printf,1,'Correlations between timeseries'
 125. printf,1,'Age-banded vs. Hugershoff-standardised'
 126. printf,1,'     Region    Full   <10   >10   >30   >50  >100'
 127. ;
 128. kla=where((xband ge perst) and (xband le peren))
 129. klh=where((x ge perst) and (x le peren))
 130. ts1=densadj(klh)
 131. ts2=densall(kla)
 132. ;
 133. r1=correlate(ts1,ts2)
 134. rall=fltarr(ntry)
 135. for i = 0 , ntry-1 do begin
 136.   filter_cru,thalf(i),tsin=ts1,tslow=tslow1,tshigh=tshi1,/nan
 137.   filter_cru,thalf(i),tsin=ts2,tslow=tslow2,tshigh=tshi2,/nan
 138.   if i eq 0 then r2=correlate(tshi1,tshi2)
 139.   rall(i)=correlate(tslow1,tslow2)
 140. endfor
 141. ;
 142. printf,1,'ALL SITES',r1,r2,rall,$
 143.   format='(A11,2X,6F6.2)'
 144. ;
 145. printf,1,' '
 146. printf,1,'Correlations carried out over the period ',perst,peren
 147. ;
 148. close,1
 149. ;
 150. end
About these ads

210 thoughts on “The Smoking Code, part 2

  1. MotleyCRU saws off the bottom of the data deck, jacks up decline.
    FudgeBoostFactor2.5 sends hockeystick soaring over the Tower of Babel.
    Only the Insider knows for sure who did what.

  2. The “fudge factor” part is just to scale the ‘artificial’ array. I’m pretty sure looking at the code that is just to help line things up. That isn’t a pass — the programmer is a total amateur on the comments alone, but that part doesn’t mean they were fudging the big picture, just they were wanting to use a single variable (err, constant in this case) to help line things up.

    The bigger picture is the inevitable ‘blade part’ of of every graph these guys produce is based on a different cocktail of data than the stick part. They go on and on how this is ‘ok’, and how this is ‘known’, but it isn’t up to climatologists to decide what is ok with math and statistics. It is obviously trying to squeeze desired results from data.

    The world seems happy to ignore this — whatever, I guess they are tired of oil or voting or something. Regardless, this is shit science. Actually it isn’t even that, it is tainted before it even gets past the math, and probably before it even gets past writing down the data. The ‘science’ never had a chance.

    Ahhh, its depressing, I’m just going to go to bed (umm, not with Shell though).

  3. This just in:
    EPA Set to Declare Carbon Dioxide a Public Danger

    http://www.foxnews.com/politics/2009/12/06/epa-set-delare-carbon-dioxide-public-danger/

    Only benefit of Cap-and-trade is that someone makes money on the deal.

    excerpt:
    Such an “endangerment” decision is necessary for the EPA to move ahead early next year with new emission standards for cars. EPA Administrator Lisa Jackson has said it could also mean large emitters such as power stations, cement kilns, crude-oil refineries and chemical plants would have to curb their greenhouse gas output.

    Who is John Galt?

  4. Finally, news about the solar minimum.

    “The sunspot count continues to be at the lowest level in a century. Looking back in the past, there appears to be a strong link between an inactive sun and a cooler earth. The sun is the source of incoming heat on this planet. It has cycles that we haven’t been around long enough to study and understand clearly.”

    “Climategate” Exposes Global Warming Hoax

    http://www.courierpress.com/news/2009/dec/06/quotclimategatequot-exposes-global-warming-hoax/

  5. There are so many concerns related to this that the only way to understand what has been going on at CRU is for complete open access to all the data and codes. I wouldn’t be surprised to find that there was (and probably still is) so little control over the codes that all versions previously used no longer exist and would have to be rewritten from paper records (if those still exist). Without a QA system in place that that would have ensured that all code and data were written, documented, maintained, used and archived properly, I suspect it will be impossible to untangle the CRU mess.

  6. Are the EPA mad? How on earth can CO2 be a Public Danger.

    OK – got it! We breathe out and the CO2 content is around 40,000 parts per million. So…….. as we are panicking about Mauna Lou CO2 levels climbing and at present 387.75 parts per million then why not issue draconian laws when the magic figure of 40,000 parts per million can be quoted.

    Solution is we all [6+ Billion of us] stop breathing! Problem solved! Danger over!

    EPA – Environmental Panic Attackers

  7. From another thread:

    Prosecutor: And where did you find the defendant, Constable Platypus?
    Bobby: Be’ind the ware’ouse, standin’ in the shadders by the back gate.
    Prosecutor: Was the gate locked when you arrived?
    Bobby: Hit was. There was a chain wif a padlock through two o’ the links.
    Prosecutor: And what did the defendant have in his hands?
    Bobby: ‘E ‘ad that pair o’ bolt cutters, Hex’ibit B. (points at evidence table)
    Prosecutor: And how did the defendant explain his presence at the back gate of the warehouse, standing in the dark, with these bolt cutters in his possession?
    Bobby: [looking at his notebook] ‘E said, “I were just practisin’, gov’nor. These ‘ere bolt cutters ‘ave been commented out.”
    (laughter)
    Prosecutor: And had they been commented out?
    Bobby: Someone ‘ad put a semicologne on ‘em wif a piece o’ chalk, yerse.
    Prosecutor: How long would it take for the defendant to remove the semicolon and cut the chain?
    Barrister: Objection, M’lud! PC Platypus is not an expert on chalk.
    Judge: Over-ruled. You may answer the question.
    Bobby: Habout ‘arf a second.
    (laughter)
    Prosecutor: Thank you, Constable Platypus.

    CRU has been caught with weapons of math destruction in their code. That is sufficient to establish intent, even without showing they were actually used, any more than the defendant’s bolt cutters. Possession of such devices is sufficient to establish guilt.

  8. I am anxious to see more analysis of this code…the problems with it appear to be enormous…a full accounting of those problems is the only way to put to rest any talk of CRU as being innocent of the charge of criminally bad science.

  9. Kiss off the next three years. The current administration isn’t interested in facts. They’re on a mission to ram an agenda down America’s throat while they can. We need to look at how we can recover after 2012 (if we’re lucky)

  10. NH 1910 temp = 1970 temp in 1974

    http://www.denisdutton.com/newsweek_coolingworld.pdf

    According to http://scienceblogs.com/deltoid/2009/12/quote_mining_code.php

    The plot without the correction comes very close to 1910- 1970.

    AJstratas work shows 1910 very close to 1970

    Here is the NH Hadcrut plot from BOM

    http://reg.bom.gov.au/cgi-bin/climate/change/global/timeseries.cgi?graph=global_t&region=nh&season=0112&ave_yr=0

    The correction IMO is to make the proxies match the “adjusted temperature”.

    I would just about bet my left nut that if we ran the “virus” backwards though the Hadcrut that we would be very close to the actual raw temperatures and that would correlate well to the individual long rural records.

  11. Phillip Bratby (23:23:43)

    Yeah, but given the apparent mess that is the CRUs’ electronic data, even with complete access to all the files, it doesn’t look good. e.g. what’s in
    crucode/idl/pro/README_GRIDDING.txt

    “Then use quick_interp_tdm2.pro
    on the secondary variable, with synth_prefix supplied, to create the new grids.
    Bear in mind that there is no working synthetic method for cloud, because Mark New
    lost the coefficients file and never found it again (despite searching on tape
    archives at UEA)
    and never recreated it. This hasn’t mattered too much, because
    the synthetic cloud grids had not been discarded for 1901-95, and after 1995
    sunshine data is used instead of cloud data anyway.”

    Wonder what else is on or gone from the archives…

  12. The Ad Hominem Attack Heard Round the World.

    I’m thinking about re-upoading this video with this new title, unless someone else beats me to it.

  13. I watched that when aired. Have to say it wad train wreck journalism at it’s best (or worst!).

    Watson is an arse but I found it quite funny that he got worked up so easily and the other huh fed off this :)

    mailman

  14. The smartest kid in the class (CRU) just got caught cheating. Why are his frat brothers (GISS, NOAA) claiming that everything is ok because they all put down the same answers on the test? If CRU takes the real data and adds fudge.dat, then aren’t they all just selling different flavors of fudge?

  15. When investigating a conspiracy to commit a crime, motive, method, and opportunity are the main areas to focus on.

    This excellent post provides evidence that a method was available to the people involved in the Climategate cabal, and the emails show that they had the motivation and the opportunity to do it.

    Now all we need is the ‘body’ – the raw data – to show a crime was actually committed. The Climategate emails show that Jones et al were collusing to stop people finding it, by refusing FOIA requests and even threatening to destroy the data should the worst came to the worst.

    Once the raw data is found the IPCC charts can be reconstructed, and comparisons are made to the ones in the IPCC documents. If there is a discrepancy, then a solid case can be built for them to answer.

    However, from the context of what we have seen. it is clear that the UEA CRU/GISS/IPCC were not conducting science – they just trying to find evidence to support their belief in CAGW and doing everything in their power to prevent contrary evidence from other scientists being published.

    Science is about facts and the truth and the Climategate evidence shows the ‘team’ had little respect for either.

  16. “Emalis that rocked climate change campaign leaked from Siberian ‘closed city’ university built by KGB

    An investigation by The Mail on Sunday has discovered that the explosive hacked emails from the University of East Anglia were leaked via a small web server in the formerly closed city of Tomsk in Siberia.

    Computer hackers in Tomsk have been used in the past by the Russian secret service (FSB) to shut websites which promote views disliked by Moscow.

    In 2002, Tomsk students were said to have launched a ‘denial of service’ attack at the Kavkaz-Tsentr portal, a site whose reports about Chechnya angered Russian officials.

    The FSB office in Tomsk put out a special Press release saying that what the students had done was a legitimate ‘expression of their position as citizens, one worthy of respect’.”

    http://www.dailymail.co.uk/news/article-1233562/Emails-rocked-climate-change-campaign-leaked-Siberian-closed-city-university-built-KGB.html

  17. The BBC’s Environment Correspondent has told commentators on his blog to stop posting comments about ClimateGate! Apparently that’s because the alleged falsification of research is boring and no-one wants to read about it – so stop nit-picking. Seriously:

    I’ve another request, too – if you can restrain yourselves from plastering this thread with stuff about ClimateGate, please do.

    There are more than 700 comments on the previous thread, the vast majority related to it. I know from e-mails that some readers find endless picking over of climate science repetitive and boring – and when they do, they don’t read through the comments. Fresh, pertinent and interesting are my suggestions.”

    BBC – Richard Black’s Earth Watch

    http://www.bbc.co.uk/blogs/thereporters/richardblack/2009/12/copenhagen_countdown_3_days.html

  18. jorgekafkazar, please may I have your permission to use this bit for my quote of next week?

    “CRU has been caught with weapons of math destruction in their code.”

    Thanks in advance.

  19. http://www2.macleans.ca/category/opinion/mark-steyn-opinion/

    Back in the summer, I wrote in a column south of the border:

    …“If you’re 29, there has been no global warming for your entire adult life. If you’re graduating high school, there has been no global warming since you entered first grade. There has been no global warming this century. None. Admittedly the 21st century is only one century out of the many centuries of planetary existence, but it happens to be the one you’re stuck living in.”

    In response to that, the shrieking pansies of the eco-left had a fit. The general tenor of my mail was summed up by one correspondent: “How can you live with your lies, dumb­f–k?” George Soros’s stenographers at Media Matters confidently pronounced it a “false claim.” Well, take it up with Phil Jones. He agrees with me. The only difference is he won’t say so in public.

    Which is a bit odd, don’t you think?…

  20. Brian Johnson uk (23:27:41) :
    Are the EPA mad? How on earth can CO2 be a Public Danger.

    The EPA is composed of bureaucrats, with a very few genuine scientists tossed in to give the appearance of a science-oriented gummint agency.

    Some time ago, I was monitoring a methane burn at a landfill and had an EPA inspector claim that the burner was inoperative because he couldn’t see the flames. When I told him that pure methane burned at such a high temperature the flames were only visible in the infrared spectrum, he harrumphed and thrust his hand into the plume to *prove* I was lying.

    I drove him to the hospital to have his second-degree burns treated.

  21. Good analysis so far Robert. I agree with you on the Modus Operandi side of things – motive, means and opportunity.

    The next thing is to try to run the program with and without the valadj ‘adjustments’ using made up test data. Then see what the difference is. The data values can be the same for all the dates but different for each file because you’re testing the code not trying to replicate a published chart.

    After that maybe the next possible step is to use real data and see if the produced chart corresponds to anything published. The real data should be obtainable via FOI. One for Steve M perhaps!

    harrfn (harry filename?) is an array of data filenames found here:


    29. harryfn=['nwcan','wnam','cecan','nweur','sweur','nsib','csib','tib',$
    30. 'esib','allsites']

    My guess is they correspond as follows:

    nwcan = North West Canada
    wnam = Western North America
    cecan = Central Canada
    nweur = North West Europe
    sweur = South West Europe
    cs and es could be ISO country codes but ns and ti are not.

  22. CRU has been caught with weapons of math destruction in their code.

    This is a quotable quote! ;)

    Larry

  23. This is not the first post about a code for fudging.
    Indeed if it is claimed it was never used, then why was it written to begin with?
    I think we should continue demanding the code and raw data under the FOIA.
    That’s the only way this is going to get investigated properly.

  24. Wow, I really think you guys missed the boat with all that code up there. I’ve come up with a simple equation that even a 12 year old can understand that explains EVERYTHING:

    AGW = (OT + (DFU/$)RG + AGF) / CAPBSIF

    For those who need help sorting this out, this shows that AGW is the sum of:

    OT – Original Temperature Findings
    DFU/$ – Tells us how much money Mann, Jones, et al, get per each Degree they can Fudge Upward (And subsequently this leads to RG, Reasearch Grants.
    AGF – Al Gore’s Carbon Footprint

    And this is very important, so that we get the numbers right – After we sum all of that up, you MUST divide by CAPBSIF, that is, the # of computer animated polar bears you can show stranded on ice flows.

  25. Fudge Factors have always had me worried.

    Once picked up some real-time safety critical code from a major defence player, now, very thankfully, no longer with us, which had fudge_factor_1 to fudge_factor_10 declared and used throughout.

    Very, very scary in a plane, but this is on a global scale B)

  26. Perhaps part 3 could address a couple of issues? First, you need to redraw the graph of valadj with the 0.75 multiplier included – so 2.6 goes down to a mere 1.95! Second, I should like to see a mathematical description of the interpolation code, as I don’t understand exactly what it does.

    Thanks,
    Rich.

  27. I fear this climategate is noting more than a fart in a bottle ! The blogosphere can yell wathever they wont…in Europe the Kopenhagen conference has to come out with a drastic reduction of CO2 to save the climate. On the official newschannels there is almost no mention of the hacked e-mails or of climategate. I also have the feeling that nobody is interested in what those emails could reveal.
    In Europe the whole antropogenic warming circus simply goes on with a never seen energy. And as cherry on the top,…more and more European people start to believe them. It has been an very warm autumn, just as the year before, and the year before that and….The official weathermen are promoting the idea this is not coïncidential anymore. Not normal. Global warming is overwhelming, they say !
    So in Europe the AGW Kopenhagen show can benefit from the reality in nature. And no ripple could disturb the already now achieved succes of the summit. Whatever the source code of whatever climate model contains or not.

  28. As the article points out, even if this code wasn’t used, the mere fact that it was there points to a need to “adjust” data. This is not interpolation.
    And even though this code seems to have been for the treemometer data, doesn’t mean there wasn’t other shenanigans going on with the temperature dataset.
    AJStrata’s work seems to point out that the raw data doesn’t confirm the warming that the GISS and CRU “adjustments” shows, but no doubt, if cornered, they’ll tell us that a “fudge factor” has to be added in to cover some inadequacy or other in the GHCN dataset.
    This whole thing is too big for people to be going off on their own tangents in analysing the data. What we need is to coordinate our efforts, standardise how the data is treated, and come up with a single alternative analysis.
    We can’t wait three years for the Met Office to announce their re-analysis, and anyway, one visit to their site shows where their prejudices lie.
    We already have a large number of capable people posting to this site, and their email addresses are known to this site, so I’d suggest that a section of the site or another effort like the surfacestations.org work be got up and running.
    I have skills in website and application programming, and am currently learning R, so if there was anything I could do to help with this effort, I’d be glad to.

  29. From previous post:
    Climate Heretic (16:18:03): I disagree with you.
    woodentop (16:01:48): Sorry pal, but that’s rubbish.

    The need for source code readability, maintainability and quality certainly depend on the context in software engineering. I addition this is not the main point here, let us assume that the source code is poor quality as you suggest, does that help us in any way? NO! I think we need a much more PRAGMATIC point of view here, KILL YOUR DARLINGS (in software), we need to COMPILE the code, EXECUTE the code to determine WHAT it is doing!

    Complaining that the source code is poor quality is a dead end – please stop doing that!

  30. No wonder the CRU are delaying the publishing of the FULL raw station data, and the specific computer code. If this is a sample of the real code then the coding is a “complete mess” and I suspect that the CRU are embarrassed by the poor quality of their work.
    Poor Harry has been given the impossible task of re-writing some kind of coding that can be applied to the raw data to recreate a temperature record that bears some resemblance to their “value added” data.
    IT’S HILARIOUS!
    To use a chess term: ZUGZWANG.
    They have been forced to make a move but whatever they do will cause their doom.
    By the way Jonathan Porrit (a very active AGW campaigner) has just been on the BBC accepting that natural variation may have a large influence on the climate. You wouldn’t have heard that a few weeks ago.

  31. I’ve not gone through the code in detail but if that’s what passes for clear modern programming then I’m in the wrong business (I’m a programmer). Looks more like a BASIC program for a ZX81.

    mikey (07:38:46) : on a different story posted this link

    http://news.bbc.co.uk/1/hi/world/south_asia/8387737.stm

    IPCC – out by 300 years and using non-peer reviewed information!
    Is this the first time the BBC has been so critical of anything done by the IPCC?

  32. Two points –
    Be careful – if the adjustments are in hundredths of a degree, you will look stupid when that comes out.
    Its obvious to me that the Met Office has said their data rework will take 3 years so they can see what happens to the climate (sorry – weather) in that time. So they dont know and they are waiting to see what happens next -like all of us.

  33. osborn-tree6\mann\mxdgrid2ascii.pro 10/03/2008

    endif else begin
    ml=where(finite(mxdfd) eq 0,nmiss)
    if nmiss gt 0 then mxdfd(ml)=-9.99
    printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
    printf,1,'(April-September) temperature anomalies (from the 1961-1990 mean).’
    printf,1,’Reconstruction is based on tree-ring density records.’
    printf,1
    printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
    printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
    printf,1,’will be much closer to observed temperatures then they should be,’
    printf,1,’which will incorrectly imply the reconstruction is more skilful’
    printf,1,’than it actually is. See Osborn et al. (2004).’
    printf,1
    printf,1,’Osborn TJ, Briffa KR, Schweingruber FH and Jones PD (2004)’
    printf,1,’Annually resolved patterns of summer temperature over the Northern’
    printf,1,’Hemisphere since AD 1400 from a tree-ring-density network.’
    printf,1,’Submitted to Global and Planetary Change.’
    ;
    printf,1,’Grid resolution is 5 by 5 degrees, for the Northern Hemisphere only,’
    printf,1,’with the first value at ‘,g.x[0],’ W’,g.y[0],’ N,’
    printf,1,’then moving eastwards to ‘,g.x[g.nx-1],’ and then southwards to ‘,g.y[g.ny-1],’.’
    printf,1
    printf,1,’Missing value is -9.99′
    printf,1
    for iyr = 0 , mxdnyr-1 do begin
    print,mxdyear[iyr],format='($,I5)’
    printf,1,mxdyear[iyr],format='(I4)’
    printf,1,mxdfd[*,*,iyr],format='(10F8.2)’
    endfor
    print
    endelse

  34. Heard on the Andrew Marr show on the BBC this morning from the US Ambassador to the UK:

    ‘Climategate’ has not been picked up on in the US. There you have it. Nothing to see here. Move along please. Have I been in a dream for the past 2 weeks or so? I am sure there are Americans who have been blogging like mad. I know Fox News has. Oh I know. He was talking about most of the American mainstream media wasnt he????

  35. From previous post:
    Carsten Arnholm, Norway (02:57:28) : It really does not matter whether the people writing this code were “smart” or not. What matters is that the result is of very poor quality.

    No. That’s complete nonsense! We need to COMPILE the code, EXECUTE the code, REVERSE ENGINEER how it is supposed to work together with the RAW data files in the dump in order to CONCLUDE exactly what it is doing.

    Complaining that the source code is poor quality does not help us in any possible way, that’s certainly a dead end a RED HERRING that draws attention away from the central issue which is whether they have ADAPTED the code to the AGW hypothesis. Imagine that we manage to find accurate digital proof that the code reveals CONVENIENT ADJUSTMENTS – that would really be something.

  36. Mama mia her I go again my my why can’t I resist it.

    On the subject of Al Capone
    Dear Robert

    Maybe at present there is insufficient evidence for murder but maybe there is sufficient evidence for tax evasion. (my view is they are banged to rights)

    I believe the raw data is available for the Central England temperature series from mid 1600 to 1973. Professor Manley. Hadcrut have turned the benign series into a hockey stick graph.

    With all the enquiries around something plain and simple is needed again like the NZ fiddling.

    It may seem parochial to concentrate on the uk but that is where two of the enquiries are to be held.

  37. Robert (OP), the point you haven’t dealt with, that several people pointed out in the previous posting, is that this code was either used as “thought experiment” test of the calibration procedure for the Briffa tree ring data (the filename indicates this), or as a way of bootstrapping a correlation process, both of which are perfectly reasonable things to do.

    Here’s Gavin of RC on the subject (which was quoted by “Norman” in comments on your previous posting):

    “It was an artificial correction to check some calibration statistics to see whether they would vary if the divergence was an artifact of some extra anthropogenic impact. It has never been used in a published paper (though something similar was explained in detail in this draft paper by Osborn). It has nothing to do with any reconstruction used in the IPCC reports.”

    And indeed, in the same set of comments, “Morgan” pointed out that the Osborn et al. paper explicitly describes this step:

    “To overcome these problems, the decline is artificially removed from the calibrated tree-ring density series, for the purpose of making a final calibration. The removal is only temporary, because the final calibration is then applied to the unadjusted data set (i.e., without the decline artificially removed). Though this is rather an ad hoc approach, it does allow us to test the sensitivity of the calibration to time scale, and it also yields a reconstruction whose mean level is much less sensitive to the choice of calibration period.”

    I’m not sure which one of these your particular code snippet is doing, but either seem perfectly reasonable explanations to me – and both require the code to be added and them removed again. The lazy programmer’s way of doing this is by commenting and uncommenting.

    Let me give you an analogy: My professional sphere is in real-time video streaming software. In one of my bits of server code there is a mode (switchable by configuration, not comments, but never mind…) where it can corrupt every Nth byte of every Mth packet; we use this to test the resilience of the client device to network errors.

    Now let’s imagine some ‘hacker’ breaks in and steals this code, and for some reason the probity of our software was internationally contentious (not likely, but still…), and people start picking over the code looking for “juicy bits” without understanding the full context. Maybe they find something like this:

    // Corrupt every ‘spacing’ bytes
    void Buffer::corrupt(unsigned int spacing)
    {
    for(unsigned int i=0; i<length; i+=spacing)
    *((unsigned char *)start+i) ^= 0xA5;
    }

    Imagine the furore! Deliberate corruption/falsification of video streams! Programmer uses hacky C-style casts in C++! What is the significance of this mysterious A5 value?!

    Let me be clear: I think there are issues about scientific openness here, especially given the importance of the output, but really, this isn't one of them.

  38. may be OT. In previous thread bill (18:55:00) pointed out :
    “If it’s a moving average then there should only be 10years centred on the date averaged. This allows you to use up to 4years 11months from ends”
    I have now extended graph to show up to 2005, which of course shows noticeable drop for CET.

  39. My question would be… what kind of configuration management system or scheme was the code stored in at UEA, and can the history of changes to the code be tracked from release to release?

    Certainly, if trillions of dollars are at stake, along with the birth of a new industry (to dwarf all other industries), they’d have at least tried at a minimum, to proceed using a “bare minimum entry-level” of code configuration and documentation to cover their hineys, wouldn’t they?

    If this is not the case, is it acceptable practice in the scientific community to not have past code versions stored/documented, which produced specific charts and data?

    All our code is under CM where I work but I’ve always chuckled about the fact nobody has ever had to use it to quantify something from the past – we’ve never had to rely on our CM system although not much is at stake making it seem like a waste. However, this particular exercise makes quite clear to me, that despite my arrogance and my laughing about the usefulness of it, I clearly and completely now get it; I understand why, at a minimum, we do attempt to practice this “bare minimum” of due diligence; this is my newly self imposed personal professional biaach slap and it is a good one. My face is red but it is smiling :)

    So, in summary, to what degree of CM did UEA keep this seemingly important code under? I mean it must have one hefty history eh??

  40. As a layman who is wary of all government agencies and quangos. I would have thought that the route to finding out the truth about science related matters is for all major science matters to be open sourced through countries accepted science bodies like The Royal Society. Whereby any scientist can check their area of expertise, where any party can access and assess the quality of the raw code,where anyone can view and comment on results given. Maybe science does not want real science any longer!

  41. I’ve been programming in IDL to produce plots for scientific papers for about 10 years and this pro clearly isn’t producing a plot for publication. The pro just plots to screen – if it were for publication then it write the graph out as a postscript file to submit to the publisher. To me, this looks like someone experimenting with the data, which doesn’t really mean anything.

  42. Gumby, as I work for a software company, I had similar thoughts as you. One would think with as much money at stake, and the fact that this is project taken on at the highest levels of academia, that they would have some sort of control on versions of their code – but alas, having never worked in an academic setting, my guess is they have no concept of this, and it probably never even crossed their mind that they would. This is the benefit of working in a make believe world as opposed to us in the real world – there is no accountability, and it never really even crosses their mind.

  43. AlanG (01:01:07) :
    harrfn (harry filename?) is an array of data filenames found here:

    29. harryfn=['nwcan','wnam','cecan','nweur','sweur','nsib','csib','tib',$
    30. 'esib','allsites']

    My guess is they correspond as follows:

    nwcan = North West Canada
    wnam = Western North America
    cecan = Central Canada
    nweur = North West Europe
    sweur = South West Europe
    cs and es could be ISO country codes but ns and ti are not.

    nsib = Northern Siberia?
    csib = Central Siberia?
    tib = ??
    esib = Eastern Siberia?

  44. Most of the media reports dwell on “the leaked / stolen emails” when in fact they should speak more accurately of “the leaked/stolen emails and code”. (I suppose code is just too difficult and makes many in their audiences switch off).

    [b]“The emails tell us about the mindset, the code was the weapon that was used to corrupt the data”[/b]. The message about the emails is out, but not the message about the smoking code”. This thought led me to wonder are whether journalists unwilling to deal with the climategate information because it does not arrive on their desks prepackaged as a press release from a reliable source?

    Would it be possible for, Watts up, Climate Audit and Bishop’s Hill to put together an official press releases from the environmental sceptic movement which is written in journalist style language using phrases that encapsulate ideas in everyday language, phrase such as “Climategate; the leaked emails and code” “the smoking code” “magic numbers”. Something that a non scientist hack could convert into an article without making a fool of themselves.

    What the sceptics need is a feed into the mainstream media that will put this important information across to the very large number of people who do not go on to the web for their news and do not go near websites that have a lot of technical language.

    Who is it that organises the sceptic conferences in New York? – this might be a good role for this organisation.

  45. Invariant (03:31:01) :

    From previous post:
    Carsten Arnholm, Norway (02:57:28) : It really does not matter whether the people writing this code were “smart” or not. What matters is that the result is of very poor quality.

    No. That’s complete nonsense! We need to COMPILE the code, EXECUTE the code, REVERSE ENGINEER how it is supposed to work together with the RAW data files in the dump in order to CONCLUDE exactly what it is doing.

    No need to shout. I was commenting on your assertion that “smart” people wrote poor code, which indeed is wrong and a distraction.

    Of course we need to figure out what the code is doing, nobody suggested anything else.

  46. @Jim G
    If, as Fox news says, the EPA Declare Carbon Dioxide a Public Danger, then where does that leave us?
    At a guess, in a long hot summer drinking flat beer or flat coke and Champagne will be banned from sports winner’s rostrums.
    Since we all breathe out CO2 I guess that makes us all public dangers and we will have to wear breathing equipment to filter out the CO2 from our exhaled breath returning only the inert gases to the atmosphere – each day we will take our cartridges of liquefied CO2 for disposal at $20 a pop ($5 to Al Gore).
    Oh, we may all have to wear warning signs and have audible alerts that sound to signal our presence to blind people who won’t be able to hear us above their own audible alerts and through wearing their own breathing sets.
    Pets will no longer be allowed except maybe kangaroos (because they don’t emit methane http://news.bbc.co.uk/1/hi/uk/7551125.stm), but then pets are on the way out anyway because they have a huge carbon footprint (http://www.telegraph.co.uk/earth/environment/climatechange/6416683/Pet-dogs-as-bad-for-planet-as-driving-4x4s-book-claims.html) and we can no longer afford the land to produce their food because so much is given over to biodiesel crops that there is barely enough left over for human food, mainly rice.

  47. woodfortrees (Paul Clark) (03:38:54) :

    After the horse escapes from the barn, a lot of thought can be given on how it had been secured and if the door was safe.

    As a particle physicist I have written a lot of code, fortran that is, because that was the language of my time. We used to put C in the first letter of a comments line. If any of the two scenaria you give were true, there should have been a comment about it, of the form:
    ; temporary fudge to test the bla bla.

    or

    ; temporary fudge to test extraneous forcings

    and for good measure

    ; to be removed from normal runs.

    The reason is because a lot of graduate students get hold of codes, and it is not the job of the programmer to set riddles, but to have a clear path for the next person wading in and checking and using the code.

    Since these caveats are not there, the simplest explanation is that the code was used or would be used as is for data processing, certainly by an unsuspecting graduate student.

    I believe in KISS ( keep it simple stupid).

  48. It seems (paranoia perhaps), that all the major search engines are trying to play the skeptics (sites, news etc..) down.. Has anybody else noticed? Delaying news stories, typing climategate etc..?

  49. The BIG question is why the ‘fudge’ code was there in the first place.

    The excuses I’ve heard are:

    a) “to see what effect it would have”. But why would you need a computer program to tell you what effect multiplying a number by 2.5 would have?

    b)”to facilitate calibration of the data”. Hang on there. If you don’t have reliable data for a period, you leave that period out of the calibration – you don’t substitute ‘made-up’ data.

  50. Sheppard:
    ” the decline Jones so urgently sought to hide was not one of measured temperatures at all, but rather figures infinitely more important to climate alarmists – those determined by proxy reconstructions. “

  51. IDL_chap:

    To me, this looks like someone experimenting with the data, which doesn’t really mean anything.

    It speaks volumes about the mindset of those involved.

  52. Peter (04:23:00) :

    The BIG question is why the ‘fudge’ code was there in the first place

    Absolutely spot on. And the word itself ‘fudge’ speaks volumes all by itself.

  53. Anna V: Does not

    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

    amount to much the same caveat? Certainly if I wanted to actually fudge something without anyone knowing, a 15-star comment wouldn’t be my first thought.

  54. geo (00:08:53) :

    “The smartest kid in the class (CRU) just got caught cheating. Why are his frat brothers (GISS, NOAA) claiming that everything is ok because they all put down the same answers on the test?”

    LOL!

  55. The term Weapons of Maths Destruction is excellent! Hats off to its creator.
    There are many parallels between the “Dodgy Dossier” of Iraqi WMDs and the the reports of the IPCC.
    At the time of the “Sexed Up” campaign that led to the last Gulf war, I like many others was convinced of the validity of the claims made.
    Hindsight brought different perceptions!
    We were cynically misled then and the evidence is mounting that we are being manipulated again.
    To be fooled once is unfortunate. To be fooled twice is just plain stupid!
    Let’s pray that this episode does not through up another Dr Kelly.

  56. Carsten Arnholm, Norway (04:18:08) : I was commenting on your assertion that “smart” people wrote poor code, which indeed is wrong and a distraction.

    OK. Then we agree. Indeed not all smart developers write spaghetti code, but my personal experience is that it is sometimes difficult to understand the source code written by the “smart” people that “manage to juggle 40 balls (functions, variables)” without doing mistakes. Another point is, of course, that if you intend to do something illegal, then you would possibly try to hide it in a mess…

  57. @ Ripper (23:51:23) ,
    the link to quote mining code was interesting.
    But as a defence it has some holes.
    “This doesn’t seem to be a smoking gun so much as a gun that hasn’t been fired.”
    Why write code that isn’t intended to be used? (or not destroy it or very clearly mark it not to be used).
    There is obviously a lot of very poor file management, file naming systems leave a lot to be desired and so on, so it is quite feasible to suggest a piece of code written for no good reason might not be clearly marked and I get the impression that some of the original code wasn’t exactly well written any way.
    But why have the team that wrote the code or had it written not been asked specifically what this piece of code was all about? Why have they volunteered no information?
    OK, maybe we have some sub judice issues.

    In the end we can have all sorts of possible excuses for why such code has been written but unless the raw data is available to test we can’t sort out whether it has been used or not.

    It is a valid caution that it may or may not have been used maliciously. It is possible to postulate that it cold have been used to illustrate something or other.
    So, how can we know wether it has been used or not?
    Simple, run the raw data through it and see if it matches published work. (which shouldn’t be necessary, the science ought to have required the papers were presented with the raw data and the code….. ) except we are told we don’t have all the raw data (though some suspect that this might not be the whole truth either).
    We could take published data and run it backwards through this code to see what sort of “raw” data sets we come up with but we always have that problem. We don’t know that what we get is raw data because we don’t have the raw data.
    About the only chance I see is if we have work available we can run backwards through the code available for that data to generate some re-constituted “raw” data.

    What we are looking for is a sequence of reconstituted data that overlaps with later data that was so “fixed” i.e. some data from the 1960’s forward that was part of published data that did not have the decline fixed in it that is also used in later work where the fix was in. Or not, as the case may be.
    It seems to me that had this code been applied only to new data we might be in a problem but if the fix has a retroactive element to it, as suggested, then this might indeed be a possible.

  58. See Al Gore’s response to Climategate here

    [snip - we don't run the Hitler parody video here - think "deniers" - A]

  59. Carlo (02:49:45) :

    > osborn-tree6\mann\mxdgrid2ascii.pro 10/03/2008

    What was you point in posting this? Just that it has the word ARTIFICIALLY in it?

    All the code does is print out a couple arrays to different destinations in different formats.

  60. woodfortrees (Paul Clark) (04:34:04) :

    Anna V: Does not

    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

    amount to much the same caveat? Certainly if I wanted to actually fudge something without anyone knowing, a 15-star comment wouldn’t be my first thought.

    Not so, in my opinion of course. I do not think these people were really aware that what they were doing was unscientific. They had the market cornered and they were sure they were right.

    Anthony does not like religious analogies, but, in my opinion, the sociology applies to these “scientists”. Were the people counting the number of angels that could fit on a pinhead aware of the preposterousness of their case? When you start by believing that angels exist and can be counted, then anything goes. When you start believing that there is unprecedented twentieth century warming then it is obvious that the data is wrong and has to be corrected for. Simple.

  61. I do not even know what code was used for what purpose, To me the damnation lies in Harry’s RE#ADME text file and the quality of the code, which is not up to standard.

  62. Robert notes:

    Here is the graph I plotted of the valadj array. When we’re talking about trying to interpret temperature data that grows on the scale of one-tenths of a degree over a period of time, “fudging” a value by 2.5 is going to have a significant impact on the data set.

    He’s assuming that the adjustments are in units of °C, but the code makes no references to temperature at all. It does make several references to tree rings and density, but I have no idea what sort of units or range of values those include.

    I fear that Robert is too interested in proving his suspicions and has forgotten that good scientists are as skeptical of their own work as they are of others.

    It would be nice to see some actual plots produced by this code.

    It would also be nice to have time to read up on the “divergence problem” in order to understand reasons why it appears to be only a recent phenomenon, or if similar issues are lost in past noise.

    -Ric

    — New pages! See http://home.comcast.net/~ewerme/wuwt/toc-2009-11.html

  63. What we are looking for is a sequence of reconstituted data that overlaps with later data that was so “fixed” i.e. some data from the 1960’s forward that was part of published data that did not have the decline fixed in it that is also used in later work where the fix was in.

    “The fix was in” — in underworld parlance — describes it pretty succinctly…

  64. Paul Clark,

    I think you raise a reasonable point. Or, it had occurred to me, that an easier answer from CRU is that, “…this was a test version, not the version of the program that we actually used…”

    But, then the next logical step is to release the actual version of the software AND the raw data and prove they work with no fudging.

    Without the release, we are free to think the worst.

  65. Another interesting quote from the UK Telegraph this morning:

    “The other is that the ugly, drum-like concrete building at the University of East Anglia which houses the CRU is named after its founder, the late Hubert Lamb, the doyen of historical climate experts. It was Professor Lamb whose most famous contribution to climatology was his documenting and naming of what he called the Medieval Warm Epoch, that glaring contradiction of modern global warming theory which his successors have devoted untold efforts to demolishing. If only they had looked at the evidence of those Siberian trees in the spirit of true science, they might have told us that all their efforts to show otherwise were in vain, and that their very much more distinguished predecessor was right after all.”

    A bit ironic you might say.

  66. As we all know CRU will be very open from now on. So i suggest that the first thing they tell us is
    WHO IS HARRY ?

  67. P Gosselin (04:42:20) :

    > [Shouting toned down] You’all got to read Marc Sheppard’s piece I linked above!

    Normally, I hate posts like this, however, given the traffic of late, I confess I skipped over your first comment and this convinced me to check it out. I’m impressed. I’m impressed so much that I’m compelled to say

    ME TOO!

    Ordinarily I hate “Me too” posts, apologies to those more pure than I am. :-)

    Seriously, this is so much better than Robert’s analysis, (though it spends less time looking at code) that future discussion should be driven by Sheppard’s claims.

    One thing that programmers (who have time) could do is look for places in the code where the proxy data is switched to instrumental data, that’s at 1980 for Mann’s proxy, and 1960 for Briffa’s.

    The link bears repeating.

    http://www.americanthinker.com/2009/12/understanding_climategates_hid.html

    That references (through a Google search!?)

    http://www.americanthinker.com/2009/11/crus_source_code_climategate_r.html

  68. I’m lost…

    The author said:

    “The source code that actually printed the graph was commented out and, therefore, is not valid proof.”

    and

    “Now, we can finally put this concern to rest.”

    So what’s the conclusion? The code is valid or not?

  69. No, when the very election of a (liberal/socialist/”green-focused” environmental presidential) administration are going to be based on whether or not you can claim that there are “very dangerous” angels dancing on that pin, and how many trillion dollars in taxes can be charged based on how many angels are dancing on that pin, THEN you need to do anything you can to claim as many angels as possible.

    Because, to these ecotheists, the “end” justifies ANY means.

  70. Peter (04:53:56) :

    And, slightly OT, here’s “Scientists behaving badly, Part 3″:

    The editorial in the current issue of the journal, “Nature”, talks about ‘denialists’.
    When one of the premier scientific journals starts using language like that, it’s clear that the age of reason is well and truly over.

    http://www.nature.com/nature/journal/v462/n7273/full/462545a.html

    No, Peter, I don’t think so. Only the age of this co-conspirator journal — and Science Mag is soon to follow. End your subscriptions and subscribe to the valid science “publications” on the internet like WUWT, CA TAV, Chiefio, and others. Give them the same subscription fee that you would have — before they went with THE FIX –given to every print mainline science publication. This would be one huge shift in economic power.

    Add to that decision, another one to either take back or end your membership in every professional scientific society that approved THE FIX under the guise (masquerade) of AGW, global warming, climate change.

    THE FIX I refer to is that Obama (probably) is going to Copenhagen to revel in his “citizen of the world, not the USA” self-aggrandizing title as the agreement re our CO2-pollution fees (in whatever form the EPA decides) are sent to an unelected UN bureaucracy (unaccountable to any sovereign nation) that is ready to rule the world. For what purposes? This is as important a “scientific” topic as is the rotten code and the traitorous pseudo-scientists. May they all rot in jail before their time in hell. They have been agents of this treason.

  71. I was perusing the Bishop Hill blog when I came across mention of another blog where the blogger was performing an analysis of the code. See:

    http://www.jgc.org/blog/

    There are several posts on the subject of the CRU code that are most fascinating, including references to several buggy bits of code. Easy to read for someone not into IDL. Goes beyond the usual references to the Harry_Read_Me.txt file.

  72. Deliberate,calculated,and possibly incompetent while doing so.
    This is indeed the smoking gun,er, code…

  73. “No proof exists that shows this code was used in publishing results.”

    If we run the program and it produces the results of any of their publish graphs, that is proof enough that the program was used.

  74. Robert, can you produce the figures from the contemporary code? I can see if they match the relevant article.

  75. AlanG (01:01:07) :
    harrfn (harry filename?) is an array of data filenames found here:

    29. harryfn=['nwcan','wnam','cecan','nweur','sweur','nsib','csib','tib',$
    30. 'esib','allsites']

    My guess is they correspond as follows:

    nwcan = North West Canada
    wnam = Western North America
    cecan = Central Canada
    nweur = North West Europe
    sweur = South West Europe
    cs and es could be ISO country codes but ns and ti are not.

    nsib = Northern Siberia?
    csib = Central Siberia?
    tib = ??
    esib = Eastern Siberia?

    tib = Tibet?

  76. Al Gore heckled on Youtube…

    It may actually take this kind of protest to wake up the sheeple.

    What I see here is some young frustrated skeptics resorting to making a protest on amateur video and publishing it on Youtube in response to the blatant fascist propaganda being pumped out by mainstream western media.

    Make no mistake about it, when deniers are vilified and attacked, when scientists are prevented from publishing opposing data in journals, when the media ruthlessly shuts down and covers up all opposing views, when the media continues publishing pro-anthropogenic stories daily, when skeptics are called a*ssholes on national television (and the BBC allows it to go on the air …oops a mistake…sure, and I was born yesterday):

    We are dealing with rampant FASCISM.

    It would all be so funny if it was on a Monty Python show but the sad thing is that this is REALLY happening.

    The proof is that like all FASCISTS – the people swept up in this movement are totally absolutely convinced – they have no compunction about folliing the “end justifies the means” path. What is amazing is that these are mostly “liberals” who are orchestrating mass media manipulation and suppression of free speech (yes when I find countless posts MODERATED out of existence and a complete refusal to debate Climate Change, I think we can call this suppression of free speech). Since when did “liberals” turn to the dark side of tyranny and dictatorship – we know what is best for you – so “Shutup” as Dr Watson at the CRU put it!

  77. Regardless of what the raw data might show, hiding it IS foul play. They are a government agency and subject to FOIA requests legally which they have ignored. I say prosecute them and lock them up. They are asking us to spend fortunes without providing the raw data!!!

  78. OT

    Just seen in the Daily Mail – “Emalis that rocked climate change campaign leaked from Siberian ‘closed city’ university built by KGB”

    Read more:

    http://www.dailymail.co.uk/news/article-1233562/Emails-rocked-climate-change-campaign-leaked-Siberian-closed-city-university-built-KGB.html

    REPLY:
    Now all we need are the street videos showing a furtive figure scurrying from CRU, hopping on his bike and pedal madly the hundred miles to the Russian Embassy in London.

  79. Al Gore heckled on Youtube…

    It may actually take this kind of protest to wake up the sheeple.

    What I see here is some young frustrated skeptics resorting to making a protest on amateur video and publishing it on Youtube in response to the blatant fascist propaganda being pumped out by mainstream western media.

    Make no mistake about it, when deniers are vilified and attacked, when scientists are prevented from publishing opposing data in journals, when the media ruthlessly shuts down and covers up all opposing views, when the media continues publishing pro-anthropogenic stories daily, when skeptics are called a*ssholes on national television (and the BBC allows it to go on the air …oops a mistake…sure, and I was born yesterday):

    We are dealing with rampant FASCISM.

    It would all be so funny if it was on a Monty Python show but the sad thing is that this is REALLY happening.

    The proof is – like all FASCISTS – the people swept up in this movement are totally absolutely convinced – they have no compunction about following the “end justifies the means” path. What is amazing is that these are mostly “liberals” who are orchestrating mass media manipulation and suppression of free speech (yes when I find countless posts MODERATED out of existence and a complete refusal to debate Climate Change, I think we can call this suppression of free speech). Since when did “liberals” turn to the dark side of tyranny and dictatorship – we know what is best for you – so “Shutup” as Dr Watson at the CRU put it!

  80. Carsten Arnholm, Norway (04:08:46)
    “………
    nsib = Northern Siberia?
    csib = Central Siberia?
    tib = ??
    esib = Eastern Siberia?”

    tib = Tibet ?

  81. I think there should be a heat map displayed on any temperature graph that indicated the % of data that has been “interpolated”, i.e. “made up out of nothing” … a second version of this could show how many station records where used for each years plot …
    This would make the clear the amount of uncertainty that truely exists in the study of climate data …

  82. This quote from the BBC is worth keeping:

    ”The IPCC relied on three documents to arrive at 2035 as the “outer year” for shrinkage of glaciers.
    They are: a 2005 World Wide Fund for Nature report on glaciers; a 1996 Unesco document on hydrology; and a 1999 news report in New Scientist.

    Incidentally, none of these documents have been reviewed by peer professionals, which is what the IPCC is mandated to be doing…..

    But in a joint statement some the world’s leading glaciologists who are also participants to the IPCC have said: “This catalogue of errors in Himalayan glaciology… has caused much confusion that could have been avoided had the norms of scientific publication, including peer review and concentration upon peer-reviewed work, been respected.”…

    “Under strict consideration of the IPCC rules, it should actually not have been published as it is not based on a sound scientific reference.

    http://news.bbc.co.uk/2/hi/south_asia/8387737.stm

    Lets see they screwed up the reports of melting glaciers, the polar bear population is increasing not decreasing, and the CRU temperature data is extremely suspect and certainly does not meet “strict consideration of the” scientific method.

    Yup, the science is settled, it is settling in to a pool of quick sand, or is that a peat bog?

  83. OK. Then we agree. Indeed not all smart developers write spaghetti code, but my personal experience is that it is sometimes difficult to understand the source code written by the “smart” people that “manage to juggle 40 balls (functions, variables)” without doing mistakes. Another point is, of course, that if you intend to do something illegal, then you would possibly try to hide it in a mess…

    Indeed, as a Software Developer that is my experience too. I’ve seen the most amazingly tortuous (in terms of cyclometric complexity) code come from the smartest people.

  84. When someone claims: “You need the raw climate data to prove that foul play occurred”, they misunderstand science. If something cannot be independently be reproduced, it is mere opinion. Only reproducible results are science. In light of the published source code showing manipulation of the now-missing raw data, honesty demands that any research based on the data published by this group, MUST BE SUSPECT and the burden of proof lays on them to substantiate the basis for any claims or projections they make.

    If their work is reproducible, this should not be a problem.

  85. wood for trees:

    Mann’s nature trick of “hiding the decline” signifies the appropriate method to cover up the declining temperature data.

    Its no less hard to imagine than calibrating satellite data in line with ground stations when they didn’t show a warming trend. If anything they should have been calibrated downwards – if satellites drift, they go closer to earth and read higher temperatures. The official explanation is exactly the opposite of the truth.

  86. Keep it simple.

    You are CO2.
    You are the danger.
    You need to be deleted.

    They want you to self delete.

    Will you self delete?

  87. The way these closed circle of statisitcs manipulators (I can’t call them climatologists) corrupt and delete/manipulate data then foment excuses about it is like those accounting procedures at Enron.

  88. “REPLY:
    Now all we need are the street videos showing a furtive figure scurrying from CRU, hopping on his bike and pedal madly the hundred miles to the Russian Embassy in London.”

    These days, I find “Pravda” more truthful than say “The Economist.”

    Whence this madness?

  89. osborn-tree6\README

    06/26/1998

    MXD and TRW data, optimised regions (best of Harry’s, decline
    and volcano regions)
    , for
    (1) comparison with T & P
    (2) regional volcanic responses
    (3) regional decline

    rd_allmxd1.pro reads in raw MXD data ready for use
    latlon.pro used to convert text to numeric
    pl_mxdlocations.pro plots location of MXD sites
    mkregions.pro allocates MXD sites into defined regions

  90. Debunking Briffa’s Version of the Hockey Stick
    Jim Lindgren • December 6, 2009 12:18 am

    Among the most ethically challenged of the scientists at the Climate Research Unit at the University of East Anglia is Keith Briffa. A couple of months ago, the Bishop Hill blog retold in detail the sad story about Briffa’s own version of the Hockey Stick, which he was able to keep alive by his attempts to prevent other scientists from discovering what he had done with the data, a practice facilitated by biased journals that refused to apply their own rules.

    http://volokh.com/

  91. Jeremy (07:25:22) :

    “ What is amazing is that these are mostly “liberals” who are orchestrating mass media manipulation and suppression of free speech (yes when I find countless posts MODERATED out of existence and a complete refusal to debate Climate Change, I think we can call this suppression of free speech). Since when did “liberals” turn to the dark side of tyranny and dictatorship – we know what is best for you – so “Shutup” as Dr Watson at the CRU put it!”

    Thank you Jeremy, this is what I’ve been trying to say for awhile. I’ve caught much grief here on this scientific site for labeling “left or lib” as a political attack. Forget left vs. right, let’s just look around the liberal left side of the spectrum (where this movement was born and nurtured) and ask –

    “why suppress the debate on this subject for over a decade?” “why turn your back on the current Climategate issue”

    Pamela – it not left vs. right, it’s what the heck happened amongst the lib left to create an atmosphere that condones this type conduct ?

    Left, right, green, blue – none of that matters – since when is it ok to turn a blind eye (MSM) on a worldwide trillion dollar scam that will truly hurt people and economies ?

  92. http://www.woodfortrees.org/plot/hadcrut3vgl/from:1979/to/plot/hadcrut3vgl/from:1979/to/trend/plot/uah/from:1979/to/plot/uah/from:1979/to/trend/plot/rss/from:1979/to/plot/rss/from:1979/to/trend

    Why would CRU need to fudge lines of code if their temperature record since 1979 shows warming that is fairly consistent with RSS and UAH. Certainly skeptics Spencer and Christy are not using CRU code.

    Here is a 9 minute video that shows how ridiculous the Climate Gate emails really are:

    http://www.desmogblog.com/another-look-stolen-emails

    As I said when this story first broke, the wolves were given a fake piece of meat.

  93. Having been involved programming, software design and systems analysis for the better part of 5 decades, I find it unimaginable a supposedly 1st rate research facility would tolerate this code. We were using software design procedures in the 60’s to keep track of the software design. This was to verify what we were doing was right, reduce errors in code changes, and maintenance of the software.

    The only time we would ever keep commented code in the source stream, is that it would be readily available to re-insert for whatever reason at a later date. Otherwise it was deleted. PERIOD.

  94. A STEP BACK

    I am not a computer type but the impression I get from this whole discussion about the computer codes is:

    #1. It is very messy, badly documented and badly programed.
    #2 It is a very poor unprofessional job.
    #3 Poor Harry could not untangle it.

    If this is an example of sophisticated topflight leading edge programing done to model climate, I would not let these people program a simple game for a three year old much less trust their programing skills enough to base policies on that change the world.

    The whole messy code is the smoking gun. It looks like these guys could not program their way out of a paper bag.

    I had originally thought that climate modeling was done by some of the best computer programmers in the world. Are these the people actually doing the climate modeling programs?? If so the bad programing alone should make any computer savy person horrified.

  95. From the Sunday Times in London

    “Almost a month before they were posted on a website popular with climate-change sceptics, the hacked data was sent to a BBC weatherman who had previously expressed his doubts about climate science on his blog.”

    “The BBC has now confirmed that Paul Hudson received some of the documents on October 12. But no story was broadcast or printed by Mr Hudson and the corporation”

    Great to see our national broadcaster on the case, left to them this story would never have seen the light of day.

  96. woodfortrees (Paul Clark) (03:38:54) : Seconded again

    stevemcintyre (07:10:48) : sounds like a good idea.

  97. Robinson (07:48:06) : I’ve seen the most amazingly tortuous (in terms of cyclometric complexity) code come from the smartest people.

    Thanks. I try to write clear code myself though. Actually a nice analogy is that many of the developers that really care about maintanability, readability and code quality really has more in common with marketing people than most hard core scientists and engineers…

  98. I’m suspicious of any manipulation of the data. I can see where simple correction, such as the kind done when moving a station (running the new and old station for a period to adjust the old data to the new records) and I can see similar corrections for UHI where their is comparative data from just outside the island to measure the creep in temp. But, if there is a statistically significant warming occurring over a century and a half of recorded temps, it will show through without bending and coaxing. If not, it is reasonable to conclude that there is no significant trend. It would appear that most agree that there has been somewhere around a degree of warming over a century and a half but the question is: a) is it merely from climbing out of the LIA or is it caused by mans activities. Lets see and plot the historical record, warts and all and see what we are likely dealing with.

  99. (These days, I find “Pravda” more truthful than say “The Economist.”

    Whence this madness?)

    Very good question. Perhaps it is the human need to have a uniting religion to help feel “connected” through “fellowship”.(the so called God gene). This can be easily exploited by politicians like Al Gore, who is a theology major.

    Each to his own I like to say, but in this case, this AGW religion would destroy the US economy by driving up energy prices for no good reason, and spending the money on useless CO2 sequestration.

    I fear for our futures. Will we end up having street fights between eco freaks and the millions of unemployed that surely will result from economic socialism? Will we see guns come out to greet the eco freaks, and not just hostile words? I don’t know, but I think we are going to find out how “patriotic” Americans react to the destruction of thier freedom. Maybe that is why the USA currently is arming itself like crazy with guns and ammunition. Throw in a racial bias against the President, and all hell could break loose. Tea parties indeed!

    Can Mann et al live with what they are producing through thier lies??? Have they NO sense of responsibility??? Is there NO attorney general in the USA that sees the need to charge Mann et al with fraud? The clock is ticking.

  100. Layne wrote:

    Layne Blanchard (23:50:50) :

    “Kiss off the next three years. The current administration isn’t interested in facts. They’re on a mission to ram an agenda down America’s throat while they can. We need to look at how we can recover after 2012 (if we’re lucky)”

    The only thing that can stop the progressive elites in the US is a veto proof overhaul of the US congress in 2010. Such an event is not likely. We are the metaphorical live frog being slow-boiled for dinner.

    Most human beings do not learn from history; therefore, we are doomed to re-live it. (Did I steal that? Is a dead Spanish guy calling me an Ar’se right about now?)

    Progressivism is another dangerous religion.

    markm

  101. geo (00:08:53) :
    >The smartest kid in the class (CRU) just got caught cheating. Why are his frat brothers (GISS, NOAA) claiming that everything is ok because they all put down the same answers on the test? If CRU takes the real data and adds fudge.dat, then aren’t they all just selling different flavors of fudge?<

    An excellent analogy.

  102. Robert
    Thanks for the clarity.

    On “fudge factor” of 2.5, may I recommend adding that since this is interpolated with the temperature, the units are in degrees (C or K).
    ie interpolating the raw data with a 2.5 degree fudge factor.

    Keep us posted for when someone is able to show by back calculations or against the raw data that this fudge factor was actually used in published work or data.

  103. Gumby (03:48:06) : My question would be… what kind of configuration management system…

    Andrew (03:56:56) : … they would have some sort of control on versions of their code…

    I have the same question. Given that harry is doing this work 2006-2009, one would think that some type of version control would be used. Having worked in the 90s on commercial database training courses with a team of less than 10, we used version control software to keep track of everything (courses, instructor notes, code examples, etc.). It is really the lazy person’s approach, since once you get it set up your life is immensely easier.

    Perhaps they have version control and we just don’t know it. The Harry read me file is a strong indicator that they do not.

  104. The short version

    002 2. ; PLOTS ‘ALL’ REGION MXD timeseries
    007 7. ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
    011 11. fudge factor
    012 12.’Oooops!’
    026 26. ; Get regional tree lists and rbar
    050 50. ; Now normalise w.r.t. 1881-1960
    055 55. ; APPLY ARTIFICIAL CORRECTION
    060 60. ; Now plot them
    077 77. ; Extract the post 1600 part
    083 83. ; APPLY ARTIFICIAL CORRECTION
    088 88. ; Now plot it too
    097 97. ; Now overplot their bidecadal components
    105 105. ; Now overplot their 50-yr components
    115 115. ; Now compute the full, high and low pass correlations between the two
    145 145. printf,1,’ ‘
    148 148. close,1
    150 150. end

  105. mkurbo,

    Yes it seems I am not the only liberal leaning socialist who is offended by the way extremist fascists are taking over the left.

    Left or Right – to my mind – both become evil and fascist when taken to the extreme. A healthy society is one where we balance Right and Left and therefore a healthy debate should ALWAYS be encouraged. No matter what side of a position I find myself leaning towards, I would still want to hear ALL the arguments from the other side in case it changes my mind or in case we can find the best middle ground. Nevertheless, in the case of Climategate, it matters not what side of the fence you sit on; the emails have been recognized by CRU as being authenticate – they are factual, fact is fact and conspiracy to commit fraud is fraud, IMHO.

  106. Scott A. Mandia (08:29:35),

    Thanks a lot for posting that insufferable video… NOT.

    The only thing that didn’t make it a complete waste of time is the fact that the apologist who made it obviously cherry-picked a few comments that were arguably arguable.

    But I notice that he never once mentioned things like the CRUcial statements by these scientific fraudsters, who refused to cooperate with legal FOIA requests [at least 40 of them!]; their open admissions of gaming the peer review system, conspiring to ruin journals and individuals that didn’t play their AGW game, the finagling of the code, etc.

    The jamoke who made your video is far less credible than any of the folks he tries to discredit. The reason you can’t do any better than that lame video is because it is clear to even the most casual observer that the tens of millions of dollars funneled into the pockets of these globaloney fabricators, on both sides of the Atlantic, and the constant, all expense paid trips around the world to hobnob with others of their ilk [with no skeptical scientists invited] was more than enough to thoroughly corrupt them.

    What would it take to make the scales fall from your blinkered eyes, and let you see what was going on? The original, raw temperature data? …oh, right. That’s the part that’s missing. Convenient, eh?

  107. Scott A. Mandia (08:29:35) : “Why would CRU need to fudge lines of code if their temperature record since 1979 shows warming that is fairly consistent with RSS and UAH”.

    Your right for this period of time, but the code here is for the period 1904 to 1999 (or something?), and the adjustment of station data is before 1979.

    I’m not sure why this code is implemented? I can’t exclude that it “hide the decline” between the 1940s and the 1970s if it is implemented for a subset of stations. However the temperature record after 1979 is irrelevant.

    Do you remember this USHCN station data adjustment:

    http://www.coyoteblog.com/coyote_blog/2007/07/an-interesting-.html

    This is the distribution of adjustment between rural areas and major cities:

    http://www.climateaudit.org/?p=1859

  108. The post is confusing for the average person who reads it.

    Robert Greiner “The source code that actually printed the graph was commented out and, therefore, is not valid proof.”

    Actually there was a later source code which printed the garph which was NOT commented out Here: http://di2.nu/foia/harris-tree/briffa_sep98_e.pro

    So that makes Robert Greiner’s comment above WRONG.

    What he is now saying is. Ok the code that prints the graph, which fudges a value by 2.5, AND IS NOT COMMENTED OUT IN A LATER VERSION OF THE CODE, would have a significant impact on the data set. (Impact shown in the hockey stick graph above)

    But we do not know whether it was actually used until we have Briffa’s (of Yamal fame) raw data.

    Have I got that right?

    The code was named (had a heading) “APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE”

    Hiding the decline? Very ARTIFICIALLY?

    I thought that some raw data of Briffa’s has already been obtained by Steve McIntyre? Or are we talking about different raw data?

  109. Robert,

    Once again, this is the same sort of overreaching [snip] that the Team has been doing for years.

    You have not proved anything untoward until you have demonstrated that the effect of this code is incorrect and that it was used to produce published results that are wrong. You have not done that.

    It would appear that the only thing nearly as pathetic as a biased climate scientist writing computer programs, is a biased computer programmer dissecting climate science.

    What you have presented thus far is context free conjecture – exactly what the Team accuses. This crap plays into their hands.

  110. If the CRU lost their original data then what are they working with now? You would think that this would shut down all research history matching.

  111. The one graph shown on this post is completely mis-interpreted. Robert Greiner seems to think they are temperatures, but they are not. They are in MXD units (a tree-ring density index) and are added to a MXD series. So the comments on their magnitude are completely wrong. Furthermore the whole later part of the series was not used at all (that’s the point of some of the code comments).

    The term “fudge factor” doesn’t indicating cheating, as some commenters imply. It is a term commonly used in an experimental context. You’re trying something out – set the FF to 1 for full effect, to zero for no effect, and then intermediate values. It helps pick up errors, avoid instabilities etc. These people do research, which means they write lots of code to just try out ideas. Most of the ideas don’t pan out.
    That’s science.

  112. Robinson (07:48:06) :

    “Indeed, as a Software Developer that is my experience too. I’ve seen the most amazingly tortuous (in terms of cyclometric complexity) code come from the smartest people.”

    In my working life, before I retired and regained my sanity, I wrote many time-critical hardware drivers (mostly assembly, as C-callable procedures) for various production machinery and telescopes. We had a standing rule that no mathematician, statistician, scientist, or astronomer was allowed to write ANY hardware code! This followed a rather spectacular crash of a telescope that attempted to simultaneously rotate the RA and Dec axises through 360 degrees! Cables??!! We don’t need no stinking cables!!!

  113. All CRU must do is just release everything. If emails are out of context, just show us the context. If this isn’t the production source code, show us the production source code.

    Just obey the FOIA laws and release the documents.

  114. Everybody who has a computer laughes about Harry, poor chap. And he was only trying to understand the genius from CRU.
    So let us offer him space in this blog to explain himself; maybe he is smarter that the rest of them together.

  115. ” Scott A. Mandia (08:29:35) :
    …that shows how ridiculous the Climate Gate emails really are:”

    Agreed.
    One would expect that pompous, ridiculous fools write pompous, ridiculous emails ?

  116. Nick Stokes (11:01:38) : The term “fudge factor” doesn’t indicating cheating, as some commenters imply.

    In physics and simulation science, fudge factor is the factor you need multiply your equation with to get a perfect match with experiments.

    After I wrote the sentence above I checked with Wikpedia.

    Fudge factors are invented variables whose purpose is to force a calculated result to give a better match to what happens in the real world.

    http://en.wikipedia.org/wiki/Fudge_factor

  117. Invariant (11:53:11) :

    If you derive your equations from the fundamental laws of physics multiplying with a fudge factor is an unforgivable sin.

    On the other hand if you for example are trying to tune your model, you can multiply, for example, the thermal conductivity k, with the Nusselt number Nu, in order to get the effective thermal conductivity, k*Nu, in for example natural/thermally driven free convection. You are allowed to tune your model with different values of k to match experiments.

    However, it is highly unlikely that to use the wording “fudge factor” for tuning, using a “fudge factors” is always a sign of dishonest science.

  118. woodfortrees (Paul Clark) (04:34:04) :

    Anna V: Does not

    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

    amount to much the same caveat? Certainly if I wanted to actually fudge something without anyone knowing, a 15-star comment wouldn’t be my first thought.

    I have a question for woodfortrees. Why do you assume the author of the comment didn’t want anyone to know about it? When in a den of thieves you don’t try to hide criminal exploits, you brag about them.

  119. Sure you have read Contact by Carl Sagan. Can we view the 165 Mb of data from CRU as a similar message as they received from the star Vega in Contact? Maybe we just have started to read out the prime numbers in our message?

    (We will manage to compile, execute and understand the source code.)

    “Suddenly they pick up a broadcast of non-random signals from the direction of the star Vega. These turn out to be coded prime numbers. This excites great government and military interest; however, Drumlin, now the national science advisor, comes in and commandeers the discovery. The message is found to contain a video signal of Adolf Hitler at the 1936 Berlin Olympics, which was the first television broadcast to be sent into space from Earth. But hidden inside the signal are found to be detailed engineering schematics. After much speculation, it is discovered that these are plans for the construction of a device for intergalactic travel.”

  120. Scott A. Mandia (08:29:35) :

    If you read the emails, or as many as you can, such as Briffa’s severe scepticism, his ambivalence to Mann, it is odd that such a conclusion is drawn. Its clear to me that Briffa gave into the project, or was browbeaten into it. (That might be why he isn’t being implicated in the fraud.)

    they are interesting to read.

    Anyhow, shouldn’t this be a court case in the manner an Inconvenient Truth was taken to court?

  121. Scott A. Mandia (08:29:35)

    “Why would CRU need to fudge lines of code if their temperature record since 1979 shows warming that is fairly consistent with RSS and UAH”.

    reply

    This data did not show a warming trend originally. THis was eventually calibrated with the “value added” ground and SST data to give it the warming trend that is being looked for. Satellites are better than thermometers as they cover large areas, so what did they do? They calibrated the readings with biased ground thermometer readings to give exactly the trend they were looking for.

  122. Reed Coray (12:08:13) :
    .
    .
    .
    When in a den of thieves you don’t try to hide criminal exploits, you brag about them.

    I see a different possibility: An honest programmer forced to do dirty work but who protests about it in the comments.

    And also this, truth is invaluable, even to liars.

  123. >>When you start believing that there is unprecedented
    >>twentieth century warming then it is obvious that the
    >>data is wrong and has to be corrected for. Simple.

    Especially when your job and income depend on doing so. Self-perpetuating societies and organisations always develop these traits.

    .

  124. Just to make sure I understand the beef about hide the declline…

    Results from tree-ring analysis show that historic temperature trends diverge from real measurements in the latter half of the 21st century.

    So AGW-ites don’t use that data and instead use real measurements tacked onto the end of the tree-ring data to make the historic temp charts in IPCC’s reports. They do this because there were no temperature measurements prior to the 19th century. So they use “proxy” (psuedo-pretend) data for before the 1900’s and real data for after.

    Since the tree-ring analysis diverges from measured data, it places the whole history of temperature in IPCC’s reports into question. i.e. if there is divergence now, how can we not assume there was divergence in the past that isn’t shown?

    Past divergence could have been up, or down. Correct?

    If past divergence is similar to recent divergence, then it would have been the case that tree-ring data analysis results are significantly lower than they should be. So it was likely much warmer in the past than is shown on the IPCC’s report.

    Or, it could have been colder.

    Since divergence seems probable, then it is not possible to judge the existance or non-existance of a past MWP. So one plot showing the MWP is just as valid as another that doesn’t show it. Both are intractible.

    Is that the logic?

    If that’s all true, then unless ice core data, sediment cores, etc show a more reliable record then the only reasonably accurate temp record we have is the real data starting from the mid 19th century. Correct?

    So…. the beef is that we really don’t know the answer regarding AGW and won’t until we either have better, more reliable predictive models or we really do reach a tipping point and the world as we know it melts into oblivion.

    Given where we are with respect to controlling CO2 I’d say the situation is hopeless, Copenhagen is a waste of time, and we should start moving our citiies underground or migrating to lunar settlements asap.

  125. Ok here is what I can make out from the post and the comments:

    1. The code named “APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE” prints a graph, which fudges a value by 2.5, would have a significant impact on a data set, and produce the hockey stick graph shown above

    2. We do not know whether it was actually used until we have Briffa’s (of Yamal fame) raw data.

    3. woodfortrees (Paul Clark) (03:38:54) quotes Gavin of RC who says that “It was an artificial correction to check some calibration statistics to see whether they would vary if the divergence was an artifact of some extra anthropogenic impact.[cant understand that - but maybe some statistician can] It has never been used in a published paper (though something similar was explained in detail in this draft paper by Osborn). It has nothing to do with any reconstruction used in the IPCC reports.”

    4. Paul Clark (03:38:54) – says this is reasonable. He programs video streaming software (in C?)

    5. anna v (04:20:35) : who writes fotran code for particle physics says it is not.

    6. A bit of yes it is – no it aint between the two (and a whole lotta others who chip in – just opinion, a waste of time)

    7. The bottom line – we need the raw data to see whether it was in fact used or not. Gavin has been caught out saying things that were not true by Steve M before.

    8. [I thought that some raw data of Briffa’s has already been obtained by Steve McIntyre? Are we talking about the same raw data?]

    9. stevemcintyre (07:10:48) : Robert, can you produce the figures from the contemporary code? I can see if they match the relevant article.

    What does he mean by “the figures from the contemporary code”? Data from the code? or take Steve’s data and produce the graphs and then Steve M will see if they match the “relevant article”, (which I presume is the Yamal data that he has got?)

    Can someone enlighten me on this? Have I got things right?

  126. Doug (14:42:33) :

    No, you did not understand correctly.

    The AGW proponents worldwide from HADCRU to NIWA to GISS have been generally adjusting past temperatures downwards and late temperatures upwards to create the false impression of an ever increasing warming and even more rapidly increase in the rate of warming. Whenever and wherever it has been possible to obtain raw temperatures not subjected to the Alarmist adjustments, the analysis depicts the recent warming was not any worse in the rate of increase or the absolute increase than similar such cycles which occurred in the 20th Century and earlier centuries when humanity was not emitting significant quantities of carbon dioxide into the atmosphere. Global Warming, also known as Climate Change, is a manmade fraud of “robust” and “unprecedented” proportions. There is, in fact, not enough carbon dioxide present in the remaining fossil fuel reserves to cause Global Warming, even if you were to burn all of them in only one year. The hoax is ludicrous.

  127. “we need to COMPILE the code, EXECUTE the code to determine WHAT it is doing!”

    Indeed compilation and execution with a dummy dataset would be an initial step. After that one would need to use a known dataset and matching published CRU results to check reproducibility.

    In the majority of the discussion over the code (its evident poor quality) and version control (there appears to be none) one should not forget that the data appears to be in a similar state.

    I would hazard a guess that CRU do not have any precise record of the code and data version used to obtain any of their published output. Its all indicative of an exceptionally sloppy working methodology. Was their scientific methodology equally sloppy?

    “Complaining that the source code is poor quality is a dead end – please stop doing that!”

    Agreed but we also need to be alert to the fact that some of the coding errors already identified may well have not insignificant effects on the computed results.

  128. AJC (15:34:13) : “we need to COMPILE the code, EXECUTE the code to determine WHAT it is doing!”

    Indeed compilation and execution with a dummy dataset would be an initial step. After that one would need to use a known dataset and matching published CRU results to check reproducibility.

    Check reproducibility? what on earth for?

    The code HAS been compiled on some data set by Robert Greiner above. “the raw data could actually be temperature data or corporate sales figures, the result is the same; a severe manipulation of data.” We know what it is doing!

    What we do not know is if it has ever been used in a published paper or any reconstruction used in the IPCC reports. Gavin Schmidt says emphatically no.

    Now we have to see if that is correct. This may apply to the Yamal data that Steve M has got, so maybe we can verify that.

  129. Anna V: Does not

    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

    amount to much the same caveat? Certainly if I wanted to actually fudge something without anyone knowing, a 15-star comment wouldn’t be my first thought.
    ————

    No, but if you might if your boss instructed you to write code that fudged data and you were unhappy about it.

  130. I agree with the spirit of the post by JJ and others who are cautioning us not to make strong accusations on aspects of this scandal where we lack really strong evidence. Our side has been given a winning hand, if we will just contain ourselves and let things play out naturally. (Investigations, congressional hearings, further defections, more articles like those in American Thinker and the Weekly Standard appearing in mainstream publications, shocked editorials in mainstream scientific journals, etc.) We need to keep the other side on the defensive by not overreaching and giving them an opportunity to counterpunch.

  131. On the VERY ARTIFICAL CORRECTION topic:

    Back2Bat (13:56:27) : I see a different possibility: An honest programmer forced to do dirty work but who protests about it in the comments.

    D (16:36:18) : No, but if you might if your boss instructed you to write code that fudged data and you were unhappy about it.

    Been there, done that.

    Sometimes the only fix available is a horrible one, and most programmers don’t want sloppy work to reflect on them. So you do it but leave a big enough trail so those that follow don’t think you’re a moron. Which is what I think “Harry” did.

    The question here is whether it was a fix or something more sinister.

    On to the topic of the poor quality of the source code:

    Invariant (02:02:29) :

    Complaining that the source code is poor quality is a dead end – please stop doing that

    I’m not sure I agree. I think the quality of the code does have some bearing. With the number of files involved, basically doing their own DBMS instead of using a commercial one, leaves lots of room for errors and unintended consequences. But yes, it would be good to run it and see what it does. Where’s that data?

    Back in my mainframe days in the 80s you couldn’t put a report program into production without a design review and operations review. I guess no such standards exist in the scientific world.

  132. In the meantime we can examine the logic –

    woodfortrees (Paul Clark) (03:38:54) :
    Robert (OP), the point you haven’t dealt with, ..is that this code was either used as “thought experiment” test of the calibration procedure for the Briffa tree ring data (the filename indicates this), or as a way of bootstrapping a correlation process, both of which are perfectly reasonable things to do

    “Thought experiment” ? What could the thought be? – Perhaps to apply “A VERY ARTIFICIAL CORRECTION FOR DECLINE” to test the calibration procedure for the Briffa tree ring data. We have seen on CA that the calibration procedure for the Briffa tree ring data did indeed lead to “A VERY ARTIFICIAL” hockey stick increase of temperatures, in place of a decline towards the end. So maybe this “Thought experiment”, which is all coded up and ready to go, was put into practice? and this is the very code that did it?

    What I dont figure out is why this would be a very reasonable thing to do. Specially when Gavin Schmidt denies it was ever actually done. Even he doesnt think it would be a reasonable thing to do to actually put it into practice.

    As for it being “a way of bootstrapping a correlation process”, I dont understand that at all. Doesnt make sense to me but perhaps it would to someone more knowledgable.

    woodfortrees (Paul Clark) (04:34:04) : Anna V: Does not

    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

    amount to much the same caveat? Certainly if I wanted to actually fudge something without anyone knowing, a 15-star comment wouldn’t be my first thought

    But the assumption is the person was thinking about fudging something without anyone knowing. Maybe the person was pretty sure that noone would get to know of it except the person or persons who were supposed to? Maybe thought that someone might never crossed his mind?

    Would Jones have written that he was going to delete data and advise others to do so if he believed it would be published for all the world to see?

    If he felt he was absolutely secure, as did Jones, Briffa etc, he would be more concerned about clarity than concealment. You can tell he wasnt a particularly brilliant programmer.

  133. “Past divergence could have been up, or down. Correct?”

    Correct.

    Although the only divergence that we have documentation of is down …

    If the effect that is causing the current downward divergence is also operating anywhere in the past data, then previous warm periods would be clipped as well. Such a divergence could be caused by some real effect (such as a temp limit in the trees response to temp), or it could be an effect of the Team’s analysis methods. Given that their obvious intent is to clip pre-modern warm peaks, the divergence that requires ‘hiding the decline’ could be an artifact of ‘hiding the incline’ of the past.

  134. What emerges from the source code, data losses, emails, etc., is a sense that these climatologists believed they were playing a larger game than science: they were good shepherds, bringing us out of the dark. The same, with few qualifications, can be said of the media – which explains its spotty and off-center reporting of Climategate – and, for that matter, of many politicians and governments.

    See “Climategate: The good shepherds”:

    http://vulgarmorality.wordpress.com/2009/12/06/climategate-the-good-shepherds/

  135. “What emerges from the source code, data losses, emails, etc., is a sense that these climatologists believed they were playing a larger game than science: they were good shepherds, bringing us out of the dark.” vulgarmorality

    Bingo!

    Bring on the cold! Lord, please freeze some sense into people, yet not my will but Thine. Amen.

  136. Quote from Fox News Sunday panel discussion:

    “With regard to this Email issue … If you come back to the reality, what we find here is scientists trying to change the language, at times, to make it less controversial in terms of the political discussion, but not to change the facts on the ground; the facts on the ground being that, you know, we continue to emit too many carbon dioxides, and CO2, and other gasses and as a result we are capturing the heat from the sun and global warming is a reality….” Juan Williams, 6 Dec 2009.

    I believe this is a good example of the honest opinion that most of the current ‘elite’ media have of the hypothetical Carbon Dioxide Crisis and Climategate in particular.

    [REPLY - Maybe honest. But likely in error. (I like Juan.) It seems CRU may well have changed the "facts on the ground", namely 1the 1930s. And whether or not we emit too much CO2, which may have very little effect on temperature, has yet to be determined. ~ Evan]

  137. vulgarmorality (17:48:19) :

    What emerges from the source code, data losses, emails, etc., is a sense that these climatologists believed they were playing a larger game than science: they were good shepherds, bringing us out of the dark. The same, with few qualifications, can be said of the media – which explains its spotty and off-center reporting of Climategate – and, for that matter, of many politicians and governments.

    See “Climategate: The good shepherds”:

    http://vulgarmorality.wordpress.com/2009/12/06/climategate-the-good-shepherds/
    ================

    Here’s the full article, my favorite article ever on this whole business, which deserves a thread of its own because of its importance in derailing the blame-the-left meme here that Pam and others like savethesharks have objected to.

    The blame falls “the anointed” — the self-consciously “aware and concerned,” highly educated cognitive elite who resonate with one another across all sectors of society and buy into one another’s rationales, tactics, credibility, and value-priorities.

    One of their main concerns is to avoid being consigned by their fellows in this group into the ranks of the “benighted” (the crudely selfish and ignorant), which is why accusations of being tainted by Big Oil or right-wing think tanks or Fox News or IDers or flat-earthers make such powerful and commonly used weapons by the groups’ mind-guards in keeping the rank and file in line.

    IOW, in large part their motivations are partly idealistic, but also partly social and psychological, in that they want to be part of the leading edge of a high-status in-group, and also want to nourish and bask in the feeling of self-approbation that this reflected self-worth, and this perception of acting idealistically, gives them.

    Well, I could go on like this for pages, but enough.

    =================
    Climategate: The Good Shepherds

    http://vulgarmorality.wordpress.com/2009/12/06/climategate-the-good-shepherds/

    Belief in conspiracy theories, let me suggest, is more a matter of personality than of evidence. Temperamentally, I’m a conspiracy skeptic. I doubt there are many people on earth who can be devilishly clever.

    So when it comes to Climategate – the scandal triggered by the unauthorized release of thousands of emails and documents from East Anglia University’s Climatic Research Unit – I reject explanations that involve Machiavellian behavior. I can’t see the CRU as the hub of a global campaign to impose the political triumph of green policies.

    But the real explanation may turn out to be more serious and dangerous, because it casts a far wider net. It involves the climate bureaucrats at CRU and their American allies at NASA, NOAH, and elsewhere, many agencies in many governments and international organizations, and the mainstream media virtually everywhere. In my opinion, these people didn’t conspire together. They just think alike.

    They subscribe to a particular story about themselves and human society which is prevalent among highly educated people, and may well be the greatest threat to liberal democracy today. My name for the story is “rationalism”; Thomas Sowell called it the “unconstrained vision.” Some, including many who embrace it, associate this cluster of dogmas with the political left – but I believe it transcends such archaic labels.

    I want to be clear about this. I hold that many climatologists, politicians, and journalists share a number of operating assumptions, which in effect allows them to coordinate their actions without resorting to conspiracies. That these assumptions are self-serving is undeniable but here besides the point. They support the story of the elites as the good shepherds, and this in turn endows the believer with the moral authority for practically any action.

    Here are the logical pillars for the story of the good shepherds:

    A few of us are wise and good, but the average person is foolish and easily misled.
    The only moral imperative is human development, and the only path to human development is power in the hands of the wise and good.
    Information must be used by the wise and good, but withheld from the public to avoid panic and confusion.
    Society is a tissue of outworn traditions and superstitions, and must be rationalized according to scientific principles.
    Opposition to the wise and good can only come from selfish, corrupt forces and their dupes.

    Evidence of these principles in action abounds in the Climategate affair, and would fill more space than I have in this post – the CRU documents alone are 160 MB. What follows is by necessity selective and illustrative, which is to say, partial and incomplete.
    First, the climate scientists. We should think of them as scientist-bureaucrats, combining the analytic inclination of the former and the primal hunger for funding and prestige of the latter. Becoming saviors of the earth by using their educated brains must have been, from both perspectives, impossible to resist. Presidents and prime ministers were now their audience. Further, the names in the CRU documents comprise a suprisingly small group – maybe 50 persons, the power elite of climatology.

    Their emails depict a world misled by false prophets, in sore need of guidance: “I trust that history will give us all proper credit for what we’re doing here.” As good shepherds, they sought to keep control of the IPCC process, which – as ferocious turf warriors – they intuited to be of supreme strategic importance. If, to control the IPCC, journal editors must be purged, or the peer review process corrupted – well, the moral imperative trumped such quibbles. Critics were unscientific barbarians, whom one wishes to pummel and in whose death one rejoices. They must be denied data at all costs.
    The CRU group perpetrated fraud and abuses in perfectly good faith, out of concern for their flock.

    The IPCC represented the commanding heights of their work. It too made news, and provided cover to politicians who advocated costly good shepherd policies and needed a global authority for this purpose. The 2007 IPCC report obliged with a “Summary for Policymakers” brimming with authoritative dictums – “There is high agreement and much evidence” recurs like a mantra – and making the leap to policy recommendations. (By contrast, in the full report the words “uncertain” and “uncertainty” appear “1,300 times in 900 pages.”)

    The IPCC chair, Rajendra Pachauri, is nothing of a scientist but very much of a Torquemada, who responded thusly to criticism by a skeptical Bjorn Lomborg: “What is the difference between Lomborg’s views and Hitler’s?” Not surprisingly, Pachauri’s response to Climategate has focused on the “unfortunate” “illegal act” of divulging the CRU documents.

    For politicians, global warming is like manna from heaven. Unlike wars, recessions, or hurricanes, the crisis will come, if at all, in the far future, long after they have retired. Yet it allows them to make messianic speeches, demand increased powers, and hammer their opponents without mercy or restraint. They can point to the IPCC reports and play the good shepherds free of political risk.

    The role demands the use of unbridled language, as Mark Steyn amusingly demonstrates. These are elites talking to their foolish publics. They presume simple-minded exaggerations are all such people will understand. Critics are dismissed as illiterates – “flat earth” types, according to the UK’s Gordon Brown – or villlains. They, the good shepherds, are wiser and nobler: thus Brown, Nicholas Sarkozy, and – possibly – President Obama transcend mere politics and assume the robes of philosopher-kings.
    Finally, the media. The story of the good shepherds is identical to the ideology of news, which assumes that, without journalists, the public will wallow in self-satisfied ignorance. Global warming was the sweetest kind of journalistic enterprise. It demanded that people be educated against their will. It inspired constant flattery and cajoling from the ultra-smart scientific set.

    Some years back the vice president of the Royal Society appealed to “all parts of UK media” to avoid skepticism about global warming. Shadowy people “on the fringes, with financial support from the oil industry” might try to corrupt journalists; they must resist. (Interestingly, the released documents reveal strong “financial support from the oil industry” in CRU research.) NYT science correspondent Andrew Rivkin appears as “Andy” in the CRU emails. He is asked by climatologist Michael Mann, who is heaping scorn on those debunking his findings: “Fortunately, the prestige press doesn’t fall for this sort of stuff, right?”

    Climategate has been another blow to the skull of mainstream journalism. Coverage has been scant and bizarrely slanted. The best in my opinion has been the WaPo. Worst by far has been the BBC, which has become a sort of Pravda of global warming – calling it, in one particularly strange post-Climategate story, a “major cause of conflict in Africa.” But the typical MSM reaction has been muttering or silence. One need only recall the uproar from the Pentagon papers or the leaked Bush-era domestic surveillance materials, to realize how unnatural this behavior is.

    Against its own business interests, the media is looking away from a scandal. The reason, I suggest, isn’t conspiratorial but ideological. Journalists, like climatologists and politicians, despise the public and wish to become society’s good shepherds.

    The picture that emerges is that of elites in different domains supporting and reinforcing each others’ impermeability to public opinion. Climatologists demand funding and the silencing of reasonable criticism. Politicians promote huge government programs and relegate reasonable opposition to the Flat Earth Society. Journalists can deal in doomsday and be flattered by powerful and brilliant individuals. Nowhere, in all this, is there a place for the voter or the marketplace. Ordinary people are foolish and must be protected from themselves.

    And that should be the great concern of all. The story of the good shepherds leaves no room for liberal democracy – for a multiplicity of choices by free citizens. It’s top-down. It’s nakedly authoritarian. That so many smart people, in so many influential places, have bought into it should give one pause.

    I’d almost prefer an honest conspiracy.

  138. ‘Hide the decline’ with precorrected data files?

    if you grep ‘artifi’ the document- and subfolders, you get 32 files.
    Hardcoded (factors in source) correction for ‘decline’ is only found in briffa_sep98_d.pro (dated 03/99) and briffa_sep98_e.pro (dated 09/98).
    It seems, later on CRU used another technique for correction values, these are precalculated in the data files itself.

    In more recent sources like data4alps.pro (dated 08/2008) you find:
    ;
    ; Writes an ASCII file with data (gridded, yes/no extended, corrected,
    ; yes/no ABD-adjusted, calibrated) for input to the Arctic synthesis update.
    ;
    doinfill=0 ; use PCR-infilled data or not?
    doabd=1 ; use ABD-adjusted data or not?
    docorr=1 ; use corrected version or not? (uncorrected only available
    ; for doinfill=doabd=0)
    missval=-9.99
    ;
    ; Get the calibrated data
    ;
    win
    ;
    print,’Reading reconstructions’
    if doabd eq 0 then begin
    if doinfill eq 0 then begin
    restore,’calibmxd5.idlsave’
    ; Gets: g,mxdyear,mxdnyr,fdcalibu,fdcalibc,mxdfd2,timey,fdseas
    if docorr eq 0 then fdcalibc=fdcalibu <—- look here
    endif else begin
    restore,'calibmxd5_pcr.idlsave'
    ; Gets: g,mxdyear,mxdnyr,fdcalibc,timey,fdseas
    endelse
    endif else begin
    if doinfill eq 0 then begin
    restore,'../mann/calibmxd5_abdlow.idlsave'
    ; Gets: g,mxdyear,mxdnyr,fdcalibu
    print,'PROBABLY WANT THIS ONE'
    if docorr eq 0 then fdcalibc=fdcalibu <—– look here
    endif else begin
    restore,'../mann/calibmxd5_abdlow_pcr.idlsave'
    ; Gets: g,mxdyear,mxdnyr,fdcalibu
    endelse
    endelse

    The statement
    if docorr eq 0 then fdcalibc=fdcalibu
    seems to overwrite precorrected values (fdcalibc) with uncorrected ones (fdcalibu) only when docorr = 0
    Further processing here and in many other sourcefiles uses only fdcalibc.

    if you grep 'calibmxd5.idlsave' (one datafile), you get 34 matches. In only one file this data file is written, look at pl_calibmxd4.pro (dated 09/99):
    ;
    ; Now compute the calibrated values now that all boxes have coefficients
    ;
    for iyr = 0 , mxdnyr-1 do begin
    fdcalibu(*,*,iyr)=fdalph(*,*)+fdbeta(*,*)*fdcalib(*,*,iyr)
    fdcalibc(*,*,iyr)=fdalph(*,*)+fdbeta(*,*)*fdcorrect(*,*,iyr)
    endfor
    ;
    ; Now save the data for later analysis
    ;
    save,filename='calibmxd5.idlsave',$
    g,mxdyear,mxdnyr,fdcalibu,fdcalibc,mxdfd2,timey,fdseas
    ;
    end

    Variable fdcorrect is computed in only two sourcefiles with same name pl_decline.pro, the more recent one is dated 03/2004

    I would like to examine these sources files precise, only my IDL-knowledge is restricted, it's a travesty…

  139. Oops, I submitted too fast.
    pl_decline.pro:
    ;
    ; Now apply a completely artificial adjustment for the decline
    ; (only where coefficient is positive!)
    ;
    tfac=declinets-cval
    fdcorrect=fdcalib
    for iyr = 0 , mxdnyr-1 do begin
    fdcorrect(*,*,iyr)=fdcorrect(*,*,iyr)-tfac(iyr)*(zcoeff(*,*) > 0.)
    endfor
    ;
    ; Now save the data for later analysis
    ;
    save,filename=’calibmxd3’+fnadd+’.idlsave’,$
    g,mxdyear,mxdnyr,fdcalib,mxdfd2,fdcorrect
    ;

  140. Phil Jones’ famous E-Mail (‘Mike’s nature trick’, ‘to hide the decline’ is time-stamped “Date: Tue, 16 Nov 1999 13:31:15 +0000″
    Source pl_decline.pro from FOIA\documents\osborn-tree6\mann\oldprog is dated 05-01-2000 dd-mm-yyyy
    There might be a relation between the E-Mail and the source file.

  141. Did you see the code comment at the start?

    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

    Now, there are one of two explanations here:

    1. A comment in case at a later date they forgot they were committing fraud.

    2. A comment to make clear the code… applied a very artificial correction for decline.

    I guess which you believe depends on one’s pre-existing view of the integrity of the scientists in question.

  142. J. Bob (08:31:36) : I find it unimaginable a supposedly 1st rate research facility would tolerate this code. [...] The only time we would ever keep commented code in the source stream, is that it would be readily available to re-insert for whatever reason at a later date. Otherwise it was deleted. PERIOD.

    Many scientist and engineers have no interest in the extra effort to generate readable code; they are more interested in the results of the calculations than keeping the source code elegant and beautiful. As long as they understand the code themselves they are happy, they simply ignore the readability for the next guy. This behavior is nasty but true…

    But that’s not the major point. The major point is that poor source code is completely irrelevant here, imagine New York Times headlines “Copenhagen cancelled due to unreadable source code”… Well, that’s not so likely is it? :-)

  143. A closer look at the Sources gives some sort of daisy chaining multiple programs to create the data files. First there is ‘calibmxd1′, ‘calibmxd2′ … up to ‘calibmxd5′.

    The source “pl_decline.pro” (where correction is done) reads’calibmxd2′ and writes ‘calibmxd3′. “pl_calibmxd4″ reads ‘calibmxd3′ and writes ‘calibmxd5′, which contains the corrected and uncorrected values.
    Plotting various graphs is based on ‘calibmxd5′, mostly variable fdcalibc is used for preparing the plot. fdcalibc contains ‘adjusted’ values ‘to hide the decline’.

  144. Richard & others,

    I think the “thought experiment” was something along the lines of “what if the tree data didn’t suffer from the decline – how would that affect the correlation of the whole dataset to actual temperature?”. That doesn’t mean in this case that they ignored the decline, only that they wanted to know what the its effect on their process was.

    The “bootstrapping” process is described in that Osborn paper quotation I posted above, but here’s how I understand what they did (as through a glass, darkly):

    1) Recognise (as is well known) that the post 1960 tree data doesn’t track real temperatures, and that if you include them in your analysis it’s going to screw things up. Assuming you just don’t want to give up and go home at this point, you have to do something about it.

    2) Then there are two ways of working out a correlation and a calibration between the tree ring widths/densities and real temperatures even though you know the post-1960 results are dodgy:

    a) Chop them off and only use the ones before 1960 (there is code that does this elsewhere)
    b) Fudge the post-1960 temperatures so they track real temperatures better.

    I’ll discuss the benefits and dangers of each of those later…

    3) This gives you a calibrated mapping – that is to say, it allows you to work out a temperature for a given ring width or density.

    4) Then you apply this mapping to the original *unmodified* data to give you your temperature reconstruction. You know the output of this is going to diverge post 1960, and needs to be replaced with real temperatures (or “hide the decline”, if you must), but at least it gives you something. Obviously you would document the process you did, as Osborn et al. did in their paper – which apparently was never published, so maybe the reviewers thought it was all a bit too dodgy – but it was all out in the open.

    OK, so lets come back to the 2(a) and (b) – the choice of whether to chop (“bobbit”, as we say in programming ;-) or fudge.

    2(a), bobbiting, gives you a more accurate view of the correlation/mapping for the period you leave in, but because the period is shorter it will be more sensitive to that particular period – as Osborn”s comment indicates.

    2(b), fudging, is really pretty horrible, granted, but it does leave the *short-term* post-1960 variations in place to correlate with, even though you’ve messed with the long-term trend – so it gives you a little less sensitivity to the time period.

    So when I say it’s “reasonable”, what I mean is, it’s a reasonable way to make the best of some data which has known, documented problems in recent years. Extracting information from imperfect data is what science (and signal processing) is all about. As long as it was done in the open, it’s fine. Now granted the code wasn’t published, and I agree it should be, but Osborn did summarise what it did in the paper.

    (BTW, I don’t know if the bit about programming video software in C (actually it’s C++) was meant as a jibe about my qualification to comment – if so, fair enough, if that’s all there was. But I have also spent the last two years building and maintaining woodfortrees.org, which gives me some insight into some of the concepts, here, I think. But I don’t claim to be a Real Statistician!)

  145. One more thing I should add: I’m well aware that the result of removing the decline (in either way I described above) means the calibration point will be higher, and so historical values will be reconstructed lower, possibly reducing the MWP.

    Whether you think that matters depends whether you think the same process that is causing the decline now was also happening in the Middle Ages. If it’s purely temperature related, it might have done; if it’s some other anthropogenic or random effect, maybe it didn’t. I don’t know either way.

    But this an issue in dendrochronology which has been out in the open and much discussed (not least by Steve M) for a decade now. This code doesn’t change that argument in either direction, it just (predictably) demonstrates that CRU did have code to do what they said they were doing in their papers. Which in a way is a good thing, although it would have been better if the code was published to begin with – maybe they and others will learn that lesson now.

  146. Paul, I think that is a very good analysis. The “fudge” may have been more legitimate than it first appears. valadj is said to be a component of a principal components analysis. In some ways that is just another set of numbers to optimise a fit, but it may bring in more non-temp data.

  147. This was not convincing to me as I don´t understand it. Could you please expain what you have found in more simple terms, understandable to somebody who doesn´t understand a shit about programming? And please use simple words. I´m a stupid Swedish journalist, you know.

  148. Somebody somewhere knows how that code was used and if was in fact used to influence any peer reviewed journal articles.

    For them to simultaneously tout peer review, withhold data and methodology, and make the dubious claim that there is no ‘proof’ that this code was ever used is Kafkaesque.

    If this code made it into a published article, those who allowed it were engaged in fraud. For them to remain silent on the issue at this time should be the final nail in the coffin of their professional careers. For those who peer-reviewed any such articles, it should be a deathblow for their careers as well. Claiming to have peer reviewed something is different from actually having done so. That is tantamount to fraud as well.

    Somebody needs to produce a list of any papers that incorporated this data manipulation program. Failure by CRU/EAU to do so should be the end of tax funded research at this institution.

    This is really sick stuff.

  149. Tenuc (00:11:04) :

    To carry your analogy a bit further; we’ve got emails where they discuss disposing of the body (raw data), and now the body is missing.

    We’ve not only got method, motive and opportunity, we’ve got a missing body and emails where they openly discuss their intent to dispose of it.

    I don’t know how the science community works, but I know several prosecutor’s who would love to have this much to go on in the court room. If this were a criminal case (at least in the U.S.) we’d be talking about pleading guilty and turning state’s evidence in exchange for a life sentence. Somehow in the scientific community, a bunch of the co-conspirators bear character witness, with the understanding that that will be sufficient to render a verdict of not-guilty.

    Finally, after having read the comments here, it seems this code is potentially a bit more innocuous than what I first interpreted, per se. However, in light of the missing raw data, emails that display a clear sense of contempt for the scientific method and actual criminal intent with respect to FOI, this code should be just another stick in the funeral pyre. Let us hope.

  150. WFT:

    “1) Recognise (as is well known) that the post 1960 tree data doesn’t track real temperatures, and that if you include them in your analysis it’s going to screw things up. Assuming you just don’t want to give up and go home at this point, you have to do something about it.”

    [snip]? You’re trying to come up with a proxy for real temperatures. If you are truly ‘Recognising (as is well known) that post 1960 tree data doesnt track real temperatures’, then it is time to give up and go home.

    That, or direct considerable effort to locating the mechanism that causes the divergence – taking care to diligently consider all possible mechanisms, not just ones that you can bluff and bluster into operating only post 1960.

    Those are the two legitimate, scientific reponses to the ‘divergence problem’. The rest is nothing more or less than fraud dressing itself up as science. Ignoring adverse results is not any more scientific than hiding them, and it is only ever so slightly less egregious.

  151. JJ: I think if you find the tree data does track temperatures pretty well for 110 years, but doesn’t for the last 40, you could argue it’s not Game Over and try to do the best you can. But I agree, it certainly demands a rain check , and I haven’t yet seen anything that really satisfies me to explain it, either (but I’ve only been looking at this particular issue for a couple of weeks).

    My real point is just this: this whole issue with divergence was already well known to those in the field, and all that’s been “discovered” here is the code that implements techniques (dubious or not) that were already described by the authors in various papers. It’s interesting to discuss the finer points of dendroclimatology as to whether what they did was valid; but to claim this is a SMOKING GUN THAT RENDERS THE WHOLE OF AGW A FARCE (paraphrasing) is just overkill.

  152. WFT,

    “JJ: I think if you find the tree data does track temperatures pretty well for 110 years, but doesn’t for the last 40, you could argue it’s not Game Over and try to do the best you can.”

    That is not science! Anything can be proven by simply ignoring data that disagree with the desired conclusion. That is religionism. Faith healers have an impressive track record, if you dont count the dead people.

    “But I agree, it certainly demands a rain check , and I haven’t yet seen anything that really satisfies me to explain it, either (but I’ve only been looking at this particular issue for a couple of weeks).”

    Dont worry. The Team has been searching for the cause of the divergence with the same dogged determination that OJ has been applying toward finding the real killer. I’m sure an answer is just around the corner.

    “My real point is just this: this whole issue with divergence was already well known to those in the field, …”

    Should not some of your ‘real point’ be unmitigated anger that they have been ignoring adverse results for a decade?

    “… and all that’s been “discovered” here is the code that implements techniques (dubious or not) that were already described by the authors in various papers.”

    I agree that we dont know the context of the code snippet dissected here, and have argued to that effect above. We risk losing the force of the point by perserverating over how they did it (if in fact that is what this code snippet represents), rather than concentrating on the fact that THEY DID IT.

    JJ

  153. woodfortrees (Paul Clark) (08:55:48) :

    JJ: I think if you find the tree data does track temperatures pretty well for 110 years, but doesn’t for the last 40, you could argue it’s not Game Over and try to do the best you can.

    I am not a scientist but this seems like nonsense to me because I do not see how data for 110 years can tell us anything definitive about conditions on a planet that has been around for an estimated 4.5 billion years. Am I wrong? Do we have any data that goes back 4.5 billion years?

  154. I think we are jumping to conclusion here. The dump contains 177 FORTRAN files, 655 PRO files, 24 RAW files, 848 RW files, 365 DAT files and 1157 OUT files – it’s obvious that the whistleblower that prepared this dump certainly did not select a large number of irrelevant files…

  155. Just a note about the adjustment factor array. It sure looks similar to the correction factors “tobs” that is in use by our US temperature data massagers.

    Has anyone checked this out?

  156. JJ (09:30:52) :
    “I agree that we dont know the context of the code snippet dissected here, and have argued to that effect above. We risk losing the force of the point by perserverating over how they did it (if in fact that is what this code snippet represents), rather than concentrating on the fact that THEY DID IT.”

    I agree on your statement. But one argument was: “the corrected values are not used”.
    A little survey of the sources shows that corrected values are used all over the place, not only in briffa_sep98_e.pro. Usually there is a hint not to plot beyond 1960, but for what reason “adjust” this values, if you do not want to use them in any context?

  157. “To carry your analogy a bit further; we’ve got emails where they discuss disposing of the body (raw data), and now the body is missing.”

    I see many claims to that effect…but where’s the PROOF?!

    It’s completely dishonest to make a subjective interpretation of an email that is taken out of context and then brandy that about as the truth.

    I see many claiming “We have emails showing they are hiding data!”

    No, you don’t… You have emails that SUGGEST they might…but the fact is they could also suggest many other things.

    I’m not sure exactly what data you speak of, but much of the data that was supposedly being “suppressed” is in fact available from the sources it originally came from!

    CRU collected data from various sources…so they might very well have been justified in saying it was not their place to give it out …..or there are many many legit reasons for them to “suppress” it.

    My biggest problem in this whole scandal is so called “skeptics” taking bits and pieces out of context and bellowing “THIS IS ABSOLUTE PROOF OF XXXX” ….based on a SUBJECTIVE INTERPRETATION.

    We need a thorough investigation and ALL The facts to come out before we can make any absolute claims.

  158. I just ant to clarify that I think there is certainly a lot to investigate here, and there may have been wrongdoing etc.

    …but many people are trying to claim it is cut and dry:

    That there is absolute proof of wrongdoing… or they are blowing up the extent and severity of said wrongdoing…

    …and those people are either lying to everyone else or lying to themselves.

    They are a disgrace to the very idea of being skeptical.

  159. Another thing:

    So many are loudly proclaiming the fact that the scientists have motivation to fake the data…

    Where’s the proof of this?

    Where’s the proof that they are making more money than they would if they didn’t support the “conspiracy”?

    Also I’m surprised that some still don’t get it:

    The point is NOT “Was the data processed (or output or whatever) by this code published or used”

    the point is “Was this output data used WITHOUT people being told that it was processed by this function”

  160. WFT,

    “They were only trying to go back a 1000 years or so, so 110 years of data is a fair proportion of the total.”

    Which remains wholey irrelevant, given that 40-60 years dont track temps at all. If 25-30% of the instrumented period bears zero or negative relationship to the tree rings, then the premise of a positive tracking relationship is falsified. Tree rings as a proxie are invalidated unless and until that is sorted out appropriately.

    TKI,

    “I agree on your statement. But one argument was: “the corrected values are not used”.”

    That was not my argument. My arguement is that we do not know what, if anything, they were used for, let alone if they were ever used for anything published.

    “A little survey of the sources shows that corrected values are used all over the place, not only in briffa_sep98_e.pro.”

    Which tells us precisely nothing WRT the above.

    “Usually there is a hint not to plot beyond 1960, but for what reason “adjust” this values, if you do not want to use them in any context?”

    That is a question. Before making claims, one needs answers.

    JJ

  161. woodfortrees (Paul Clark) (15:00:37) :

    D: They were only trying to go back a 1000 years or so, so 110 years of data is a fair proportion of the total.

    —-

    11%? I’m sorry. I still thinks it’s nonsense. I’m a programmer and I would never draw any conclusions after analyzing 11% of the data.

    And why are we only going back 1000 years? Am I the only one that finds this odd??

  162. IMPORTANT? I think???a few days back I found the old charts and details for the Aussie temperatures for many years back, now it went missing..BOM is tidying up:-) hmmm?
    however the page I downloaded now went to this
    ftp://ftp2.bom.gov.au/anon/home/bmrc/perm/climate/temperature/annual/
    I downloaded the zips in case , but my pc cannot open? some o the others.
    this topic seems to dovetail with the similar story above.
    so
    somebody? anybody!
    feel free to save and examine it all:-)
    what I saw before was 3? charts with a suspicious lack of rises in it from places like Koolgardie west Aus, and inland Qld

  163. JJ (17:52:10) :
    “I agree on your statement. But one argument was: “the corrected values are not used”.”

    “That was not my argument. My arguement is that we do not know what, if anything, they were used for, let alone if they were ever used for anything published.”

    Misunderstanding here, sorry for that, I did’nt think, that was your argument.

    Many of the sources using adjusted values generate graphs, I think it’s likely a few plots were published. This point needs to be examined.

  164. woodfortrees (Paul Clark) (03:31:29) :
    (BTW, I don’t know if the bit about programming video software in C (actually it’s C++) was meant as a jibe about my qualification to comment – if so, fair enough, if that’s all there was. But I have also spent the last two years building and maintaining woodfortrees.org, which gives me some insight into some of the concepts, here, I think. But I don’t claim to be a Real Statistician!)

    Absolutely Not! I meant no disrespect. I mentioned about your programming video software in C (C++), because that’s what you said you did. I do know about your woodfortrees, which I use occasionally but don’t always understand.

    I was trying to sum up the blog and reason things out to the best of my ability. Thank you for the enlightenment.

    However I look at this: “So when I say it’s “reasonable”, what I mean is, it’s a reasonable way to make the best of some data which has known, documented problems in recent years.”, differently. And this is the way I look at it from a simple scientific principle perspective: (also through a glass darkly)

    We make a hypothesis – say these bristlecone pine tree ring widths are a measure of the temperature. We measure and compile the data and plot the temperature curve and we find that the temperatures of our proxies diverge from 1960 when we have the instrument records, with these records. Our temperatures go down whereas the thermometer readings go up. To me this would indicate it weakens my hypothesis, and truly speaking we should discard it.

    The proper scientific attitude would be not to say hey this data has problems in recent years, but for 900 years previously, when we had nothing to check it with, it was spot on. No it doesn’t agree with the temperature records so our hypothesis is probably wrong and we cant be confident that it represents the temperatures of the previous 900 years accurately.

    As it happens we do have something to compare these proxies with for the previous 900 years, and they do not agree with that data. And what is that? 771 studies that show that that the Medieval Warm period did exist, from 458 research institutions in 42 countries which say the Medieval Warm Period did exist, it was widespread across the globe and it was on the average about 0.75 C warmer than today.
    See here for the graphic on this: http://wattsupwiththat.com/2009/12/04/jo-nova-finds-the-medieval-warm-period/

    Against this we have evidence of a few trees in a grove in the Rockies, which does not agree with the temperature records which says no. This is confirmed and corroborated by another lonely tree in Siberia, with its own peculiar history, supposed to stand proxy for the temperatures of the whole world.

    Which would you believe? Which evidence is more credible?

    You say these emails do not reveal anything new. The “problem” has already been discussed at length in papers. But discussing the “problem” doesnt make it right. The problems with the problems have also been pointed out at length. The emails may not provide ammunition against AGW but it certainly provides vindication for the concerns voiced at length by the sceptics and frustratingly stonewalled.

  165. Another point why would proxy tree ring data from a grove in the Rockies represent Global temperatures? It would be near nigh miraculous if they did. So even if they represented accurately the temperature history of their grove, (neck of the woods, so to speak), which is dubious as other factors also influence tree rings, would it be logical or meaningful to graft global temperature records onto the end of them?

  166. debreuil:

    Please do not denegrate “[snip] science”! …. I studied wildlife biology in college and took a very difficult course in scatology and am quite interested in paleoscatology.

    We need to all remember where the “science” of Climatology came from — basically it was spawned as an obscure branch of geography — before AGW it was a total backwater of academia, populated mostly by those who could not endure the rigors of hard scinece. The study of geography has always been at odds with the “hard” sciences — mostly because geography tends to inject a distinctly “liberal” or “social science” aspect to the hard science. My observation is that it is easy to see how Climatologists have little regard for the scientific method (they never really learned it!) and how they are easily politicized and radicalized.

  167. This entire thread is an example of the uninformed leading the uninformed.

    Consider:

    “valadj” is some sort of filter, plotted in the link. “yrloc” is an array of years, from 1400 to 1994. So the output from this filter, is a TIME adjustment, as suggested by the variable name, “yearlyadj”. It isn’t changing the temperatures at all- contrary to the stupid conclusion of the watts-up author and the sheep piling on.

    I repeat: the code has no impact on temperature, it is dealing with time, not temperature, and I suspect a gap between 1400 and 1904 in some data set.

    The comment::

    “valadj, or, the “fudge factor” array as some arrogant programmer likes to call it is the foundation for the manipulated temperature readings. It contains twenty values of seemingly random numbers. We’ll get back to this later.”

    is lunacy, it isn’t operating on temperature data!

Comments are closed.