CRU Emails "may" be open to interpretation, but commented code by the programmer tells the real story

When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:

From a yahoo.com news story:

In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.

Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.

Dr. Jones responded.

However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.

Ok fine, but how Dr. Jones, do you explain this?

There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:

People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:

Here’s the code with the comments left by the programmer:

function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$

datathresh=datathresh

;

; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES

; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate

; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE

; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE

; USUAL correlate FUNCTION ON THE RESIDUALS.

;

pro maps12,yrstart,doinfill=doinfill

;

; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

;

and later the same programming comment again in another routine:

;

; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD

; reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

 

You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”,  but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.

Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.

Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.

For more details, see Mike’s Nature Trick

UPDATE: By way of verification….

The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file

/documents/osborn-tree6/mann/oldprog

in the files

maps12.pro

maps15.pro

maps24.pro

These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.

UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:

================

From documents\harris-tree\recon_esper.pro:

; Computes regressions on full, high and low pass Esper et al. (2002) series,

; anomalies against full NH temperatures and other series.

; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

;

; Specify period over which to compute the regressions (stop in 1960 to avoid

; the decline

;

Note the wording here “avoid the decline” versus “hide the decline” in the famous email.

===============

I’ll give Dr. Jones and CRU  the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.


Sponsored IT training links:

Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.


0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

480 Comments
Inline Feedbacks
View all comments
Manfred
November 23, 2009 3:28 am

bbc loses all credibility
an appaling and disturbing piece of nepotism:
http://news.bbc.co.uk/2/hi/science/nature/8371597.stm

Fred Lightfoot
November 23, 2009 3:29 am

E.M.Smith (00.47.58)
Still got access to that Cray ? wishful thinking.

debreuil
November 23, 2009 3:30 am

Posted this on CA, but I see I’m not the only one wondering here. I think it is a dataset merge and then attempt at normalizing, but some of it is by year, so pretty weird.
Weirdest? This is printed on running. Sometimes when programming you fudge like this to get clues to where you might be off, it is more legit if it warns you on output. Still the final part sounds like they aren’t about to change it. Not sure what to think.
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is.
….original msg….
What is the ‘decline’ thing anyway? It is in a lot of code, seems to involve splicing two data sets, or adjusting later data to get a better fit. Mostly (as a programmer), it seems like a ‘magic number’ thing, where your results aren’t quite right, so you add/multiply by some constant rather than deal with the real problem. Aka “a real bad thing to do” : ).
\FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.pro
printf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
\FOIA\documents\osborn-tree6\briffa_sep98_d.pro
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
\FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
\FOIA\documents\osborn-tree6\combined_wavelet_col.pro
;
; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
\FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro
; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
\FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
\FOIA\documents\osborn-tree6\mann\oldprog\pl_decline.pro
;
; Now apply I completely artificial adjustment for the decline
; (only where coefficient is positive!)
;
tfac=declinets-cval
fdcorrect=fdcalib
for iyr = 0 , mxdnyr-1 do begin
fdcorrect(*,*,iyr)=fdcorrect(*,*,iyr)-tfac(iyr)*(zcoeff(*,*) > 0.)
endfor
;
; Now save the data for later analysis
;
save,filename=’calibmxd3.idlsave’,$
g,mxdyear,mxdnyr,fdcalib,mxdfd2,fdcorrect
;
end
\FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro
;
; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***

P Gosselin
November 23, 2009 3:33 am

Brazen!
“We also have a data protection act, which I will hide behind.”
http://directorblue.blogspot.com/2009/11/milli-vanilli-of-science-hacked-emails.html

Oscar Bajner
November 23, 2009 3:38 am

Sample from Harry_readme.txt
“In other words, the *anom.pro scripts are much more recent than the *tdm
scripts. There is no way of knowing which Tim used to produce the current
public files. The scripts differ internally but – you guessed it! – the
descriptions at the start are identical. WHAT IS GOING ON? Given that the
‘README_GRIDDING.txt’ file is dated ‘Mar 30 2004’ we will have to assume
that the originally-stated scripts must be used. ”
I could help the Hadley Harries with this. There is this software thing called “revision control” and I could set it up in an afternoon for them. (CVS, SVN, GIT, BZR … whatever they like) In two afternoons I could set up a really neat, scriptable control system. They could spend more time debugging and less time on [snip]
Thing is, I can’t give up my day job right now – I just got a promotion and from today I am in charge of the rotary buffer! W00t! Wal-Mart rocks.

Stuck-Record
November 23, 2009 3:39 am

“Alan the Brit (01:37:00) :
I am absoultely loving this! He he!!! The foul stench of bovine faecal contaminated science from the CRU is stomach turning. Of course they’ll point to the recent bad storms in Cumbria as hard evidence of Climate Change,”
They already have. 10 o’clock news and again on Today prog this morning.

Scouse Pete
November 23, 2009 3:49 am

It’s interesting the BBCs confused policy at the moment. They have reopened their blogsite this morning with the specific message No links to or extracts from the emails will be allowed. Yet, Nigel Lawson was allowed to talk about it on BBC R4 this morning! (already linked above) And in The Mail this morning 3 pages of stuff about it. A new story on Phil Jones “Pioneer or Junk Peddler” and then a Chistopher Booker piece of 2 pages.
Nigel Lawson also has a piece in The Times this morning. So, it seems it’s only Aunty Beeb fighting with its internal conflict on this issue, in which its editorial policy has been compromised due to it’s Editor-In-Chief the DJ Mark Thompson being hood-winked by Al Gore back in 2007 when he attended the personal presentation of his flawed Powerpoint presentation to BBC Staff. Ever since then, he has dictated their current toothless policy from the top – in my opinion.
Time for him to go. Time for The BBC Trust to get involved.

debreuil
November 23, 2009 3:49 am

It says important note, but I guess I missed the memo.
\FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
printf,1,’IMPORTANT NOTE:’
printf,1,’The data after 1960 should not be used. The tree-ring density’
printf,1,’records tend to show a decline after 1960 relative to the summer’
printf,1,’temperature in many high-latitude locations. In this data set’
printf,1,’this “decline” has been artificially removed in an ad-hoc way, and’
printf,1,’this means that data after 1960 no longer represent tree-ring
printf,1,’density variations, but have been modified to look more like the
printf,1,’observed temperatures.’

Ashtoreth
November 23, 2009 3:52 am

As a software engineer with 30 years experience, some of it working with government scientists, that code is horribly horribly familiar…
The problem is that research scientists have done a programming course at some point in time. 99% consider themselves good coders as a result. 98% of them are wrong….
The initial flaw seems to be in the way they intend to use the software – its only for them (often not even for their colleagues), and as such is completely uncontrolled. Often changes are made without any reference or not, changes on changes…and after a while, they arent sure any more why things happen the way they do…
Documentation? We dont need that, its my programme, I know what it does. Maybe. Will you in 5 years? Experience shows you dont…
This code is a classic example of this way of programming. Now fortunately, much of this type of coding is only used by one person, not designed for input to anything critical, as an aide for a researcher, for whom results trump everything. So while its bad practice, it doesnt have too many disastrous effects. This time, however, its being used for predictions costing 100’s of billions of dollars….
Monkton is absolutely correct, we need to take the raw data, the calculations, and build new, verified models and data sets to see what is hapenning BEFORE we spend all this money. If these DO show AGW, fair enough. My money is on any AGW being so small its lost in the noise.

rbateman
November 23, 2009 4:00 am

Climate Change? Hah.
THIS is what I call real Climate Change:
Date Id Name State Latitude Longitude Maximum Temperature
in ºF Minimum Temperature
in ºF Observation Temperature
in ºF Precipitation
in inches
1892-01-03 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-04 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-05 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.02
1892-01-06 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-07 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.54
1892-01-08 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.32
1892-01-09 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.05
1892-01-10 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-11 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-12 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-13 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1892-01-14 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.005
1892-01-15 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0.005
nothing but precip data until…..
1934-09-27 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-28 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-29 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-09-30 47290 RED BLUFF CA 40.1833 -122.233 9999 9999 9999 0
1934-10-01 47290 RED BLUFF CA 40.1833 -122.233 85.0 51.0 79.0 0
1934-10-02 47290 RED BLUFF CA 40.1833 -122.233 82.0 51.0 79.0 0
1934-10-03 47290 RED BLUFF CA 40.1833 -122.233 84.0 54.0 76.0 0
1934-10-04 47290 RED BLUFF CA 40.1833 -122.233 91.0 50.0 76.0 0
1934-10-05 47290 RED BLUFF CA 40.1833 -122.233 100.0 47.0 79.0 0
1934-10-06 47290 RED BLUFF CA 40.1833 -122.233 82.0 62.0 66.0 0
1934-10-07 47290 RED BLUFF CA 40.1833 -122.233 74.0 54.0 64.0 0.11
crua6[/cru/cruts/version_3_0/db/testmergedb] grep -n ‘RED BLUFF’ tmp.0*.*
tmp.0612081519.dat:28595: 725910 401 1223 103 RED BLUFF USA 1991 2006 101991 -999.00
tmp.0702091122.dtb:171674: 725910 401 1223 103 RED BLUFF USA 1878 1980 101878 -999.00
tmp.0704251819.dtb:200331: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
tmp.0704271015.dtb:254272: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
tmp.0704292158.dtb:254272: 725910 401 1223 103 RED BLUFF USA 1878 2006 101878 -999.00
crua6[/cru/cruts/version_3_0/db/testmergedb]
The first file is the 1991-2006 update file. The second is the original
temperature database – note that the station ends in 1980.
It has *inherited* data from the previous station, where it had -9999
before! I thought I’d fixed that?!!!
Yeah, baby, you fixed it all right.
1 station data set smoked on the CRU barbie.

Robinson
November 23, 2009 4:02 am

As a Software Developer, I know that programmers often impart their stream of consciousness into the code in the form of comments. But from reading the above (particularly debreuil’s quotes), it seems clear to me there’s quite a substantial confirmation bias in their method.

John Finn
November 23, 2009 4:05 am

So come on, folks – time to nominate your favourite email. I realise we’re totally spoilt for choice but which ones stand out.
The ‘nature trick’ is definitely a contender and is far more damaging than Phil Jones and the press are trying to make out, but this only confirmed what I knew anyway.
The surprise for me was the Trenberth effort which includes the immortal lines “but the data are surely wrong. Our observing system is inadequate. ”
Comments on sceptic blogs often suggest that the warmers think that if the data doesn’t agree with the models then the data must be wrong. I always thought this was an unfair exaggeration. But it’s true. These people are beyond satire.

Cassandra King
November 23, 2009 4:12 am

The BBC are hiding behind the very flimsy excuse of “legal reasons” why they cannot reveal details of the emails, the excuse is so transparently dishonest I wonder if they are so desperate to deny people a chance to see the evidence that they would risk using such an obviously false reason for withholding the data?
I suspect that the BBC science and environment departments and reporters are very deeply involved with the scientists at the heart of the scandal, it must be clear that the BBC are covering up for the fraudsters for as long as it takes to either create a backup story or the story fades away.
Whatever the motives of the BBC and their reporters, the longer the delay the more suspicious that delay becomes, perhaps the BBC are willing to take the risk of stonewalling and delaying the actual release of the data considering the damage that releasing that data will have on the BBC.

dodgy geezer
November 23, 2009 4:17 am

Interesting quote from the baffled programmer trying to make sense of it all, and finally guessing.. (Harry’s txt)
“…The results are depressing. For Paris, with 237 years, +/- 20% of the real value was possible with even 40 values. Winter months were more variable than Summer ones of course. What we really need, and I don’t think it’ll happen of course, is a set of metrics (by latitude band perhaps) so that we have a broad measure of the acceptable minimum value count for a given month and location. Even better, a confidence figure that allowed the actual standard deviation comparison to be made with a looseness proportional to the sample size.
All that’s beyond me – statistically and in terms of time. I’m going to have to say ’30’.. it’s pretty good apart from DJF. For the one station I’ve looked at….”

jh
November 23, 2009 4:19 am

previous beeb link didn’t work for me perhaps this might 0.735am spot
http://news.bbc.co.uk/today/hi/today/newsid_8373000/8373594.stm
Lawson suggests funding body NERC and VC should enquire into issues raised
– pigs might fly

Martyn B
November 23, 2009 4:22 am

I have to say this tickled me from HARRY_READ_ME.txt
“So, once again I don’t understand statistics. Quel surprise, given that I haven’t had any training in stats in my entire life, unless you count A-level maths.”

Stephen Wilde
November 23, 2009 4:24 am

Given the volume of material and the number of serious issues to be considered how can any of the participants show up in public without a barrage of embarrassing questions ?
Has the entire AGW community just been neutralised for all practical purposes ?
Unless we see an entirely new set of personnel, climate science will be frozen in the spotlight and cannot progress.

Arthur Glass
November 23, 2009 4:24 am

“…“…we are having trouble to express the real message of the reconstructions – being
scientifically sound in representing uncertainty…” ”
Now ‘trouble to express’ *does* sound like an ESL grammar error. ‘Trouble expressing’ would be standard grammar.

Basil
Editor
November 23, 2009 4:25 am

Nick Stokes (20:34:52) :
I may be dense here, but what’s the issue? The red comment says “don’t plot beyond 1960″, because the results are unreliable. So is there any indication that anyone has plotted beyond 1960? This came up on the Bishop Hill thread, where he drew attention to an email by Tim Osborn where he said that they never plot some treering set beyond 1960 because of a divergence issue. Turns out that that is what Briffa/Osborn say also in Briffa et al 2001. This Briffa/Osborn context may be unrelated, but it seems to me that it may simply just mean what it says. Don’t plot beyond 1960 using this code. And people don’t.

Nick,
I think you are hanging your hat on the paleo/divergence issue. But it looks to me like HARRY_READ_ME is about the code used in CRU TS. I’m not certain about that, but I think we need to know. If so, then the Briffa et al literature acknowledging the divergence in paleo time series really doesn’t apply here. I.e., the “adjust for the decline” in the “Harry” code, and “Mike’s Nature Trick” are two different things.

debreuil
November 23, 2009 4:25 am

Ok, haven’t done fortran in 20 years, but if I read this right, it is creating a weighting hash for each 5 year period starting in 1904 (two arrays, 1st is year, second is weighting). The forties area are multiplied by as much as -.3, then in 1960 the ‘fudge’ creeps positive, up to 2.6 in 1980 onwards. It then interpolates this over the data. Please correct if this is wrong…
1904 0.
1909 0.
1914 0.
1919 0.
1924 0.
1929 -0.1
1934 -0.25
1939 -0.3
1944 0.
1949 -0.1
1954 0.3
1959 0.8
1964 1.2
1969 1.7
1974 2.5
1979 2.6
1984 2.6
1989 2.6
1994 2.6
1999 2.6
original code (\FOIA\documents\osborn-tree6\briffa_sep98_d.pro)
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)

November 23, 2009 4:26 am

A better case can’t be made for open science, open software … And honesty. Never trust programs done behind closed doors.
When the secrecy of temperature data, and the software that manipulates it, is higher that that for nuclear weapons design, there is something very wrong with the whole of climate science. The hoax is designed to scam people out of money and make others very rich off the hoax.
Science may never recover from this … Where is the virtue in science fraud.

November 23, 2009 4:26 am

rbateman (04:00:20) :
Sounds very interesting – could you perhaps explain a little more?
Thanks 🙂
Frank

Basil
Editor
November 23, 2009 4:32 am

Ashtoreth (03:52:14) :
Monkton is absolutely correct, we need to take the raw data, the calculations, and build new, verified models and data sets to see what is hapenning BEFORE we spend all this money.

While I agree, we need to realize that there is no longer any raw data, at least at CRU. So it — the raw data — will have to be acquired all over again. Given the that this is now international politics, and not just academics cooperating in the interest of disinterested science, that may no longer be possible.
Can anyone tell me what the relationship between CRU TS and HadCRUT is? While there has been a bit of a kerfuffle over the fact that CRU is not Hadley, do the latter use data from the former in their product?

jh
November 23, 2009 4:34 am

This is Lawson’s think tank
http://www.thegwpf.org/
Membership a minimum £100
Lots of names you know on the advisory board

John Finn
November 23, 2009 4:39 am

Just watched the Politics Show on BBC1 (UK).
Fred Singer and Bob Watson (Chief Environmental Scientist) were interviewed by Andrew Neil. Not exactly a trouncing, but Singer got the easier ride and probably edged it. Watson was surprisingly agreeable and suggested an enquiry should be set up which looks into a) the ‘hacking’ of the emails and b) the contents of the emails.

1 5 6 7 8 9 20