CRU Emails "may" be open to interpretation, but commented code by the programmer tells the real story

When the CRU emails first made it into news stories, there was immediate reaction from the head of CRU, Dr. Phil Jones over this passage in an email:

From a yahoo.com news story:

In one leaked e-mail, the research center’s director, Phil Jones, writes to colleagues about graphs showing climate statistics over the last millennium. He alludes to a technique used by a fellow scientist to “hide the decline” in recent global temperatures. Some evidence appears to show a halt in a rise of global temperatures from about 1960, but is contradicted by other evidence which appears to show a rise in temperatures is continuing.

Jones wrote that, in compiling new data, he had “just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (i.e., from 1981 onwards) and from 1961 for Keith’s to hide the decline,” according to a leaked e-mail, which the author confirmed was genuine.

Dr. Jones responded.

However, Jones denied manipulating evidence and insisted his comment had been taken out of context. “The word ‘trick’ was used here colloquially, as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward,” he said in a statement Saturday.

Ok fine, but how Dr. Jones, do you explain this?

There’s a file of code also in the collection of emails and documents from CRU. A commenter named Neal on climate audit writes:

People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples [of] bias take a look at the following remarks from the MANN code files:

Here’s the code with the comments left by the programmer:

function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$

datathresh=datathresh

;

; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES

; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate

; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE

; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE

; USUAL correlate FUNCTION ON THE RESIDUALS.

;

pro maps12,yrstart,doinfill=doinfill

;

; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

;

and later the same programming comment again in another routine:

;

; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD

; reconstructions

; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually

; plot past 1960 because these will be artificially adjusted to look closer to

; the real temperatures.

 

You can claim an email you wrote years ago isn’t accurate saying it was “taken out of context”,  but a programmer making notes in the code does so that he/she can document what the code is actually doing at that stage, so that anyone who looks at it later can figure out why this function doesn’t plot past 1960. In this case, it is not allowing all of the temperature data to be plotted. Growing season data (summer months when the new tree rings are formed) past 1960 is thrown out because “these will be artificially adjusted to look closer to the real temperatures”, which implies some post processing routine.

Spin that, spin it to the moon if you want. I’ll believe programmer notes over the word of somebody who stands to gain from suggesting there’s nothing “untowards” about it.

Either the data tells the story of nature or it does not. Data that has been “artificially adjusted to look closer to the real temperatures” is false data, yielding a false result.

For more details, see Mike’s Nature Trick

UPDATE: By way of verification….

The source files with the comments that are the topic of this thread are in this folder of the FOI2009.zip file

/documents/osborn-tree6/mann/oldprog

in the files

maps12.pro

maps15.pro

maps24.pro

These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits timeline-wise with Dr. Jones email where he mentions “Mike’s Nature trick” which is dated 11/16/1999, six days later.

UPDATE2: Commenter Eric at the Climate Audit Mirror site writes:

================

From documents\harris-tree\recon_esper.pro:

; Computes regressions on full, high and low pass Esper et al. (2002) series,

; anomalies against full NH temperatures and other series.

; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N

;

; Specify period over which to compute the regressions (stop in 1960 to avoid

; the decline

;

Note the wording here “avoid the decline” versus “hide the decline” in the famous email.

===============

I’ll give Dr. Jones and CRU  the benefit of the doubt, maybe these are not “untowards” issues, but these things scream for rational explanations. Having transparency and being able to replicate all this years ago would have gone a long way towards either correcting problems and/or assuaging concerns.


Sponsored IT training links:

Need help for EX0-101 exam ? We offer self study 642-436 training program for all your 642-974 exam needs.


The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
480 Comments
Inline Feedbacks
View all comments
November 24, 2009 12:49 pm

Anthony responded:

Trees do no make thermometers in all instances.

I agree absolutely. So, when we know that trees don’t work as thermometers, as in those post-1960 trendlines, it seems to me to be a better choice to substitute actual temperature readings. Is there a better solution?

Liebigs law illustrates that plant and tree growth is limited by the least available limiting growth factor. In some years it may be temperature, with a warm summer spurring growth, but in others it can be lack of water or lack of nutrients, and for long lived species like Bristlecone pine, yes even years of low CO2 can be a limiting growth factor. Bristlecones tend to frequent high and dry climates, and water can be just as influential as temperature.

Has anyone found any natural spot outside of a cave and above water where CO2 is the limiting factor? Even in the thinner atmosphere and drier climes of the Sierra Nevada, there is plenty of CO2 for bristlecones. Walter Muller wrote in one of the intro to botany texts I had that there was no shortage of CO2 even before the Industrial Revolution, and after, no chance. Do you know of any work that suggests bristlecone, or any other tree used in dendrochronology, would have faced a CO2 limitation in the past 80 million years?
My point was that something other than temperature limited the growth of the trees after 1960. Another correspondent in this thread says it was drought. I cannot imagine CO2 being the limitation, but were that so, it would still suggest the data after 1960 should be discounted, and Mann and his colleagues were wise to do so.

Without knowing which of the factors was in shortest supply in a year, or stretch of years, you can’t make a blanket claim that tree ring widths reflect only the temperature record. Untangling it all is not possible without the metadata, and we don’t have it in most cases. – Anthony

Exactly the point. For much of the time used, dendrochronology records may act as a proxy for temperature measurements, and in fact there is good correlation at many points where data come from other sources to corroborate. That correlation fails after 1960 in the dendro data, and so it would be misleading at best to include it without noting it is known to be unrepresentative at that point. For the purposes of a summary paper, why include that post-1960 data at all, especially since it was discussed at some length in earlier papers focusing on that issue?
It would be good to have dendrochronology records from many different forests in many different climates and diverse geological areas. Perhaps you could spearhead a research effort to look for contraindicating data, where it might be found? It would be fun to find trees under the Sahara sands, and it would be particularly informative to be able to see how well temperature tracked with desertification in those now-desert areas. I have hopes that with NASA’s and ESA’s ground-penetrating radar data, and stable governments to allow access, such data may be found and studied.

P Wilson
November 24, 2009 1:17 pm

IYet dendrochronology can differ significantly from instrumental data: In the case of Yamal just 12 trees selected that showed a warming trend against the rejection of 34 in the same area that didn’t show a warming trend. Prior to this, Briffa argued that the problems of dendroclimatology are fraught with problems and that the Medieval Warm Period was probably as warm as today.
on the basis of these 12 trees, Briffa declares that the MWP was quite cold, but on the basis of the older 34 trees, the MWP was at least as warm as today.
Its a hard one to fathom.

Ben
November 24, 2009 1:18 pm

To those who say that this is a common practice in coding. Here’s the opinion of a chemical-enginner and programmer.
It is a very bad thing to merge datasets from two different instruments unless they are well calibrated. Every time you replace an environmentally sensitive instrument, the EPA has a stated range that have to calibrate it withing (5% for CO2 measurements, as low as 0.5% of range for incinerator temperatures). This is extremely important for proxy data that is replaced by measured data. You leave the proxy data running with the measured data in order to compare them. Unfortunately, the proxy is sometimes way off. You don’t then tack them together and then say (for example) “we had a 20% increase in flaring after the flowmeter was installed”, you say “our calculations were understating our flaring by 20%”.
If the proxy can be adjusted to reasonably predict the value, then you can adjust it for ALL PRIOR DATA and continue. However, if your proxy is completely uncorrelated to the real data or the adjustments are too extreme (which it is in the temperature proxy), then you start your graph at the installation of your instrument and throw out your proxy data as worthless.
Doing otherwise shows a lack of scientific integrity.

P Wilson
November 24, 2009 1:30 pm

Actually its easy to fathom. Just reject difficult data from the peer reviewing process

eaglewingz08
November 24, 2009 1:33 pm

Don’t you understand the facts can be wrong just as long as the story is true or SHOULD be true. (See Rathergate for full explication of liberal/progressive psychosis).
Since capitalism must fail and should fail and socialism/communism must win and should win, then global warming/climate change, as the stalking horse of the commies must be supported and all facts ‘massaged’ to fall into line (or at least into a hockey stick power point presentation). Those ‘reporters’ and politicians and bloggers who believe this is much ado about nothing (remember Van Jones and Anita Dunn and ACORN videos were so characterized) are whistling past the graveyard. Progressivism is built on lies, global warming as one of its most malicious creations is built by lying meretricious ‘scientist’ propagandists.

P Wilson
November 24, 2009 1:52 pm

Here’s a fix: If tree rings show continued growth but temperatures are on the decline, use them as proxies for temperature. If they show limited growth but temperatures are increasing according to the instrumental record, then use the instrumental record. That way, you get a *smooth* upward curve

E.M.Smith
Editor
November 24, 2009 2:50 pm
Editor
November 24, 2009 4:05 pm

Kathleen S. (05:09:55) :
I had always been taught that one should use scientific method to collect, collate data. Strange that these over funded people seem to ignore even the most basic of scientific methods!
For those at CRU, go to this site to lean how to implement said method – http://www.sciencebuddies.org/mentoring/project_scientific_method.shtml (it has a pretty, colourful bubble step plan for them and everything!!! >;))
I don’t have colorful bubbles but I do have colorful images of CO2 concentrations, greenhouse gas transmission windows (CO2’s is nearly saturated) in addition to the introduction to the scientific method. I wrote it for people new to the field and a select class of scientists who have forgotten it. See Science, Method, Climatology, and Forgetting the Basics.

Andrew
November 24, 2009 5:32 pm

This is interesting:
From the progrm file FOI2009/FOIA/documents/harris-treebriffa_sep98_e.pro
;
; PLOTS ‘ALL’ REGION MXD timeseries from age banded and from hugershoff
; standardised datasets.
; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
; with missing values set appropriately. Uses mxd, and just the
; “all band” timeseries
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
; Now normalise w.r.t. 1881-1960
;
mknormal,densadj,x,refperiod=[1881,1960],refmean=refmean,refsd=refsd
mknormal,densall,x,refperiod=[1881,1960],refmean=refmean,refsd=refsd
;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
;
; Now plot them
;
‘Oooops!’ indeed. I wonder what the ‘plot’ was?

Tardkiller
November 25, 2009 2:10 am

Guido Fawkes Blog is now covering this.
His weekly readership is more than some of the MSM papers monthly figures and his monthly figures, well, lets just say its well read.
http://order-order.com/2009/11/25/time-to-defund-crus-global-cooling-deniers/

tonydej
November 25, 2009 3:24 am

If I did not know better, from reading this extraordinary thread, I might give in to the urge to say: Behold the World, like unto a hockey stick: Man has lost his Way and embraced False Prophets and the End of the World is nigh, and the Lord will send Flood and Fire to purge it, (unless we all bow down and accept the need for a ‘co-ordinated response’).

Ian W
November 25, 2009 7:58 am

The ‘Decline/ shows up a problem with the proxies and that is a problem for the entire edifice of AGW research.
The proxies failed validation against reality.
This means that they are USELESS and ALL the papers that use these proxies for temperature (or whatever) are of no value they are fakes. It is no wonder that CRU didn’t respond to the plant scientist who pointed this out.
All the arguments about the statistical significance of the number of trees in Yamal – total waste of time – the base data being used by those statistics was just random noise. So Biffra’s ‘seminal paper’ might as well be just invented values – it is a worthless fairy story.
What I find totally disappointing is that this aspect has not been flagged up and shouted from the rooftops.
So I will type it again – ALL papers that base their conclusions on proxy data are now worthless. None of their conclusions should be used until there is complete and independent validation that the proxies used represent the variable they claim to be proxies for.
Failure to validate proxies is a FUNDAMENTAL failure of scientific method, and should have been picked up in peer review – but we can see now that ‘peer review’ was just a way of research centers passing on ‘tricks’ to each other to hide the proxy failure!
This was the real reason that Phil Jones had to “hide the decline” – it showed that the whole AGW edifice based on the papers from CRU was built on invalid assumptions.
WHY IS NO-ONE POINTING THIS OUT?

Rob
November 25, 2009 9:13 am

I don’t understand how you can still call conspiracy on AGW after this. I mean, surely if there was one there would be evidence in these emails. Have you found Al Gore checking in to make sure everyone’s towing the party line? No?
Then the best you have (and I think this is a bit of a stretch) is some positive affirmation in one or two data-sets. No evidence of a New World Order conspiracy, no admission of a hoax. Unless I’ve missed something?

edward
November 25, 2009 9:23 am

Expose the code and bust the Anti-Trust Climate Team
Busted not Robust!
Shiny
Edward

John Ferguson
November 25, 2009 9:45 am

Been looking into the code and data held in the \documents\cru-code\f77\mnew directory. Here is what I have found:
The data sets are master.dat.com, master.src.com. master.src.com is the important file. Don’t open these without changing the extension to .txt otherwise Windows interprets them as executables, and you won’t be able to view them properly anyway. Could send copies capable of being opened in Windows. These contain monthly weather station data with one row per year. I don’t know the exact nature of these files, but some of the data does relate to sunlight duration. A site in Finland suggests master.src.com is temperature related, but there’s a lot of speculation flying around the Internet regarding the leaked files at the moment, so can’t be certain.
There are 3526 stations in all and 2578488 monthly observations. -9999 in a field means the observation for that month is absent. There are 269172 (10%) missing observations in master.dat.com and 14226 complete missing years. The programs are designed to completely ignore any years with no observations. In total there are 200649 rows (years) of observations which should equate to 2407788 months, however due to some years having up to 11 missing months there are 2309316 monthly observations used. Now what’s interesting is how these missing months are processed. Programs such as split2.f, where a year contains one or more missing months actually invents the figures using the following heuristic
If a month is missing try to infill using duplicate. if duplicates both have data, then takes a weighted average, with weights defined according to inverse of length of record (1/N)
That’s from the comment at the start of split2.f
What this really means is more than 4% of the data is being completely fabricated by at least some of the Fortran data processing programs. If this were done in other disciplines this would be extremely questionable.
Also did notice quite a few programs, especially in the documents\cru-code\idl\pro directory are designed to process data deemed anomalous, though this isn’t necessarily suspicious.
This is the header comment from documents\cru-code\idl\pro\quick_interp_tdm2.pro
; runs Idl trigrid interpolation of anomaly files plus synthetic anomaly grids
; first reads in anomaly file data
; the adds dummy gridpoints that are
; further than distance (dist)
; from any of the observed data
; TDM: the dummy grid points default to zero, but if the synth_prefix files are present in call,
; the synthetic data from these grids are read in and used instead
What is ‘synthetic data’ and why might it be applied to dummy gridpoints away from genuine observation points? This could be a recognised statistical procedure, or data massaging, or creating more observations out of thin air to skew certainty levels, just can’t tell and don’t have time to look at anything else in depth right now. Like it says, e-mails can be open to interpretation but it’s the code and what it does to the raw data which really matters. The comment in the Mann code described in the link below is a work-around to a recognised issue with dendrochronology data. During 1960s the correlation coefficient between tree growth rate and temperature altered.
The recent ERBE results are really significant, the discrepancy between IPCC modelled values and the real world figures is quite something.
http://wattsupwiththat.com/2009/11/22/cru-emails-may-be-open-to-interpretation-but-commented-code-by-the-programmer-tells-the-real-story/

TitaniumDragon
November 25, 2009 1:19 pm

“Ok fine, but how Dr. Jones, do you explain this?”
You need to understand the data source in question before you jump to conclusions the way you did. Without understanding the data, you cannot understand the program, because the program is made to deal with data.
The data source in question is treering data. If you go back to the paper which originally generated it, it states that after 1960 treerings no longer correlate accurately with temperature. This is hardly surprising, given the increase in herbicide, pesticide, and other pollution which occurred durng the 1950s and has continued through to today; we know from the record that things which simulate pollution (volcanic activity, for example) can have a smiliar impact on treering data.
So, fundamentally, they don’t use the data after 1960 for the same reason that you wouldn’t use data in a chamber which is supposed to be held at a constant humidity if the coolant pipes broke and the humidity rose to 100% – the data has been corrupted. Claiming that this is something shady is to not understand why the “trick” was not used in the first place.
The reason it isn’t untowed isn’t because they didn’t exclude the post 1960 treering data (they did) but because excluding the post 1960 treering data was correct – it no longer correlated with every other temperature record, especially the most reliable one, the instrumental record. Its not really relevant because the instrumental record post 1960 is quite good, so losing the proxy is not really an issue.
And indeed, this is well-known. The ORIGINAL PAPER regarding the proxy data indicated it was useless post 1960. The decline in question is that the treerings are smaller than they should be for the summer temperatures of the environments they were in as compared to how thick they were previously.

November 25, 2009 4:19 pm

Code doesn’t lie. I think the answer is simple. We have a computer model to predict future climate. Run it backwards, can it predict the past accurately? If so we have a good approximation of the natural world. If not, than you have a failed equation. That means back to the drawing board. Hire a better programmer or better yet make it open source. This is science not politics. Use the scientific method, it works pretty well.

Gail Combs
November 25, 2009 4:41 pm

Gordon Walker said:
“….And Yamal is an arctic wasteland where trees grow for about 15% of the year.
15% of 15% is just over 2%!
How much of a representative sample of the earth’s climate is that?”

Yeesh, Thats a little over a month! Just this summer I went for close to three weeks without rain in July. With such a short growing period, the influence of factors other than temperature would be exaggerated since there is no time for “smoothing” over the growing season. On top of that didn’t someone mention only 10 trees were used? Heck my forester took core samples of 25 when he estimated the age and growth of my stand of timber!
(bye the bye expect those fire ants in Virgina soon they are already showing up north of Raleigh NC and frost DOES NOT kill them….)

TitaniumDragon
November 25, 2009 5:59 pm

Gail, you should never, ever eat a number like “10 trees” without thinking about it first. How many trees did they SAMPLE to start out with? That’s the real question. And how did they choose them, for that matter.
You need to remember that there are VERY stringent requirements on what trees you can use – they have to have been and always been healthy, had to have been taller than nearby trees the whole usable life, ect. because if they fail any of those you have severe confounding factors.
So in reality, 10 trees may be perfectly fine, so long as they were the correct 10 trees to sample.
And they aren’t the only metric used anyway. Not sure where that idea came from… oh wait, yes I do. People who have no idea what they’re talking about (or, liars with agendas – pretty common in the denier community).
See also this very post, where he whines about something which is well understood by people in the know but OMG CONSPIRACY to people who are clueless.

Gail Combs
November 26, 2009 1:22 am

E.M.Smith said
“JNL (22:29:08) : I’m a statistical programmer for “BIG PHARMA” . For every new drug application, the FDA requires that we give them:…
Nice List.
FWIW, I’ve done “qualified installs” for Pharma companies.
What the non-Pharma folks might not know: For every single bit of hardware and software used for all the stuff JNL listed, it must be installed “just so”. Every Single Step of Every Single Procedure must be defined in advance. Even if it is just “Open box. Plug in cord. Turn power switch on.”…”

As a Certified Quality Engineer/Chemist who also worked in FDA audited factories, I can verify what is said here. FDA tells you how many mice, how many grams and how to test…. EVERYTHING must be documented and verified with double sign offs.
This “AGW climate” work shown in this release is so sloppy it does not qualify as “science” – FRAUD maybe but science no.

Gail Combs
November 26, 2009 2:19 am

John Finn said:
“So come on, folks – time to nominate your favourite email. I realise we’re totally spoilt for choice but which ones stand out. “
There are so many to choose from but I like this the best because it so clearly states the problems with AGW:
[mailto:geoengineering@xxxxxxxxx.xxx
] *On Behalf Of *David
Schnare
*Sent:* Sunday, October 04, 2009 10:49 AM
*Cc:* Alan White; geoengineering@xxxxxxxxx.xxx
*Subject:* [geo] Re: CCNet: A Scientific Scandal Unfolds
Gene:
I’ve been following this issue closely and this is what I take
away from it:
1) Tree ring-based temperature reconstructions are fraught with
so much uncertainty, they have no value whatever.
It is
impossible to tease out the relative contributions of rainfall,
nutrients, temperature and access to sunlight. Indeed a single
tree can, and apparently has, skewed the entire 20th century
temperature reconstruction.
2) The IPCC peer review process is fundamentally flawed if a
lead author is able to both disregard and ignore criticisms of
his own work, where that work is the critical core of the
chapter.
It not only destroys the credibility of the core
assumptions and data, it destroys the credibility of the larger
work – in this case, the IPCC summary report and the underlying
technical reports. It also destroys the utility and credibility
of the modeling efforts that use assumptions on the relationship
of CO2 to temperature that are based on Britta’s work, which is,
of course, the majority of such analyses.
As Corcoran points out, “the IPCC has depended on 1) computer
models, 2) data collection, 3) long-range temperature
forecasting and 4) communication. None of these efforts are
sitting on firm ground.

Nonetheless, and even if the UNEP thinks it appropriate to rely
on Wikipedia as their scientific source of choice, greenhouse
gases may (at an ever diminishing probability) cause a
significant increase in global temperature. Thus, research,
including field trials, on the leading geoengineering techniques
are appropriate as a backstop in case our children find out that
the current alarmism is justified.
David Schnare

Can’t be more blunt than that!

Gail Combs
November 26, 2009 4:27 am

Old Gasser said
“I see a striking corollary in supervision for contract maintenance of floors, …
Exit Questions: Which group was the most trainable; reliable; took most pride in work? Bonus Question: Who most often needed to be counseled regarding personal hygiene?”

The ghetto kids of course! That is why I fired a degreed Chemist, passed over lots of recent grads in Chemistry and hired a construction worker as a lab assistant. (He was also the only one dressed neatly and cleanly) He turned out to be a great worker and very trainable. Best move I ever made.
From what I can see these academic “Scientists?” wouldn’t last in an industrial setting because of their shoddy work habits. I certainly would fire them.

David in Florida
November 26, 2009 4:43 am

TitaniumDragon (17:59:10) :
“People who have no idea what they’re talking about (or, liars with agendas – pretty common in the denier community).
… where he whines … people who are clueless.”
TitaniumDragon must be a climate scientist, likely a professor. Rather than debate he seems to prefer to berate.

Gail Combs
November 26, 2009 4:51 am

E.L. said
“The smoking gun is comments in code that specifically state, “adjusted to look closer to
the real temperatures.” 0.o
Yes sir, they must be really pulling the wool over people’s eyes by adjusting things to reflect real temperatures.
Social scientists are missing an opportunity to study irrational human behavior here.”

Yes, they must be really pulling the wool over people’s eyes by adjusting things to reflect real temperatures when they are adjusting the bristle cone data so it reflects the real temperatures and then claiming it is a reliable proxy for GLOBAL temperatures.
It is the “things” they are busy adjusting that is questionable, just ask an accountant about adjusting things ie cooking the books.

Gail Combs
November 26, 2009 5:07 am

TitaniumDragon (17:59:10) :
Gail, you should never, ever eat a number like “10 trees” without thinking about it first. How many trees did they SAMPLE to start out with? That’s the real question. And how did they choose them, for that matter.
You need to remember that there are VERY stringent requirements on what trees you can use – they have to have been and always been healthy, had to have been taller than nearby trees the whole usable life, ect. because if they fail any of those you have severe confounding factors.
So in reality, 10 trees may be perfectly fine, so long as they were the correct 10 trees to sample.”

The minimum sample size should be at least 25 preferably 30 for this type of attributes data. 10 trees should only be considered a preliminary study to see if the idea is worth perusing and nothing more.
I do not have to “THINK” about it. I have been dealing with sampling plans for 30 years. The sample size sucks period. Heck we pulled more samples when qualifying a lot of shampoo caps that came out of the same plastics mold! Something as variable as a tree needs a larger sample size than I would use for a plastic cap (n=25)
I would want n=25 trees over at least 100 different locations correlated to accurate temperature measurements to consider the tree ring temperature proxy reliable. Otherwise it is just hand waving as the divergence after 1960 shows.