The Stokes-Kaufman contamination protocol – a 'sticky' wicket

Over at Climate Audit, Steve McIntyre has found yet another unexplainable inclusion of a hockey stick shaped proxy in the PAGES2K paper. What is most interesting about it is that when you look at the proxy plot panel, it reminds you of the panel that Steve plotted for Yamal, where just one proxy sample went off the rails as an apparent outlier and seems to dominate the set. Since even a grade school student could pick this proxy out in one of those “which one of these is not like the others?” type test questions, one wonders if this particular proxy was preselected by Kaufman specifically for its shape, or if they just  bungled the most basic of quality control inspections. Of course, when Steve asked those questions, Nick Stokes showed up to defend the indefensible, and hilarity ensued.

Steve McIntyre writes:

==============================================================

Kaufman and paleo peer reviewers ought to be aware that the recent portion of varve data can be contaminated by modern agriculture, as this was a contentious issue in relation to Mann et al 2008 (Upside Down Mann) and Kaufman et al 2009. Nonetheless, Kaufman et al 2013 (PAGES), despite dozens of coauthors and peer review at the two most prominent science journals, committed precisely the same mistake as his earlier article, though the location of the contaminated data is different.

The contaminated series is readily identified as an outlier through a simple inspection of the data. The evidence of contamination by recent agriculture in the specialist articles is completely unequivocal. This sort of mistake shouldn’t be that hard to spot even for real climate scientists.

Here is a plot of the last nine (of 22) Arctic sediment series. One of these series (top left – Igaliku) has the classic shape of the contaminated Finnish sediment series (often described as upside down Tiljander). Any proper data analyst plots data and inspects outliers, especially ones that overly contribute to the expected answer. The Igaliku series demands further inspection under routine data analysis.

last 9 arctic sediments

Figure 1. Plot of last nine (of 22) Kaufman et al Arctic sediment series. The Igaliku proxy is total pollen accumulation.

The Igaliku series is plotted separately below. It is also available at a NOAA archive here , which actually contains one additional recent value plotted in red. The NOAA archive contains many other measurements: it is unclear why Kaufman selected pollen accumulation rate out of all the available measurements.

The resolution of the data set is only 56 years (coarser than the stated minimum of 50 years) and only has three values in the 20th century. The value in 1916 was lower than late medieval values, but had dramatically surged in the late part of the 20th century.

Igaliku pollen

Figure 2. PAGES2K Igaliku series.

Igaliku is in Greenland and was the location of the Norse settlement founded by Erik the Red and is of archaeological interest. Sediment series from Lake Igaliku have been described in three specialist publications in 2012:

Massa et al, 2012. Journal of Paleolimnology, A multiproxy evaluation of Holocene environmental change from Lake Igaliku, South Greenland. (Not presently online). (Update: online here h/t Mosher. I’ve added a paragraph from this text referring to pollen accumulation.)

Massa et al 2012. QSR. A 2500 year record of natural and anthropogenic soil erosion in South Greenland. Online here.

Perren et al 2012, 2012. Holocene. A paleoecological perspective on 1450 years of human impacts from a lake in southern Greenland. Online here.

The three articles clearly demonstrate that the sediments are contaminated as climate proxies.

Igaliku has been re-settled in the 20th century and modern agricultural practices have been introduced. The specialist publications make it overwhelmingly clear that modern agriculture has resulted in dramatic changes to the sediments, rendering the recent portion of the Igaliku series unusable as a climate proxy. Here are some quotes from the original article.

============================================================

Read Steve’s entire essay here: More Kaufman Contamination

Nick showed up to argue that the Igaliku really isn’t contaminated by agriculture at all, and is currently engaged in an multi-front battle of deny, duck, and cover. The obstinance on display to prevent admitting the obvious is diamond hard. This isn’t unusual, as Nick was associated with CSIRO, where admissions aren’t part of the government funded manual. Gadflies and racehorse comparisons were bandied about and now Steve has taken to calling Nick “racehorse” much in the same vein as Tamino and his self proclaimed “bulldog” status.

To say watching this is entertaining, would be an understatement. Meanwhile there have been many updates and piling on of additional evidence for contamination. Nick is now reduced to rebutting Steve with Bill Clinton style questions (“It depends on what the meaning of the words ‘is’ is.”) such as: “Could you say exactly what you mean by “contaminated core”?”

Here is my contribution that I left as comments:

==================================================================

Anthony Watts Posted Apr 30, 2013 at 12:09 AM

Steve writes:

The three articles clearly demonstrate that the sediments are contaminated as climate proxies.

Igaliku has been re-settled in the 20th century and modern agricultural practices have been introduced.

=======================================================

By way of support for this, photos can tell you a lot.

Google Earth’s aerial view clearly shows the developed agriculture signature:

And from the ground, hay bales in Igaliku from the Wikipedia page on Igaliku:

The slope of the land drains right into the lake, and along the slope is clearly human agricultural development.

O’Rourke and SOlomon 1976 have recently found that total pollen influx was a direct function of sediment influx in varved sediments from seneca Lake, New York.

Given the drainage pattern of the land, it seems like a clear case of sediment contamination to me.

Kaufman has followed his rules, which are to use proxies which:

“(5) exhibit a documented temperature signal, and (6) are

published in peer-reviewed literature as a proxy for temperature”

One wonders though if Igaliku wasn’t preselected due to the shape of the data without any other considerations.

===============================================================

Note: the Google Earth image is of the town near the fjord, the Wikipedia picture of the lake where sediment was sampled is in the highland just to the NW of the town. You can inspect the map here and see the lake (which is ice-covered in the satellite photo):

http://maps.google.com/maps?q=60.987778,-45.420833&ll=61.009153,-45.439453&spn=0.053414,0.185394&t=h&z=13

Update: here is another view of the lake from the ground, showing agriculture all around the catch basin, thanks to Nick and anonymoose http://www.panoramio.com/photo/12426959

===============================================================

    • EdeF
      Posted Apr 30, 2013 at 12:27 AM

      Igaliku reminds me of the small farms in the Okanogan River valley of central Washington state. Note that dirt would wash into the lake from the several roads going up to the higher country.

      • Anthony Watts
        Posted Apr 30, 2013 at 12:37 AM

        Exactly, basically what agriculture does is increase the pollen catch-basin area though land use change. Fighting runoff and erosion is always an issue with agriculture.

        With a larger area near the lake having undergone land-use change, it will allow more runoff, and therefore more pollen to be funneled into the lake. Kaufman was probably never a farmer and wouldn’t get this, or maybe he simply didn’t want to since that uptick looks so “elegant” when trying to fit the theory to the data.

  1. Anthony Watts
    Posted Apr 30, 2013 at 1:14 AM

    Figure 2 from PAGES 2K has an interesting pollen bump from about 1150-1400.

    I think I’ve found a proxy for that. Modern day Igaliku is on the same site as Garðar, Greenland, which had a period of growth during the MWP.

    Garðar was the seat of the bishop in the Norse settlements in Greenland.

    Garðar had enough success as a town to warrant the Catholic Church to issue a permanent Bishop for the construction of a cathedral there. The first bishop of Garðar, Arnaldur, was ordained by the Archbishop of Lund in 1124. He arrived in Greenland in 1126. In the same year he started with the construction of the cathedral, devoted to St. Nicholas, patron saint of sailors.

    To support something like that, you need a successful agricultural base. People that are starving don’t have time for such luxuries.

    Bishop Álfur was ordained in 1368 and served as last bishop of Garðar until 1378. The Greenland diocese disappeared in the 1400s, when the ship departures from Norway stopped.

    If you look at this table of Bishops, it seems to correlate with that bump in the pollen data, then dives after 1400.

    Bishop Served years

    Arnaldur First-Bishop 1124–1126

    Bishop Arnaldur 1126–1150

    Jón Knútur 1153–1186

    Jón Árnason 1189–1209

    Þór Helgi 1212–1230

    Nikulás 1234–1242

    Ólafur 1242–mid-1280

    Þór Bokki 1289–1309

    Bishop Árni 1315–1347

    Álfur Last-Bishop 1368–1378

    Source: http://en.wikipedia.org/wiki/Gardar,_Greenland

    A timeline is here: http://www.greenland-guide.gl/leif2000/history.htm

    Bishops would seem to be a proxy for the success of the town, and the success of the town had to rely on the sea and agriculture. When the climate turned colder, the agriculture failed, as we have heard about other areas of southern Greenland.

    Of course the pollen bump due to agriculture would have been smaller then than now, since they had no mechanization to amplify the area they could till and plant.

===========================================================

Bishop Hill might like the Bishops proxy, but Mosher added the real clincher though:

===========================================================

Steve Mosher Posted Apr 29, 2013 at 2:39 PM

http://www.academia.edu/2367255/A_multiproxy_evaluation_of_Holocene_environmental_change_from_Lake_Igaliku_South_Greenland_of_environmental_change_from_Lake_Igaliku_South_Greenland._Massa_C._Perren_B._Gauthier_E._Bichet_V._Petit_Ch._Richard_H

“Norse farmers settled southern Greenland *985 AD(Jones1986) including the area around Lake Igaliku,which was used for grazing and hay production.Following the disappearance of the Norse *1450 AD,Igaliku was resettled during the 18th century (Arne-borg2007) and large-scale agriculture, based on sheep farming, was developed in the 1920s (Austrheim et al.2008). Consequently, the response to climate changeover the last millennium was overprinted by land-use effects (Gauthier et al.2010; Massa et al.2012; Perrenet al.2012). However, the consideration of human-induced changes at Lake Igaliku in light of the entire Holocene ecosystem development provides new insights about their magnitude.Relative to the preceding Holocene shifts, the vegetation was slightly impacted by land clearanceand grazing, and exhibits a small decrease in woodytaxa abundance (from 60 to 45 %). Until *1335 AD,the related soil erosion, documented by high TOC/TNand MAR values, clearly compounds the long-termincreasing trend (Fig.6). Contrary to the other studiedvariables, the diatom assemblages indicate that thelake ecology was not significantly impacted, and that the changes are within the range of natural Holocene variability.Both in terms of lake ecology and soil erosion, theperiod since 1988 AD is likewise unprecedented in the context of the Holocene by a magnitude and rate of change greater than the previous 9,500 years. The digging of drainage ditches for hayfields caused adramatic increase in MAR, which reached unprece-dented values. The use of nitrogen fertilizers on thesefields (200–250 kg ha -1 yr -1of N, Miki Egede pers.commun.) have outpaced the natural buffering capac-ity of Lake Igaliku, resulting in a sharp rise in themesotrophic diatom, Fragilaria tene

=============================================================

Nick, in classic Mannian style, refuses to concede. Go help him out at the Climate Audit thread More Kaufman Contamination which is sure to become a classic.

This Stokes-Kaufman incident seems to be a case of land use effect denial.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
90 Comments
Inline Feedbacks
View all comments
James Bull
May 2, 2013 12:01 am

I love the idea of using bishops as a proxy for climate, it just appeals to my sense of humour and leads to a host of irrelevant questions…
How accurate are they to the nearest tenth of a decree?
Do they fluctuate with how orthodox/liberal they are?
Would it work for Dalai Lamas/Grand Mufti?
Where are these bishop/temp proxies listed?
James Bull

richardscourtney
May 2, 2013 2:06 am

Nick Stokes:
In this thread at May 1, 2013 at 4:17 pm, for the stated and clear purpose of clarifying your words, I asked you three simple questions about the work you are defending.
http://wattsupwiththat.com/2013/05/01/the-stokes-kaufman-contamination-protocol-a-sticky-wicket/#comment-1293944
Were I peer reviewing the paper then inadequate answers to those questions would have resulted in my recommending the paper be rejected for publication.
My original post stated each of the questions together with my reasons for each question.
At May 1, 2013 at 4:45 pm, you replied to my post
http://wattsupwiththat.com/2013/05/01/the-stokes-kaufman-contamination-protocol-a-sticky-wicket/#comment-1293968
I was grateful that you replied but surprised that you had bothered because your reply did not mention, answer or address any of my questions.
Therefore, at May 1, 2013 at 5:04 pm, I responded with a post which demonstrated I had read and understood your reply but was disappointed you had not answered my questions.
http://wattsupwiththat.com/2013/05/01/the-stokes-kaufman-contamination-protocol-a-sticky-wicket/#comment-1293981
I then retired to bed hoping that upon rising this morning you may have provided the clarification I requested. Upon arising I found your answer which I am replying. It is at May 1, 2013 at 5:30 pm
http://wattsupwiththat.com/2013/05/01/the-stokes-kaufman-contamination-protocol-a-sticky-wicket/#comment-1293999
Your reply hints at – but evades – parts of all my questions, and it changes what you have been saying.
My questions pertained to your saying of data assurance

If that was necessary, it was for the original journal to demand it.

Your latest post addressed to me says in total

Richard C,

” But he claimed to be doing novel work when all he was doing was reusing old data”

The Pages2k network is quite explicit about their scope:

” The ‘2k Network’ of the IGBP Past Global Changes (PAGES) project aims to produce a global array of regional climate reconstructions for the past 2000 years”

Many scientists have labored for many years to create this data. There is now enough that a regional reconstruction can be made, though that in itself is a big task. If everyone who essayed a reconstruction had to experimentally recreate the data, there would be no end to it. And it isn’t science. Science progresses because people can build on what others have done.

Nick Stokes, that answer is a disgrace.
1.
Science progresses because people BUILD ON what others have done.
Science is assessed by replicating what others have done.
Science does NOT consist of re-hashing what others have done.
2.
Nobody is required to “experimentally recreate the data” but every scientist is required to obtain NEW data. Merely playing with existing data is NOT science unless it is to investigate the quality of that data or to demonstrate error in earlier interpretation(s) of that data.
3.
Data may be overseen by an agreed authority; e.g. NPL is responsible for maintaining calibration standards in the UK. Other than that, scientists who use data are responsible for the selection of the data, the quality of the data, and the use of the data they use. It was NOT the duty of “the journal” (as you originally said) or of the PAGES project (as you now say) to validate the data used by “the Pages2k consortium”. As I said, they chose to use that data, they used that data, and the results they published are theirs. So,
ALL responsibility for the work is theirs and that includes any effects of the data they chose to use.
Nick Stokes, you are not defending “the Pages2k consortium”, you are attacking science.
On the basis of your words to me I am wondering why you are doing this.
Is your intention to ensure maximum publicity for the inadequacies and gross flaws of the paper by “the Pages2k consortium”?
If so, then I congratulate you on the success you are achieving.
Richard

mpainter
May 2, 2013 3:09 am

For Richard Courtney:
Good luck on trying to pin down Nick Stokes.

wayne Job
May 2, 2013 3:13 am

Nick Stokes,
I am a very old aussie engineer and grew up in awe of the wonderful work of our C.S.I.R.O , the last two decades have seen me slowly shun then discount totally any thing they say as bulltish. Your methods and ethics to real science are shining brightly in your postings here and your research. Most thinking Australians take now anything the CSIRO or the MET office say’s with a very large grain of salt, you and your ilk have done some serious damage to the credibility of the science community.
Eat some humble pie and do real science without spin, you may find it liberating but some what less profitable.

Paul Dennis
May 2, 2013 3:24 am

Richard Courtney has a good point here. Any scientist, whether individual, or as part of a group has to take full responsibility for the data they are using. It doesn’t matter if it is data they have collected themselves, or data others have published. They need to be objective and critical. Having set a selection criteria the PAGES2K group then need to evaluate data quality and can’t delegate this to the journal, reviewers, or original authors.
The Igaliku data looks anomalous and this should have been an immediate red flag. There’s a factor of 8 increase in both PAR and MAR over the past 200 years. The common factor driving these is an 8x increase in the original authors estimate of sediment accumulation rate (SAR). It doesn’t take long to look at the data and note that (i) The factor of 8 increase in the SAR is not in accord with their near surface Pb-210, and Cs-137 data. These indicate a factor of 3x change in SAR. Moreover, the sediment they are coring is between 70 and 80% porosity with up to 80 wt % water content. Such sediments cannot have well packed mineral structures and are liable to settling during coring. A small degree of settling just over the initial few cms will have a significant impact on calculated sedimentation accumulation rates. Ergo I conclude that the sharp uptick in the PAR is not robust. This is before we even start to discuss the effects of land use and vegetation changes etc. I’ve documented this over at CA.
One shouldn’t underestimate the role of common sense, gut feeling, and the skills necessary to quickly evaluate a data set and ask key questions. Igaliku is a good example. The top 15 cm of sediment has near constant water content, wet bulk density, total organic C, N and S and yet we are told that the sediment accumulation rate has increased by a factor of 8 over this same interval. Wouldn’t anyone want to ask serious questions about what is going on here rather than simply infer it’s a result of the combination of land use and climatic changes?

richard verney
May 2, 2013 3:31 am

Ricket says:
May 1, 2013 at 10:07 am
//////////////////////////////////////
Is not the correct approach to both include and exclude any proxy that is thought to be an outlier or contaminated (ie., of ‘dubious’ quality and relevance), and at the same time to explain the reasons why the proxy in question is considered unreliable?
Extrapolations with both results should be shown together with the reason why the author prefers the extrapolated set with the ‘dubious’ proxy excluded over the extrapolated set with the ‘dubious’ proxy included (or vice versa depending upon how the author wishes to present his case).
The reader can then form a valid assessment as to which data set is to be preferred rather than being guided by the bias of the author.
Later research may demonstrate that the ‘dubious’ proxy was not contaminated and is properly relevant for inclussion, or it may further confirm its status as a rogue which if included would contaminate the extrapolated data set.

Steve T
May 2, 2013 4:11 am

Nick Stokes says:
May 1, 2013 at 5:30 pm
Richard C,
” But he claimed to be doing novel work when all he was doing was reusing old data”
The Pages2k network is quite explicit about their scope:
” The ‘2k Network’ of the IGBP Past Global Changes (PAGES) project aims to produce a global array of regional climate reconstructions for the past 2000 years”
Many scientists have labored for many years to create this data. There is now enough that a regional reconstruction can be made, though that in itself is a big task. If everyone who essayed a reconstruction had to experimentally recreate the data, there would be no end to it. And it isn’t science. Science progresses because people can build on what others have done.
****************************************************************************************************
Or, in this case NOT done.
It is akin to builders adding a second storey to an existing single floor building without checking that the foundations are suitable for the extra load. Is this acceptable – no. The builders have the responsibility of checking that anything that is used in their work is fit for purpose.
Steve T

richardscourtney
May 2, 2013 4:33 am

Steve T:
re your post at May 2, 2013 at 4:11 am.
Thankyou!
That is a brilliant explanation of the issue for explanation to onlookers who are not versed in the scientific method. I wish I had thought of it and I am grateful for your providing it.
Richard

Rob Ricket
May 2, 2013 7:58 am

richard verney says:
May 2, 2013 at 3:31 am
“Is not the correct approach to both include and exclude any proxy that is thought to be an outlier or contaminated (ie., of ‘dubious’ quality and relevance), and at the same time to explain the reasons why the proxy in question is considered unreliable?”
“Extrapolations with both results should be shown together with the reason why the author prefers the extrapolated set with the ‘dubious’ proxy excluded over the extrapolated set with the ‘dubious’ proxy included (or vice versa depending upon how the author wishes to present his case).”
Your argument is worthy of consideration in light of (thanks to Steve’s work) documented cases of scientific fraud in the Climate Science community. While the “inclusion/exclusion” model might assist in reducing selective sampling, it would also muddy the water with unreliable data and tedious qualifiers.
My argument against inclusion is predicated on a dispassionate view of proper scientific methodology, where unreliable data is naturally discarded. In my opinion, the disqualification process for specific samples from the selected series should be explained in the methodology section and remain isolated from the results. The key here (we are dealing with a credibility issue) is to make a compelling case for including or excluding specific samples.

richardscourtney
May 2, 2013 9:01 am

Rob Ricket and Richard Verney:
I interrupt your interesting discussion in hope of helping. And I do it using the method so clearly demonstrated to me by Steve T.
The issue is sampling and data selection.
If the sample were random were then all the sample should be used.
But the sample is not random. Indeed, it is deliberately selected for a purpose.
Random samples from a random sample of the world’s lakes were NOT obtained. Lakes were selected as being indicative of climate change and not of other factors. At least, they were selected on the basis that other factors can be deconvoluted from their indications of climate change.
Therefore, sample selection criteria were – and needed to be – decided prior to obtaining the samples.
The question then arises as to whether there could be post hoc reasons for rejecting a sample which was obtained according to the criteria. A statistician may say no, but a scientist says, YES! And this scientific answer exists irrespective of whether the data are to be subjected to a statistical analysis.
Such a post hoc decision for rejection of data occurs when the selection criteria are observed to have not accounted for an unforeseen circumstance.
(a)
A statistician says this does not matter because “the data are the data” and altering the decision on what constitutes the data biases analysis of the data.
(b)
A scientist says this does matter because he only wants the data which will reveal what he does not know and he wants to find out: indeed, that is why he established the selection criteria.
So, in the style of Steve T, I will consider the issue by discussion of a hypothetical analogy. In this case, a scientific study intended to assess tyre wear on GM cars.
There is a data set of tyre wear on cars which traveled down a particular street, and the data includes the makes and ages of the tyres, the mileage of the cars, and the manufacturers of the cars.
The scientist decides to use the existing data set.
1,
He sets selection criteria as using data from all the GM cars.
2.
He then assesses the quality of the data by comparing sub-sets of his selected data and by obtaining additional data (or another but similar data set) and comparing them.
3.
The data seems sound but one datum is an outlier which he investigates.
4.
The investigation of the outlier either
(a) finds nothing untoward
or
(b) finds the tyres of the outlier had previously been on a Ford car and were transferred to the GM car on the day before the data was obtained.
5.
In the case of 4(a) there is no discernible reason to reject the datum.
6.
In the case of 4(b) the datum is clearly misleading: it mostly indicates tyre wear on a Ford car and would distort the investigation of tyre wear on GM cars because – in this sample – the Ford wear is much greater than the GM wear. Hence, he rejects this datum from his study and his report explains why he rejected it post hoc.
As Paul Dennis explains at May 2, 2013 at 3:24 am. the Igaliku data used by “the Pages2k consortium” clearly indicates effects of agriculture which is much greater than the indication of climate change.
http://wattsupwiththat.com/2013/05/01/the-stokes-kaufman-contamination-protocol-a-sticky-wicket/#comment-1294466
Hence, the Igaliku data should have been rejected for use in the study by “the Pages2k consortium” and the report of that study should have provided the explanation and data which required the rejection.
Richard

Rob Ricket
May 2, 2013 12:06 pm

Richard C.
Thanks for the substantive contribution to the discussion and I find no cause to disagree with anything you’ve written. In fact, the Watts et al papers are predicated on the rejection of unreliable data. Methinks the hard core Statisticians need to bend a bit to accommodate what passes as common sense to most folks.
Some folks at CA have fallen into the trap of taking a hard line regarding inclusion during an audit of X-study and then conversely arguing for exclusion during the audit of Y-study. Clearly (as you note in the tyre example) a well-documented explanation for including or excluding data from a set of proxies is preferable to blanket inclusion of suspect data.

May 2, 2013 12:30 pm

atheok says: May 1, 2013 at 5:04 pm:
I stand by my defence of Nick Stokes as worthy of some respect. Remember, I was responding to the idea that he was a villain of the same credibility as Tiljander Mann; that was unfair.
You ask what definition of “reasoned“ I thought that he did well.
I meant, “2. To talk or argue logically and persuasively.”
Logically: He does argue logically.
Persuasively: Nick Stokes doesn’t persuade me or you but even so, persuasive is a subjective term.
He would go down well at the Guardian, SkS or RealClimate.
And he is more rational in his use of arguments than many at those places.

Roger Knights
May 2, 2013 4:43 pm

Here’s a very relevant quote from a fortnight ago. I wish I’d thought to post it earlier in the thread:

Doug Proctor says:
April 14, 2013 at 12:20 pm
I’ve been going on about what I call Procedural Certainty vs Representational Certainty. I would be appreciative if others could comment on this idea.
Procedural Certainty is that certainty which results from the mathematical methodology. In the temperature profile case, it means the error we speak of that results from how we take the data, modify it, adjust it, combine it, average it, smooth it and present it. It takes our methods of dealing with the uncertainty we attribute to each element and how the variability of these elements (as we define) may modify the whole (possibly using a Monte Carlo simulation). In conversation, the Procedural Certainty may be expressed something like this, “If we do what we do repeatedly with the tools we have and assumptions we made for tree-rings in these areas, the result is like this 95% of the time.”
The Representational Certainty is that certainty which deals with the amount of correlation with what we are getting out our work with what we are trying to determine, i.e. with “reality”. In the case of Mann or Marcott or the recent varve work, the Representational Certainty is an indication of how well the patterns of small samples, locally sourced, tree-rings, allenones or varve thickness reflect temperatures of a wide region. We might say of Representational Certainty, “Tree-ring widths increase and decrease along with regional temperature changes 80% of the time, and in proportion to the amount of change, also 80% of the time.”
For the example of tree-rings, you can see that the two types of Certainty are not similar. The first is valid for the purposes of the study, but only the second is valid for the purposes of understanding how the present relates to the past (and, therefore, for policy matters).
Procedural Certainty is what I suggest the IPCC trumpets. The skeptics argue about Representational Certainty. The MSM and public think they are the same thing, and they can be, but most times they are not, and may be quite different.
For climate science, to bring one in tune with the other requires correlation studies. For Mann and the others, there are several ways to do this.
One would be to take the proxy data (easiest for tree-rings) of the last 150 years from a wide variety of places that have close, decent instrumental temperature records and compare the two.
Another would be to take a similar but location-wise different numerical and geographic distribution of instrumental temperature data and process them in the same procedural way as that done for the proxies and compare the results to each other.
A third way would be to take the instrumental data and snip it into pieces (“samples”) representing the proportional time-length of individual pieces of the proxies (i.e. 11 for varves, 73 for the allenones of Marcott), leaving the spaces as they develop. This last would be trickier, as to deal with the overlap portions one would have to randomly assign portions of the overlap data to the samples.
We hear much of proxy studies but little of proxy-reality correlations. When both Mann and Marcott splice the instrumental period onto their proxies, they present the viewer with a fait accompli inference that the proxy record has a Representational Certainty equal to the instrumental, Procedural Certainty. It does not, which is why we cry foul. As we should.
The most egregious examples of the fallacy of the equality of Procedural vs Representational Certainty recently has been the Lewandowsky papers showing skeptics to be flat-Earth conspiracists, but the greatest of them was the paper that said 97% of scientists support CAGW and the IPCC narrative. Procedurally both these claims are correct, in that what they did produced those results, but Representationally neither is correct.
A paper or two showing the difference in theory and practice would go a long way to showing on what foundations of sand the edifice of Anthropogenic Global Warming has been built.

May 4, 2013 9:15 am

“M Courtney says: May 2, 2013 at 12:30 pm
atheok says: May 1, 2013 at 5:04 pm:
I stand by my defence of Nick Stokes as worthy of some respect. Remember, I was responding to the idea that he was a villain of the same credibility as Tiljander Mann; that was unfair.
You ask what definition of “reasoned“ I thought that he did well.
I meant, “2. To talk or argue logically and persuasively.”
Logically: He does argue logically.
Persuasively: Nick Stokes doesn’t persuade me or you but even so, persuasive is a subjective term.
He would go down well at the Guardian, SkS or RealClimate.
And he is more rational in his use of arguments than many at those places.

M Courtney:
Based on your refinements, I agree with you.
Sort of reminds me about debating events where the debaters are given pro/con positions and undertake their debates with dexterity, sincerity, intelligence and force whether or not they personally believe their assigned stances.
Yes, Nick displays intelligence, verbal dexterity, reason and persuasion in his repartee while trying to keep the pot stirred without his opponents gaining direction and traction. Which is why Nick keeps trying to avoid getting nailed down on specifics.
Removing fabrications committed by deception, evasion, circular arguments, I am challenged to remember Nick ever stating a direct lie or ad hominem. My personal belief is that any method of deception is still deception. Everyone may run a red light at some time, but that neither makes it right nor legal; when the consequences are death and destruction the full force of the law should and most likely eventually will be brought to bear.
The CAGW religion is absolutely selfish without regard for who they trample in their flight from reality. Whether destroying people’s legitimate careers or causing world starvation, the CAGW faith is blind to all consequences but their personal gain.
As far as Nick, I do not know his actual thoughts as he evades getting nailed down on his beliefs; and it may be that Nick’s incisive logic twists get him banned rather quickly from the less tolerant discussion sites. Leaving Nick to practice his arts on us at the very tolerant sites. It may also be that the participants and arguments are more Nicks mettle here and other skeptic sites. Consider Nick having a debate with CAGW believers? Would Nick get any debate satisfaction or mental exercise, even at RC?
A great uncle of mine and his wife argued constantly; whichever one spoke first, the other would immediately take the opposite. Yes, their arguments were nonstop and quite draining to us visitors, but neither of them would think of moving more than a few feet without making sure the other was coming, or at least arguing against the idea. I never saw them more than a room apart. I couldn’t imagine a forty-fifty year marriage like that, but neither uncle nor aunt thought their loud arguments unusual or considered trying to stop.
I have a close friend who absolutely loves a good argument, but she prefers to argue her beliefs and instead will steer topics towards ones where she can take her preferential side. Once, when she was arguing with one of my relatives, I leaned in and advised that she wouldn’t win the discussion because my relative didn’t/couldn’t recognize he was losing. As I refuse to enter any argument with her, she continued verbally beating up my relative. Beggars can’t be choosers, I guess.