Those who don't learn from Yamal, are condemned to repeat it – Marcott’s YAD061

You’d think academics in the upside down Mann climate proxy world would pay attention, and not repeat the same mistakes of the past. Apparently not. WUWT readers surely recall the Yamal YAD06 (The most influential tree in the world) and the core sample YAD061.

Core YAD061, shown in yellow highlight, the single most influential tree
Core YAD061, shown in yellow highlight, the single most influential tree

Steve McIntyre points out the YAD061 equivalent in Marcott et al, where a single sample contributed the majority of the uptick.

He writes:

TN05-17 is by far the most influential Southern Hemisphere core in Marcott et al 2013- it’s Marcott’s YAD061, so to speak. Its influence is much enhanced by the interaction of short-segment centering in the mid-Holocene and non-robustness in the modern period. Marcott’s SHX reconstruction becomes worthless well before the 20th century, a point that they have not yet admitted, let alone volunteered.

Marcott’s TN05-17 series is a bit of an odd duck within his dataset. It is the only ocean core in which the temperature is estimated by Modern Analogue Technique on diatoms; only one other ocean core uses Modern Antalogue Technique (MD79-257). The significance of this core was spotted early on by ^.

TN05-17 is plotted below. Rather unusually among Holocene proxies, its mid-Holocene values are very cold. Centering on 4500-5500 BP in Marcott style results in this proxy having very high anomalies in the modern period: closing at a Yamalian apparent anomaly of over 4 deg C.

TN05-17_baseFigure 1. TN05-17.

In the most recent portion of the Marcott SHX, there are 5 or fewer series, as compared to 12 in the mid-Holocene. Had the data been centered on the most recent millennium and extended back (e.g. Hansen’s reference station method is a lowbrow method), then there would have been an extreme negative contribution from TN05-17 in the mid-Holocene, but its contribution to the average would have been less (divided by 12, instead of 4). As shown below, TN05-17 pretty much by itself contributes the positive recent values of the SHX reconstruction. It’s closing anomaly (basis 4500-5500 BP) is 4.01 deg. There are 4 contributing series – so the contribution of TN05-17 to the SHX composite in 1940 is 4.01/4, more than the actual SHX value. The entire increase in the Marcott SHX from at least 1800AD on arises from increased influence of TN05-17 – the phenomenon pointed out in my post on upticks.

Read the entire post here: http://climateaudit.org/2013/04/10/the-impact-of-tn05-17/

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

73 Comments
Inline Feedbacks
View all comments
Jean Parisot
April 11, 2013 7:34 am

Again!

wwschmidt
April 11, 2013 7:43 am

It’s not hard to see why they keep repeating the same mistake. For them, it’s not about getting the “science” right, it’s about providing support for an ossified ideological belief. This technique is the only way they have left with which they can massage the.numbers so that they.show what.they want them to show.

April 11, 2013 7:43 am

The -2000 SUV caused the warming.

Beta Blocker
April 11, 2013 7:44 am

Steve McIntyre on CA: “TN05-17 is plotted below. Rather unusually among Holocene proxies, its mid-Holocene values are very cold. Centering on 4500-5500 BP in Marcott style results in this proxy having very high anomalies in the modern period: closing at a Yamalian apparent anomaly of over 4 deg C.”

Another way of saying this is that it’s Yanomalous.

Jeff Alberts
April 11, 2013 7:44 am

Those who don’t learn from Yamal, are condemned to repeat it – Marcott’s YAD061

You’re assuming they DIDN’T use Yamal as a learning experience. They learned how to make a stick where there are none. Such things can ONLY be deliberate.

Jean Parisot
April 11, 2013 7:58 am

How was this proxy treated in his thesis?

Gary
April 11, 2013 8:07 am

Besides the all too human reaction of ignoring your critics, what would cause climate researchers to continually re-use questionable data series in their analyses? I suspect it’s because they are mostly cloistered in offices running computers rather than out in the field collecting data. Biologists and ecologists know their organisms — how they behave and what influences their existence — from close contact and observation. I’ve done both field and laboratory data collection in ecological and climate research and I understand the differences by experience. Many climate scientists seem to come from backgrounds other than life sciences and just may not have the same intuition about data generated from samples of tree cores, mud, rock, ice, fossils, and chemicals. Numbers thus become reality even though they actually are derivative from reality. Maybe this habit also explains the blind trust in modeling and it’s subtle elevation over observation. What else can explain how they completely ignore warnings about the context of a data series except ignorance of a larger perspective? [And don’t say deliberate fraud. That’s going too far, so don’t even go there.]

Kasuha
April 11, 2013 8:12 am

One thing that surprises me on that paper which nobody mentioned yet is the way how the error range is calculated. How is that possible that 20th century where only about 10% of proxies are used to calculate the mean has about the same error range as the whole rest? doesn’t each missing proxy extend the error by uncertainity about what that proxy’s contribution would be should that proxy have a value for that period?

Steve Keohane
April 11, 2013 8:23 am

You’d think academics in the upside down Mann climate proxy world would pay attention, and not repeat the same mistakes of the past
Maybe ‘climatology’ SOP is a mistake.

steveta_uk
April 11, 2013 8:40 am

Bit puzzled by this. I thought Steve Mc had shown that the uptick was an artifact of the processing and wasn’t in any of the proxies. And now, he’s identified a specific proxy with an uptick.
Can’t be both, surely?

trafamadore
April 11, 2013 8:43 am

Here is the data part of the abstract from the Marcott paper:
“Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally distributed records. Early Holocene (10,000 to 5000 years ago) warmth is followed by ~0.7°C cooling through the middle to late Holocene (<5000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago. This cooling is largely associated with ~2°C change in the North Atlantic."
Exactly, which part of this abstract is wrong? Because I don't see much about the last 100 years. The only time it comes up is in the discussion, where they are comparing their data to data that they reference from others, which includes the instrument record.

trafamadore
April 11, 2013 8:50 am

Wow, impressed with the speed in the new site.

dp
April 11, 2013 8:59 am

It is pretty obvious the damning data and methods get re-used because there are just so many damning data and methods. It is a matter then of re-using them or turning to valid science which would not tell the story they wish told. It is not only hockey sticks all the way down – it is hockey sticks all the way up. Where there is a willing MSM there will be hockey sticks.

John B
April 11, 2013 9:02 am

trafamadore
No one is saying there is anything wrong with the quote you reference from the abstract, and if the authors had presented their paper in those terms during their press blitz then there would not have been the same negative reaction
However that would still leave two factors to consider. Without the uptick the paper would have been so unremarkable that it is highly unlikely it would even have been published, and certainly not in a prestigious journal.
It also fails to consider the use of what the layman could only describe as creative, if not downright deceitful, statistical techniques within the paper.

Louis Hooffstetter
April 11, 2013 9:13 am

Trafamador says:
“Here is… part of the abstract from the Marcott paper:
‘Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally distributed records. Early Holocene (10,000 to 5000 years ago) warmth is followed by ~0.7°C cooling through the middle to late Holocene (<5000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago. This cooling is largely associated with ~2°C change in the North Atlantic.'
Exactly, which part of this abstract is wrong?"
The part that's wrong is the very next sentence of the abstract (which you conveniently left out):
"Current global temperatures of the past decade have not yet exceeded peak interglacial values but are warmer than during ~75% of the Holocene temperature history."
This was their money quote for the press releases, intended to make headlines around the world (and it did). They made this statement both in the abstract and in numerous press releases knowing full well that their data did not support it. It was a lie, which makes it scientific fraud.

Fred from Canuckistan
April 11, 2013 9:19 am

When there are no consequences for getting away with poor scientific method, you are inclined to repeat the process.
If you do it again, especially after the errors of your ways are repeatedly pointed out to you, it is fraud.

toml
April 11, 2013 9:19 am

It would be very interesting to repeat Marcott’s analysis dropping out one proxy at a time. It’s a pretty standard sensitivity analysis method.

April 11, 2013 9:27 am

Reblogged this on Climate Ponderings.

April 11, 2013 9:28 am

What’s driving me crazy in the climate sciences is the same crazy-making in marketing a la Lewandowsky or whatever his name is: so much work is done and presented that is MATHEMATICALLY correct, in that correct PROCEDURES were followed, but are not REPRSENTATIVELY correct, i.e. what they show either is a poor representation of reality or is invalid in that its actual, not mathematically correct, uncertainty is terrible.
Lewandowsky/whatever worked his numbers but they do not represent the skeptic community. The actual uncertainty by any standard other than internal mathematics is huge; the results are invalid. Marcott and Mann worked their numbers but had to splice on the instrument record because the proxy data fails to match the instrument data; the proxy-to-instrument data correlation is hugely uncertain.
“97%” of scientists support CAGW: this is another mathematically correct statement BY PROCEDURE but not by representation. It is definitely uncertain and (by experience) invalid. But the math is good.
Perhaps statistics has a term for this, but I’ll give one: Procedural Certainty vs Representational Certainty.
Innocent people go to jail because of the lack of public recognition of the difference. Procedural certainty says you murdered someone because you knew him and the murder weapon (sans fingerprints) was found at your house. Representational certainty says you murdered someone because the closed circuit cameras filmed you doing it.
There is a fundamental gap between procedural and representational certainty. Proxies are all about procedure, while instrumental are/can be representational. The step between is filled with uncertainty.
It is this gap that drives me to say Marcott and Mann misrepresent both science and reality. They speak – along with the IPCC – of 95% certainty, but what they mean is procedural certainty. The average citizen and political leader are not educated and suspicious enough of experts of any stripe to know the difference, because in life we are concerned with representational certainty, not procedural.
To wit: The man who says there is no tiger in the bush because it is at night and he can’t see him it, his normal procedure for detecting hungry predators, is recognized as a short-lived fool by those who know that simply using that “proxy” means little at night. Procedurally the tiger isn’t there, but representationally, the tiger is ready for a meal.
The best thing personally about the outrageous climate debate is that it has spurred me to think deeply about not just science and the world, but why people do what they do, especially how they sleep at night (Al Gore sleeps well on a soft pillow of money, and David Suzuki sleeps well with the conviction that, on death, he will be received by God as one of His Humble Saints, if you were wondering what I had decided.)

steveta_uk
April 11, 2013 9:42 am

Doug Proctor, this difference was something I questioned some time ago, when a new set of temp reconstructions had been published, and each one had an error band with the usual 95% certainty.
But the difference reconstructions had quite different, non overlapping values, including the error bands. And the climate scientist involved didn’t seem to understand when I asked how they could claim that the true value lies within these error bands despite the non overlapping results. And I was told that I didn’t understand the math. Daft.

geran
April 11, 2013 9:50 am

Doug Proctor says:
April 11, 2013 at 9:28 am
>>>>>>
Exactly! Well stated!

DirkH
April 11, 2013 10:13 am

Gary says:
April 11, 2013 at 8:07 am
“What else can explain how they completely ignore warnings about the context of a data series except ignorance of a larger perspective? [And don’t say deliberate fraud. That’s going too far, so don’t even go there.]”
Well, they had the best expert for this kind of analysis on board; Marcott, who used it already in his doctoral thesis – with a completely different result. So “ignorance of a larger perspective” can be ruled out.
Which leaves…

markx
April 11, 2013 10:17 am

trafamadore says: April 11, 2013 at 8:43 am
Here is the data part of the abstract from the Marcott paper:
“Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally distributed records. Early Holocene (10,000 to 5000 years ago) warmth is followed by ~0.7°C cooling through the middle to late Holocene (<5000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago. This cooling is largely associated with ~2°C change in the North Atlantic."
Exactly, which part of this abstract is wrong? Because I don't see much about the last 100 years. The only time it comes up is in the discussion, where they are comparing their data to data that they reference from others, which includes the instrument record.

Marvelous stuff, Tramp!
Yes, I am sure their whole paper was only about showing that the early Holocene was hotter than today, then it got a lot colder…
They probably did not really mean to finish on this note:

Global temperature, therefore, has risen from near the coldest to the warmest levels of the Holocene within the past century, reversing the long-term cooling trend that began ~5000 yr B.P.
Climate models project that temperatures are likely to exceed the full distribution of Holocene warmth by 2100 for all versions of the temperature stack (35) (Fig. 3), regardless of the greenhouse gas emission scenario considered (excluding the year 2000 constant composition scenario, which has already been exceeded). By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario (35) based on our Standard 5×5 plus high-frequency addition stack (Fig. 3).

StanleySteamer
April 11, 2013 10:30 am

I tell my Stats students this all the time: Excel (SPSS, SAS, name your software) is not smart enough to know that you are stupid.

markx
April 11, 2013 10:33 am

Doug Proctor says: April 11, 2013 at 9:28 am
“Al Gore sleeps well on a soft pillow of money, and David Suzuki sleeps well with the conviction that, on death, he will be received by God as one of His Humble Saints, if you were wondering what I had decided.”
David Suzuki also sleeps well on a soft pillow of money, albeit a smaller one:
David Suzuki and his registered charity:
• More than US$8 Million in donations in 2010
• US$4 million as his salary
• 16 lobbyists on the payroll
• US$81 million in donations since 2000.
http://ezralevant.com/2011/12/davids-details.html
http://fairquestions.typepad.com/rethink_campaigns/david-suzuki-foundation-70-million.html

1 2 3