Guest Post by David Middleton
The Gorebots are all atwitter about this new paper…
Science 8 March 2013:
Vol. 339 no. 6124 pp. 1198-1201
DOI: 10.1126/science.1228026
A Reconstruction of Regional and Global Temperature for the Past 11,300 Years
Shaun A. Marcott, Jeremy D. Shakun, Peter U. Clark, Alan C. Mix
Surface temperature reconstructions of the past 1500 years suggest that recent warming is unprecedented in that time. Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally distributed records. Early Holocene (10,000 to 5000 years ago) warmth is followed by ~0.7°C cooling through the middle to late Holocene (<5000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago. This cooling is largely associated with ~2°C change in the North Atlantic. Current global temperatures of the past decade have not yet exceeded peak interglacial values but are warmer than during ~75% of the Holocene temperature history. Intergovernmental Panel on Climate Change model projections for 2100 exceed the full distribution of Holocene temperature under all plausible greenhouse gas emission scenarios.
Marcott et al., 2012 is behind a paywall; however the supplementary materials include a link to their proxy data.
This paper appears to be a text book example of creating a Hockey Stick by using a low resolution time series for the handle and a high resolution time series for the blade…
Let’s test one of the 73 proxies.
I picked ODP-1019D, a marine sediment core from just offshore of the California-Oregon border because it has a long time series, is a an annual reconstruction and has a nearby long time series instrumental record (Grants Pass OR).
ODP-1019D has a resolution of 140 years. Grants Pass is annually resolved…
Let’s filter Grants Pass down to the resolution of the Marcott et al. reconstruction…
Grants Pass sure looks very anomalous relative to the rest of the Holocene… Right?
Well, not so fast. ODP1019D only has a 140-yr resolution. The record length at Grants Pass is less than 140 years. So, the entire Grants Pass record would be a single data point in the ODP-1019D record…
While, the most recent ~140 years might be warmer than most of the rest of the Holocene in this particular area, does anyone else notice what I did?
The Grants Pass/ODP-1019D area has been warming at a fairly steady rate for 6,500 years…
I don’t know how many of these proxies I will have time to analyze… Probably not very many. Maybe this could become a WUWT crowd-sourcing project.
Title revised per notes below, 03-11-2013. Mod]
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.






Surely one of the comments above have questioned how you can see a 30 or 50 yr rise in contemporary temp in series in a 140 yr resolution proxy? Sorry I don’t have the time to get thru all the comments. Is this is debunking or silliness?
see Robert Rohde to Andy Revkin
I’m no scientist but as soon as I read over the SI it had me thinking that the study cannot treat intervals of less than a few centuries with high confidence at all. Thus, how can Mann & friends already be saying that the latest hockey stick shows something decisive about an unprecedented “rate” of temp. change in any part of the 20th century? We can’t know from this study what rates of temp. change might have existed before the 20th century…. (and it’s not clear how they got their hockey stick blade at all). The pre-20th centuries are much smeared/blurred with this method. One could not know if there were rapid increases (or decreases) in certain prior periods under 300-400 years.
fyi, Robert Rohde notes that the study’s methods lack resolution in 100-400 year intervals, hence one cannot directly compare recent temp. changes to past centennial events:
Theo Goodwin says:
March 11, 2013 at 4:20 pm
Latitude says:
March 11, 2013 at 2:31 pm
“David, James and Hank have been all over it…..have a look
http://suyts.wordpress.com/2013/03/11/the-dagger-in-the-heart-maybe-a-remedial-explanation-of-marcotts-hs-blade-mikey-whats-that-about-a-dagger/”
Everyone has to read this one. You great-grand can understand it and it is brilliant.
==============================================================
Theo, thank you for the kind words. It’s really appreciated. It takes me back to a time when I regularly played here!
Anthony, you are welcome to any, all, or part of the post if you wish. I was worried that I hadn’t got quite the right mix between being explanatory enough and sophisticated enough. But, after a few comments on my blog and a few from people like Theo, I’m satisfied I hit what I aimed for.
• • •
Reply: As long as we’re being so kissy-face, I should mentuion that your kind comments are always appreciated here. — mod.]
I think that the Suyts article referenced above by Theo Godwin is excellent for the layman to grasp these parlour tricks. What’s going on here is so dishonest for a subject so important that it would put many a Victorian seance charlatan to shame.
I would encourage Anthony to explore the possibility of carrying that post here at WUWT to give it some more exposure.
Monckton of Brenchley says (March 11, 2013 at 4:06 pm): “In my expert review of the next daft of the IPCC’s Fifth Assessment Report I’ll make sure proper attention is given to this point.”
OK, now I’m going to be up all night wondering if that was deliberate…
“When hockey players are choosing a hockey stick they tend to be pretty picky about what they like, and what they simply can not stand to play with. The decision mainly comes down to the “feel” of the hockey stick. “Feel” is the term used to describe the player’s ability to accurately sense and control the puck with the hockey stick that they are using. This includes both offensive and defensive maneuvers, which consist of but are not limited to puck-handling, shooting, passing, reach and poke-checking. As a player develops these skills, they also acquire a better sense of which stick will do the trick for them.”
http://www.hockeygiant.com/hockey-giant-buying-guide-how-to-buy-hockey-sticks.html
If graphing two different things, two different graphs should be used. This “graph” is not a graph at all, but merely 100% falsehood. The Autobahn analogy above is very apt.
What sort of calibration was used for which of these 73 “proxies?” Another falsehood.
“Climate Science” needs a trial before a jury, with an aggressive prosecuting attorney. Let’s hope EPA does something stupid enough to achieve this goal…
Dave, have a look at TN057-17. It ranges up to 1950 but swings wildly around the mean with variations > 4C between samples. It is sensitive to something but it doesn’t appear to be temperature and makes one of it’s wild excursions right at 1950 (BP=0). It has me scratching my head as to why it was included as one of the proxies. What are your thoughts?
Regards,
Hank Hancock
Reply: As long as we’re being so kissy-face, I should mentuion that your kind comments are always appreciated here. — mod.]
==============================================
“Kissy-face”. That’s a first for me. Thanks mod]. 😐
ZootCadillac says:
March 11, 2013 at 9:35 pm
I think that the Suyts article referenced above …..
======================================
Thanks Zoot,
I went back and re-read the post. I agree! It’s pretty good! I think it does a fair balance between showing and telling. It’s trick to do so. I’d be more gracious and whatnot, but mod cured me of that! 🙂
Anthony is always welcome to grab what he can use.
Skiphil says:
March 11, 2013 at 8:21 pm
…Few if any of the proxies offer any (visual) support for the blade of the hockey stick, so what is in the stats “food processor” that produces such an outcome?
Turn up the heat with Viadgraph, the new climate virility drug from Hyde D. Klein!
David Middleton, “. I don’t think Marcott et al. did this; but I haven’t read the paper.”
Good to see you could come to conclusions about the paper without reading it. So much easier that way.
The paper talks about resolution in a very weird way (short excerpt provided under fair use):
“Because the relatively low resolution and time- uncertainty of our data sets should generally suppress higher-frequency temperature variability, an important question is whether the Holocene stack adequately represents centennial- or millennial- scale variability. We evaluated this question in two ways. First, [they mistakenly simulated with white noise instead of red noise], with essentially no variability preserved at periods shorter than 300 years…. Second, spectral analysis indicates that the variance of the Holocene proxy stack approaches that of the global CRU-EIV reconstruction of the past 1500 years (2) at millennial time scales and longer (figs. S20 and S23).”
In other words, they found no preservation of variability for short periods even with the wrong kind of simulation (white noise, not red, a red noise simulation would have shown more variability) but that’s ok because the spectral analysis of their reconstruction matches the spectral analysis of the CRU 1500 year reconstruction (like that means anything). Did they even notice that the lack of preservation of variability means they can go no further? But they go further:
“Our global temperature reconstruction for the past 1500 years is indistinguishable within uncertainty from the Mann et al. (2) reconstruction”
So apparently their methodology wins 2 goals to 1. Finally, they have the temerity to add some fake variability back in as red noise in figure 3 (after using white noise incorrectly). In figure 3 they show that the 2000-2009 period is well within the temperature “probability distribution” (I hate those things) of their proxy series plus some red noise.
This paper is pathetic. Was it worth $20? Under normal circumstances for entertainment, yes. But they are using this to maintain the drumbeat of propaganda in the IPCC report, not to add to greater understanding of climate. To answer some people above, Marcott is clearly the next Mann. The climaterati will circle the wagons around him and defend this crap to the death. He will bring in new grants and go on speaking tours.
I wonder if we can look at this from another angle. Looking at the instrumental record as far back as one can (individual records), is there any statistical juice to be extracted from the frequency of spikes or oscillations of decadal, or multidecadal smoothings? Maybe the 60yr cycles? Can we say that in any smoothed record, we can have spikes of, say 3+ sigma 3 times in a century? From discussion of the proxy graph here and elsewhere, it is clear that it has been smoothed perforce by 160+/- year averaging. If it is 100% certain that their were spikes sticking up above this band then there should be a “spikey” strip added to the top and bottom of the gray ribbon. It would at least remove the simple-minded chatter about unprecedented temperature highs…blah blah. Wm Briggs? Steve Mac?
So the time resolution for the paleo data is 140 years on the horizontal axis.
The entire data set on the horizontal axis is 16000 years. In other words the possible error for the paleo data is less than 1% of the horizontal axis.
You could shift the blue line to the left or right by this amount and you would never notice it.
This quiblble is meaningless in the interpreattion of the long term data compared compared with the very raid rise in temperature as measured by instruments for the last century and a half.
I see the word “unprecedented” and know immediately that what follows is twaddle.
@- David Middleton – Re:- previous atmospheric CO2 levels
“Two words: Plant stomata.”
Two more: radiocarbon dating.
If there had been any significant changes in atmospheric CO2 levels during the Holocene it would show up in a very profound correction necessary in the radiocarbon dating curve. The only reason that radiocarbon dating works without a big correction is that solar activity and the carbon cycle have been extremely stable for the recent period.
Until anthropogenic effects intervened….
Give them credit for their honesty and integrity in providing the proxy data on which they based their statistical analysis for all to replicate and review. They may have erred in their approach and overall conclusion but they never erred overall from the true scientific method.
As for all the other Big Climate dogs that ate their own homework may you continue to itch and scratch in purgatory you flea bitten mongrels of politicised, post-normal science.
The reader should be pre-warned by looking at the author list:
Shakun, Mix et al (2013) Science.
In honour of Mikes Nature Trick ™ we now have the Shakun-Mix method.
The paper is pay-walled… The data and general description of the methods are not. I generally don’t purchase access to papers when the data are feely available. However, it is obvious from their reconstruction that they did not reconstitute a high frequency signal in their proxy stacks; nor did they filter the higher frequency proxy and instrumental signals down to the bandwidth of the low frequency proxies.
Nor does anything in the supplemental information suggest that they homogenized the bandwidth of the time series.
I’m giving them some benefit of the doubt that maybe they addressed this issue in the paper itself; however Robert Rohde’s comments suggest that they did not. Although, he did say that they acknowledged the resolution issues.
So we’re left with a press release and media frenzy claiming that the last few decades are essentially unprecedented in the Holocene, even though the paper, the data and the supplemental information don’t support this assertion.
***
Nick Stokes says:
March 11, 2013 at 4:46 pm
However, the spike of the last century does have special status, apart from being well observed. It follows the insertion of 350 Gtons C into the atmosphere from fossil fuels. That hasn’t happened before.
****
“It follows the change from .00028 to .00039 CO2 of the atmosphere from fossil fuels.”
There, fixed it for ya.
δ13C has been quite variable throughout the Holocene. Antarctic ice core derived δ13C values are just as poorly resolved as CO2 concentrations.
See figure 5 in Dusting for Fingerprints in the Holocene.
The actual temporal resolution of the reconstruction is on the order of 400 years. The specific marine core I looked had had a resolution of 140 years.
It’s not a “quibble” about potential errors. It’s basic signal theory. A signal with a 400-year resolution will not reflect century-scale changes, much less decadal scale changes. This is very basic signal theory.
Robert Rohde explained the principle very clearly to Andy Revkin…
@- David Middleton
“δ13C has been quite variable throughout the Holocene. Antarctic ice core derived δ13C values are just as poorly resolved as CO2 concentrations.”
Radiocarbon dating uses the C14 isotope, but I expect you knew that.
The C13/C12 ratio is a measure of the relative role of biomass and geology in the carbon cycle. If the TOTAL atmospheric CO2 level had varied much in the past however it would have diluted the C14 formed from Nitrogen and the radiocarbon correction curve would be much more varied to reflect this.
Leaf stomata are an epigenetic response to many more parameters than just CO2 levels.
Excuse my ignorance but what annual event off the coast of Oregon creates layers of sediment that accurately measure temperature?
This is why the people who reconstruct plant stomata CO2 chronologies control for sunlight, precipitation and other environmental factors.
δ14C has also been highly variable throughout the Late Pleistocene and Holocene.