Guest Post by David Middleton
The Gorebots are all atwitter about this new paper…
Science 8 March 2013:
Vol. 339 no. 6124 pp. 1198-1201
DOI: 10.1126/science.1228026
A Reconstruction of Regional and Global Temperature for the Past 11,300 Years
Shaun A. Marcott, Jeremy D. Shakun, Peter U. Clark, Alan C. Mix
Surface temperature reconstructions of the past 1500 years suggest that recent warming is unprecedented in that time. Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally distributed records. Early Holocene (10,000 to 5000 years ago) warmth is followed by ~0.7°C cooling through the middle to late Holocene (<5000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago. This cooling is largely associated with ~2°C change in the North Atlantic. Current global temperatures of the past decade have not yet exceeded peak interglacial values but are warmer than during ~75% of the Holocene temperature history. Intergovernmental Panel on Climate Change model projections for 2100 exceed the full distribution of Holocene temperature under all plausible greenhouse gas emission scenarios.
Marcott et al., 2012 is behind a paywall; however the supplementary materials include a link to their proxy data.
This paper appears to be a text book example of creating a Hockey Stick by using a low resolution time series for the handle and a high resolution time series for the blade…
Let’s test one of the 73 proxies.
I picked ODP-1019D, a marine sediment core from just offshore of the California-Oregon border because it has a long time series, is a an annual reconstruction and has a nearby long time series instrumental record (Grants Pass OR).
ODP-1019D has a resolution of 140 years. Grants Pass is annually resolved…
Let’s filter Grants Pass down to the resolution of the Marcott et al. reconstruction…
Grants Pass sure looks very anomalous relative to the rest of the Holocene… Right?
Well, not so fast. ODP1019D only has a 140-yr resolution. The record length at Grants Pass is less than 140 years. So, the entire Grants Pass record would be a single data point in the ODP-1019D record…
While, the most recent ~140 years might be warmer than most of the rest of the Holocene in this particular area, does anyone else notice what I did?
The Grants Pass/ODP-1019D area has been warming at a fairly steady rate for 6,500 years…
I don’t know how many of these proxies I will have time to analyze… Probably not very many. Maybe this could become a WUWT crowd-sourcing project.
Title revised per notes below, 03-11-2013. Mod]
David, James and Hank have been all over it…..have a look
http://suyts.wordpress.com/2013/03/11/the-dagger-in-the-heart-maybe-a-remedial-explanation-of-marcotts-hs-blade-mikey-whats-that-about-a-dagger/
…and here
http://suyts.wordpress.com/2013/03/10/the-hockey-stick-resurrected-by-marcott-et-al-2012/
astonishing… climate science really has nothing to do with science anymore. how on earth can this pass any review when the error in this new hockeystick is this bad.
thanks for exposing this infantile attempt to fraud. i hope they will get excommunicated from their universities asap.
Best explanation for hockey-sticking I have seen. Thank you. Book-marked and saved.
Unless bandwidth is reasonably constant across the entire time period, the claims on rates mean nothing. Grafting a low bandwidth recontruction on to a high bandwidth time series makes for garbage.
I look forward to the crowd-sourcing.
So far this looks like the equivalent of the “one tree in Yamal” phenomenon 🙂
“This paper appears to be a text book example of creating a Hockey Stick by using a low resolution time series for the handle and a high resolution time series for the blade…”
You established your point beautifully. No, Marcott and friends did not use exactly Mikey’s “Nature trick” but they showed imagination and invented another trick that has the same effect as Mikey’s “Nature trick.” It is a “trick” in the bad sense of the word. It misrepresents data. It gives an illusion of something not only important but ominous where the truth is altogether mundane.
You have to wonder if some of these climate scientists, so-called, are ex-announcers for the NBA. You know, “He shoots from downtown! and puts it in!!!”
Elegant work!!!
Title and text should all say “2013” not “2012” for Marcott et al. (2013)
So this paper is yet another example of the trick of attaching (adjusted, of course) temperature data to the end of a proxy record. Not surprising that we get another hockey stick, though this one seems a little bent — perhaps it’s been used as a weapon since we last saw it?
Era of the Pharaohs: Climate was HOTTER THAN NOW, without CO2 And yet … Alexandria was NOT a flooded island. Weird
http://www.theregister.co.uk/2013/03/11/holocene_was_warmer/
That was my initial citation. For some reason, I changed it to 2012. However, it does appear that the correct citation should be 2013. Hell, some days I can’t even remember what day it is… 😉
Can we see the “peer reviewers comments” that passed this crazy paper ?
The time resolution of individual proxy fossil data is greater than the entire period of surface temperature measurements. You cannot draw any conclusions. An analogy would be that of comparing your monthly averaged speed of car travel with one trip down a German autobahn yestreday.
Thanks for the link to Supplementary Materials. At first glance I don’t see the code.
Steven Mosher–I know you often ask for code. Would you be interested in asking for it here? I mention that because if Steve McI asks for it it will no doubt set off head explosions in the Alarmosphere.
Can you imagine if people like those are on this site (thanks) would not be here??
These guys would get away with this!!!
Oh wait,…. Al Gore,
stupid me
Isn’t there a penalty for ‘high-sticking’ in hockey?
Alfred
It looks like you placed the point for Grant’s pass at year 2000, but if it is really 124 year long record, shouldn’t you have placed the center point of that data point at T=(2013-62) or the year 1951 to maintain the correct position in time?
This highlights a question I have been asking about for some time: are all of our temp profiles comparing the past to the present a case of “Apples and Oranges”?
The Greenland core data etc. looks to me like a data set with a >70 year averaging function. All the short-term highs AND lows have been disappeared. If we were to apply such a function to the last 200 years, I betcha we will see what we see here: nada of interest.
When can we expect to see the hockey-stick team puting ‘Cows’ into their graphs, or at least, splicing thermometers onto their asses ??/sarc
http://wattsupwiththat.com/2013/03/08/a-bridge-in-the-climate-debate-how-to-green-the-worlds-deserts-and-reverse-climate-change/#more-81728
Most grateful to David Middleton for pointing out the chalk-and-cheesy technique of splicing hi-res real-world data on to lo-res proxy data. In my expert review of the next daft of the IPCC’s Fifth Assessment Report I’ll make sure proper attention is given to this point.
Just one small correction to the text: where the head post there has been fairly steady warming for 6500 years it should perhaps say 4500 years.
It may be more helpful to look at this data in terms of high frequency (or short wavelength) and low frequency (or long wavelength). The Grant’s Pass temperature record as shown appears to have no low frequency component to it, so it apparently contains no trend data on a scale (or wavelength) of one hundred years. Consequently, the Grant’s Pass temperature records would not merit even a single data point. Only the marine sediment proxy (ODP-1019D) would appear to have any low frequency (and, therefore, trend) information associated with it.
Latitude says:
March 11, 2013 at 2:31 pm
David, James and Hank have been all over it…..have a look
http://suyts.wordpress.com/2013/03/11/the-dagger-in-the-heart-maybe-a-remedial-explanation-of-marcotts-hs-blade-mikey-whats-that-about-a-dagger/
=====================================================
Thanks Lat!
David, have a check. Feel free to use/borrow/take whatever you feel can better articulate the thought. You’re spot on! If you’re like me you had to stare at it for a while to see what the heck those pinheads did.
As Lat mentioned, if you want to see a different view, and a bit more heady for the stat nerds, Hank did a bang up job here….. http://suyts.wordpress.com/2013/03/10/the-hockey-stick-resurrected-by-marcott-et-al-2012/
Latitude says:
March 11, 2013 at 2:31 pm
“David, James and Hank have been all over it…..have a look
http://suyts.wordpress.com/2013/03/11/the-dagger-in-the-heart-maybe-a-remedial-explanation-of-marcotts-hs-blade-mikey-whats-that-about-a-dagger/”
Everyone has to read this one. You great-grand can understand it and it is brilliant.
I saw this same technique applied a few years ago to a “hockey stick” of CO2 levels. They had grafted data from Antarctic ice cores to the modern Mauna Loa record showing an alarming increase in CO2 levels. Of course, in the desert of Antarctica snow gets blown and sifted by wind for years before it is compacted to ice so it is low-resolution compared to Mauna Loa data. Just another way to lie with statistics.
“This paper appears to be a text book example of creating a Hockey Stick by using a low resolution time series for the handle and a high resolution time series for the blade…”
That’s a reasonable criticism. And it shows a proper appreciation of what a HS is – a proxy shaft, with all its faults, and an instrumental blade. They go together.
However, the spike of the last century does have special status, apart from being well observed. It follows the insertion of 350 Gtons C into the atmosphere from fossil fuels. That hasn’t happened before. And when arguing about whether the rise has produced temperatures higher than at some point in the Holocene, and whether some past times might have been warmer, remember that there are at least 3500 more Gtons to burn.
“It’s worserer than we thunk! Hockey stickerish with super extra scary sauce!!!”
Given the simplicity of the peer review we just witnessed at http://suyts.wordpress.com/2013/03/11/the-dagger-in-the-heart-maybe-a-remedial-explanation-of-marcotts-hs-blade-mikey-whats-that-about-a-dagger/
…what on earth possessed Marcott et al to value being included in AR5 but be ridiculed by everyone else?
Oops! of course the reviewers are…”D***ers”. What a parallel reality!
“This paper appears to be a text book example of creating a Hockey Stick by using a low resolution time series for the handle and a high resolution time series for the blade…”
yup.
However, it’s trivially true since the blade is always higher resolution from the shaft.
The shaft almost always suffers from a loss of high frequency signal.
The real job will be addressing how they accounted for the potential loss of high frequency signal.
You’ve merely pointed out what they admit. It a clear way, I’ll add.
I think one of the problems is that there are just so many climate scientists.
In order to get published, in order to stick out so that professorships are offered, individual scientists have to extend the envelope and publish even wierder stuff than has gone before in order to get noticed.
Nobody wants to see yet another journal article that says “we ran a climate model and came up with +3.25C by 2100”. That’s been done 100 times already.
Nobody wants to see yet another article that says there is some uncertainty regarding the past climate. That’s been done 100 times already. Medieval Warm Period – 200 times already. Ice cores – 50 times.
Global warming will bring back Unicorns – now that is going to get you noticed. Global warming will extend Ebola infections to London by 2020 – now that is going to get you to some conferences.
Replicate Mann’s phony hockey stick – done a dozen times already – but Mann will put a good word in for you when you need a fellowship or an Assistant Professor position. Nuff said.
Posted this at bishophill – think that this may be appropriate here too.
It’s already been accepted by Climate Psyientists that traditional paleo-proxies such as tree-rings haven’t performed to specification for the last half-century or so.
The reasons for this are steeped in highly complex post-normal calculations involving quasi-mechanical teleconnections; a branch of funding-mathematics that tranforms a political viewpoint into simple graphical outputs that can be displayed in Powerpoint by many and, it has been rumoured, even in Excel by a few highly-skilled eggheads!
The problem of how to fix the Anthropocenic proxy-problem occupied the waking and sleeping hours of some of the worlds brightest Nobel laureats-in-waiting for days on end until Dr Mann had his Eureka moment!
With hindsight his solution was both simple and brilliant, so simple that it didn’t need explaining and so brilliant that it illuminates the way for current groundbreakers such as Marcott 2012 et al.
I refer, of course, to Mikes ‘TRICC” (commonly misspelt as Trick) – an acronym for Transfer of Revenue Into Climate Coffers.
His thinking – I hypothesise – went a bit like this –
(A) The Paleo-proxies represent temperature.
(B) Recent proxies don’t give us the correct temperatures.
(C) But we’ve got the temperatures we need without using proxies.
thus (and this bit is pure Genius)
(D) We’ll use temperature records as a proxy for.. drumroll … TEMPERATURE!
Dr Marcott et Al (izzat thanks for the Internet-thingy Al?) have indeed stood on the shoulders of a giant.
Nick Stokes says:
March 11, 2013 at 4:46 pm
, remember that there are at least 3500 more Gtons to burn.
================
I’ve always wondered what type of person would look at this and think “Oh no, we’re getting warmer”…..
http://www.foresight.org/nanodot/wp-content/uploads/2009/12/vostok.png
Mosher–Do you have their code? It might explain how they deal with the loss of high frequency signal.
Aw, Nick, your fears provoke you to ‘fake but accurate’.
======
How do you know that hasn’t happened before? Ever hear of an exposed coal seam? Coal deposits have been exposed, eroded, and naturally burning for millions of years. How much coal has been released into the atmosphere in the past? Similarly, we only see the petroleum deposits which still exist — petroleum has been seeping to the surface for a long time also.
And 4,000 years ago, a lot of forests got burned off when agriculture began. I have no idea how many Gtons of C were released by that… nor how many times before that were continent-wide forests burned off.
It seems to me that Andrew Revkin’s caveats with regard to Marcott and friends substantially agrees with the views expressed on this forum:
“Because the analysis method and sparse data used in this study will tend to blur out most century-scale changes, we can’t use the analysis of Marcott et al. to draw any firm conclusions about how unique the rapid changes of the twentieth century are compared to the previous 10,000 years. The 20th century may have had uniquely rapid warming, but we would need higher resolution data to draw that conclusion with any certainty. Similarly, one should be careful in comparing recent decades to early parts of their reconstruction, as one can easily fall into the trap of comparing a single year or decade to what is essentially an average of centuries. To their credit Marcott et al. do recognize and address the issue of suppressed high frequency variability at a number of places in their paper.”
However, Marcott and friends, and especially the project manager at NSF, present the work as if it is the hockey stick resurrected. They know what they have done, know that it was wrong, and know that they should apologize for it. Of course, now it is too late after the proCAGW media blitz, except for the necessary apology to the American taxpayer (NSF, ya’ know).
If I am in error in this post, I would appreciate correction.
Nick Stokes says:
March 11, 2013 at 4:46 pm
It follows the insertion of 350 Gtons C into the atmosphere from fossil fuels… remember that there are at least 3500 more Gtons to burn.
===========
so 91% is still in the ground!! so much for “peak oil”.
no use leaving the stuff in the ground. dig it up and pay down the debt so our children and grandchildren wont be mortgaged to the Rothschilds. Otherwise, it is no different than being born into slavery.
Of if that doesn’t grab you, dig it up so we don’t send hundreds of trillions of $$ to the middle east to put nuclear weapons in the hands of folks that have been told by god they need to destroy us to save us.
You are going to need something a whole lot bigger than a fossil fuel powered air-conditioner to deal with a nuke in the neighborhood. 911 was a warning of what is likely to follow if you continue to pour your gold into the hands of your enemies. And if you think de-industrializing will make them like you better, well the poor need saving along with the rich.
Nick Stokes;
However, the spike of the last century does have special status, apart from being well observed. It follows the insertion of 350 Gtons C into the atmosphere from fossil fuels. That hasn’t happened before.
>>>>>>>>>>>>>>>>>>
It also follows the invention of the wheel. That hasn’t happened before. And writing. That hasn’t happened before. And sailing ships. That hasn’t happened before either.
Get real. The “spike” isn’t actually a spike unless you torture the data to make it look like a spike, and all sorts of things happened before it besides CO2 increases. Not to mention that the spike has had a blunt flat top for 17 years despite more gigatons C in the last 17 years than EVER before. Not to mention that CO2 has been higher in the past and it matters not one bit what the source was, it could have come from pink faeries or visitors from Alpha Centauri…it did diddly squat to temps so trying to claim that because it came from fossil fuels makes it different is just nonsense. It doesn’t even rise to nonsense.
I’m greatly depressed that climate “science” has been reduced to such chicanery. I’m also angry that peer review has been responsible to the publishing of such nonsense. Doesn’t anyone bother to examine the information presented in peer review?
AnonyMoose says:
March 11, 2013 at 6:01
“How do you know that hasn’t happened before?”
Because no evidence has been provided that it has. JP
The trailing average of an instrumental record is more “apples & apples” with a geologic proxy.
I still feel compelled to say …
For the period 1960-2012, CO2 goes from 316-385 ppm and HADCRUT3 global from ~287.7-288.4K.
So while CO2 increases 21%, temperature increases 0.24%. What part of the 0.24% is due to man-converted CO2?
Lord Moncton, I am truly humbled by your compliments. In this particular area, the warming appears to have begun around 4500 BC. This would make ~6500 of warming.
Bill Illis says:
March 11, 2013 at 5:07 pm
“I think one of the problems is that there are just so many climate scientists.”
Aside from the unregulated flood of funding into climate science and all things CAGW, the huge expansion in the number of graduate programs in climate science has been unconscionable. More radicals tenured for life and forever begging/threatening for taxpayer support.
Steve, I appreciate the kind words and always welcome your constructive criticism.
My view on handling the high frequency loss in the proxy data is to filter the higher frequency signal down to the lowest frequency proxy series, unless there’s something that can be used as a deconvolution operator to “recover” the higher frequencies of the low frequency proxy signal. I don’t think Marcott et al. did this; but I haven’t read the paper. From what I can tell, their stacking procedure appeared to be very flawed. A big like stacking raw shot records without proper move out corrections.
Steven Mosher says:
March 11, 2013 at 5:01 pm
“The real job will be addressing how they accounted for the potential loss of high frequency signal.”
Done. And what the accounting reveals is not pretty.
http://suyts.wordpress.com/2013/03/11/the-dagger-in-the-heart-maybe-a-remedial-explanation-of-marcotts-hs-blade-mikey-whats-that-about-a-dagger/
Two words: Plant stomata.
A typo in my reply to Steve Moshser…
A bit like stacking raw shot records without proper move out corrections.
I take it, then, that livestock grazing in Greenland has never been better.
I’m sorry Dave….
“Lord Moncton, I am truly humbled by your compliments”
“but I haven’t read the paper”
Brandon had a comment at CA with links to graphs of all of the individual proxies for the Marcott et al. (2013) study:
http://climateaudit.org/2010/02/03/the-hockey-stick-and-milankovitch-theory/#comment-403822
It doesn’t prove anything without proper statistical analysis but it does indicate how “curious” it is that the paper could get such a hockey stick out of these proxies! Few if any of the proxies offer any (visual) support for the blade of the hockey stick, so what is in the stats “food processor” that produces such an outcome?
Surely one of the comments above have questioned how you can see a 30 or 50 yr rise in contemporary temp in series in a 140 yr resolution proxy? Sorry I don’t have the time to get thru all the comments. Is this is debunking or silliness?
see Robert Rohde to Andy Revkin
I’m no scientist but as soon as I read over the SI it had me thinking that the study cannot treat intervals of less than a few centuries with high confidence at all. Thus, how can Mann & friends already be saying that the latest hockey stick shows something decisive about an unprecedented “rate” of temp. change in any part of the 20th century? We can’t know from this study what rates of temp. change might have existed before the 20th century…. (and it’s not clear how they got their hockey stick blade at all). The pre-20th centuries are much smeared/blurred with this method. One could not know if there were rapid increases (or decreases) in certain prior periods under 300-400 years.
fyi, Robert Rohde notes that the study’s methods lack resolution in 100-400 year intervals, hence one cannot directly compare recent temp. changes to past centennial events:
Theo Goodwin says:
March 11, 2013 at 4:20 pm
Latitude says:
March 11, 2013 at 2:31 pm
“David, James and Hank have been all over it…..have a look
http://suyts.wordpress.com/2013/03/11/the-dagger-in-the-heart-maybe-a-remedial-explanation-of-marcotts-hs-blade-mikey-whats-that-about-a-dagger/”
Everyone has to read this one. You great-grand can understand it and it is brilliant.
==============================================================
Theo, thank you for the kind words. It’s really appreciated. It takes me back to a time when I regularly played here!
Anthony, you are welcome to any, all, or part of the post if you wish. I was worried that I hadn’t got quite the right mix between being explanatory enough and sophisticated enough. But, after a few comments on my blog and a few from people like Theo, I’m satisfied I hit what I aimed for.
• • •
Reply: As long as we’re being so kissy-face, I should mentuion that your kind comments are always appreciated here. — mod.]
I think that the Suyts article referenced above by Theo Godwin is excellent for the layman to grasp these parlour tricks. What’s going on here is so dishonest for a subject so important that it would put many a Victorian seance charlatan to shame.
I would encourage Anthony to explore the possibility of carrying that post here at WUWT to give it some more exposure.
Monckton of Brenchley says (March 11, 2013 at 4:06 pm): “In my expert review of the next daft of the IPCC’s Fifth Assessment Report I’ll make sure proper attention is given to this point.”
OK, now I’m going to be up all night wondering if that was deliberate…
“When hockey players are choosing a hockey stick they tend to be pretty picky about what they like, and what they simply can not stand to play with. The decision mainly comes down to the “feel” of the hockey stick. “Feel” is the term used to describe the player’s ability to accurately sense and control the puck with the hockey stick that they are using. This includes both offensive and defensive maneuvers, which consist of but are not limited to puck-handling, shooting, passing, reach and poke-checking. As a player develops these skills, they also acquire a better sense of which stick will do the trick for them.”
http://www.hockeygiant.com/hockey-giant-buying-guide-how-to-buy-hockey-sticks.html
If graphing two different things, two different graphs should be used. This “graph” is not a graph at all, but merely 100% falsehood. The Autobahn analogy above is very apt.
What sort of calibration was used for which of these 73 “proxies?” Another falsehood.
“Climate Science” needs a trial before a jury, with an aggressive prosecuting attorney. Let’s hope EPA does something stupid enough to achieve this goal…
Dave, have a look at TN057-17. It ranges up to 1950 but swings wildly around the mean with variations > 4C between samples. It is sensitive to something but it doesn’t appear to be temperature and makes one of it’s wild excursions right at 1950 (BP=0). It has me scratching my head as to why it was included as one of the proxies. What are your thoughts?
Regards,
Hank Hancock
Reply: As long as we’re being so kissy-face, I should mentuion that your kind comments are always appreciated here. — mod.]
==============================================
“Kissy-face”. That’s a first for me. Thanks mod]. 😐
ZootCadillac says:
March 11, 2013 at 9:35 pm
I think that the Suyts article referenced above …..
======================================
Thanks Zoot,
I went back and re-read the post. I agree! It’s pretty good! I think it does a fair balance between showing and telling. It’s trick to do so. I’d be more gracious and whatnot, but mod cured me of that! 🙂
Anthony is always welcome to grab what he can use.
Skiphil says:
March 11, 2013 at 8:21 pm
…Few if any of the proxies offer any (visual) support for the blade of the hockey stick, so what is in the stats “food processor” that produces such an outcome?
Turn up the heat with Viadgraph, the new climate virility drug from Hyde D. Klein!
David Middleton, “. I don’t think Marcott et al. did this; but I haven’t read the paper.”
Good to see you could come to conclusions about the paper without reading it. So much easier that way.
The paper talks about resolution in a very weird way (short excerpt provided under fair use):
“Because the relatively low resolution and time- uncertainty of our data sets should generally suppress higher-frequency temperature variability, an important question is whether the Holocene stack adequately represents centennial- or millennial- scale variability. We evaluated this question in two ways. First, [they mistakenly simulated with white noise instead of red noise], with essentially no variability preserved at periods shorter than 300 years…. Second, spectral analysis indicates that the variance of the Holocene proxy stack approaches that of the global CRU-EIV reconstruction of the past 1500 years (2) at millennial time scales and longer (figs. S20 and S23).”
In other words, they found no preservation of variability for short periods even with the wrong kind of simulation (white noise, not red, a red noise simulation would have shown more variability) but that’s ok because the spectral analysis of their reconstruction matches the spectral analysis of the CRU 1500 year reconstruction (like that means anything). Did they even notice that the lack of preservation of variability means they can go no further? But they go further:
“Our global temperature reconstruction for the past 1500 years is indistinguishable within uncertainty from the Mann et al. (2) reconstruction”
So apparently their methodology wins 2 goals to 1. Finally, they have the temerity to add some fake variability back in as red noise in figure 3 (after using white noise incorrectly). In figure 3 they show that the 2000-2009 period is well within the temperature “probability distribution” (I hate those things) of their proxy series plus some red noise.
This paper is pathetic. Was it worth $20? Under normal circumstances for entertainment, yes. But they are using this to maintain the drumbeat of propaganda in the IPCC report, not to add to greater understanding of climate. To answer some people above, Marcott is clearly the next Mann. The climaterati will circle the wagons around him and defend this crap to the death. He will bring in new grants and go on speaking tours.
I wonder if we can look at this from another angle. Looking at the instrumental record as far back as one can (individual records), is there any statistical juice to be extracted from the frequency of spikes or oscillations of decadal, or multidecadal smoothings? Maybe the 60yr cycles? Can we say that in any smoothed record, we can have spikes of, say 3+ sigma 3 times in a century? From discussion of the proxy graph here and elsewhere, it is clear that it has been smoothed perforce by 160+/- year averaging. If it is 100% certain that their were spikes sticking up above this band then there should be a “spikey” strip added to the top and bottom of the gray ribbon. It would at least remove the simple-minded chatter about unprecedented temperature highs…blah blah. Wm Briggs? Steve Mac?
So the time resolution for the paleo data is 140 years on the horizontal axis.
The entire data set on the horizontal axis is 16000 years. In other words the possible error for the paleo data is less than 1% of the horizontal axis.
You could shift the blue line to the left or right by this amount and you would never notice it.
This quiblble is meaningless in the interpreattion of the long term data compared compared with the very raid rise in temperature as measured by instruments for the last century and a half.
I see the word “unprecedented” and know immediately that what follows is twaddle.
@- David Middleton – Re:- previous atmospheric CO2 levels
“Two words: Plant stomata.”
Two more: radiocarbon dating.
If there had been any significant changes in atmospheric CO2 levels during the Holocene it would show up in a very profound correction necessary in the radiocarbon dating curve. The only reason that radiocarbon dating works without a big correction is that solar activity and the carbon cycle have been extremely stable for the recent period.
Until anthropogenic effects intervened….
Give them credit for their honesty and integrity in providing the proxy data on which they based their statistical analysis for all to replicate and review. They may have erred in their approach and overall conclusion but they never erred overall from the true scientific method.
As for all the other Big Climate dogs that ate their own homework may you continue to itch and scratch in purgatory you flea bitten mongrels of politicised, post-normal science.
The reader should be pre-warned by looking at the author list:
Shakun, Mix et al (2013) Science.
In honour of Mikes Nature Trick ™ we now have the Shakun-Mix method.
The paper is pay-walled… The data and general description of the methods are not. I generally don’t purchase access to papers when the data are feely available. However, it is obvious from their reconstruction that they did not reconstitute a high frequency signal in their proxy stacks; nor did they filter the higher frequency proxy and instrumental signals down to the bandwidth of the low frequency proxies.
Nor does anything in the supplemental information suggest that they homogenized the bandwidth of the time series.
I’m giving them some benefit of the doubt that maybe they addressed this issue in the paper itself; however Robert Rohde’s comments suggest that they did not. Although, he did say that they acknowledged the resolution issues.
So we’re left with a press release and media frenzy claiming that the last few decades are essentially unprecedented in the Holocene, even though the paper, the data and the supplemental information don’t support this assertion.
***
Nick Stokes says:
March 11, 2013 at 4:46 pm
However, the spike of the last century does have special status, apart from being well observed. It follows the insertion of 350 Gtons C into the atmosphere from fossil fuels. That hasn’t happened before.
****
“It follows the change from .00028 to .00039 CO2 of the atmosphere from fossil fuels.”
There, fixed it for ya.
δ13C has been quite variable throughout the Holocene. Antarctic ice core derived δ13C values are just as poorly resolved as CO2 concentrations.
See figure 5 in Dusting for Fingerprints in the Holocene.
The actual temporal resolution of the reconstruction is on the order of 400 years. The specific marine core I looked had had a resolution of 140 years.
It’s not a “quibble” about potential errors. It’s basic signal theory. A signal with a 400-year resolution will not reflect century-scale changes, much less decadal scale changes. This is very basic signal theory.
Robert Rohde explained the principle very clearly to Andy Revkin…
@- David Middleton
“δ13C has been quite variable throughout the Holocene. Antarctic ice core derived δ13C values are just as poorly resolved as CO2 concentrations.”
Radiocarbon dating uses the C14 isotope, but I expect you knew that.
The C13/C12 ratio is a measure of the relative role of biomass and geology in the carbon cycle. If the TOTAL atmospheric CO2 level had varied much in the past however it would have diluted the C14 formed from Nitrogen and the radiocarbon correction curve would be much more varied to reflect this.
Leaf stomata are an epigenetic response to many more parameters than just CO2 levels.
Excuse my ignorance but what annual event off the coast of Oregon creates layers of sediment that accurately measure temperature?
This is why the people who reconstruct plant stomata CO2 chronologies control for sunlight, precipitation and other environmental factors.
δ14C has also been highly variable throughout the Late Pleistocene and Holocene.
@- elmer
Excuse my ignorance but what annual event off the coast of Oregon creates layers of sediment that accurately measure temperature?
Summer.
It’s not annual an annual event, like a varved lake bed. It’s the continuous deposition of shells of forams, nanofossils and other critters. The stable isotope concentrations of several elements are effectively geochemical thermometers.
This particular core has a resolution of about 140 years. The average deposition rate was ~0.5 mm/yr. The core was sampled every 5-10 cm (50-100 mm). So each sample represented 100-200 years of deposition.
You have to ask if the science is has settled has claimed , and the situation as urgent has we are told, given ‘climate doom’, why the authors did no make this research freely and easily available rather than have it hide behind a pay-wall . That way the majority of the people could access it and given its ‘settled ‘ all they really could do is admire it anyway so its not has if they have to fear having errors found is it ?
But call me sceptical, but I cannot help but feel that if the situation was has they claim it to be , they would make a lot more effort to get the research , included the data out there to has many people has possible , for after all its about ‘saving the planet ‘
When I look at that Marcott graph of temperature over the last 11,000 years, I see a huge warming period when CO2 levels were very low. Wonder how that happened?…
izen says:
Summer.
Thanks that explains everything.
The money quote is in the supplement where it talks about the transfer function gain (relationiship) between the real signal and the sampled / estimated signal of the real signal.
“The gain function is near 1 above ~2000-year periods,
suggesting that multi-millennial variability in the Holocene stack may be
almost fully recorded. Below ~300-year periods, in contrast, the gain is
near-zero, implying proxy record uncertainties completely
Marcott et al., 2012 24 remove centennial variability in the stack. Between these two periods,
the gain function exhibits a steady ramp and crosses 0.5 at a period of ~1000 years.”
There you have You can not resolve . componets / events below 300 years in the output as they are swamped by the noise/uncertainty.
@- David Middleton
“This is why the people who reconstruct plant stomata CO2 chronologies control for sunlight, precipitation and other environmental factors.”
A key environmental factor that alters plant stomata is temperature.
So what temperature record do the people who reconstruct plant stomata CO2 chronologies use as the control data?
The Marcott-et-al-2013 reconstruction perhaps…..
@-“δ14C has also been highly variable throughout the Late Pleistocene and Holocene.”
Holocene C14 has highly varied by about 3% from cosmic ray variations.
There is no sign of the sort of variations that would indicate significant changes in total atmospheric carbon since that seen during the melt from the last glacial period.
Confirmation bias may be at work here.
http://www.sciencedirect.com/science/article/pii/S0277379113000553
Stomatal proxy record of CO2 concentrations from the last termination suggests an important role for CO2 at climate change transitions
Hank: “Studying the proxies, I discovered that only nine of the 73 proxies contained data that extended to 1950. Of those nine, only two contained data that extended to 2000.”
So, Marcot et al claim what they claim on the basis of two proxies!?! And even those two appear to be cooling in the later part of the 20th century!
There is no evidence that temperature variations affect stomatal densities under natural growth conditions and have rarely exhibited effects on stomatal densities in controlled experiments.
Temperature variations do not affect stomatal frequencies (stomatal index).
See Figure 1 of Marchal, 2003.
δ14C dropped from about 90‰ to about -25‰ from ca. 5500 BC (~7500 yrs. ago) to ca. 500 AD (~1500 yrs. ago), then climbed to about 15‰ by the 1800’s. The recent δ14C depletion is no more anomalous relative to the rest of the Holocene then the δ13C depletion is.
Hello Ivan, yes, that’s correct. All nine proxies extend to 1960, eight extend to 1980, seven to 1990, and two to 2000. There was a slight typo in the text of my article. It should have said nine proxies that extended past 1950. For statistical reasons, it was the ones that extended past 1950 that were the focus of my analysis.
I wonder if there’s an email out there saying, “We’ve got to get rid of the Holocene Warm period…”
Re izen says: March 12, 2013 at 9:24 am
David Middleton says: March 12, 2013 at 9:36 am
elmer says: March 12, 2013 at 11:00 am
Hope I haven’t left anyone out. Another complexity in interpreting sediment data is “bioturbation”, or the churning of the top layer of sediment by benthic creatures. If I recall correctly, that would be about the top 10 centimeters or so. The bioturbation induces an averaging over the number of years based on bioturbation depth divided by sedimentation rate.
I thought the 8.2kyr event was the fastest fall and rise in temperature through the Holocene?
There have been multiple fast rises and falls since the last ice age, documented in high resolution high quality spelethemes. And these events have been global as they have been found synchronized in various locations around the world. The recent rise is not exceptional, neither in its time frame nor in its extent.
“However, the spike of the last century does have special status, apart from being well observed.”
Hmmm… this pretty accurate. The portion of the squiggly line drawings that resemble a spike have indeed been well observed.
The evidence that the spike resembles any kind of reality in any way, has not.
Andrew