Spurious Varvology

Guest Post by Willis Eschenbach

As Anthony discussed here at WUWT, we have yet another effort to re-animate the long-dead “hockeystick” of Michael Mann. This time, it’s Recent temperature extremes at high northern latitudes unprecedented in the past 600 years, by Martin P. Tingley and Peter Huybers (paywalled), hereinafter TH2013.

Here’s their claim from the abstract.

Here, using a hierarchical Bayesian analysis of instrumental, tree-ring, ice-core and lake-sediment records, we show that the magnitude and frequency of recent warm temperature extremes at high northern latitudes are unprecedented in the past 600 years. The summers of 2005, 2007, 2010 and 2011 were warmer than those of all prior years back to 1400 (probability P > 0.95), in terms of the spatial average. The summer of 2010 was the warmest in the previous 600 years in western Russia (P > 0.99) and probably the warmest in western Greenland and the Canadian Arctic as well (P > 0.90). These and other recent extremes greatly exceed those expected from a stationary climate, but can be understood as resulting from constant space–time variability about an increased mean temperature.

Now, Steve McIntyre has found some lovely problems with their claims over at ClimateAudit. I thought I’d take a look at their lake-sediment records. Here’s the raw data itself, before any analysis:

tingley raw dataFigure 1. All varve thickness records used in TH2013. Units vary, and are as reported by the original investigator. Click image to embiggen.

So what’s not to like? Well, a number of things.

To start with, there’s the infamous Korttajarvi record. Steve McIntyre describes this one well:

In keeping with the total and complete stubbornness of the paleoclimate community, they use the most famous series of Mann et al 2008: the contaminated Korttajarvi sediments, the problems with which are well known in skeptic blogs and which were reported in a comment at PNAS by Ross and I at the time. The original author, Mia Tiljander, warned against use of the modern portion of this data, as the sediments had been contaminated by modern bridgebuilding and farming. Although the defects of this series as a proxy are well known to readers of “skeptical” blogs, peer reviewers at Nature were obviously untroubled by the inclusion of this proxy in a temperature reconstruction.

Let me stop here a moment and talk about lake proxies. Down at the bottom of most every lake, a new layer of sediment is laid down every year. This sediment contains a very informative mix of whatever was washed into the lake during a given year. You can identify the changes in the local vegetation, for example, by changes in the plant pollens that are laid down as part of the sediment. There’s a lot of information that can be mined from the mud at the bottom of lakes.

One piece of information we can look at is the rate at which the sediment accumulates. This is called “varve thickness”, with a “varve” meaning a pair of thin layers of sediment, one for summer and one for winter, that comprise a single year’s sediment. Obviously, this thickness can vary quite a bit. And in some cases, it’s correlated in some sense with temperature.

However, in one important way lake proxies are unlike say ice core proxies. The daily activities of human beings don’t change the thickness of the layers of ice that get laid down. But everything from road construction to changes in farming methods can radically change the amount of sediment in the local watercourses and lakes. That’s the problem with Korttajarvi.

And in addition, changes in the surrounding natural landscape can also change the sediment levels. Many things, from burning of local vegetation to insect infestation to changes in local water flow can radically change the amount of sediment in a particular part of a particular lake.

Look, for example, at the Soper data in Figure 1. It is more than obvious that we are looking at some significant changes in the sedimentation rate during the first half of the 20th Century. After four centuries of one regime, something happened. We don’t know what, but  it seems doubtful a gradual change in temperature would cause a sudden step change in the amount of sediment combined with a change in variability.

Now, let me stop right here and say that the inclusion of this proxy alone, ignoring the obvious madness of including Korttajarvi, this proxy alone should totally disqualify the whole paper. There is no justification for claiming that it is temperature related. Yes, I know it gets log transformed further on in the story, but get real. This is not a representation of temperature.

But Korttajarvi and Soper are not the only problem. Look at Iceberg, three separate records. It’s like one of those second grade quizzes—”Which of these three records is unlike the other two?” How can that possibly be considered a valid proxy?

How does one end up with this kind of garbage? Here’s the authors’ explanation:

All varve thickness records publicly available from the NOAA Paleolimnology Data Archive as of January 2012 are incorporated, provided they meet the following criteria:

• extend back at least 200 years,

• are at annual resolution,

• are reported in length units, and

• the original publication or other references indicate or argue for a positive association with summer temperature.

Well, that all sounds good, but these guys are so classic … take a look at Devon Lake in Figure 1, it’s DV09. Notice how far back it goes? 1843, which is 170 years ago … so much for their 200 year criteria.

Want to know the funny part? I might never have noticed, but when I read the criteria, I thought “Why a 200 year criteria”? It struck me as special pleading, so I looked more closely at the only one it applied to and said huh? Didn’t look like 200 years. So I checked the data here … 1843, not 200 years ago, only 170.

Man, the more I look, the more I find. In that regard, both Sawtooth and Murray have little short separate portions at the end of their main data. Perhaps by chance, both of them will add to whatever spurious hockeystick has been formed by Korttajarvi and Soper and the main players.

So that’s the first look, at the raw data. Now, let’s follow what they actually do with the data. From the paper:

As is common, varve thicknesses are logarithmically transformed before analysis, giving distributions that are more nearly normally distributed and in agreement with the assumptions characterizing our analysis (see subsequent section).

I’m not entirely at ease with this log transformation. I don’t understand the underlying justification or logic for doing that. If the varve thickness is proportional in some way to temperature, and it may well be, why would it be proportional to the logarithm of the thickness?

In any case, let’s see how much “more nearly normally distributed” we’re talking about. Here are the distributions of the same records, after log transformation and standardization. I use a “violin plot” to examine the shape of a distribution. The width at any point indicates the smoothed number of data points with that value. The white dot shows the median value of the data. The black box shows the interquartile range, which contains half of the data. The vertical “whiskers” extend 1.5 times the interquartile distance at top and bottom of the black box.

tingley log transform vioplotFigure 2. Violin plots of the data shown in Figure 1, but after log transformation and standardization. Random normal distribution included at lower right for comparison.

Note the very large variation between the different varve thickness datasets. You can see the problems with the Soper dataset. Some datasets have a fairly normal distribution after the log transform, like Big Round and Donard. Others, like DV09 and Soper, are far from normal in distribution even after transformation. Many of them are strongly asymmetrical, with excursions of four standard deviations being common in the positive direction. By contrast, often they only vary by half of that in the negative direction, two standard deviations. When the underlying dataset is that far from normal, it’s always a good reason for further investigation in my world. And if you are going to include them, the differences in which way they swing from normal (excess positive over negative excursions) affects both the results and their uncertainty.

In any case, after the log transformation and standardization to a mean of zero and a standard deviation of one, the datasets and their average are shown in Figure 3.

tingley log transform plus avgFigure 3. Varve thickness records after log transformation and standardization.

As you can see, the log transform doesn’t change the problems with e.g. the Soper or the Iceberg records. They still do not have internal consistency. As a result of the inclusion of these problematic records, all of which contain visible irregularities in the recent data, even a simple average shows an entirely spurious hockeystick.

In fact, the average shows a typical shape for this kind of spurious hockeystick. In the “shaft” part of the hockeystick, the random variations in the chosen proxies tend to cancel each other out. Then in the “blade”, the random proxies still cancel each other out, and all that’s left are the few proxies that show rises in the most recent section.

My conclusions, in no particular order, are:

• The authors are to be congratulated for being clear about the sources of their data. It makes for easy analysis of their work.

• They are also to be congratulated for the clear statement of the criteria for inclusion of the proxies.

• Sadly, they did not follow their own criteria.

•The main conclusion, however, is that clear, bright-line criteria of the type that they used are a necessary but not sufficient part of the process. There are more steps that need to be followed.

The second step is the use of the source documents and the literature to see if there are problems with using some parts of the data. For them to include Korttajarvi is a particularly egregious oversight. Michael Mann used it upside-down in his 2008 analysis. He subsequently argued it “didn’t matter”. It is used upside-down again here, and the original investigators said don’t use it after 1750 or so. It is absolutely pathetic that after all of the discussion in the literature and on the web, including a published letter to PNAS, that once again Korttajarvi is being used in a proxy reconstruction, and once again it is being used upside-down. That’s inexcusable.

The third part of the proxy selection process is the use of the Mark I eyeball to see if there are gaps, jumps in amplitude, changes in variability, or other signs of problems with the data.

The next part is to investigate the effect of the questionable data on the final result.

And the final part is to discuss the reasons for the inclusion or the exclusion of the questionable data, and its effects on the outcome of the study.

Unfortunately, they only did the first part, establishing the bright-line criteria.

Look, you can’t just grab a bunch of proxies and average them, no matter if you use Bayesian methods or not. The paleoproxy crowd has shown over and over that you can artfully construct a hockeystick by doing that, just pick the right proxies …

So what? All that proves is yes indeed, if you put garbage in, you will assuredly get garbage out. If you are careful when you pack the proxy selection process, you can get any results you want.

Man, I’m tired of rooting through this kind of garbage, faux studies by faux scientists.

w.

Get notified when a new post is published.
Subscribe today!
5 1 vote
Article Rating
91 Comments
Inline Feedbacks
View all comments
Chris Wright
April 14, 2013 4:13 am

It’s easy to joke about “upside-down Mann” etc etc but this whole issue strikes me as being absolutely outrageous.
As I understand it, Mann’s method in Mann 2008 simply used absolute values and ignored the sign. Amazing, but true. His method had the hockey stick assumption (with the blade going upwards) built in. This is obviously unacceptable, but it could conceivably have been an honest mistake. However, Mann is clearly aware of this problem and he has done nothing to rectify it. Because of this, what might have been an honest mistake becomes scientific misconduct or even outright fraud.
It’s inconceivable that the present authors were unaware of the upside-down problem. If so, then they would also be open to accusations of scientific misconduct or fraud.
I’m assuming that the Korttajarvi upside-down claim can be proven. If so, then there would be a clear-cut case of scientific misconduct against both Mann and the present authors. Of course, the fact that Tiljander clearly warned that the data could not be used for 19th and 20th century reconstructions, due to local industrial activity, would hardly help their case.
There does appear to be a strong case for scientific misconduct or fraud. So why are none of the bodies who are supposed to be protecting the integrity of science doing nothing? I think we know the answer to that.
By the way, I think nature does tend to be logarithmic. Some of you may be familiar with Benford’s law. It explains why, for instance, the first pages of books of logarithms were more grubby than the rest. If you look at a list of share prices you should find that around a third of the prices begin with the number one. It’s quite remarkable. It seems that any list of measurements of natural variation follow the law. It has actually been used to detect financial fraud. Who knows, maybe it could be used to detect climate science fraud…..
Quite possibly Benford’s law arises because natural processes do tend to follow a logarithmic law.
Chris

TerryMN
April 14, 2013 4:58 am

Willis – this paper does not use the Kortajarvi proxy “upside down.” The problem is that they use the contaminated portion post 1720.
The 3 proxy series that can be generated from the core are thickness, X-Ray density lightsum, and X-Ray density darksum. The densities represent contribution of plants vs. minerals in the sediment – I forget which is which. Mann used one of the density series “upside down” in that it has an inverse relationship w/temp.
If TH2013 are only using thickness, they’re using the correct orientation – but should NOT have used values in the modern, contaminated portion of the series. Regards – Terry

Robert of Ottawa
April 14, 2013 5:52 am

I don’t understand the “butterfly” plots.
I think the log transformation is to account for layer thickness compression.

April 14, 2013 6:03 am

johnmarshall says:
April 14, 2013 at 2:48 am
Thanks Willis.
But what does ”embiggen” mean? Does ”Enlarge” just about cover it?
==========================================================
A classic Simpsons line:
“A noble spirit embiggens the smallest man”
– Jebediah Springfield
http://www.urbandictionary.com/define.php?term=Embiggen

McComber Boy
April 14, 2013 6:21 am

“These and other recent extremes greatly exceed those expected from a stationary climate, but can be understood as resulting from constant space–time variability about an increased mean temperature.” I’m not sure why Mike B the Chead in Switz got snipped for commenting, but this sentence, to me, is the crux of the problem.
It doesn’t matter where the varves of the valves meet the cool of the day if your premise, the very basis of your research is wrong. We do not live in a static climate. When this is the assumption of TH2013 and they go looking for it, why of course they will find it. In essence they are saying that they just felt it was too warm. So they read the hype and looked for confirmation of the hype among the writings of other confirmation confabulists and data contortionists.
It becomes evident daily that we need more Liu Yu’s doing his ten years of Tibeten field trips and analyzing his own data and fewer folks throwing the work of others into the grand bingo hopper of science and pasting up the data of others even when it’s upside down and backward.

GregK
April 14, 2013 6:21 am

A possibile explanation for the non-normal and non-lognormal distributions of varve thicknesses may be that the data for single sites include several populations with different means/medians which result from different climatic conditions. For instance in the Big Round data the varve thicknesses between around 1580 to 1780 are thinner than for the periods before and after suggesting lower rainfall [maybe] during this period. More to the point, the data from 1580 to 1780 seem to form a separate population and should not be lumped together with the earlier and later data in a statistical analysis.
And as for this…
“As is common, varve thicknesses are logarithmically transformed before analysis,..”
The authors need to show that varve thicknesses are log normally distributed before applying a log transformation.
Otherwise they are doing it simply because….. “As is common, varve thicknesses are logarithmically transformed before analysis,..”

oMan
April 14, 2013 6:55 am

Great take-down, Willis. It only takes a few bad apples (proxies like Korttajarvi, Soper etc) to contaminate the whole barrel (the spurious “blade” on the average of them all).

April 14, 2013 7:12 am

The failure to follow their own proxy criteria is worse than you portray. The readily available on line abstract for Ogac Lake specifically says a temperature record IS NOT DISCERNABLE. It was use anyway.
The big giveaway is he lack of an LIA in the conclusion, which comes partly from data massage and partly from proxy selection. For example,mthe criteria meant the recent Braya So multi proxy series from Greenland, published in Nature last year, was omitted. It has a three year resolution, and of course shows the LIA as well as fairly continuous warming since its end around 1800– not 1900. Just like the Greenland Alley ice core does. And this paper’s selection of proxies doesn’t.

Elizabeth
April 14, 2013 7:22 am

Any paper that has to be withdrawn methinks.

Elizabeth
April 14, 2013 7:25 am

Another paper that has to be withdrawn!

John West
April 14, 2013 7:29 am

That last “criteria”: “the original publication or other references indicate or argue for a positive association with summer temperature” is hardly a criteria at all. I’m sure I could find “references” that “argue for” phlogiston or just about anything else.
Obviously, they only actually used one criterion: will it aid in realizing results “consistent with” Catastrophic Anthropogenic Global Warming.

j ferguson
April 14, 2013 7:34 am

Thank you Chris Wright for alerting us to Benford’s Law. What an astonishing law it is.

Downdraft
April 14, 2013 7:51 am

So another proxy study employing proxies with questionable qualities, and including data they should know is garbage, to which was spliced the instrumental record for only the last few years, and then the expected claim is made that those years were the hottest in the series. But those spliced records are not in the series. It appears they had this all planned out before they started.
It would be interesting to overlay the central England record on top of all of the proxies and see how they compare. As a starter, if there is no close comparison in the overlap years, toss it all out.
Also, I have a gut feeling that the thickness of a lake sediment probably has even less to do with temperatures than tree rings, and more to do with local weather events like heavy rain, forest fires, windstorms, etc.. Was any isotope analysis done of the sediments? Where can we see proof that these varves tell us anything about temperature?

Theo Goodwin
April 14, 2013 8:12 am

Willis writes:
“Look, you can’t just grab a bunch of proxies and average them, no matter if you use Bayesian methods or not. The paleoproxy crowd has shown over and over that you can artfully construct a hockeystick by doing that, just pick the right proxies …”
That hurts. That’s the way Alarmists roll. They practice science by press release only. Too bad that Nature has become nothing more than a public relations agency for Alarmists,

April 14, 2013 8:14 am

Look, you can’t just grab a bunch of proxies and average them, no matter if you use Bayesian methods or not.

Exactly. For the same reason that “global temperature” has no physical meaning.
Even IF (and that’s a big if) these proxies are adequate to represent temperature, it’s obvious that they have their own climate regimes. There is no “teleconnection”. Global Warming isn’t global.

Pamela Gray
April 14, 2013 8:15 am

The Willamette Valley lake and wet land sediments would also be affected but not just by pioneer farmers. The indian population in that fertile valley figured out way before pioneers did, that farming was the way to go. They cleared land, burned on a seasonal basis, planted, and harvested their food. And were quite industrious at it! To a large degree, history, and in particular AGW history, fails to acknowledge American Indian land use prior to that of whites. Truth be told, they were as good, and bad, at it as whites were. To the extent that these layers would have to be peeled back further than one would think before getting sediments that are devoid of human influence.

Theo Goodwin
April 14, 2013 8:23 am

What a treasure Willis is. In addition to his first rate knowledge of science and statistics, he takes you right to the lake itself. What a wealth of knowledge and common sense he has.

rgbatduke
April 14, 2013 8:36 am

If you want to show “unprecedented warming over 600 years”, that’s really fishing with dynamite anyway. 600 years is easy, because 1400 was solidly after the MWP. In fact, it is well down the side of the slope from the MWP down to the LIA, which is tied for the coldest single period in the entire Holocene post YD. The start date is not, I’m sure, overtly “cherrypicked”, but the conclusion is utterly unsurprising and probably even true, to the extent that the present represents temperatures last “precedented” in the MWP, where the Earth managed them without the help of anthropogenic CO_2. Run the same study back to (say) 800 CE or 500 BCE, if you could, and you might find that the present even with the problems pointed out by Willis is hardly unprecedented.
Why do we keep seeing that word, unprecedented? Because it is indeed a key term in the dialectic of post-Mannian hockey-shtickery. Go back in time to the very AR report where Mann’s hockey stick was promoted from the first paper of an unknown researcher who was so bad at what he did that he actually wrote his own PCA code in Fortran (instead of using R, or SAS, or Stata, or any of the well-written, debugged, readily available tools) and then tweaked it with the GOAL (as revealed in the Climategate letters) of “erasing the LIA and MWP”. There is a paper and graphic there by (IIRC) Jones or Briffa that clearly showed both, with the present pretty much indistinguishable from the MWP.
How can you convince the world to spend its scarce political capital and economic wealth on “global warming” to the substantial enrichment of selected groups and with the promotion of a profound anti-civilization agenda that de facto freezes 2/3 of the world in abject poverty? Simple. Show that the climate of the present is “unprecedented”, because if it is precedented it confounds the assertion that CO_2 is a necessary adjunct to its unprecedented behavior.
And indeed it is unprecendented, if you get to choose the start point of the interval that you look at. It is unprecedented over the last 100 years. It is unprecedented over the last 200, 300, 400, 500, 600, 700, 800 years. Go back 900 years and you start to hit an interval that was — within our ability to resolve past temperatures via proxies — almost as warm. Go back 1000, 1100, 1200 years and it is not unprecedented at all. Go back 2500 years and it might actually be cooler, not even a match for the warmest decades or centuries in the record. Go back 12,000 years, to the start of the post-YD Holocene, and it not only is not unprecedented, it might be a full degree to degree and a half cooler than the Holocene Optimum — might because going back that far increases the noise of uncertainty from the best of proxies.
What else can we make “unprecedented” with this sort of analysis? Well, pretty much any non-flat curve. Simply go to the end. Proceed backwards from that endpoint to the previous maximum or minimum. All of the change — in either direction — observed in that curve is then “unprecedented” over that interval.
To construct the curve itself that is being analyzed out of data corrupted with confounding influences as Willis deconstructs up above is simply adding insult to a very basic cherrypicking injury. I tend to be very suspicious of temperature proxies that are multifactorial, where the actual cause of the variation is not temperature as a direct agency but as an inferred secondary agency.
Tree rings are an excellent example, as they are a direct proxy of precipitation, not temperature per se. Although there is some correlation between precipitation and mean temperature, it is not a compelling one or a predictable one. Some years it is wet and cold. Some years it is wet and warm. Many years it is hot and dry. Many years it is cold and dry. One can go back in the instrumental, historical record and find multiple instances of all of these combinations occurring, and these aren’t the only two factors that affect tree rings — insect infestations, fungus infestations, tree diseases, fires, volcanic aerosol activity, animal predator/prey cycles — all of these things and more can affect how much or little a tree grows in any given year or decade, and many of these problems are persistent one. Even if a given species in a given location tends to have a particular association between temperature and a given growth pattern now, it is by no means clear that that association is persistent into the indefinite past on a century time scale.
One small part of this is revealed by the difficulty of cutting contemporary trees and inferring the measured temperature series for the location from its rings. Around here, rings will be packed in pretty nastily during the extreme droughts we had during the 80s, when it was nominally cooler, but in recent years with only one exception if anything it has been nice and wet. If one cuts any of the trees in my yard, or the pin oaks that are 100-150 years old, and try to parse out the temperature, all you’re likely to actually get isn’t temperature, it is rainfall, and annual rainfall is at best weakly correlated with temperature. This, too, was noted in the climategate emails, where almost exactly this “backyard experiment” was done by a dendroclimatologist (or perhaps his son for a science fail project) with utterly inconclusive results.
Varves sound no better. Ultimately, they sound like they are a proxy not of temperature per se, but of a mix of precipitation and ice melt. As noted, they are easily confounded by land use changes — any sort of land use change (natural OR anthropogenic — forest fires and other natural events might well contribute) can cause years to decades to permanent alterations of runoff patterns as profound as the natural “signal” of water flow alone. One can even imagine years with MINIMAL flow but maximum sedimentation. Is a peak due to a particularly warm but short summer following a particularly long, cold, and snowy winter, so that the ice melt was violent and maximally turbulent (producing a thick sediment layer in a colder than normal year)? Is it due to a normal winter followed by a particularly cold, wet summer, so that flooding and high sedimentation is caused not by ice melt from high temperatures but by latent heat of fusion as excessive cold rain melts more ice than mere dry air sunlight would have done?
Sure, one might find local weak correlations between temperature and sedimentation now, but climate does change and always is changing for non-anthropogenic reasons; it may well have shifted between patterns with the same mean temperature but different precipitation/melt/sedimentation.
Ask the settlers of the Roanoke colony about “climate change”. Seven years of “unprecedented drought” across most of the Southeast US, drought so severe it wiped out whole tribes of native Americans (and the colony), nearly wiped out Jamestown, forced the migration of other tribes searching for water, during a time of unprecedented cold, the coldest stretch of the entire Holocene. We know about the drought from a mix of historical records and from (yes) tree rings. We know about the cold from a mix of historical records and from other proxies. Given prior knowledge from human recorded history, we can look at these tree rings and determine that they were cold and dry rings, not hot and dry rings.
Without that prior knowledge, what would we do? Take a snapshot of rings from (say) the last 100 years where we have decent thermometric data for the Southeast US. We note that over that stretch, there were a handful of periods of drought lasting 2 or more years (this is only 100 years, after all!) and that there was a coincident patch of them in the 80’s (as well as more isolated ones scattered more or less uniformly across the century). The patch gives “drought vs temperature” a small positive correlation in a linear fit. We look back at the rings from the early 1600s, note that they are far more tightly packed than even rings from the 80s and conclude that this was a heat wave in the Southeast US, not a period of bitter cold, dry winters and comparatively short, hot and dry summers.
Sure, people who do this try to correct for this sort of confounding, but it is very difficult and in the end, uncertainties in the process if honestly stated are large enough to erase almost any conclusion concerning temperatures 400 years ago based on multifactorial proxies. Even if a tree species is very positively correlated with temperature locally over the thermometric record, that in no way guarantees that that correlation persists back hundreds of years, because the climate does shift over that sort of time scale and a different climate might have had a different correlation that mimics the one observed now but with entirely different associations between observed patterns. Statistics is always based on the assumption of “independent, identically distributed” sampling, but when sampling from a chaotic dynamical system with persistent structured behavior on century timescales punctuated by comparatively sudden transitions between entirely dissimilar behavioral regimes, this assumption is generally false. You might as well try to extrapolate the behavior of a butterfly based on a series of observations of a pupa in a cocoon, or the behavior of a teen-age adolescent human based on observations of a humans sampled from a retirement community. Some of one’s extrapolations might even turn out to be right — teen-agers and retirees both eat, for example — but it is difficult to predict which ones are valid when you cannot go and check, when one’s only knowledge of teen-agers is from inferences drawn from elderly samples.
We are in this state with almost all of climate science. As even the climate research community is starting to openly acknowledge, we are no more capable of explaining the medieval WP based on our observations of the modern WP than a study of current hip hop music explains sock hop music from the 50s. Both are warm. Both are music.
What we do know is that in the MWP, it was warm without anthropogenic CO_2. We know that climate variations at least this warm are entirely possible without any anthropogenic influence whatsoever. This confounds any attempt to infer that the modern warm period is exclusively due to anthropogenic increases in CO_2 based on the fact that it is, in fact, warm. Even people untrained in statistical inference understand this. It is mere common sense.
Which is why it has been, and continues to be, the primary duty of the Hockey Team and its affiliates to demonstrate that the modern period is “unprecedented”, by erasing, ignoring, cherrypicking away the MWP and RWP and the proxy derived record of the entire Holocene, by pretending that we somehow know what temperature it “should” be outside (that is, would be in the complete absence of human influence), by misusing corrupt data sources that do contain a human signal, just not a human signal associated with climate, all to create panic, concern, and a willingness to open one’s pocketbook and grant one’s vote to the proposition that we are causing a disaster so that no price is too big to pay to ameliorate it.
This is, in one way, entirely unprecedented. It is an unprecedented abuse of science. And one day, we will pay for it.
rgb

Theo Goodwin
April 14, 2013 8:59 am

rgbatduke says:
April 14, 2013 at 8:36 am
Spot on. RGB has written a highly informative article that should be adopted as a foundation in the paleoclimatology community.

rgbatduke
April 14, 2013 9:04 am

Example: black box radiation spectrum profile follows the normal curve on the log scale, but on the integer scale it shows a characteristic curve, see e.g. http://en.wikipedia.org/wiki/Planck's_law . So on the log scale it is symmetric — obviously a better scale.
Say what? No, it doesn’t. How are any of:
http://en.wikipedia.org/wiki/File:RWP-comparison.svg
normal? Or if you prefer:
http://commons.wikimedia.org/wiki/File:Planck_law_log_log_scale.png
There is a nice paper online that shows the radiance with only a log representation of the frequency/wavelength, and it still isn’t normal. The Blackbody radiation curve is not just an exponential transform of the normal distribution.
Is the log scale “obviously better” on some other basis? Well the paper I just referred to (by Marr and Wilkin if you want to search for it) argues that there is some point to presenting the Planck curve on alternative scales, but personally I don’t see it. The usual comparison is to the classical theory, e.g. Rayleigh-Jeans, with the ultraviolet catastrophe. R-J isn’t even vaguely normal, and yet it asymptotically describes Planck. The physical mechanism of the cutoff is not any sort of averaging process — it is quantization of the EM field.
Note well this latter point. The reason the normal curve is important is almost exclusively the Central Limit Theorem. That is, it is useful when the quantity being examined is itself a mean derived from some sort of statistical average of a compact distribution. The CLT is so powerful in part because it shows that normal distributions are in some sense a “fixed point” in data transformations — as long as a distribution is sufficiently compact, its mean will be normally distributed, and as long as any transform of that distribution is still sufficiently compact, the mean of the transform will also be normally distributed. I would have made the point in the exact opposite way — the fact that the Planck curve is not particularly normal — or at any rate is highly skewed with a very long tail — suggests that the underlying process is not well described as the average of a compact distribution, and the natural log is not sufficient to compactify it.
Unless I’m missing something. Am I?
rgb

jc
April 14, 2013 9:34 am

rgbatduke says:
April 14, 2013 at 8:36 am
“It is an unprecedented abuse of science. And one day, we will pay for it.”
———————————————————————————————————————
Paying for it now. Have been for years.
At least humanity has. And science. No doubt real scientists will pay the price of science harboring the fake.

Jimbo
April 14, 2013 9:44 am

The paper mentions an unprecedented rise in Western Greenland summer temperatures. Here is a look back at Western Greenland and winter time temperatures, when we were at the safe co2 limit. Only the abstract is available so I can’t see what the rise was.

Abstract
July 1937
A period of warm winters in Western Greenland and the temperature see-saw between Western Greenland and Central Europe
Particulars are given regarding the big rise of winter temperatures in Greenland and its more oceanic climate during the last fifteen years. Observations covering sixty years show a marked negative correlation of simultaneous temperatures between western Greenland and the regions around the Baltic Sea.
http://onlinelibrary.wiley.com/doi/10.1002/qj.49706327108/abstract

Steve McIntyre
April 14, 2013 9:52 am

Willis, most of these varve datasets were used in Kaufman et al 2009 and were discussed at CA at the time. The topic was overtaken in November by Climategate. Look at climateaudit.org/tag/varve.
Iceberg Lake attracted particular interest at the time from both of us. See climateaudit.org/tag/loso. In a contemporary post, http://climateaudit.org/2009/09/23/loso-varve-thickness-and-nearest-inlet/ we discussed the Iceberg lake jump in proxy thickness with change in inlet location, a point that you had hypothesized a couple of years earlier and which Loso considered in a subsequent article (without mentioning the prior discussion.)

Steve McIntyre
April 14, 2013 9:54 am

In respect to the Korttajarvi organics. the problem here is that the series is contaminated rather than upside down. Not that the data means very much either way.