Spurious Varvology

Guest Post by Willis Eschenbach

As Anthony discussed here at WUWT, we have yet another effort to re-animate the long-dead “hockeystick” of Michael Mann. This time, it’s Recent temperature extremes at high northern latitudes unprecedented in the past 600 years, by Martin P. Tingley and Peter Huybers (paywalled), hereinafter TH2013.

Here’s their claim from the abstract.

Here, using a hierarchical Bayesian analysis of instrumental, tree-ring, ice-core and lake-sediment records, we show that the magnitude and frequency of recent warm temperature extremes at high northern latitudes are unprecedented in the past 600 years. The summers of 2005, 2007, 2010 and 2011 were warmer than those of all prior years back to 1400 (probability P > 0.95), in terms of the spatial average. The summer of 2010 was the warmest in the previous 600 years in western Russia (P > 0.99) and probably the warmest in western Greenland and the Canadian Arctic as well (P > 0.90). These and other recent extremes greatly exceed those expected from a stationary climate, but can be understood as resulting from constant space–time variability about an increased mean temperature.

Now, Steve McIntyre has found some lovely problems with their claims over at ClimateAudit. I thought I’d take a look at their lake-sediment records. Here’s the raw data itself, before any analysis:

tingley raw dataFigure 1. All varve thickness records used in TH2013. Units vary, and are as reported by the original investigator. Click image to embiggen.

So what’s not to like? Well, a number of things.

To start with, there’s the infamous Korttajarvi record. Steve McIntyre describes this one well:

In keeping with the total and complete stubbornness of the paleoclimate community, they use the most famous series of Mann et al 2008: the contaminated Korttajarvi sediments, the problems with which are well known in skeptic blogs and which were reported in a comment at PNAS by Ross and I at the time. The original author, Mia Tiljander, warned against use of the modern portion of this data, as the sediments had been contaminated by modern bridgebuilding and farming. Although the defects of this series as a proxy are well known to readers of “skeptical” blogs, peer reviewers at Nature were obviously untroubled by the inclusion of this proxy in a temperature reconstruction.

Let me stop here a moment and talk about lake proxies. Down at the bottom of most every lake, a new layer of sediment is laid down every year. This sediment contains a very informative mix of whatever was washed into the lake during a given year. You can identify the changes in the local vegetation, for example, by changes in the plant pollens that are laid down as part of the sediment. There’s a lot of information that can be mined from the mud at the bottom of lakes.

One piece of information we can look at is the rate at which the sediment accumulates. This is called “varve thickness”, with a “varve” meaning a pair of thin layers of sediment, one for summer and one for winter, that comprise a single year’s sediment. Obviously, this thickness can vary quite a bit. And in some cases, it’s correlated in some sense with temperature.

However, in one important way lake proxies are unlike say ice core proxies. The daily activities of human beings don’t change the thickness of the layers of ice that get laid down. But everything from road construction to changes in farming methods can radically change the amount of sediment in the local watercourses and lakes. That’s the problem with Korttajarvi.

And in addition, changes in the surrounding natural landscape can also change the sediment levels. Many things, from burning of local vegetation to insect infestation to changes in local water flow can radically change the amount of sediment in a particular part of a particular lake.

Look, for example, at the Soper data in Figure 1. It is more than obvious that we are looking at some significant changes in the sedimentation rate during the first half of the 20th Century. After four centuries of one regime, something happened. We don’t know what, but  it seems doubtful a gradual change in temperature would cause a sudden step change in the amount of sediment combined with a change in variability.

Now, let me stop right here and say that the inclusion of this proxy alone, ignoring the obvious madness of including Korttajarvi, this proxy alone should totally disqualify the whole paper. There is no justification for claiming that it is temperature related. Yes, I know it gets log transformed further on in the story, but get real. This is not a representation of temperature.

But Korttajarvi and Soper are not the only problem. Look at Iceberg, three separate records. It’s like one of those second grade quizzes—”Which of these three records is unlike the other two?” How can that possibly be considered a valid proxy?

How does one end up with this kind of garbage? Here’s the authors’ explanation:

All varve thickness records publicly available from the NOAA Paleolimnology Data Archive as of January 2012 are incorporated, provided they meet the following criteria:

• extend back at least 200 years,

• are at annual resolution,

• are reported in length units, and

• the original publication or other references indicate or argue for a positive association with summer temperature.

Well, that all sounds good, but these guys are so classic … take a look at Devon Lake in Figure 1, it’s DV09. Notice how far back it goes? 1843, which is 170 years ago … so much for their 200 year criteria.

Want to know the funny part? I might never have noticed, but when I read the criteria, I thought “Why a 200 year criteria”? It struck me as special pleading, so I looked more closely at the only one it applied to and said huh? Didn’t look like 200 years. So I checked the data here … 1843, not 200 years ago, only 170.

Man, the more I look, the more I find. In that regard, both Sawtooth and Murray have little short separate portions at the end of their main data. Perhaps by chance, both of them will add to whatever spurious hockeystick has been formed by Korttajarvi and Soper and the main players.

So that’s the first look, at the raw data. Now, let’s follow what they actually do with the data. From the paper:

As is common, varve thicknesses are logarithmically transformed before analysis, giving distributions that are more nearly normally distributed and in agreement with the assumptions characterizing our analysis (see subsequent section).

I’m not entirely at ease with this log transformation. I don’t understand the underlying justification or logic for doing that. If the varve thickness is proportional in some way to temperature, and it may well be, why would it be proportional to the logarithm of the thickness?

In any case, let’s see how much “more nearly normally distributed” we’re talking about. Here are the distributions of the same records, after log transformation and standardization. I use a “violin plot” to examine the shape of a distribution. The width at any point indicates the smoothed number of data points with that value. The white dot shows the median value of the data. The black box shows the interquartile range, which contains half of the data. The vertical “whiskers” extend 1.5 times the interquartile distance at top and bottom of the black box.

tingley log transform vioplotFigure 2. Violin plots of the data shown in Figure 1, but after log transformation and standardization. Random normal distribution included at lower right for comparison.

Note the very large variation between the different varve thickness datasets. You can see the problems with the Soper dataset. Some datasets have a fairly normal distribution after the log transform, like Big Round and Donard. Others, like DV09 and Soper, are far from normal in distribution even after transformation. Many of them are strongly asymmetrical, with excursions of four standard deviations being common in the positive direction. By contrast, often they only vary by half of that in the negative direction, two standard deviations. When the underlying dataset is that far from normal, it’s always a good reason for further investigation in my world. And if you are going to include them, the differences in which way they swing from normal (excess positive over negative excursions) affects both the results and their uncertainty.

In any case, after the log transformation and standardization to a mean of zero and a standard deviation of one, the datasets and their average are shown in Figure 3.

tingley log transform plus avgFigure 3. Varve thickness records after log transformation and standardization.

As you can see, the log transform doesn’t change the problems with e.g. the Soper or the Iceberg records. They still do not have internal consistency. As a result of the inclusion of these problematic records, all of which contain visible irregularities in the recent data, even a simple average shows an entirely spurious hockeystick.

In fact, the average shows a typical shape for this kind of spurious hockeystick. In the “shaft” part of the hockeystick, the random variations in the chosen proxies tend to cancel each other out. Then in the “blade”, the random proxies still cancel each other out, and all that’s left are the few proxies that show rises in the most recent section.

My conclusions, in no particular order, are:

• The authors are to be congratulated for being clear about the sources of their data. It makes for easy analysis of their work.

• They are also to be congratulated for the clear statement of the criteria for inclusion of the proxies.

• Sadly, they did not follow their own criteria.

•The main conclusion, however, is that clear, bright-line criteria of the type that they used are a necessary but not sufficient part of the process. There are more steps that need to be followed.

The second step is the use of the source documents and the literature to see if there are problems with using some parts of the data. For them to include Korttajarvi is a particularly egregious oversight. Michael Mann used it upside-down in his 2008 analysis. He subsequently argued it “didn’t matter”. It is used upside-down again here, and the original investigators said don’t use it after 1750 or so. It is absolutely pathetic that after all of the discussion in the literature and on the web, including a published letter to PNAS, that once again Korttajarvi is being used in a proxy reconstruction, and once again it is being used upside-down. That’s inexcusable.

The third part of the proxy selection process is the use of the Mark I eyeball to see if there are gaps, jumps in amplitude, changes in variability, or other signs of problems with the data.

The next part is to investigate the effect of the questionable data on the final result.

And the final part is to discuss the reasons for the inclusion or the exclusion of the questionable data, and its effects on the outcome of the study.

Unfortunately, they only did the first part, establishing the bright-line criteria.

Look, you can’t just grab a bunch of proxies and average them, no matter if you use Bayesian methods or not. The paleoproxy crowd has shown over and over that you can artfully construct a hockeystick by doing that, just pick the right proxies …

So what? All that proves is yes indeed, if you put garbage in, you will assuredly get garbage out. If you are careful when you pack the proxy selection process, you can get any results you want.

Man, I’m tired of rooting through this kind of garbage, faux studies by faux scientists.

w.

Advertisements

91 thoughts on “Spurious Varvology

  1. What do you think is an appropriate proxy these data represent?
    I think that is a basic call.

  2. “Consensus” climate science has been on life-support for a long time now.
    As new treatments consistently fail, weirder and weirder shamanistic rituals have to be carried out in order to keep the patient from expiring completely.

  3. Good post Willis. How is tighter packing due to greater compression of deeper layers reconciled? There must be a K number assigned to each core to accurately gauge the varve thickness of each layer. Failing to accurately account for differences based on time and compression would likely skew all data toward greater thickness in more recent periods.

  4. Again, nice one, Willis. But you are far more patient than I and I could not be bothered running so many of the faults in this ‘study’ down – finding that the authors couldn’t be bothered sticking to their own criteria would have been enough for me to call the game off.

  5. When all is said and done, no matter how many studies, no matter how many grants and billions of dollars wasted on “climate research, doesn’t all of this simply amount to saying, “It’s a teensy, weensy bit warmer than it was at the end of the Mini Ice Age. All the hockey stick crap was just a justification for keeping the gravy train rolling. Thanks to Steve, Anthony, Willis and others the whole edifice is crumbling. It’s only the scientific illiteracy of politicians, the media and a truly dumb public that is allowing this nonsense to go on as long as it has. It should have died long ago but there was (and is) too much money and power to be had by keeping it alive. Thanks to all who are bring the BS to light.

  6. Thank you, Mr. Eschenbach and Mr. Watts. Your thorough and insightful summarizing of scientific papers in language a non-science major can understand (er, well, ahem, most of the time…), is much appreciated. This is my first and likely last post (I am not likely to have anything substantial to say), but know that I and, no doubt, thousands [ah, why not speculate — MILLIONS $:)] of us are silently reading and learning, gaining more ammunition for the fight for TRUTH.
    Thanks, too, to all you other cool science geeks who weigh in, here! You rock.

  7. Any ostensibly scientific paper that uses the word “unprecedented” in the abstract is clearly not going to present an objective analysis. And “unprecedented in the past 600 years” is the clincher – attempting to stir emotion over a result that even if it were accurate, would still be a big “so-what?”. Not that nuch different from saying the storm we just had was unprecedented in the past three weeks.

  8. Another fine example of junk science propagandists masquerading as scientists. Clearly there is something wrong with at least 4 of these records. Why would anyone who wants to know the truth use them? It’s a sure sign that these folks are not interested in finding out good estimates of past climate.
    Good write up as usual. Wanted to mention, it looks like a spell checker got your valves and varves mixed up Willis.

  9. In Ancient Rome a soothsayer would often study the entrails of a chicken and make sage pronouncements and wise prognostications on what he discerned therein.
    More recently old ladies would study the tea leaves left behind in the bottom of cups to predict romance or tragedy.
    In branches of science like physics and chemistry observation and measurement takes place at astounding degrees of smallness…but this is verifiable and repeatable measurement.
    In ‘paleo climatology’ there is no such accuracy or repeatability, there are no benchmarks or base lines…it’s thirty percent feel, thirty percent gut instinct, thirty percent guesswork and thirty percent fabrication…it’s a farcical pursuit. Look at Michael Mann.

  10. TH2013 say: “The summers of 2005, 2007, 2010 and 2011 were warmer than those of all prior years back to 1400 (probability P > 0.95), in terms of the spatial average.”
    So…what if the spatial averages of their corresponding winters were colder than those of all prior years back to 1400 with Pr>.95? Wouldn’t that make it a wash?

  11. Willis, somehow I find this appropriate.
    Take it as you may…………………accuracy, in the end, is what matters………

  12. Geoff, The term –> “Valve is an anatomical term applied to the shell of those mollusks that have a shell.” Reference: Wikipedia … The incremental deposit of the shell layers are measurable and might imply something about the “climate”, weather, nutrition for the critters, etc. at some time in the past…
    Question for scientists here: would it be true the valve thickness is really measuring how well/poorly the critters were eating at the time? It seems to me there would be many possible variables about food supplies and that not just temperature should be considered. What if a nice hot-spring were nearby at the time and some ate really well, while others across the lake starved. If this is true then valve thickness cannot alway be used for a climate proxy. Yes/no ???
    Thanks to Willis, yet again, for great analysis…

  13. Does anyone know: Are these varves from glacier fed lakes? Not mentioned by Willis, but Surely they must be if there is any suggestion that thickness measures temp.

  14. You can smell the desperation. They are going berserk to make their case by any means possible that distracts attention from the satellite record, the only reasonably accurate data we have. No warming for two decades. They can hype the past versus the present all they want, but it really makes no difference. Their proxy studies could be 100% accurate and there STILL would be no warming for the last two decades. They can’t change that, and they can’t change the fact that all the models have predicted (projected, whatever) warming over the last two decades that is entirely absent. Increased drought, absent. Increased severe weather, absent. Decreased ice, on a global basis, also absent. Every piece of data we have that has any semblance of accuracy over the last two decades contradicts all the theory and all the models. The only thing these clowns have left is trying to pain the distant past as meaningful so that nobody looks at the current data.

  15. Nice post Willis, thank you.
    ————————
    tchannon says:
    April 13, 2013 at 8:25 pm
    “What do you think is an appropriate proxy these data represent?
    I think that is a basic call.”
    Agree tchannon. In my experience sediment yield for a given period, e.g. annual, and for maximum annual sediment yield are first usually an indicator of more runoff either from more precipitation (including magnitude and rate or intensity) or land use changes. For example, a severe forest fire can cause changes of runoff on the order of about 2 to 10 and increases sediment yield on the order of 100 or more, Second, since the highest sediment yield in a given period is often a few orders of magnitude larger that the arithmetic mean, a skewed distribution such as log-normal, gamma, etc. fit the data better than a line through the untransformed data. The authors may have picked up a book on erosion and sedimentation and grabbed the log-normal distribution. So if I had to pick what a change in varve thickness represents I would first look for runoff and second what caused the change in runoff.

  16. I’d kill the auto-correct if I could, changing “valve” to “valve” in a bunch of spots.
    Fixed … grrrr.
    w.

  17. Thanks. Most informative. But what does it mean when you say about Tiljander that it was used “upside down”? Does it mean that the proxy actually shows that the temparature has dropped?

  18. What if there has been no global warming for the past 17 years? What if climate has changed naturally for the past billion years? What if it was the Sun’s fault?

  19. Leonard, varve thickness was first used by de Geer 120 years ago to measure the rate of retreat of the last glaciation across Sweden. My presumption is that it is a bit of neglect tacit knowledge here that these varves can be closely associated with ice melt at the polar perma-frost boundary. Thus, while there is still the problem of interference by bridge building etc.,the association with temp is not so absurd.

  20. Franz-
    I think it literally means upside down, they have changed the sign from positive to negative by “mistake.”
    I worked in complex human genetics for a long time, and many people in that area around geneticists and not mathematicians so they screw up the math in papers all the time. To me it looks like climate science has the same issue, the datasets are too complex for the scientists to handle and thus many papers do not have the significance claimed.

  21. Oh dear, once again, where is academic rigor?
    You do not do a log transform unless you have reason to do so. Nature tends not to know what a logarithm is and I suspect there are few examples in Nature of logarithmic distributions of anything. More often, it is used to transform data into a look that satisfies the author and is not more than an arbitrary approximation. The logarithm function is often first recourse when a linear function has problems, but there are many mathematical transforms possible. The skew in the violin plot, especially the fat-tailed high of several, shows that caution should be used with these log transforms. I would not class these as statically valid for further analysis.
    I have not read the article (paywall) so these comments might be refuted therein.
    There is a further problem, shown by Ogac in the last figure. The lower values group into boxes at Y-axis values of -2, -1 etc (by eyeball, see the flat lines you can draw from left to right). These might be an artefact of carrying a small number of significant figures on the small numbers a log transform can create. This in turn feeds back on error estimate and in the general case broadens the bounds if done correctly. I do not know the specifics until I read the paper. So I refrain from calling it a cartoon level treatment for now.

  22. Leonard Lane says:
    April 13, 2013 at 10:46 pm


    In my experience sediment yield for a given period, e.g. annual, and for maximum annual sediment yield are first usually an indicator of more runoff either from more precipitation (including magnitude and rate or intensity) or land use changes. For example, a severe forest fire can cause changes of runoff on the order of about 2 to 10 and increases sediment yield on the order of 100 or more,
    Second, since the highest sediment yield in a given period is often a few orders of magnitude larger that the arithmetic mean, a skewed distribution such as log-normal, gamma, etc. fit the data better than a line through the untransformed data. The authors may have picked up a book on erosion and sedimentation and grabbed the log-normal distribution. So if I had to pick what a change in varve thickness represents I would first look for runoff and second what caused the change in runoff.

    Thanks for the interesting thoughts, Leonard. One point in showing the distribution after the log-normal transformation was to point out that each distribution is different. Some are quite normal in distribution after the log-normal transformation. However, others are not. In such cases, if the objective is to end up with a generally normal distribution, a gamma or other distribution should be considered in addition to log.
    My main point was, they claimed they wanted to get a more normal distribution, but in many cases they were not successful.
    Finally, I agree that a change in precipitation or runoff would be the first consideration. Second for me would be to look at the location of the core in respect to the location of the inflow. In natural lakes, a delta often builds up around the inlet. If the incoming water cuts through to the other side off the delta, as happens occasionally, the amount of sediment can change drastically.
    See the discussion of Iceberg Lake at ClimateAudit for an example of what I mean.
    Best regards,
    w.

  23. Franz Dullaart says:
    April 13, 2013 at 11:00 pm

    Thanks. Most informative. But what does it mean when you say about Tiljander that it was used “upside down”? Does it mean that the proxy actually shows that the temparature has dropped?

    Indeed, that’s exactly what it means. In addition, the same problem exists for the Iceberg Lake data. Here is the location, along with the nearest stations:

    The three nearest temperature stations are shown on the map, along with the location of the lake. Iceberg Lake is only about 80 km (50 miles) from the coast of Southeast Alaska. Fortunately, the three nearest stations are not far from Iceberg Lake, and it’s in the middle of the three. Here’s the temperature record:

    As you can see, the three stations are well correlated, particularly in the modern period when measurements are likely to be better. Burwash, being inland, has wider swings than the two coastal stations, but they all move together.
    And as you can also see, in the last century the varve thickness of Iceberg Lake is a pretty good proxy for the temperature, with just one leetle, tiny problem. Note that the scale on the right is inverted … which means that when the temperature increases, the varve thickness decreases
    So Iceberg Lake (“Iceberg” in the figures above) is upside-down as well. I probably should add this to the head post.
    w.

  24. @Charles Gerard Nelson
    “…it’s thirty percent feel, thirty percent gut instinct, thirty percent guesswork and thirty percent fabrication…it’s a farcical pursuit.”
    I love your climatic mathematics. That is the point exactly with the disturbed sediments. The modern size of the disturbed sediments add up to more than 100% of the ‘temperature’ they encoded at some time in the past.
    Let me have a go at it:
    Korttajarvi is upside down – that is 100% wrong. Then the log-thing: 50% wrong. The inclusion of data that does not meet the criteria: 10% wrong. The inclusion of data sources and methods: 50% right. Spelling his name correctly on the paper: 10% bonus (that is a fixture from my high school days).
    So it is 60% right and 160% wrong.
    Final Mark:
    (60+160) / 392 ppm CO2 = 0.561 or 56%
    That’s a D-minus
    How am I doing so far?

  25. Korttajärvi is not a lake somewhere in pristine wilderness.It is a very small lake surrounded by agriculture, fields that slope towards it and possibly at least one pig farm or cattle farm.
    In the fifties, those small farms got tractors an other power equipment, and fertilizers were introduced.As usual, if small amount of fertilizer seems to be good, more must be even better.
    So the farmers soaked their fields with enermous amount of fertilizer.And used tractors to stir the survace.
    Choosing the lake Korttajärvi is a deliberate trick to create an anomaly.
    I have driven past Korttajärvi, it is near Jyväskylä, close to our summer cottage.
    There are only two Korttajärvi:s here in Finland ,and the other one is called “Alvajärvi-Korttajärvi” because it is actually just Alvajärvi.It would be even worse option to study.
    Best Regards
    Timo Kuusela

  26. berniel says:
    April 13, 2013 at 11:22 pm

    Leonard, varve thickness was first used by de Geer 120 years ago to measure the rate of retreat of the last glaciation across Sweden. My presumption is that it is a bit of neglect tacit knowledge here that these varves can be closely associated with ice melt at the polar perma-frost boundary. Thus, while there is still the problem of interference by bridge building etc.,the association with temp is not so absurd.

    Thanks, Bernie. The association is not absurd … but as the example of Iceberg Lake shows, the relationship is also by no means obvious. My conclusion about Iceberg Lake was that when it is colder and the ice advances, you get more frost heave and “bulldozing” by advancing ice fronts. That makes for MORE sediment when it’s colder, and less sediment when it’s warmer, the opposite of the blanket assumption used in this study.
    In other areas, the link may be between temperature –> rainfall –> increased river flow –> increased sediment. Theoretically, if a warmer climate in the area leads to more rain, we could see a positive correlation between temperature and varve thickness.
    So it seems to me you’d have to do what I did, and compare the varve data to the local temperature, to try to understand what is going on.
    w.

  27. By the way Iceberg Lake is in the near proximity of Hubbard Glacier, the largest tidewater glacier in North America. Hubbard is gaining in mass and has been advancing since the late 1880s. Haven’t seen much reporting on that in SKS or Huffington Post or anywhere else.

  28. Willis: “So what? All that proves is yes indeed, if you put garbage in, you will indeed get garbage out. If you are careful when you pack the proxy selection process, you can get any results you want.”
    As a practical guy, I’m sure you know that if you search through garbage you can find what you need to make just about anything. Climate science is no different 😉
    Perhaps this field should be classified as a sub-section of garbology. http://en.wikipedia.org/wiki/Garbology
    W “Man, I’m tired of rooting through this kind of garbage, faux studies by faux scientists.”
    There seems to be surge of more and more spurious work getting published recently.
    Their whole story is falling apart and the public are losing interest. It seems like their last desperate attempt to keep boat afloat is to publish these spurious studies faster than the rest of the world can rebut them.
    FWIW, I think the idea behind the log is that the mud compacts with time. The water being forced out as it settles. It would be interesting to ask how accurate that simplistic model is and how they account for the uncertainty of that process in their overall uncertainty calculations for the study.

  29. Geoff Sherrington says:
    April 13, 2013 at 11:33 pm

    Oh dear, once again, where is academic rigor?
    You do not do a log transform unless you have reason to do so. Nature tends not to know what a logarithm is and I suspect there are few examples in Nature of logarithmic distributions of anything. More often, it is used to transform data into a look that satisfies the author and is not more than an arbitrary approximation. The logarithm function is often first recourse when a linear function has problems, but there are many mathematical transforms possible. The skew in the violin plot, especially the fat-tailed high of several, shows that caution should be used with these log transforms. I would not class these as statically valid for further analysis.

    My point exactly, sir, and I thank you for articulating it so well.

    I have not read the article (paywall) so these comments might be refuted therein.

    Not that I can find. Their reference for the log-transform usage is Loso, M. Summer temperatures during the medieval warm period and little ice age inferred from varved proglacial lake sediments in southern Alaska. J. Paleolimnol. 41, 117–128 (2009), available here.

    There is a further problem, shown by Ogac in the last figure. The lower values group into boxes at Y-axis values of -2, -1 etc (by eyeball, see the flat lines you can draw from left to right). These might be an artefact of carrying a small number of significant figures on the small numbers a log transform can create. This in turn feeds back on error estimate and in the general case broadens the bounds if done correctly. I do not know the specifics until I read the paper. So I refrain from calling it a cartoon level treatment for now.

    I suspect that it reflects the lower limits of resolution for the actual measurement of the varve thickness, rather than the number of significant figures. Here are the values before the log transform (cf. Figure 1):

    > levels(as.factor(round((proxy[,27]),3)))
     [1] "0.08" "0.09" "0.11" "0.12" "0.13" "0.14" "0.15" "0.16" "0.17" "0.18" "0.19" "0.2"  "0.21"
    [14] "0.22" "0.23" "0.24" "0.25" "0.26" "0.27" "0.28" "0.29" "0.3"  "0.31" "0.32" "0.33" "0.34"
    [27] "0.35" "0.36" "0.38" "0.39" "0.4"  "0.42" "0.43" "0.44" "0.45" "0.46" "0.47" "0.5"  "0.51"
    [40] "0.54" "0.56" "0.57" "0.58" "0.59" "0.62" "0.65" "0.69" "0.73" "0.83" "0.93" "0.94" "1.1"
    [53] "1.12" "1.54" "NaN" 

    You can see that a log transform of that would be stepped.
    Thanks,
    w.

  30. An increase in the Varv thickness is a measure of the melting is increased, the more the temperature increases. And similar is due to a decrease in Varv the temperature drops.
    The problem of the samples are being taken to places where Varv formation is beneficial in today. And thereforedata is lacking from the period from the year 800 to the year 1000 where the temperature increased.

  31. “For if we are uncritical we shall always find what we want: we shall look for, and find, confirmations, and we shall look away from, and not see, whatever might be dangerous to our pet theories. In this way it is only too easy to obtain what appears to be overwhelming evidence in favor of a theory which, if approached critically, would have been refuted.” — Karl Popper, 1957, The Poverty of Historicism (London: Routledge), p. 124

  32. To be fair to the authors there acting at a standard and in a way that is the norm for their area .
    That the standards are so low and manner of working is such a joke within climate scince is another problem .

  33. By the way, I must speak out in favor of log transforms. As a matter of fact, one could seriously consider that logs are Nature’s #1 way of counting, and that our use of integers just shows our local-mindedness. Example: black box radiation spectrum profile follows the normal curve on the log scale, but on the integer scale it shows a characteristic curve, see e.g. http://en.wikipedia.org/wiki/Planck's_law . So on the log scale it is symmetric — obviously a better scale. Another example: financial graphs would be better if plotted logarithmically, because then a doubling of the price is the same y-displacement wherever it goes. And so on. So I would not be quick to jump on the anti-logarithm bandwagon, no, not at all.

  34. Excellent post Willis, as always. Thank you.
    More worrying for me is that these varves patently are not a good proxy for temperature. I say this because the graphs shout out that they are contaminated in individual years. Just look at the sudden spikes in lots of them. No one seriously believes the temperature spiked by, in some cases 5 fold, for a single year or two. Even assuming a log relation to temperature doesn’t make those spikes credible. And the spikes are not consistent across the world, so they are patently not ‘real’ measures of anything – they are local conditions being reflected locally.
    But the criteria Willis put up for proxy inclusion do not indicate any special processing of these spikes.
    Let’s imagine that these really were ‘thermometers’. Would anyone get away with using this data? Of course not. Reviewers would be all over them, calling for a processing step that removed the obvious year of data when an apprentice reported in Fahrenheit instead of Celsius.

  35. Interesting as always, Willis
    I am wondering about the varve proxy approach in general:
    i) So it seems probable varve thickness varies some way in synch with summer temperatures. But why would it be linear? Or, why would it be log-linear? What is the physics behind this assumption? And why would a lake in the Alaska show the same behavior as a lake in Finland?
    ii) Assuming it has been established that a e.g. log-linear relationship has a physical justification, what is the inaccuracy of each individual temperature “measurement”? 0.1 degree, 1 degree, 2 degrees? 10 degrees? (My guess 1 or 2 degrees)
    iii) In this case, there are 11 “thermometers” all measuring/sampling DIFFERENT signals, i.e. (indirect) temperatures in 11 different locations. How is then possible to determine some sort of northern average temperature with an accuracy of 0.1 or even 0.01 degree? (I assume this necessary in order to rank the years.)
    Thanks,
    /Johan

  36. I should have added in response to Geoff Sherrington:
    “More often, it is used to transform data into a look that satisfies the author and is not more than an arbitrary approximation”
    This is what I was thinking – the log process compresses the spikes, and makes the processed data more ‘credible’ in the authors’ eyes.

  37. As a general rule, if samples from presumed equivalent populations that mostly show a normal distribution, show skewdness or kurtosis, then those populations are subject to other effects than the normal distribution populations.
    Combining them is poor science.
    Personally, I’d replace faux by cr@p.

  38. Willis! You have misinterpreted Korttajavi. It is not fixed to varve thickness. From Tilander 2003:
    “It is now possible to interpret long varve records
    with one year or seasonal resolution. Variation in the
    relative X-ray density describes the structure of the
    sediment and provides information about the sedimentary
    environment. Varve thickness indicates the rate of
    sediment accumulation, and magnetic measurements
    have been used to provide information on the nature of
    the accumulating mineral particles (e.g. Snowball et al.
    1999, 2002).”

  39. As I said above: I think the idea behind the log is that the mud compacts with time. The water being forced out as it settles. It would be interesting to ask how accurate that simplistic model is and how they account for the uncertainty of that process in their overall uncertainty calculations for the study.
    The log is the inverse of the exponential , which is very common in natural processes. So taking the log could be justified. It may warrant further inspection, but I don’t think it’s grounds for shouting an yabooing without reading the paper.
    No point in slinging mud a varvologists, they love the stuff.
    This result looks very weak for the other reasons Willis has highlighted. I’d be inclined to concentrate of that rather than the log issue.

  40. Just how often do you encounter a Google Map aerial wiev that is a nighttime wiev?
    The Korttajärvi region “just happens” to be like that.Like a black hole in the middle of Finland.
    Luckily there is an option:kansalaisen.karttapaikka.fi ,it is in black and white aerial photography, but much better than google map.Just choose “ilmakuva”- “aerial image” (you can choose english too)and you can watch closely any part of Finland.Korttajärvi is 10km/6 miles north of Jyväskylä, just south of Puuppola.You can see yourselves how good a lake is that to have such scientific importance.

  41. My daughter wanted to know how on earth I can write the word “view” -“wiev”……
    A facepalm….

  42. Thanks Willis.
    But what does ”embiggen” mean? Does ”Enlarge” just about cover it?

  43. “These and other recent extremes greatly exceed those expected from a stationary climate”
    And there is your problem. You’re denying climate change. [snip] ~mod

  44. It’s easy to joke about “upside-down Mann” etc etc but this whole issue strikes me as being absolutely outrageous.
    As I understand it, Mann’s method in Mann 2008 simply used absolute values and ignored the sign. Amazing, but true. His method had the hockey stick assumption (with the blade going upwards) built in. This is obviously unacceptable, but it could conceivably have been an honest mistake. However, Mann is clearly aware of this problem and he has done nothing to rectify it. Because of this, what might have been an honest mistake becomes scientific misconduct or even outright fraud.
    It’s inconceivable that the present authors were unaware of the upside-down problem. If so, then they would also be open to accusations of scientific misconduct or fraud.
    I’m assuming that the Korttajarvi upside-down claim can be proven. If so, then there would be a clear-cut case of scientific misconduct against both Mann and the present authors. Of course, the fact that Tiljander clearly warned that the data could not be used for 19th and 20th century reconstructions, due to local industrial activity, would hardly help their case.
    There does appear to be a strong case for scientific misconduct or fraud. So why are none of the bodies who are supposed to be protecting the integrity of science doing nothing? I think we know the answer to that.
    By the way, I think nature does tend to be logarithmic. Some of you may be familiar with Benford’s law. It explains why, for instance, the first pages of books of logarithms were more grubby than the rest. If you look at a list of share prices you should find that around a third of the prices begin with the number one. It’s quite remarkable. It seems that any list of measurements of natural variation follow the law. It has actually been used to detect financial fraud. Who knows, maybe it could be used to detect climate science fraud…..
    Quite possibly Benford’s law arises because natural processes do tend to follow a logarithmic law.
    Chris

  45. Willis – this paper does not use the Kortajarvi proxy “upside down.” The problem is that they use the contaminated portion post 1720.
    The 3 proxy series that can be generated from the core are thickness, X-Ray density lightsum, and X-Ray density darksum. The densities represent contribution of plants vs. minerals in the sediment – I forget which is which. Mann used one of the density series “upside down” in that it has an inverse relationship w/temp.
    If TH2013 are only using thickness, they’re using the correct orientation – but should NOT have used values in the modern, contaminated portion of the series. Regards – Terry

  46. I don’t understand the “butterfly” plots.
    I think the log transformation is to account for layer thickness compression.

  47. “These and other recent extremes greatly exceed those expected from a stationary climate, but can be understood as resulting from constant space–time variability about an increased mean temperature.” I’m not sure why Mike B the Chead in Switz got snipped for commenting, but this sentence, to me, is the crux of the problem.
    It doesn’t matter where the varves of the valves meet the cool of the day if your premise, the very basis of your research is wrong. We do not live in a static climate. When this is the assumption of TH2013 and they go looking for it, why of course they will find it. In essence they are saying that they just felt it was too warm. So they read the hype and looked for confirmation of the hype among the writings of other confirmation confabulists and data contortionists.
    It becomes evident daily that we need more Liu Yu’s doing his ten years of Tibeten field trips and analyzing his own data and fewer folks throwing the work of others into the grand bingo hopper of science and pasting up the data of others even when it’s upside down and backward.

  48. A possibile explanation for the non-normal and non-lognormal distributions of varve thicknesses may be that the data for single sites include several populations with different means/medians which result from different climatic conditions. For instance in the Big Round data the varve thicknesses between around 1580 to 1780 are thinner than for the periods before and after suggesting lower rainfall [maybe] during this period. More to the point, the data from 1580 to 1780 seem to form a separate population and should not be lumped together with the earlier and later data in a statistical analysis.
    And as for this…
    “As is common, varve thicknesses are logarithmically transformed before analysis,..”
    The authors need to show that varve thicknesses are log normally distributed before applying a log transformation.
    Otherwise they are doing it simply because….. “As is common, varve thicknesses are logarithmically transformed before analysis,..”

  49. Great take-down, Willis. It only takes a few bad apples (proxies like Korttajarvi, Soper etc) to contaminate the whole barrel (the spurious “blade” on the average of them all).

  50. The failure to follow their own proxy criteria is worse than you portray. The readily available on line abstract for Ogac Lake specifically says a temperature record IS NOT DISCERNABLE. It was use anyway.
    The big giveaway is he lack of an LIA in the conclusion, which comes partly from data massage and partly from proxy selection. For example,mthe criteria meant the recent Braya So multi proxy series from Greenland, published in Nature last year, was omitted. It has a three year resolution, and of course shows the LIA as well as fairly continuous warming since its end around 1800– not 1900. Just like the Greenland Alley ice core does. And this paper’s selection of proxies doesn’t.

  51. That last “criteria”: “the original publication or other references indicate or argue for a positive association with summer temperature” is hardly a criteria at all. I’m sure I could find “references” that “argue for” phlogiston or just about anything else.
    Obviously, they only actually used one criterion: will it aid in realizing results “consistent with” Catastrophic Anthropogenic Global Warming.

  52. So another proxy study employing proxies with questionable qualities, and including data they should know is garbage, to which was spliced the instrumental record for only the last few years, and then the expected claim is made that those years were the hottest in the series. But those spliced records are not in the series. It appears they had this all planned out before they started.
    It would be interesting to overlay the central England record on top of all of the proxies and see how they compare. As a starter, if there is no close comparison in the overlap years, toss it all out.
    Also, I have a gut feeling that the thickness of a lake sediment probably has even less to do with temperatures than tree rings, and more to do with local weather events like heavy rain, forest fires, windstorms, etc.. Was any isotope analysis done of the sediments? Where can we see proof that these varves tell us anything about temperature?

  53. Willis writes:
    “Look, you can’t just grab a bunch of proxies and average them, no matter if you use Bayesian methods or not. The paleoproxy crowd has shown over and over that you can artfully construct a hockeystick by doing that, just pick the right proxies …”
    That hurts. That’s the way Alarmists roll. They practice science by press release only. Too bad that Nature has become nothing more than a public relations agency for Alarmists,

  54. Look, you can’t just grab a bunch of proxies and average them, no matter if you use Bayesian methods or not.

    Exactly. For the same reason that “global temperature” has no physical meaning.
    Even IF (and that’s a big if) these proxies are adequate to represent temperature, it’s obvious that they have their own climate regimes. There is no “teleconnection”. Global Warming isn’t global.

  55. The Willamette Valley lake and wet land sediments would also be affected but not just by pioneer farmers. The indian population in that fertile valley figured out way before pioneers did, that farming was the way to go. They cleared land, burned on a seasonal basis, planted, and harvested their food. And were quite industrious at it! To a large degree, history, and in particular AGW history, fails to acknowledge American Indian land use prior to that of whites. Truth be told, they were as good, and bad, at it as whites were. To the extent that these layers would have to be peeled back further than one would think before getting sediments that are devoid of human influence.

  56. What a treasure Willis is. In addition to his first rate knowledge of science and statistics, he takes you right to the lake itself. What a wealth of knowledge and common sense he has.

  57. If you want to show “unprecedented warming over 600 years”, that’s really fishing with dynamite anyway. 600 years is easy, because 1400 was solidly after the MWP. In fact, it is well down the side of the slope from the MWP down to the LIA, which is tied for the coldest single period in the entire Holocene post YD. The start date is not, I’m sure, overtly “cherrypicked”, but the conclusion is utterly unsurprising and probably even true, to the extent that the present represents temperatures last “precedented” in the MWP, where the Earth managed them without the help of anthropogenic CO_2. Run the same study back to (say) 800 CE or 500 BCE, if you could, and you might find that the present even with the problems pointed out by Willis is hardly unprecedented.
    Why do we keep seeing that word, unprecedented? Because it is indeed a key term in the dialectic of post-Mannian hockey-shtickery. Go back in time to the very AR report where Mann’s hockey stick was promoted from the first paper of an unknown researcher who was so bad at what he did that he actually wrote his own PCA code in Fortran (instead of using R, or SAS, or Stata, or any of the well-written, debugged, readily available tools) and then tweaked it with the GOAL (as revealed in the Climategate letters) of “erasing the LIA and MWP”. There is a paper and graphic there by (IIRC) Jones or Briffa that clearly showed both, with the present pretty much indistinguishable from the MWP.
    How can you convince the world to spend its scarce political capital and economic wealth on “global warming” to the substantial enrichment of selected groups and with the promotion of a profound anti-civilization agenda that de facto freezes 2/3 of the world in abject poverty? Simple. Show that the climate of the present is “unprecedented”, because if it is precedented it confounds the assertion that CO_2 is a necessary adjunct to its unprecedented behavior.
    And indeed it is unprecendented, if you get to choose the start point of the interval that you look at. It is unprecedented over the last 100 years. It is unprecedented over the last 200, 300, 400, 500, 600, 700, 800 years. Go back 900 years and you start to hit an interval that was — within our ability to resolve past temperatures via proxies — almost as warm. Go back 1000, 1100, 1200 years and it is not unprecedented at all. Go back 2500 years and it might actually be cooler, not even a match for the warmest decades or centuries in the record. Go back 12,000 years, to the start of the post-YD Holocene, and it not only is not unprecedented, it might be a full degree to degree and a half cooler than the Holocene Optimum — might because going back that far increases the noise of uncertainty from the best of proxies.
    What else can we make “unprecedented” with this sort of analysis? Well, pretty much any non-flat curve. Simply go to the end. Proceed backwards from that endpoint to the previous maximum or minimum. All of the change — in either direction — observed in that curve is then “unprecedented” over that interval.
    To construct the curve itself that is being analyzed out of data corrupted with confounding influences as Willis deconstructs up above is simply adding insult to a very basic cherrypicking injury. I tend to be very suspicious of temperature proxies that are multifactorial, where the actual cause of the variation is not temperature as a direct agency but as an inferred secondary agency.
    Tree rings are an excellent example, as they are a direct proxy of precipitation, not temperature per se. Although there is some correlation between precipitation and mean temperature, it is not a compelling one or a predictable one. Some years it is wet and cold. Some years it is wet and warm. Many years it is hot and dry. Many years it is cold and dry. One can go back in the instrumental, historical record and find multiple instances of all of these combinations occurring, and these aren’t the only two factors that affect tree rings — insect infestations, fungus infestations, tree diseases, fires, volcanic aerosol activity, animal predator/prey cycles — all of these things and more can affect how much or little a tree grows in any given year or decade, and many of these problems are persistent one. Even if a given species in a given location tends to have a particular association between temperature and a given growth pattern now, it is by no means clear that that association is persistent into the indefinite past on a century time scale.
    One small part of this is revealed by the difficulty of cutting contemporary trees and inferring the measured temperature series for the location from its rings. Around here, rings will be packed in pretty nastily during the extreme droughts we had during the 80s, when it was nominally cooler, but in recent years with only one exception if anything it has been nice and wet. If one cuts any of the trees in my yard, or the pin oaks that are 100-150 years old, and try to parse out the temperature, all you’re likely to actually get isn’t temperature, it is rainfall, and annual rainfall is at best weakly correlated with temperature. This, too, was noted in the climategate emails, where almost exactly this “backyard experiment” was done by a dendroclimatologist (or perhaps his son for a science fail project) with utterly inconclusive results.
    Varves sound no better. Ultimately, they sound like they are a proxy not of temperature per se, but of a mix of precipitation and ice melt. As noted, they are easily confounded by land use changes — any sort of land use change (natural OR anthropogenic — forest fires and other natural events might well contribute) can cause years to decades to permanent alterations of runoff patterns as profound as the natural “signal” of water flow alone. One can even imagine years with MINIMAL flow but maximum sedimentation. Is a peak due to a particularly warm but short summer following a particularly long, cold, and snowy winter, so that the ice melt was violent and maximally turbulent (producing a thick sediment layer in a colder than normal year)? Is it due to a normal winter followed by a particularly cold, wet summer, so that flooding and high sedimentation is caused not by ice melt from high temperatures but by latent heat of fusion as excessive cold rain melts more ice than mere dry air sunlight would have done?
    Sure, one might find local weak correlations between temperature and sedimentation now, but climate does change and always is changing for non-anthropogenic reasons; it may well have shifted between patterns with the same mean temperature but different precipitation/melt/sedimentation.
    Ask the settlers of the Roanoke colony about “climate change”. Seven years of “unprecedented drought” across most of the Southeast US, drought so severe it wiped out whole tribes of native Americans (and the colony), nearly wiped out Jamestown, forced the migration of other tribes searching for water, during a time of unprecedented cold, the coldest stretch of the entire Holocene. We know about the drought from a mix of historical records and from (yes) tree rings. We know about the cold from a mix of historical records and from other proxies. Given prior knowledge from human recorded history, we can look at these tree rings and determine that they were cold and dry rings, not hot and dry rings.
    Without that prior knowledge, what would we do? Take a snapshot of rings from (say) the last 100 years where we have decent thermometric data for the Southeast US. We note that over that stretch, there were a handful of periods of drought lasting 2 or more years (this is only 100 years, after all!) and that there was a coincident patch of them in the 80’s (as well as more isolated ones scattered more or less uniformly across the century). The patch gives “drought vs temperature” a small positive correlation in a linear fit. We look back at the rings from the early 1600s, note that they are far more tightly packed than even rings from the 80s and conclude that this was a heat wave in the Southeast US, not a period of bitter cold, dry winters and comparatively short, hot and dry summers.
    Sure, people who do this try to correct for this sort of confounding, but it is very difficult and in the end, uncertainties in the process if honestly stated are large enough to erase almost any conclusion concerning temperatures 400 years ago based on multifactorial proxies. Even if a tree species is very positively correlated with temperature locally over the thermometric record, that in no way guarantees that that correlation persists back hundreds of years, because the climate does shift over that sort of time scale and a different climate might have had a different correlation that mimics the one observed now but with entirely different associations between observed patterns. Statistics is always based on the assumption of “independent, identically distributed” sampling, but when sampling from a chaotic dynamical system with persistent structured behavior on century timescales punctuated by comparatively sudden transitions between entirely dissimilar behavioral regimes, this assumption is generally false. You might as well try to extrapolate the behavior of a butterfly based on a series of observations of a pupa in a cocoon, or the behavior of a teen-age adolescent human based on observations of a humans sampled from a retirement community. Some of one’s extrapolations might even turn out to be right — teen-agers and retirees both eat, for example — but it is difficult to predict which ones are valid when you cannot go and check, when one’s only knowledge of teen-agers is from inferences drawn from elderly samples.
    We are in this state with almost all of climate science. As even the climate research community is starting to openly acknowledge, we are no more capable of explaining the medieval WP based on our observations of the modern WP than a study of current hip hop music explains sock hop music from the 50s. Both are warm. Both are music.
    What we do know is that in the MWP, it was warm without anthropogenic CO_2. We know that climate variations at least this warm are entirely possible without any anthropogenic influence whatsoever. This confounds any attempt to infer that the modern warm period is exclusively due to anthropogenic increases in CO_2 based on the fact that it is, in fact, warm. Even people untrained in statistical inference understand this. It is mere common sense.
    Which is why it has been, and continues to be, the primary duty of the Hockey Team and its affiliates to demonstrate that the modern period is “unprecedented”, by erasing, ignoring, cherrypicking away the MWP and RWP and the proxy derived record of the entire Holocene, by pretending that we somehow know what temperature it “should” be outside (that is, would be in the complete absence of human influence), by misusing corrupt data sources that do contain a human signal, just not a human signal associated with climate, all to create panic, concern, and a willingness to open one’s pocketbook and grant one’s vote to the proposition that we are causing a disaster so that no price is too big to pay to ameliorate it.
    This is, in one way, entirely unprecedented. It is an unprecedented abuse of science. And one day, we will pay for it.
    rgb

  58. rgbatduke says:
    April 14, 2013 at 8:36 am
    Spot on. RGB has written a highly informative article that should be adopted as a foundation in the paleoclimatology community.

  59. Example: black box radiation spectrum profile follows the normal curve on the log scale, but on the integer scale it shows a characteristic curve, see e.g. http://en.wikipedia.org/wiki/Planck's_law . So on the log scale it is symmetric — obviously a better scale.
    Say what? No, it doesn’t. How are any of:
    http://en.wikipedia.org/wiki/File:RWP-comparison.svg
    normal? Or if you prefer:
    http://commons.wikimedia.org/wiki/File:Planck_law_log_log_scale.png
    There is a nice paper online that shows the radiance with only a log representation of the frequency/wavelength, and it still isn’t normal. The Blackbody radiation curve is not just an exponential transform of the normal distribution.
    Is the log scale “obviously better” on some other basis? Well the paper I just referred to (by Marr and Wilkin if you want to search for it) argues that there is some point to presenting the Planck curve on alternative scales, but personally I don’t see it. The usual comparison is to the classical theory, e.g. Rayleigh-Jeans, with the ultraviolet catastrophe. R-J isn’t even vaguely normal, and yet it asymptotically describes Planck. The physical mechanism of the cutoff is not any sort of averaging process — it is quantization of the EM field.
    Note well this latter point. The reason the normal curve is important is almost exclusively the Central Limit Theorem. That is, it is useful when the quantity being examined is itself a mean derived from some sort of statistical average of a compact distribution. The CLT is so powerful in part because it shows that normal distributions are in some sense a “fixed point” in data transformations — as long as a distribution is sufficiently compact, its mean will be normally distributed, and as long as any transform of that distribution is still sufficiently compact, the mean of the transform will also be normally distributed. I would have made the point in the exact opposite way — the fact that the Planck curve is not particularly normal — or at any rate is highly skewed with a very long tail — suggests that the underlying process is not well described as the average of a compact distribution, and the natural log is not sufficient to compactify it.
    Unless I’m missing something. Am I?
    rgb

  60. @ rgbatduke says:
    April 14, 2013 at 8:36 am
    “It is an unprecedented abuse of science. And one day, we will pay for it.”
    ———————————————————————————————————————
    Paying for it now. Have been for years.
    At least humanity has. And science. No doubt real scientists will pay the price of science harboring the fake.

  61. TerryMN says:
    April 14, 2013 at 4:58 am

    Willis – this paper does not use the Kortajarvi proxy “upside down.” The problem is that they use the contaminated portion post 1720.
    The 3 proxy series that can be generated from the core are thickness, X-Ray density lightsum, and X-Ray density darksum. The densities represent contribution of plants vs. minerals in the sediment – I forget which is which. Mann used one of the density series “upside down” in that it has an inverse relationship w/temp.
    If TH2013 are only using thickness, they’re using the correct orientation – but should NOT have used values in the modern, contaminated portion of the series. Regards – Terry

    Sorry, Terry, but the paper is indeed using it upside-down. Mann, in fact, used all four of the Tiljander series upside down. See here and here for the details.
    w.

  62. The paper mentions an unprecedented rise in Western Greenland summer temperatures. Here is a look back at Western Greenland and winter time temperatures, when we were at the safe co2 limit. Only the abstract is available so I can’t see what the rise was.

    Abstract
    July 1937
    A period of warm winters in Western Greenland and the temperature see-saw between Western Greenland and Central Europe
    Particulars are given regarding the big rise of winter temperatures in Greenland and its more oceanic climate during the last fifteen years. Observations covering sixty years show a marked negative correlation of simultaneous temperatures between western Greenland and the regions around the Baltic Sea.
    http://onlinelibrary.wiley.com/doi/10.1002/qj.49706327108/abstract

  63. Willis, most of these varve datasets were used in Kaufman et al 2009 and were discussed at CA at the time. The topic was overtaken in November by Climategate. Look at climateaudit.org/tag/varve.
    Iceberg Lake attracted particular interest at the time from both of us. See climateaudit.org/tag/loso. In a contemporary post, http://climateaudit.org/2009/09/23/loso-varve-thickness-and-nearest-inlet/ we discussed the Iceberg lake jump in proxy thickness with change in inlet location, a point that you had hypothesized a couple of years earlier and which Loso considered in a subsequent article (without mentioning the prior discussion.)

  64. In respect to the Korttajarvi organics. the problem here is that the series is contaminated rather than upside down. Not that the data means very much either way.

  65. Thanks Willis.
    ~~~~~~~~~~~
    Geoff Sherrington says:
    April 13, 2013 at 11:33 pm
    “Nature tends not to know what a logarithm is . . .

    Thanks, Geoff. I needed an early morning laugh.
    http://upload.wikimedia.org/wikipedia/commons/thumb/3/36/Spiral_aloe.jpg/800px-Spiral_aloe.jpg
    http://www.empowernetwork.com/zeflow/files/2012/08/shell.jpg?id=zeflow
    Just kidding. I know you know.
    ~~~~~~~~~~~~~~~~
    Janice Moore says:
    April 13, 2013 at 9:17 pm
    “This is my first and likely last post . . .

    And that could be a disservice to the “fight for TRUTH”.
    You know things. Share.
    I thought this was interesting:
    ‘They also serve who only stand and wait.’
    ~John Milton’s Sonnet XVII
    http://forum.quoteland.com/eve/forums/a/tpc/f/99191541/m/8371905596
    ~~~~~~~~~~~~~~~~~~
    Pamela Gray says:
    April 14, 2013 at 8:15 am
    “The Willamette Valley lake and wet land sediments . . .

    See:
    http://www.firescience.gov/projects/04-2-1-115/project/04-2-1-115_04-2-1-115_final_report.pdf
    Title (orig. in all caps; sorry):
    HISTORICAL FIRE REGIMES OF THE WILLAMETTE VALLEY, OREGON:
    PROVIDING A LONG-TERM, REGIONAL CONTEXT FOR FIRE AND FUELS
    MANAGEMENT

    From the introduction: “A fire history study based on lake-sediment records of the last 2000 years and tree-ring records from the Willamette Valley and surrounding lowlands was undertaken with three objectives: …
    The 4th author has been at Central Washington University (Ellensburg) for several years and continues this sort of work in WA and OR; as, I assume, the others also continue to do.

  66. j ferguson says:
    April 14, 2013 at 7:34 am
    “Thank you Chris Wright for alerting us to Benford’s Law. What an astonishing law it is.”
    I have a “Word” file graph showing distribution of leading digits in Bill Clinton’s 13 years of tax returns overlaid with the Benson distribution – the fit is impressive and shows that Clinton’s tax returns are highly likely to have been correct. (I don’t know how to put this on the Word Press page). Cooked books tend toward a normal distribution of leading digits. BD is also the reason that the first pages of the old log table books were scruffy and dog-eared compared to the rest of book. Apparently the Benford D is the only one that is “scale invariant” ie does not vary with differing units (dollars, Yen, Euros; SI, English units…).
    Probably, taking the logarithm of the varve thickness data DOES essentially change scale and violates its natural distribution. Perhaps in some way, a test using the BD could be another possibility for “auditing” climate statistical manipulations.

  67. Procedural certainty, not representational certainty. Reality is not found simply by number-crunching.
    I am shocked that technically proficient men and women don’t understand the difference between correct math and valid conclusions determined from math. The sampling is invalid for the purpose, the variation clearly indicating a regional situation.
    A mineral geologist would recognize this pattern: local ore deposits, two of which deserve staking.
    Who would claim 11 in-part incomplete temperature records would accurately reflect the large regional history excerpt by accident? Apparently knowledgeable is not smart.

  68. A great post Mr. Eschenbach but you say “…the random variations in the chosen proxies tend to cancel each other out.”
    Actually i would go further. From a visual inspection I suspect that if someone wasted enough of their time obtaining the correllation coefficients between each pair of proxies, even after transformation and standardisation, there would be sufficient evidence to show that the varve thicknesses were not responding to the same stimuli.

  69. “Steve McIntyre says:
    April 14, 2013 at 9:54 am
    In respect to the Korttajarvi organics.”….
    Readers do not in general understand the technics about this (Willis point) concerning temperature that Tiljander does not refer varve thickness to temperature but to other measurements about details within the year.

  70. There has been a barrage of warming garbage of late from the climateer “scientists”. “Hide the decline” has morphed into “Bury the truth”.

  71. A lot of comment but you and the readers leave out an important subject. The little ice age. In 1400s we were near the bottom of the temeratures so temperatures despite all the flaws in this study could have been lower than and higher today. What was the temperature in say 1000 AD, before the temperatures started dropping. The basin study I just checked shows the temperatuire in 1225 was the same as in 1990 so maybe you folks are looking at the trees in this study and ignoring the forest so to speak.

  72. Janice Moore says:
    April 13, 2013 at 9:17 pm
    Thank you, Mr. Eschenbach and Mr. Watts. Your thorough and insightful summarizing of scientific papers in language a non-science major can understand (er, well, ahem, most of the time…), is much appreciated. This is my first and likely last post (I am not likely to have anything substantial to say), but know that I and, no doubt, thousands [ah, why not speculate — MILLIONS $:)] of us are silently reading and learning, gaining more ammunition for the fight for TRUTH.
    Thanks, too, to all you other cool science geeks who weigh in, here! You rock.
    ++++++++++++++
    My vote for comment of the month…
    If only I had the same sense of restraint, and could write.

  73. Data transformations are often used in a direct response to the technology that generated the data. “Random” errors arising from measurement techniques tend to be proportional to the size of the quantity being measured. The sources of potential error really need to be considered when thinking about the validity and utility of performing data transformations, as well as the apparent inherent scatter in the quantity being investigated. Take precipitation measurements as an example. Collections of monthly average rainfall data may contain values from 0 (zero) millimeters to a few hundred. It will be obvious that very high values are likely to be subject to substantial errors of sampling as well as actual measuring the height of the water in the collector. In dry months such errors will be small. In these circumstances a log scale may be useful, but it falls down for any zero observations. One might thus consider a (logV + C) transform, choosing C such that you feel comfortable with its non-linear scale effects. Clearly subjective in general, but possibly useful. What about a square root transform? The zero problem vanishes whilst the de-emphasis of high values is fairly well maintained. If your original data contain negative values you may again need to choose an arbitrary constant to add.
    Simply deciding on a transform in an attempt to improve the “normality” of a set of observations is very ill-advised. Normality is never achieved in practice, and despite its attractive simplification of the arithmetic and algebra of regression, the inferential statistical calculations that generally follow the regression process and the consequences of back-transforming to the original data scale have to be accepted and allowed for. Take care when using transforms!

  74. My two cents: Air pressure in the atmosphere drops off as the logarithm of altitude… another example of nature’s use of logarithms.

  75. berniel
    Thank you for the comment about glacial melt water for increased runoff and sediment yield. You are correct. Willis is also correct in that glacial advance as well as retreat can caused increased sediment transport. I used precipitation to include rain as well as snow. Also, in places such as the NE USA, rapid snowmelt from warm rain can cause huge floods with ice jams and drastically increased sediment transport. When lake Bonneville’s dam burst it scoured out huge canyons in the Snake River of Northwest USA, perhaps the largest yet found. I can’t swear to it until I go back and find a good book on it, but I do believe an ice dike may have been involved in the very rapid flood and huge depths of water. But with a flood of that magnitude we may never know.
    So again, you are correct, precipitation in the form of snow and ice flow/melt can be significant.
    I appreciate your comments and those of Willis.

  76. Leonard Lane says:
    April 14, 2013 at 9:22 pm
    Leonard,
    The great scouring was done by the Columbia River, fed by bursts of water from the Clark Fork River. Lake Missoula was indeed a glacier formed lake that burst periodically. I remember an article in Scientific American belatedly acknowledging the work of J Harlan Bretz in figuring out the mechanism of the periodic flooding by hiking over the ground on foot. They did discover varves, though I don’t recall use of the term. The varves were located on the Snake River because the huge volume of water in the Columbia caused the Snake to flow upstream, allowing deposition of soil layers as the water flow decelerated. Examination of those layers was key in figuring out what had gone on so many years before.
    pbh

  77. McComber Boy.
    Correct. Missoula floods took the Clark Fork to the Columbia and certainly where the Snake River joined the Columbia River, water would have went everywhere, including up the Snake River. And the Columbia River was scoured several times by the Missoula Floods.
    But, I was referring to Lake Bonneville (in Utah) and its massive flood scouring the Snake River. As I recall, the Lake Bonneville flood was a single occurrence. Thank you for making the comment more general by including other massive floods in the region..

  78. Actually i would go further. From a visual inspection I suspect that if someone wasted enough of their time obtaining the correllation coefficients between each pair of proxies, even after transformation and standardisation, there would be sufficient evidence to show that the varve thicknesses were not responding to the same stimuli.
    Not to be picky, but you mean there would be insufficient evidence to conclude that there are responding to the same stimulus (one at a time). It is virtually impossible to conclude a negative in science. The correct use of statistics here is to try to disprove the null hypothesis of no common cause, and to be unable to do so.
    Beyond that, I disagree as long as one includes the sets that are obviously correlated with what is being interpreted as a 19th and 20th century warming. These aren’t correlated with most of the rest of the sites, but they are correlated strongly with each other. I would guess that a few of the other sites might have decent correlation too, but on a much more modest basis — they might well be responding weakly to a common set of conditions e.g. excessive rainfall or drought per year or per decade. This is actually pretty reasonable, as things like ENSO often create correlated patterns of drought quite independent of the temperature, “often” (on a century timescale) drought or flood patterns that last 2 to 5 years and that are quite widespread on a continental basis.
    But your suggestion is an excellent one, and since Willis has the data in hand and with R it is very easy, perhaps he might do the correlation between all of the datasets pairwise. The reason this is important — and Steve McIntyre can check me on this if he is still listening in — is that let’s just suppose that all of the datasets BUT the ones that are obviously corrupted in the 20th century have a very consistent level of mutual correlation, one that is equally consistent with a weak hypothesis of correlation with e.g. El Nino type phenomena (or some simple model of spatiotemporal correlation of weather on a decadal scale). Then the datasets that are in fact questionable will stand out like a sore thumb with absurdly incorrect correlation properties, outvoted within the study some three to one. Add to that the fact that those sites/studies have other serious problems discussed even in the literature from which they were drawn, and it is the basis for a comment in the original journal if not withdrawal of the paper, or — better yet — the publication of a new paper that directly contradicts its conclusions (and supports the null hypothesis of “no observable warming”) on the basis of a sound statistical analysis of the data that omits the sites that are known to be corrupted by human activity such as land use changes AND that fail a clearly stated correlation criterion that is used to separate climate signal (which is surely universal) from any land use signals that OVERWHELM the climate signal at selected sites.
    I fervently await the day when climate scientists actually learn real statistics, or stats grad students start to go into climate science. So far the best that I can say is that the papers purporting to present detailed statistical studies of complex multivariate data from proxies with multiple influences look like they have been written by rank amateurs, people who have literally taken no more than one or two stats courses and who are utterly clueless about the subject beyond that. Mann writing his own PCA code in FORTRAN for gosh sake. This study, where a mere glance at the data suggests that one had better resolve an obvious bimodal inconsistency before publishing a result that de facto selects one of the two modes as being representative of the climate and the other as not. I mean, one would think that a scientist would want to understand this before risking reputation and career on a premature publication.
    Sadly, one of the most profound effects of the kind of gatekeeping and career-ruining-behind-the-scenes activity revealed by the Climategate letters, where there actually is an informal committee of sorts that actually will try to ruin your career and obstruct your papers (and that has had some success doing so) is that researchers in the field no doubt fear Mann more than they fear publishing a bad result. That is, it is literally safer, career-wise, to publish a paper that confirms a “warming signal” even if there are obvious, glaring inconsistencies in the data and no adequate explanation of those inconsistencies than to do the right thing and pursue the inconsistencies at the expense of ending up with a result with no warming, that fails to reject the null hypothesis.
    What a disaster, if this part of the US actually shows no discernible 19th and 20th century warming, or a barely resolvable 0.3 to 0.5 C warming from the LIA on (cherrypicking the interval where we KNOW global warming to have occurred, just not anthropogenic global warming).
    rgb

  79. rgb@duke said:
    “I fervently await the day when climate scientists actually learn real statistics, or stats grad students start to go into climate science. ”
    Nice one. That would sting a little if Mann & Co. were bright enough to understand your comment.

Comments are closed.