Guest sciency schist by David Middleton
Resolution vs. Detection
Geoscientists in the oil & gas industry spend much of our time integrating data sets of vastly different resolutions. Geological data, primarily well logs, are very high resolution (6 inches to 2 feet vertical resolution). Geophysical data, primarily reflection seismic surveys, are of much lower and highly variable resolution, dependent on the seismic velocities of the rocks and the frequency content of the seismic data. The rule of thumb is that a stratigraphic unit must be at least as thick as one-quarter of the seismic wavelength (λ/4) to be resolved.

Thinner beds can be detected, but not resolved. The same principle applies to normal faults. If the throw (vertical offset) is at least λ/4, the fault can be resolved and the throw can be accurately estimated from the seismic data. If the throw is less than λ/4, the best I can determine is that a fault is probably present.

When we integrate geological and geophysical data, we don’t just splice the well log onto a seismic profile. We convert the well logs into a synthetic seismogram. This is most effectively accomplished using sonic and density logs.

The sonic and density logs are used to calculate acoustic impedance and a reflection coefficient series (RC). The RC is convolved with a seismic wavelet, often extracted from the seismic data near the well. The synthetic seismic traces (3 panels on the right) can then be directly compared to the seismic profile. The resolution difference is quite large. The trough-to-trough interval on the seismic pulse below is about 150 m.

How Does This Relate to Climate “Science”
Signal theory and signal processing principles apply to all signals, not just seismic data. A signal is a time-variant sequence of numbers. Almost all temperature, carbon dioxide and sea level reconstructions employ many of the same data processing methods as seismic data processing. Deconvolution is particularly essential to ice core carbon dioxide chronologies. Sometimes the signal processing methods are properly employed. Van Hoof et al., 2005 demonstrated that the ice core CO2 data represent a low-frequency, century to multi-century moving average of past atmospheric CO2 levels. They essentially generated the equivalent of a synthetic seismogram from the stomata chronology and tied it to the ice core.

From my geological perspective, most climate “hockey sticks” are the result of the improper integration of high resolution instrumental data (akin to well logs) and lower resolution proxy data (akin to reflection seismic data). Many of these hockey sticks appear to have been the result of a careless, if not reckless, disregard of basic signal processing principles.
Temperature Reconstruction Hockey Sticks

One of the most egregious violations of signal processing principles is the generation of climate reconstruction “hockey sticks” through variations of “Mike’s Nature Trick.”
In the aftermath of the “Climategate” scandal, Penn State conducted an “investigation” of Mike’s Nature Trick. The Penn State whitewash was ludicrous…
After careful consideration of all the evidence and relevant materials, the inquiry committee finding is that there exists no credible evidence that Dr. Mann had or has ever engaged in, or participated in, directly or indirectly, any actions with an intent to suppress or to falsify data.
RA-10 Inquiry Report
It can’t be proven that he intended to suppress or falsify inconvenient data. It’s entirely possible that he accidentally devised a method to suppress or falsify inconvenient data.
This bit here was laughable…
In fact to the contrary, in instances that have been focused upon by some as indicating falsification of data, for example in the use of a “trick” to manipulate the data, this is explained as a discussion among Dr. Jones and others including Dr. Mann about how best to put together a graph for a World Meteorological Organization (WMO) report. They were not falsifying data; they were trying to construct an understandable graph for those who were not experts in the field. The so-called “trick”1 was nothing more than a statistical method used to bring two or more different kinds of data sets together in a legitimate fashion by a technique that has been reviewed by a broad array of peers in the field.
RA-10 Inquiry Report
The most benign possible interpretation of the “trick” is that they edited part of Keith Briffa’s reconstruction because the tree ring chronology showed that the 1930s to early 1940′s were warmer than the late 1990′s. So, they just substituted the instrumental record for the tree ring chronology. In the private sector, this sort of behavior isn’t benign… It’s grounds for immediate termination or worse.
I suppose that there is no evidence that they did this with intent to deceive. However, the fact that they called it “Mike’s nature trick” sure makes it seem like this sort of thing was standard operating procedure.
Taking a set of data that shows that the 1930′s were warmer than the 1990′s and using another data set to reverse that relationship is not bringing “two or more different kinds of data sets together in a legitimate fashion.” It’s a total bastardization of the data.
To see an example of “Mike’s Nature Trick,” go here… Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia
Click this… EIV Temperature Reconstructions
Open up any of the **cru_eiv_composite.csv or **had_eiv_composite.csv files. All of them splice the high frequency instrumental data into the low frequency proxy data. To Mann’s credit, he at least documents this one enough to sort it out.
This statement from their PNAS paper is totally unsupported by proxy reconstructions… “Recent warmth appears anomalous for at least the past 1,300 years whether or not tree-ring data are used. If tree-ring data are used, the conclusion can be extended to at least the past 1,700 years.”
The anomalous nature of the “recent warmth” is entirely dependent on the “tricky” use of the instrumental data. He didn’t use any proxy data post-1850. The eivrecondescription file states “Note: values from 1850-2006AD are instrumental data”.
This image from Mann’s 2008 paper implies that all of the reconstructions are in general agreement regarding the claim that the “recent warmth appears anomalous for at least the past 1,300 years”…

By cluttering up the image with many reconstructions and plastering the instrumental record onto end of the graph, it’s impossible to see any details.
Here are Mann (Cru_EIV), Moberg, 2005, Christiansen & Ljungqvist 2012 (un-smoothed) Ljungqvist 2010 and Esper 2002 (low frequency)…

Zoomed in on post-1800 with HadCRUT4 NH added:

The Modern Warming only appears anomalous because of the higher resolution of the instrumental record and its position at the tail-end of the time series.
Ljungqvist (2010) clearly explained the problem with directly comparing instrumental data to proxy reconstructions.
The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. AD 1-300, reaching up to the 1961-1990 mean temperature level, followed by the Dark Age Cold Period c. AD 300-800. The Medieval Warm Period is seen c. AD 800–1300 and the Little Ice Age is clearly visible c. AD 1300-1900, followed by a rapid temperature increase in the twentieth century. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961-1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself.
[…]
The proxy reconstruction itself does not show such an unprecedented warming but we must consider that only a few records used in the reconstruction extend into the 1990s. Nevertheless, a very cautious interpretation of the level of warmth since AD 1990 compared to that of the peak warming during the Roman Warm Period and the Medieval Warm Period is strongly suggested.
[…]
The amplitude of the temperature variability on multi-decadal to centennial time-scales reconstructed here should presumably be considered to be the minimum of the true variability on those time-scales.
Ljungqvist, 2010

Direct comparisons of the the modern instrumental record to the older proxy reconstructions are not robust because the proxy data are of much lower resolution. The proxy data indicate the “minimum of the true variability on those time-scales.” The instrumental data are depicting something closer to actual variability.
The proxy data lack the high frequency component of the signal. When the high frequency component of a signal is filtered out, it attenuates the amplitude.

The direct comparison of instrumental data to proxy data becomes even more problematic when the record length is extended beyond 2,000 years.

The supposedly “four warmest years on record” have occurred about 300 years after the coldest century of the past 100 centuries. This could only be described as a “climate crisis” or “climate emergency” by someone who was totally ignorant of basic scientific principles, particularly Quaternary geology and signal processing. It’s actually a helluva a lot better than just about any other possible evolution of Earth’s “climate.”
The longer the record length of the reconstruction, the more important the consistency of the temporal resolution becomes.
“Consistency of the temporal resolution” means that the resolution of the older proxies are consistent with the recent proxies. Temporal resolution is a function of the sampling interval…
We believe the greater source of error in these reconstructions is in the proxy selection. As documented in this series, some of the original 73 proxies are affected by resolution issues that hide significant climatic events and some are affected by local conditions that have no regional or global significance. Others cover short time spans that do not cover the two most important climatic features of the Holocene, the Little Ice Age and the Holocene Climatic Optimum.
[…]
We also avoided proxies with long sample intervals (greater than 130 years) because they tend to reduce the resolution of the reconstruction and they dampen (“average out”) important details. The smallest climate cycle is roughly 61 to 64 years, the so-called “stadium wave,” and we want to try and get close to seeing its influence. In this simple reconstruction, we have tried to address these issues.
Andy May WUWT.
This is a table of all of the “used” proxies. They have fairly consistent temporal resolution. They have long record lengths and most, if not all, cover the Holocene Climatic Optimum and Little Ice Age, the warmest and coldest climatic phases of the Holocene. It’s about as close to “apples and apples” you can get with a >10,000-yr global temperature reconstruction. Andy’s proxies have an average resolution of 75 yrs and an average record length of 11,697 yrs with low standard deviations (by proxy series standards). There is no significant trend of degrading resolution with time, as occurs in most proxy reconstructions.

Andy’s reconstruction demonstrates that the nadir of the Little Ice Age was the coldest climatic period of the Holocene. This is a feature of every non-hockey stick reconstruction and even most hockey stick reconstructions, including the serially flawed Marcott et al., 2013. It also demonstrates that the modern warming is inconspicuous relative to the Holocene’s pervasive millennial-scale climate signal (Bohling &Davis, 2001).
If you open the Reconstruction References spreadsheet and go to the far right column (Comments), Andy notes whether the proxy was “used” or explains why it was rejected. The three most common reasons for rejecting proxy series were:
- Coarse resolution (denoted as “resolution too big”)
- Not old enough
- Not Young enough
Andy could have spliced the instrumental record onto the end of this and made a hockey stick… But that would be fraudulent anywhere outside of academic and government “science”. It’s akin to splicing a well log into a seismic line and calling it an anomaly.
Regarding Marcott, the authors even state that their Holocene reconstruction can’t be directly compared to instrumental data due to resolution differences… Yet they do so anyway.
Q: What do paleotemperature reconstructions show about the temperature of the last 100 years?
A: Our global paleotemperature reconstruction includes a so-called “uptick” in temperatures during the 20th-century. However, in the paper we make the point that this particular feature is of shorter duration than the inherent smoothing in our statistical averaging procedure, and that it is based on only a few available paleo-reconstructions of the type we used. Thus, the 20th century portion of our paleotemperature stack is not statistically robust, cannot be considered representative of global temperature changes, and therefore is not the basis of any of our conclusions. Our primary conclusions are based on a comparison of the longer term paleotemperature changes from our reconstruction with the well-documented temperature changes that have occurred over the last century, as documented by the instrumental record. Although not part of our study, high-resolution paleoclimate data from the past ~130 years have been compiled from various geological archives, and confirm the general features of warming trend over this time interval (Anderson, D.M. et al., 2013, Geophysical Research Letters, v. 40, p. 189-193; http://www.agu.org/journals/pip/gl/2012GL054271-pip.pdf).
Q: Is the rate of global temperature rise over the last 100 years faster than at any time during the past 11,300 years?A: Our study did not directly address this question because the paleotemperature records used in our study have a temporal resolution of ~120 years on average, which precludes us from examining variations in rates of change occurring within a century. Other factors also contribute to smoothing the proxy temperature signals contained in many of the records we used, such as organisms burrowing through deep-sea mud, and chronological uncertainties in the proxy records that tend to smooth the signals when compositing them into a globally averaged reconstruction. We showed that no temperature variability is preserved in our reconstruction at cycles shorter than 300 years, 50% is preserved at 1000-year time scales, and nearly all is preserved at 2000-year periods and longer. Our Monte-Carlo analysis accounts for these sources of uncertainty to yield a robust (albeit smoothed) global record. Any small “upticks” or “downticks” in temperature that last less than several hundred years in our compilation of paleoclimate data are probably not robust, as stated in the paper.
Real Climate
If the “the 20th century portion of our paleotemperature stack is not statistically robust”… Why was it included in the publication? The modern instrumental record would be a single data point at the resolution of Marcott’s reconstruction.
Why does this matter?
So, what would it mean, if the reconstructions indicate a larger (Esper et al., 2002; Pollack and Smerdon, 2004; Moberg et al., 2005) or smaller (Jones et al., 1998; Mann et al., 1999) temperature amplitude? We suggest that the former situation, i.e. enhanced variability during pre-industrial times, would result in a redistribution of weight towards the role of natural factors in forcing temperature changes, thereby relatively devaluing the impact of anthropogenic emissions and affecting future predicted scenarios. If that turns out to be the case, agreements such as the Kyoto protocol that intend to reduce emissions of anthropogenic greenhouse gases, would be less effective than thought.
Esper et al., 2005
It matters because the only way to directly compare the instrumental data to the pre-industrial proxy data is the filter the instrumental data down to the resolution of the proxy data. This leads to climate reconstructions with “enhanced variability during pre-industrial times” and “result in a redistribution of weight towards the role of natural factors in forcing temperature changes, thereby relatively devaluing the impact of anthropogenic emissions and affecting future predicted scenarios.”
It matters because the advocates of the Anthropocene as a geologic epoch are relying on the Marcott hockey stick.

It matters because hockey sticks are being used to justify policy changes, carbon taxes and destroy individual liberty and prosperity.
Most of the asserted evidence that recent climate changes deviate from the norms of the Holocene are equally consistent with being the result of differences in the resolution of paleo-climate data and instrumental records.
Part deux will address carbon dioxide and sea level hockey sticks.
References
Anklin, M., J. Schwander, B. Stauffer, J. Tschumi, A. Fuchs, J. M. Barnola, and D. Raynaud (1997), “CO2record between 40 and 8 kyr B.P. from the Greenland Ice Core Project ice core,” J. Geophys. Res., 102(C12), 26539–26545, doi: 10.1029/97JC00182.
Christiansen, B. and F.C. Ljungqvist. 2012. “The extra-tropical Northern Hemisphere temperature in the last two millennia: reconstructions of low-frequency variability”. Climate of the Past, Vol. 8, pp. 765-786. www.clim-past.net/8/765/2012/ doi:10.5194/cp-8-765-2012
Davis, J. C., and G. C. Bohling. “The search for patterns in ice-core temperature curves. 2001, in L. C. Gerhard, W. E. Harrison, and B. M. Hanson, eds., Geological perspectives of global climate change, p. 213–229.
Esper, J., E.R. Cook, and F.H. Schweingruber. 2002. “Low-Frequency Signals in Long Tree-Ring Chronologies for Reconstructing Past Temperature Variability”. Science, Volume 295, Number 5563, 22 March 2002.
Esper, J., R.J.S. Wilson, D.C. Frank, A. Moberg, H. Wanner, & J. Luterbacher. 2005. “Climate: past ranges and future changes”. Quaternary Science Reviews 24: 2164-2166.
Etheridge, D.M., L.P. Steele, R.L. Langenfelds, R.J. Francey, J.-M. Barnola and V.I. Morgan. 1998. “Historical CO2 records from the Law Dome DE08, DE08-2, and DSS ice cores”. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A.
Finsinger, W. and F. Wagner-Cremer. “Stomatal-based inference models for reconstruction of atmospheric CO2 concentration: a method assessment using a calibration and validation approach”. The Holocene 19,5 (2009) pp. 757–764
Kouwenberg, LLR. 2004. “Application of conifer needles in the reconstruction of Holocene CO2 levels”. PhD Thesis. Laboratory of Palaeobotany and Palynology, University of Utrecht.
Ljungqvist, F.C. 2009. N. Hemisphere Extra-Tropics 2,000yr Decadal Temperature Reconstruction. IGBP PAGES/World Data Center for Paleoclimatology Data Contribution Series # 2010-089. NOAA/NCDC Paleoclimatology Program, Boulder CO, USA.
Ljungqvist, F.C. 2010. “A new reconstruction of temperature variability in the extra-tropical Northern Hemisphere during the last two millennia”. Geografiska Annaler: Physical Geography, Vol. 92 A(3), pp. 339-351, September 2010. DOI: 10.1111/j.1468-459.2010.00399.x
Mann, Michael, Zhihua Zhang, Malcolm K Hughes, Raymond Bradley, Sonya K Miller, Scott Rutherford, & Fenbiao Ni. (2008). “Proxy-based Reconstructions of Hemispheric and Global Surface Temperature Variations over the Past Two Millennia”. Proceedings of the National Academy of Sciences of the United States of America. 105. 13252-7. 10.1073/pnas.0805721105.
McElwain et al., 2002. “Stomatal evidence for a decline in atmospheric CO2 concentration during the Younger Dryas stadial: a comparison with Antarctic ice core records”. J. Quaternary Sci., Vol. 17 pp. 21–29. ISSN 0267-8179
Moberg, A., D.M. Sonechkin, K. Holmgren, N.M. Datsenko & W. Karlén. 2005. “Highly variable Northern Hemisphere temperatures reconstructed from low-and high-resolution proxy data”. Nature, Vol. 433, No. 7026, pp. 613-617, 10 February 2005.
Van Hoof, Thomas, Karsten A. Kaspers, Friederike Wagner-Cremer, R.S.W. Wal, Wolfram Kürschner & Henk Visscher. (2005). “Atmospheric CO 2 during the 13th century AD: reconciliation of data from ice core measurements and stomatal frequency analysis”. Tellus B. 57. 10.3402/tellusb.v57i4.16555.
For additional references, see these Watts Up With That posts:
Addenda

Thanks David, great post. I am a geophysicist of 35 years standing and all this is part of my day job.
The issue of resolution is one part of the key. In geostatistics this issue is called the support of the measurement and I have been involved with a lot of research on this topic and published a bit. I have also been heavily involved in seismic inversion and stochastic seismic inversion, so this is very much in my area of expertise (there is another point about low frequency trends in seismic inversion that is very relevant to why climate models behave like they do – that’s a separate topic for another day). It is simply not valid to draw conclusions by comparing 20th century high resolution data to low resolution data over millennia or longer.
I would note some other points too. Marcott is a particularly egregious example of the problem of comparing high and low resolution. Indeed, the BBC still carries a web page with the Marcott reconstruction image with a misleading conclusion, despite my complaining about it. However, it should be noted that the results Marcott presented for PhD thesis do not include the splicing on of modern day temperature data. The splice was only added for journal publication. Why? Steve McIntyre has quite a lot to say about it.
Regarding tree rings and temperature reconstructions, there is another fundamental problem with methodology here as well. Firstly, the idea that tree rings are simply thermometers is absurd. Ask any forestry expert and they will point out many other limiting factors that influence tree growth. Furthermore, the idea that tree rings (and tree growth) respond linearly to temperature is also absurd. Most living things have an optimal environmental range and any extremes, both higher and lower, will impair growth. For this reason it means that growth response curves are likely to be an inverted-U shape and therefore multivalued. Reconstructing temperature from such a response curve is essentially impossible, regardless of compounding factors.
My final point and ultimate criticism of the hockey-stick reconstructions are based on a simple numerical experiment. If you generate a series of random, trendless but auto-correlated long-term time series and compare them by correlation in their latter parts to a training period (temperature say in the 20th Century) and only accept the best correlations (what Steve M. calls post-hoc screening) and them sum the best results, what you find is that you get a long, downward trending low frequency handle and then an uptick into the modern period caused by the presence of the auto-correlation and the post-hoc selection process. Of course, because the input data were random auto-correlated time series their sum should tend to a long term trendless series with a mean of zero (assuming that was the chosen mean). Instead, performing this test in a spreadsheet with random auto-correlated time series produces an output with exactly the same features as the tree-ring hockey-stick reconstructions: Long, low frequency slightly down-trending handle, strong uptick into the training period of temperature increase. The hockey-stick is an artefact of the methodology, nothing more.
I didn’t really get into the smoothing effect of proxies that aren’t like isotope thermometers. d18O is effectively a temperature measurement. You can mathematically translate it to temperature, like you can mathematically translate sonic & density logs to synthetic seismic traces. Tree rings, pollen, plant stomata etc., are all effective paleo-climate proxies; but the math is far more uncertain. Although they tend to be high resolution proxies, they are also very noisy and have to be smoothed to be useful.
I also didn’t get into horizontal resolution. The horizontal resolution of multi-proxy reconstructions is degraded by the dating uncertainty and determining how the proxies should be stacked together. My recollection is that there were serious issues with how Marcott adjusted the ages of their proxies.
David, You are right to avoid mixing low resolution proxies with ones with higher resolution. Ocean sediment cores are another example of this problem, with difficulties of dating confounding the results, though varved sediments in fresh water environments may exhibit higher resolutions. I haven’t noticed mention of the work of Loehle yet, who produced a fatally flawed paper published in E&E in 2007, which resulted in a quick “correction” which was not much better, as I pointed out in a Letter to the Editor, published September 2008. That 2008 paper by Loehle is still a favorite among the denialist camp, which I find yet another example of disinformation from that side of the so-called “debate”.
What I have been pondering is, what effect have proxies on the temperature signal? For a simplified example, let’s consider a temperature signal with annual sampling frequency and a proxy with 50 years resolution.
As a first approximation, we can think of the proxy as a decimator, taking only one point every 50 of the signal. But even so, where do we start: from the first point in the series going forward? Or otherwise?
Still, proxy as decimator is probably not correct. More likely they act as some sort of low-pass filter, alas with unknown transfer function.
We can at first assume, maybe, that proxyes work as a simple moving average (because physical systems cannot see into the future, it must not be a centered average), in this case with n = 50.
Then I’m not sure how to proceed.
Donald L. Klipstein
Of all the problems tree rings have for indicating temperature, time resolution is not one of them. Tree rings show down to the year, sometimes to the season, how favorable things were for the tree to grow or to have certain characteristics of its wood. The problem is determining what to attribute variations of tree ring parameters to.
There must be some places (maybe not many) where at least one kind of tree has rings that did (or usually did) a good job of indicating local temperature with smoothing of a year or only a few months. One issue is that local temperature in some of these places may not correlate well with global temperature. Another is that ability of tree rings to indicate variations of temperature or other factors that affect tree growth could change with new modern factors such as
pollution,
invasive species, increase of population of animals eating or otherwise stressing trees due to hunting of their enemies,
_________________________________________________________
tree growth could change with new modern factors such as
pollution:
Pollution is no way “modern factors”:
https://www.google.com/search?client=ms-android-huawei&ei=XFIYXYe4MovRrgTEmICwDQ&q=Tons+Of+Cosmic+Dust+Fall+To+Earth+Every+Day&oq=Tons+Of+Cosmic+Dust+Fall+To+Earth+Every+Day&gs_l=mobile-gws-wiz-serp.
_________________________________________________________
Touch any boathouse at sea level or touch a stone on the Himalayas: your hands will turn black from old-fashioned pollution. Who is qualified to distinguish between old-fashioned star dust and “anthropogenic modern pollution”.
it matters because the only way to directly compare the instrumental data to the pre-industrumental proxy data is the filter the instrumental data down to the resolution of the proxy data. –>
it matters because the only way to directly compare the instrumental data to the pre-industrumental proxy data is they filter the instrumental data down to the resolution of the proxy data.