UMass Amherst geoscientists reconstruct ‘eye-opening’ 900-year Northeast climate record

Just a single study.  Just a single location.  Just a single technique.  Importance TBD~ctm

From EurekAlert!    Link to study and excerpts below press release.

Geoscientists at UMass Amherst have reconstructed the longest and highest-resolution climate record for the northeastern United States

University of Massachusetts at Amherst

189909_web

Doctoral students Daniel Miller, in the water, with Helen Habicht and Benjamin Keisling, handle two recaptured sediment traps from an unusually deep lake in central Maine, where they collected 136 sediment samples spanning the 900-year time span to reconstruct the longest and highest-resolution climate record for the Northeastern United States to date. Credit: UMass Amherst

AMHERST, Mass. – Deploying a new technique for the first time in the region, geoscientists at the University of Massachusetts Amherst have reconstructed the longest and highest-resolution climate record for the Northeastern United States, which reveals previously undetected past temperature cycles and extends the record 900 years into the past, well beyond the previous early date of 1850.

First author Daniel Miller, with Helen Habicht and Benjamin Keisling, conducted this study as part of their doctoral programs with advisors geosciences professors Raymond Bradley and Isla Castañeda. As Miller explains, they used a relatively new quantitative method based on the presence of chemical compounds known as branched glycerol dialkyl glycerol tetra ethers (branched GDGTs) found in lakes, soils, rivers and peat bogs around the world. The compounds can provide an independent terrestrial paleo-thermometer that accurately assesses past temperature variability.

Miller says, “This is the first effort using these compounds to reconstruct temperature in the Northeast, and the first one at this resolution.” He and colleagues were able to collect a total of 136 samples spanning the 900-year time span, many more than would be available with more traditional methods and from other locations that typically yield just one sample per 30-100 years.

In their results, Miller says, “We see essentially cooling throughout most of the record until the 1900s, which matches other paleo-records for North America. We see the Medieval Warm Period in the early part and the Little Ice Age in the 1800s.” An unexpected observation was 10, 50-to-60-year temperature cycles not seen before in records from Northeast U.S., he adds, “a new finding and surprising. We’re trying to figure out what causes that. It may be caused by changes in the North Atlantic Oscillation or some other atmospheric patterns. We’ll be looking further into it.”

He adds, “We’re very excited about this. I think it’s a great story of how grad students who come up with a promising idea, if they have enough support from their advisors, can produce a study with really eye-opening results.” Details appear in a recent issue of the European Geophysical Union’s open-access online journal, Climate of the Past.

The authors point out that paleo-temperature reconstructions are essential for distinguishing human-made climate change from natural variability, but historical temperature records are not long enough to capture pre-human-impact variability. Further, using conventional pollen- and land-based sediment samples as climate proxies can reflect confounding parameters rather than temperature, such as precipitation, humidity, evapo-transpiration and vegetation changes.

Therefore, additional quantitative paleo-temperature records are needed to accurately assess past temperature variability in the Northeast United States, the researchers point out. An independent terrestrial paleo-thermometer that relies on measuring two byproducts of processes carried out in branched GDGTs in lake sediment, a method first introduced two decades ago by researchers in The Netherlands, offered a promising alternative, Miller says.

Source organisms are not known for branch GDGTs, he points out, but they are thought to be produced in part by Acidobacteria. “These are compounds likely produced by different algae and bacteria communities in the membrane, or skin,” he notes. “Just like for humans, the skin regulates the organism’s body temperature and these compounds change in response to temperature. So if they grow in summer, they reflect that and the compounds are different than if they were produced in winter. We record the compounds to get the temperature curves. We found there seems to be a huge bloom of these organisms in the fall. After they die, they settle into the lake bottom. We think it’s mainly a fall temperature that we’re detecting.”

For this work, Miller and colleagues constructed large plastic sediment traps and deployed them about ten feet below the surface of a small, 106-foot-deep lake in central Maine in May, 2014. They then dove under to collect a catchment bottle from the bottom of each trap every month in June, July, August and September, and the following May 2015.

Miller says, “This lake is very deep for its small area, with very steep sides. It doesn’t seem to have much mixing of water layers by surface winds. We think that has helped to preserve a bottom water layer with no oxygen year-round, known as anoxia, which helps in the preservation of annual layers in the sediments at the bottom of the lake. It’s rare for a lake to have such fine, thin lines that represent annual deposition, so all you have to do is count the lines to count the years. We double-checked our results with radiocarbon dating and other methods, and it turns out that reconstructing the temperature record this way was successful.”

Miller and colleagues say this project enjoyed notable support from many quarters, including the UMass Amherst Alumni Association supporting student field work and data collection in Maine; the geology department at Bates College; funding from the U.S. Geological Survey; and at UMass Amherst, sophisticated biogeochemistry laboratory equipment and the Joe Hartshorn Memorial Award from the geosciences department, and other assistance from the Northeast Climate Adaptation Science Center.

The researchers conclude that this first paleo-temperature reconstruction coupled with site-specific knowledge from Basin Pond “informs our understanding of climatic variability in the Northeast U.S. beyond the era of human influence” and “contributes to our understanding of the production and fate of brGDGTs” in lake systems.

###

The paper is open access and can be found here.

Abstract

Paleotemperature reconstructions are essential for distinguishing anthropogenic climate change from natural variability. An emerging method in paleolimnology is the use of branched glycerol dialkyl glycerol tetraethers (brGDGTs) in sediments to reconstruct temperature, but their application is hindered by a limited understanding of their sources, seasonal production, and transport. Here, we report seasonally resolved measurements of brGDGT production in the water column, in catchment soils, and in a sediment core from Basin Pond, a small, deep inland lake in Maine, USA. We find similar brGDGT distributions in both water column and lake sediment samples but the catchment soils have distinct brGDGT distributions suggesting that (1) brGDGTs are produced within the lake and (2) this in situ production dominates the down-core sedimentary signal. Seasonally, depth-resolved measurements indicate that most brGDGT production occurs in late fall, and at intermediate depths (18–30 m) in the water column. We utilize these observations to help interpret a Basin Pond brGDGT-based temperature reconstruction spanning the past 900 years. This record exhibits trends similar to a pollen record from the same site and also to regional and global syntheses of terrestrial temperatures over the last millennium. However, the Basin Pond temperature record shows higher-frequency variability than has previously been captured by such an archive in the northeastern United States, potentially attributed to the North Atlantic Oscillation and volcanic or solar activity. This first brGDGT-based multi-centennial paleoreconstruction from this region contributes to our understanding of the production and fate of brGDGTs in lacustrine systems.

6 Conclusions

We find evidence for seasonally biased in situ production of branched glycerol dialkyl glycerol tetraethers (brGDGTs) in a lake in central Maine, NE US. BrGDGTs are mostly produced in September at Basin Pond, and their downward fluxes in the water column peak at 30 m in water depth. A down-core brGDGT-based reconstruction reveals both gradual and transient climate changes over the last 900 years and records cooling and warming events correlated with other Northern Hemisphere records and the NAO and AMO indices. This suggests inland Maine climate is sensitive to hemispheric climate forcing as well as changes in regional atmospheric pressure patterns and North Atlantic sea surface temperatures. Our new MBT′5ME temperature reconstruction, supported by a pollen record from the same site, reveals a prominent cooling trend from AD 1100 to 1900 in this area. Comparison with regional hydroclimate records suggests that despite increasingly cool and wet conditions persisting at Basin Pond over the last 900 years, fire activity has increased. Although recent fire activity is likely anthropogenically triggered (i.e., via land-use change), our results imply an independent relationship between climate and NE US fire occurrence over the study interval. Thus, the paleotemperature reconstruction presented here alongside site-specific knowledge from Basin Pond informs our understanding of climatic variability in the NE US beyond the era of human influence.

HT/David B and Yooper

 

0 0 votes
Article Rating
77 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Richard
January 9, 2019 10:07 pm

From a mere layperson who relies on plain common sense to assess the value and authenticity of my ‘betters’ I say “Wow!” So there is brilliant science still being done. How is that data going to be massaged to fit by the higher purpose persons to fit their agenda?

Donald Kasper
Reply to  Richard
January 9, 2019 11:27 pm

By basic logic known from signal processing theory also used in electronics, we know you cannot reconstruct a signal sampling at or below half of its frequency, in this case, its periodicity. 900/136 = 6.6 x 2 = 13.2 years. Therefore, 10 year cycles are not discernible with the sampling used. Flat out, no. What happens with undersampling below the data frequency? You get aliasing effects seen as anomalous artifacts that show up that are not real. Then since they are studying organic compounds in an active biomass system, all they have to do is prove no bioturbation (sediment mixing) and no other interfering biochemical reactions. Also, no landslides under water (slumping), no periodic drying/wetting, no turbidity currents, etc.

Greg
Reply to  Donald Kasper
January 9, 2019 11:55 pm

Yes , technically impossible and that figure assumes evenly spaced data which almost certainly is not the case here. Not a very encouraging testament to their abilities in data analysis.

They also say this is better than other proxies which have confounding effects, yet they do not even know the species responsible for these deposits. That means that they do not understand the mechanism of creation and have NO IDEA whether there are also confounding causes other than temperature. It would seem a pretty safe bet that there will be some. No organic life cycle processes are going to be solely dependent on ONE variable : temperature.

They say they checked their ring counting using carbon dating. That is cart before the horse. If they have a reliable, readable layering as they claim this could be used to calibrate the carbon dating , not the other way around.

As always, I went straight to look for published data. It seems they have archived it , which is good. Sadly it is at NCDC which is shut down because of the shutdown.

Steven Mosher
Reply to  Greg
January 10, 2019 12:48 am

Did you read the paper in which the actual dating was done??? Miller 2017

They used prior work

” The age model for Basin Pond is based on 210Pb, varve counts, and five 14C dates and was previously published by Miller et al. (2017). The sediment examined here ranges in age from modern to ∼1100 BP, with a sampling resolution of 4 to 13 years (median: 7) (Miller et al., 2017).

Miller 2017

“The age model for Basin Pond core BP2014-5D is
primarily based on a varve count chronology of the
uppermost 80 cm confirmed through radioisotopic
dating of the upper sediments and radiocarbon dating
of plant macrofossil samples (Fig. 2; Table 1). Varve
counts were completed using X-Ray radiograph
images with 100 lm resolution. Laminations in the
upper 80 cm of the sediment record appear continuous
with no apparent hiatuses, and were counted three
times for the creation of the varve count chronology
with minimal error. The upper 15 cm of sediment were
also dated using radioisotopic analysis. Subsamples
from core BP2014-3D were freeze dried, homogenized, and measured for 210Pb activity on a Canberra
GL2020R Low Energy Germanium Detector, following the methods detailed by Woodruff et al. 2013.
Ages for 210Pb measurements were estimated assuming a constant rate of supply of unsupported 210Pb
activity (Appleby and Oldfield 1978). In addition,
radiocarbon dating of 5 discrete samples were conducted on terrestrial macrofossils (Table 1; Fig. 2).
Radiocarbon age estimates were calibrated using the
‘‘IntCal13’’ calibration to years before present in the
‘R’ program ‘BChron’ (Parnell 2016).”

So they used varves primarily and the core contained macrofossils ( pine needles and wood)
so they carbon dated the fossils

Reply to  Greg
January 10, 2019 4:20 am

Good comments Greg and Donald. Thank you.

Jaap Titulaer
Reply to  Greg
January 10, 2019 7:10 am

>> Sadly it is at NCDC which is shut down because of the shutdown.

LOL, that is just plain stupid and more an act of ‘Resistance!’.
There is no need for such a website to be down or ‘closed’. The bills have already been paid. Someone pulled a switch to make their own little statement.

Patrick Blasz
Reply to  Jaap Titulaer
January 10, 2019 8:51 am

That is what Leftists do. The close down war memorials that have no active personnel or service needs, they close roads through national forests and the like, again with no active personnel or service needs. These people are weasels just like their hero, little [pruned] weasel (LFW) whose name goes unmentioned.

Clyde Spencer
Reply to  Jaap Titulaer
January 10, 2019 10:46 am

How can you be so cynical? Obviously, there is no one around to put food and water out for the servers!

Reply to  Jaap Titulaer
January 12, 2019 9:12 am

Same spiteful tactic of shutting down pubic websites has been done by @NOAA & @FCC. There are better alternatives for the data provided by both. Some of this shutdown should be permanent, and we can begin with those who put their interests ahead of the public’s.

D Anderson
Reply to  Greg
January 10, 2019 1:10 pm

No mixing – Lakes in that latitude turn over twice a year.

Stephen Cheesman
Reply to  D Anderson
January 10, 2019 1:46 pm

Not meromictic lakes, of which there are a small number in North America. One is Crawford Lake in Ontario. It does not turn over, so its sediment provides an excellent undisturbed record dating back hundreds of years which has been used for archaeological investigations and revealed a 15th century Iroquoian Village.

D. Anderson
Reply to  D Anderson
January 10, 2019 2:36 pm

What makes these lakes special so they don’t turn over?

Stephen Cheesman
Reply to  D Anderson
January 10, 2019 3:00 pm

From Wikipedia:
>>>>
A meromictic lake may form for a number of reasons:

The basin is unusually deep and steep-sided compared to the lake’s surface area

The lower layer of the lake is highly saline and denser than the upper layers of water
<<<<

In the case of Crawford Lake, the inlet is very low volume and gentle, and so minimizes disturbance of the water during the critical freezing period when turnover normally occurs.

Steven Mosher
Reply to  Donald Kasper
January 10, 2019 12:53 am

136 refers to the number of CORES taken. not the varve count

Resolution was between 4 and 13 years

” The sediment examined here ranges in age from modern to ∼1100 BP, with a sampling resolution of 4 to 13 years (median: 7) (Miller et al., 2017).”

Easy Tiger
Reply to  Steven Mosher
January 10, 2019 1:27 am

Thank you Mosher
I very much appreciate your work here
You are under appreciated

Clyde Spencer
Reply to  Steven Mosher
January 10, 2019 10:43 am

Mosher,
Even so, there is little support for the claim that a 10-year (11?) cycle can be extracted from what appears to be uneven sampling rates, and for which even the median does not meet the Nyquist Criteria.

https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem

krm
Reply to  Steven Mosher
January 10, 2019 11:18 am

“136 refers to the number of CORES taken. not the varve count”
No, they took very few cores and only one was used for the climate reconstruction work. From their paper:
“Subsamples for past climate reconstruction were taken every 0.5 cm from the uppermost 68 cm of core BP2014-5D.” That rate of sampling yields 136 samples.

kevin kilty
Reply to  krm
January 10, 2019 1:21 pm

Thanks.

Jeff Alberts
Reply to  krm
January 11, 2019 6:30 am

Will we hear an apology from Mosher?

Oh, that’s right, elitists don’t apologize.

Jeff Alberts
Reply to  krm
January 11, 2019 6:32 am

“No, they took very few cores and only one was used for the climate reconstruction work. From their paper:”

That’s even less than useless.

Steve O
Reply to  Donald Kasper
January 10, 2019 4:22 am

They have 136 samples, each of which covers some large number of years. I don’t know if each sample covers the entire 900 years, but I don’t see any basis for questioning the resolution they are claiming. They are obviously aware that mixing of sediments would cause problems.

BallBounces
Reply to  Donald Kasper
January 10, 2019 5:18 am

Although I didn’t understand a word they said, I think they meant to say they found ten 50-60 year cycles. It’s ambiguous.

Farmer Ch E retired
Reply to  BallBounces
January 10, 2019 6:38 am

“An unexpected observation was 10, 50-to-60-year temperature cycles not seen before”

BallBounces – My thoughts also that they identified ten temperature cycles of 50-60 years. I haven’t read the source paper so can’t confirm this.

Jeff Alberts
Reply to  Farmer Ch E retired
January 11, 2019 6:35 am

“An unexpected observation was 10, 50-to-60-year temperature cycles not seen before”

No, they found 10 cycles. Zero evidence that anything they found corresponds to temperature, since “Interestingly, variations in MBT′5ME values for the last 100 years do not agree with instrumental observations.”

Don Straitiff
Reply to  Donald Kasper
January 10, 2019 10:50 am

Donald (good name!), I’m not sure I agree with your assessment. It looks to me like they are presenting everything in the time domain, not the frequency domain. I see reference to a ten year resolution, but I interpret this to mean that they believe that they are pegging the accuracy of the year that the sample came from, essentially to the nearest decade. I don’t see that they’re FFTing the data at all, though it would be interesting to do so, just throwing away any peaks you see with a wavelength less than, oh, maybe 50 years.

kevin kilty
Reply to  Don Straitiff
January 10, 2019 1:15 pm

Doesn’t matter. Aliasing applies to the time series just as it would to the spectrum.

Don Straitiff
Reply to  kevin kilty
January 11, 2019 7:51 am

The time series is analogous to what would be represented from the signal being run through a low pass filter. It’s true you can’t make any conclusions about higher frequency content in the data, but that does not mean the data is totally worthless. It just means the data has limitations, which is true for just about all data.

Louis Hooffstetter
Reply to  Richard
January 10, 2019 10:07 am

I find this study fascinating, but I would urge caution. The researchers admit: “Source organisms are not known for branch GDGTs, but they are thought to be produced in part by Acidobacteria.”

From https://www.sciencedirect.com/science/article/pii/S0146638012001970:
“Branched glycerol dialkyl glycerol tetraethers (branched GDGTs) are membrane lipids presumably derived from anaerobic, heterotrophic soil bacteria. Empirical studies have shown that the composition of branched GDGTs in topsoils varies in degree of cyclization and methylation (expressed as CBT and MBT, i.e. cyclization and methylation index of branched tetraethers) depending on soil pH and mean annual air temperature (MAT).”

To a layman like me, that’s clear as mud. GDGTs may turn out to be an excellent proxy for paleotemperatures, but until we know more about them I wouldn’t jump to that conclusion just yet.

Joel O'Bryan
January 9, 2019 10:12 pm

They found… (wait for it) ….

noise.

And amazingly…. noise has ups and downs in the plot.
Any EE trained engineer knows what noise looks like. Apparently these UMass clisci majors missed out on EE course as that major required a hard math matriculation.

They should take a walk over to the UMass EE department and find a EE prof and show him/her what they have.

Reply to  Joel O'Bryan
January 10, 2019 5:09 am

They hardly need to go to UMass to find a professor when they have Raymond Bradley of MBH98 Hockey Stick fame to help them find hidden meaning.

commieBob
Reply to  Joel O'Bryan
January 10, 2019 5:46 am

Yep.

The basics are super important. I’ve seen lots of work that makes an error in basic science or math which makes everything else invalid. Of course, the author is protected from that realization because she doesn’t actually understand the basics. As a result, when you point out the error, you are accused of hand waving.

January 9, 2019 10:23 pm

No spaghetti graph?

Alan Tomalty
Reply to  Hans Erren
January 9, 2019 10:46 pm

Yes you have to read the paper from the link but see my post below’

Alan Tomalty
January 9, 2019 10:45 pm

“Interestingly, variations in MBT′5ME values for the last 100 years do not agree with instrumental observations.” DIRECT QUOTE FROM THE STUDY

Another useless study with wasted taxpayer $.

Edward Hanley
Reply to  Alan Tomalty
January 9, 2019 11:25 pm

That the introductory sentence to section 5.5 should be enough to invalidate the study is remarkable. You either have great instincts or strong confirmation bias. At the very least you may be over-generalizing. Summarizing the study as “useless” overlooks the fact that it supported three doctoral students in learning how to do rigorous data collection, background research, and straightforward, unbiased reporting of data. That would make them part of the mere 0.08% of the world’s population who have that skill. (Whether they use their powers for evil or good in the future is up to them.) Not exactly a “waste of taxpayer $”, and if you were to read the article you’d notice that six separate funding sources are reported, most of them private.

Reply to  Edward Hanley
January 10, 2019 4:34 am

“Edward Hanley January 9, 2019 at 11:25 pm
That the introductory sentence to section 5.5 should be enough to invalidate the study is remarkable. You either have great instincts or strong confirmation bias.”

A back handed compliment or an outright ad hominem.
Either, fail to identify any error in Alan’s comment.
Either, exposes your particular bias.

“Edward Hanley January 9, 2019 at 11:25 pm
At the very least you may be over-generalizing. Summarizing the study as “useless” overlooks the fact that it supported three doctoral students in learning how to do rigorous data collection, background research, and straightforward, unbiased reporting of data.

“supported three doctoral students in learning…”
And what exactly does that prove? That the funds are wasted?

“learning how to do rigorous data collection, background research, and straightforward, unbiased reporting of data.”
Well, your own words describe your views rather well; “You either have great instincts or strong confirmation bias. At the very least you may be over-generalizing.”
When describing a research project these personal claims are valid only between the students, their professor and the review board.

Beyond that personal student/educator interface, those claims are all distraction and zero validity in published research. Instead it further proves that science by press release makes for weak science.

Louis Hooffstetter
Reply to  ATheoK
January 10, 2019 10:16 am

I applaud these “three doctoral students… learning how to do rigorous data collection, background research, and straightforward, unbiased reporting of data…” because they are following the scientific method, which is something climate ‘scientists’ consistently fail to do.

But in the end, Alan Tomalty is right.

Donald Kasper
Reply to  Alan Tomalty
January 9, 2019 11:29 pm

That is because lake bottoms do not measure air temperature and are going to give a highly dampened signal.

D. J. Hawkins
Reply to  Donald Kasper
January 10, 2019 6:04 am

You really didn’t read their methodology, did you?

Reply to  Alan Tomalty
January 10, 2019 9:26 am

Not so useless, Alan. It’s new data. The fact that their proxy reconstruction doesn’t agree with instrumental records (during the only period where instrumental records actually exist!!) is obviously a critical point. It should have been given much more prominence, and the conclusions should have been modified to highlight the inadequacy of the proxy to directly estimate paleotemperatures.

Instead of “hiding the decline”, these authors have progressed to “ignoring the decline”. Which is a positive development (of a sort).

Or you could call it “hiding the decline in plain sight”

Anyway – if you go to the paper and click on the link to “supplement” on the table of contents, you will see that they are moving on to try and get a proxy that does correlate with instrumental temperature.

Reply to  Alan Tomalty
January 10, 2019 9:37 am

That was my first question. How does their data match with instrumental observations? Based on their statement that you quoted, the answer is “Well we will just have to use Mann’s Nature Trick, then it will match great!*”

*Mann’s Nature Trick aka Hide the Decline reveled in the Climate Gate emails.

January 9, 2019 10:45 pm

What I want to know right off is, what other environmental variables (in addition to temperature) have an effect on the production of these brGDGTs. How many ambiguities does this “study” not reconcile? Is it like the tree-ring proxies that focus exclusively on the temperature effects to explain changes in ring thickness while completely disregarding the effects of CO2 variations in the air and of water availability? How many assumptions have been made in this nominally very insightful and innovative study? It is an essential discipline to list them all, always, in every “study.”

Alan Tomalty
Reply to  Alexander
January 9, 2019 10:47 pm

A useless study. See my comment above

Reply to  Alan Tomalty
January 9, 2019 10:59 pm

Alan,
I agree this study is a useless pile of bovine excrement (like so much of today’s peer-reviewed science), your posting on another thread questioning the basis of solar fusion destroys your sci cred… completely.
Really.

Reply to  Joel O'Bryan
January 10, 2019 12:10 am

Joel,
Alan’s quote is accurate (I checked, that’s what the paper says) and the conclusion that follows from that is the paper is garbage. Alan could believe the tooth fairy is real and secretly living in his basement, and it would not change the quote or the conclusion one must draw from it.

Uncle Max
January 9, 2019 11:02 pm

call me a cynic, but if this paper is deemed hearsay ( aka, unhelpful to the cause and Govt grants) it will be ignored, kneecapped, and the community of climate alarmists NE will make sure that NO ONE on this team will be invited to the quarterly New England Climate cotillion .

Donald Kasper
January 9, 2019 11:15 pm

Go to a lake. Sample one place. Call that the paleoclimate of the Northeastern United States. Got it. Geothermometers to determine the temperature of formation of rocks was the hot thing for two decades. One by one, they were all shown to be unreliable. Even the chemicals in rock bubbles in solid quartz are claimed to be unreliable from potential leaching in or out over a geologic age. But, now we have organic compounds that tell all.

John F. Hultquist
January 9, 2019 11:15 pm

This sounds interesting. It is a first step in a process.
Maybe in 5 years an evaluation will show it useless, or not.
No need to be negative on the first go round.
Somewhat like potty training a young one.

Donald Kasper
Reply to  John F. Hultquist
January 9, 2019 11:31 pm

All geothermometers for rock formation I know of such as the famous chlorite geothermometer are highly debated and shown by others to be unreliable after the initial paper introduction. Exception cases that blow the correlation away start showing up, and the overall correlation is highly broad.

Donald Kasper
Reply to  John F. Hultquist
January 9, 2019 11:34 pm

Then they should have used O-18/O-16 ratio data in the sediment to use that estimate of temperature. For any site where you are drilling subsurface rock or soil, you also need to get the geothermal gradient. If the area is geothermall active or has a high geotemperature gradient, then you will have problems as the method will then be recording heat flux from the subsurface, mostly.

Steve O
Reply to  John F. Hultquist
January 10, 2019 4:34 am

Exactly.

January 9, 2019 11:40 pm

Perhaps there are still student scientist that wan t to find out the truth.

A far greater problem is what will happen to the data from now on. How much in the way of adjustments will be made, for the “Greater Good” etc. is going to happen.

MJE

observa
January 10, 2019 12:07 am

Don’t these naïve whippersnappers know that the science is settled? Whoever’s supervising them must be sacked forthwith for dereliction of duty.

January 10, 2019 12:50 am

Don’t knock them because their Ëlders ” “have lost their way.

Progress is made by someone not agreeing with those who have solved everything, think Galeelio for example.

MJE

Graemethecat
January 10, 2019 12:59 am

Most interesting is that these authors clearly detected the MWP. I thought the official Warmist dogma was that it didn’t exist.

knr
Reply to  Graemethecat
January 10, 2019 3:02 am

and the little ice age too. Both of which Mann and Co ‘wiped out’ so they are not going to make many friends there .

joe- the non climate scientist
Reply to  Graemethecat
January 10, 2019 5:36 am

The official warmist dogma is that the MWP did exist, but was only regional spanning the western europe, the north atlantic, greenland and the eastern atlantic coast.

mikewaite
January 10, 2019 1:49 am

Difficult to understand the negative comments above . This seems, technically , to be a brilliant piece of work. I wonder if it could be applied to , say, Wastwater or Windermere in the UK.
The conclusions agree,IMO, with other proxy based reconstructions for the NH and North Atlantic region .
Also the conclusions on fire effects , not necessarily associated with warmer conditions ,

– ” Furthermore, two recent fire events occurred during the historical period, which is reconstructed as relatively cool and wet
(Fig. 9). Therefore, it appears that at Basin Pond, temperature
did not exert a major influence over fire occurrence. “-

agree with another report (vaguely remembered and also I think from “Climate of the Past ” journal ) which found the correlation of wild fire in NE USA to be more strongly associated with recent cold , wet conditions ( and probable deliberate Native American or settler clearance) than previous warm dry conditions. ( I will post a link if I can find it in my disgracefully disordered files )
There is an ongoing project , global in intention, to coordinate paleo- charcoal deposition events which should answer the question of whether wildfire events are more likely in the present warming conditions.

Krishna Gans
January 10, 2019 1:57 am

They found 60 year cycles and are excited, read the “skeptic papers” and see, nothing new, tbese cycles 😀

ralfellis
January 10, 2019 2:32 am

Quote: “Interestingly, variations in MBT-5ME values for the last 100 years do not agree with instrumental observations.”

As Tomalty observed above, this invalidates the entire technique (or the conclusions from the technique).

But we have been here before, haven’t we?

I seem to remember a tree ring record that was a fantastic proxi for temperature, all the way back to the Roman era, but likewise ‘did not agree with instrumental observations’. And so they had to ‘HIDE THE DECLINE’ by splicing in the modern thermometer record – to cover up the poor performance of the proxi.

But how can any honest scientist do this, with a straight face? And claim grants on the back of it? And ruin entire economies on the back of it?

Even a child can see that if your proxi record does not agree with the recent temperature record, then your proxi cannot be recording temperature. It may be recording many things, like rainfall, cloud cover, and pestilance, but it has nothing to do with temperature.

R

.

observa
Reply to  ralfellis
January 10, 2019 6:03 am

“Even a child can see that if your proxi record does not agree with the recent temperature record, then your proxi cannot be recording temperature.”

Picky picky and besides it’s grown ups that are in charge of interpreting the proxies so you just tell the child it’s for their own good and these minor distractions go away.

Dave Fair
Reply to  ralfellis
January 10, 2019 9:41 am

Are the instruments measuring UNI?

January 10, 2019 4:23 am

A very interesting paper.
Note this shows a long term cooling trend, from 1300 to 2000.

And the same trend is noted in compared papers from China and Africa. (Dang et al and Russel et al. (and note also the high late 1900s peak seen in both the New England and China)

Chart. https://www.clim-past.net/14/1653/2018/cp-14-1653-2018-f07.pdf

“…..The Dang et al. (2018) calibration is based on alkaline Chinese lakes and reconstructs temperatures ranging from 4 to 9 ∘C, while the Russell et al. (2018) calibration is based on African lakes and yields temperatures ranging from 10 to 14 ∘C (Fig. 7)….”

Note also ‘fire’ is a useless metric as a measure of climate change.:
“…..These cooling and wetting trends are surprising given the record of fire history at Basin Pond (Miller et al., 2017), which shows five periods of increased charcoal deposition since AD 1100 (Fig. 9). It is important to note that wildfire activity is a complex phenomena, with multiple factors affecting fire occurrence apart from climate variability (Marlon et al., 2017). …”

https://www.clim-past.net/14/1653/2018/

tt
Reply to  markx
January 10, 2019 9:23 am

Climate may not affect fire frequency much, but fires in a forested environment strongly affect locak climate (insolation, wind exposure).

Ragnaar
January 10, 2019 6:28 am

Could use a banner plot of the temperature over time so when this is shared, the MWP and LIA are suggested. Facebook favors the first picture of the story. Reach and impact.

Bill Murphy
January 10, 2019 7:27 am

I have to agree that there is a LOT more work to be done before this can be considered a reliable proxy. It’s well known that prior to about 1000–1500 years ago central Maine was predominately populated by hardwood species which apparently due to natural climate change were replaced by the spruce and fir softwoods seen there today. Additionally, the Spruce Budworm infestations periodically destroy vast areas of forest with resultant widespread fires in a 40 to 60 year cycle (sound familiar?) that has been going on for at least 400–500 years. So there are 2 known natural cycles going on there with 40 to 60 year periods. The obvious questions then are, does the AMO somehow help trigger the Budworm cycle and resultant fires and deforestation or are they independent and the budworm cycles are more related to stand age or some other factor. Is any of this even correlated? How will the death and regrowth of vast swaths of forest every 40–60 years affect this proxy? What effect has the massive growth of the timber and paper industries and the heavily managed forests and watersheds had on this proxy for the last 150 years or so? Did the millions of gallons of carbamate insecticide and oil sprayed on the forests during the last budworm infestation 40 years ago affect anything? There is also some evidence that the gradual warming since mid 19th century could be encouraging the hardwoods to make a comeback. What will this affect? Sounds to me like a classic case of too many variables.

Phineas Sprague
January 10, 2019 8:28 am

I offer that this is an interesting effort. Hal Bornes was counting varves in hanging valleys the 1960’s. Varves can be easily counted. Each varve contains unique information based upon the conditions during that cycle. THIS FACT IS SELF EVIDENT. THE QUESTION IS HOW DOES THIS INFORMATION GET RELEASED IN THE FORM OF DATA? This group would not be jawboning over this effort if it hadn’t been undertaken. Raw data is valuable precisely because it must next be accepted and interpreted. Any one is entitled to their own interpretation of data and thus if they have the courage to impress their peers with their brilliance or regale their peers with the delight in pointing out the fallacies in their interpretation. Good for these up and coming scientists. Have the courage to take an idea and run with it.

tty
January 10, 2019 9:12 am

GDGT as a temperature proxy seems to be fairly well understood as to the underlying biological mechanism. Bacteria change the composition of their cell membranes based on the temperature.

The problem is just what temperature is being measured? Did the bacteria live at the surface or below? Or even in the bottom sediments? Are we recording winter, summer or average temperatures?

This seems to be the main problem with using GDGT (TEX 86) in marine sediments. They are assumed to record SST but probably aren’t, at least not exclusively. The results are frequently quite aberrant in upwelling areas where the relation between SST and temperatures at depth are unusual strongly suggesting that we are dealing with a mixed-depth signal.

Another problem is that bacterial communities vary geographically. For example modern GDGT temperatures from the Red Sea gives a completely different and much lower calibration curve than other areas. For some reason that is unclear to me it has been assumed that for measuring temperatures in former hothouse climates, the calibration curve from the Red Sea, the only current large deep sea area with conditions similar to former hothouse climate, should NOT be used.

There seems to be rather limited data on GDGT from shallow fresh-water sites like this one, so it is difficult to evaluate the results, but generally speaking it seems that the results from such sites have very large uncertainties compared to oceans and large, deep lakes.

Reply to  tty
January 11, 2019 4:45 pm

tty, thanks for you insight. I’m curious why no one has tried analyzing oxygen isotopes as is commonly done with marine sediment cores.

Duane
January 10, 2019 10:29 am

The negativity amongst and self-satisfied tut-tutting and tongue clucking by some in this thread is disgusting. This is a scientific study, with real world data, and an interesting conclusion, that remains subject to both peer review and replication. It only adds to knowledge, and does not claim to be the ultimate prehistoric temperature recorder.

But it clearly is worth followup.

The notion that at least the northeastern part of North America had been in a 800+ year long cooling trend, then started back upwards around 1900 is certainly both interesting and significant if confirmed. Of course the amount of atmospheric CO2 really didn’t start increasing rapidly until the middle of the 20th century, suggesting that the upturn was likely just the natural end point of a cooling cycle. And the 50-60 year cycles within the overall trend is useful too, and seems to comport with what we’ve known to be true since the mid-1800s. That the indicator species don’t comport exactly with known temperature records doesn’t falsify the data or the study … but it begs other questions, as several commenters above have noted.

I do believe that based on comments in this thread and many others that a very large proportion of the commenters here at WUWT are not serious about science at all, but are into anti-science or they’re just into political science. Maybe no more than a third to a half of the commenters here seem to really give a damn about real science.

Clyde Spencer
Reply to  Duane
January 10, 2019 11:09 am

Duane,
If someone uses science to criticize science, how can they be “anti-science?” It is the character of peer review that new and different claims be able to withstand running the gauntlet. Or, to quote Nietzsche, “That which does not kill you, makes you stronger.” I don’t think that you understand how science and peer review is supposed to work. Appeals to personal authority (divine or otherwise) and ad hominem attacks are not allowed. However, apparent contradictions to known and accepted science are exactly what need to be defended against. Are you familiar with the defense of a dissertation in awarding a doctoral degree?

Duane
Reply to  Clyde Spencer
January 11, 2019 11:20 am

Laughable that you suggest internet comments coming mainly from people who are not professional scientists or engineers but are just wagging their political tongues constitutes “peer review” or PhD dissertation committtee commentary.

A comments page is not for that. A comments page at a purported science website should be for serious commentary, not heckling from the peanut gallery.

Clyde Spencer
Reply to  Duane
January 11, 2019 3:51 pm

Duane,
It is true that there is a mixture of political comment and serious scientific criticism here. However, I suspect that you and most of the readers here are capable of telling the two apart. On the other hand, some of the things that are being criticized have a distinctly political flavor, and it is probably appropriate to call that out, as in a paper starting out using the term “denier.”

Tom Abbott
January 10, 2019 11:43 am

From the article: “The authors point out that paleo-temperature reconstructions are essential for distinguishing human-made climate change from natural variability, but historical temperature records are not long enough to capture pre-human-impact variability.”

According to the IPCC, anything before 1950 would be “pre-human-CO2-impact, so we actually do have historical temperature records which can tell us whether humans are causing a discernable effect on the atmosphere because we have lots of temperature records from the 1930’s when it was warmer than today, and that warmth was caused by Mother Nature (according to the IPCC), and the current warming is no warmer than the 1930’s, so Mother Nature is also the cause of the current warming, until proven otherwise, which has not been done.

Thank you very much.

observa
January 10, 2019 3:35 pm

They’re looking in the wrong place because the atomic bombs are going off in the oceans according to a Guardian analysis of findings-
https://www.msn.com/en-au/news/techandscience/warming-oceans-likely-to-raise-sea-levels-30cm-by-end-of-century-–-study/ar-BBS57HV
“The new Science paper analysed four studies published between 2014 and 2017, which corrected for discrepancies between different types of ocean temperature measurements and gaps in measurements.”

Sounds familiar so we’re definitely doomed and it won’t do any good curling up under school desks in the foetal position because you won’t see all the flashes. We were all warned about this with the video of the exploding schoolkiddies but the grown ups wouldn’t listen. Our only chance of Salvation is to cease all fossil fuel extraction forthwith and put it back in the ground or volcanoes to appease Gaia.

Reggie
January 10, 2019 8:52 pm

What?

Chino780
January 11, 2019 12:20 pm

Bill Nye was wrong. The Medieval Warm Period was not just in Europe. LOL.