By P Gosselin on 29. March 2022
A new paper published in open access publishing MDPI looks at seven prominent hemispheric and global temperature reconstructions for the past 2000 years (T2k).
The analysis conducted by the authors found that some reconstructions “differed from each other in some segments by more than 0.5 °C” whilst some show negligible pre-industrial climate variability (“hockey sticks”).
Those showing variability would suggest natural factors playing a greater role than those that claim climate had been rather constant over the past 2000 years.
Abstract: Global mean annual temperature has increased by more than 1 °C during the past 150 years, as documented by thermometer measurements. Such observational data are, unfortunately, not available for the pre-industrial period of the Common Era (CE), for which the climate development is reconstructed using various types of palaeoclimatological proxies. In this analysis, we compared seven prominent hemispheric and global temperature reconstructions for the past 2000 years (T2k) which differed from each other in some segments by more than 0.5 °C. Whilst some T2k show negligible pre-industrial climate variability (“hockey sticks”), others suggest significant temperature fluctuations. We discuss possible sources of error and highlight three criteria that need to be considered to increase the quality and stability of future T2k reconstructions. Temperature proxy series are to be thoroughly validated with regards to (1) reproducibility, (2) seasonal stability, and (3) areal representativeness. The T2k represents key calibration data for climate models. The models need to first reproduce the reconstructed pre-industrial climate history before being validated and cleared for climate projections of the future. Precise attribution of modern warming to anthropogenic and natural causes will not be possible until T2k composites stabilize and are truly representative for a well-defined region and season. The discrepancies between the different T2k reconstructions directly translate into a major challenge with regards to the political interpretation of the climate change risk profile. As a rule of thumb, the larger/smaller the pre-industrial temperature changes, the higher/lower the natural contribution to the current warm period (CWP) will likely be, thus, reducing/increasing the CO2 climate sensitivity and the expected warming until 2100.
Reconstructing AD global temperatures within the band of + or – 1 degree C might be a fools errand.
However, researching a single locality by using tried and tested methods it might be possible to obtain reasonable assessment of the local area temperatures natural variability.
The historical accounts would definitely support the existence of the Medieval Warm Period, and the Little Ice Age. I would trust historical accounts as to what was grown where, and when the growing season started and ended, more than individual proxies.
Real science depends on empirical observations as you say “historical acounts”. Models without empirical confirmation are worthless – and I do not mean massaging the numbers.
Lake bottom pollen proxies are very good indicators of which direction the wind mostly blew during the two week flowering period….
I’m curious – what are all the assumptions that have to be made in using such a proxy?
no change in number/size of local plants? location of local plants? Amount of wind or lack of windstorms during that two week period? Size, shape and depth of lake? Amount of flow through the lake? Amount of precipitation and type of precipitation during that period? Local animals and humans?
Proxies always seem to me to carry such a huge margin of error as to be useless. But I’m willing to be convinced otherwise if you can show there are no extraneous factors influencing the proxy – a high bar it seems.
I like data that hits you between the eyes and leaves nothing to the imagination. It’s like the quote ascribed to Rutherford:
So, if you see evidence of significant sea level rise, that demonstrates that the global temperature was warm enough, for long enough, to melt some serious ice. It may not give you exact temperatures or dates, but it sure proves that our beloved Dr. Mann is full of … male bovine excrement.
Unstated are the basic requirements that fire/flood/mankind dramatically change the local plant cover. Not only annually, but also seasonally.
The proxis are so good, that you probably can make out single weather events that look like an entire years worth of flowering season, with a little sand washed over it from an untimely rainfall.
Last I worked with scientists, that looked at lake sediments, they were unsure how to distinguish seasons and years from weather events.
How they would make out wind direction escapes me for now. Is this something to do with its apples in the North and Bananas in the South?
I fully agree. To crop data I would add the growth and decline of glaciers as well as sea level as an indicator of long term global temperatures.
In the above graph, Buntgen shows the most variability. I went looking for the Buntgen paper and found “The influence of decision-making in tree ring-based climate reconstructions” link. Buntgen advocates an ensemble approach to tree ring temperature reconstructions. That sounds suspiciously like: Take a bunch of papers and average out the results. The basic question is whether tree rings are even a valid proxy. The paper makes no effort to validate tree rings vs. other proxies.
Buntgen talks about the role of volcanoes in producing large decreases in global temperature. When we look at recent volcanoes, we don’t see the kind of variation shown in Buntgen’s curve in the graph at the top of the page. We really haven’t had a lot of huge volcanic eruptions lately. link
Anyway, Buntgen’s work shows considerable natural variability and that’s interesting all by itself.
There’s no evidence that sea level fluctuated during MWP or LIA as that quote claims. It’s just a bogus post at a website by someone who doesn’t know what they are talking about. Be careful with your sources.
If you follow my link, you find a link that the author bases his statement on. That link cites a pile of reliable sources.
Can’t find any evidence for that amount of movement in those links cB, much as I’d like to have . . . . .
In particular, Hofstede 1991.
Jensen and Louters seem to be based on Hofstede.
Any climate reconstruction should be considered a failure if it doesn’t agree with documentary evidence of ancient climate (reports of growing seasons and crops for example).
You can’t even get people to agree the 70s were cooler
During the 70’s cold spells killed all of the trees in the orange groves around Orlando. That hasn’t happened again until recently.
@Mark: Certainly, the extreme cold 0f 1983 did what you say, except it was far from all the orange trees. A major reason for the disappearance was the conversion to land for building which was much more profitable. Also, following that, bacterial disease of citrus from, IIRC, imported from Brazil, finally did for the remainder. I had 13 trees survive the cold of 1983, but the disease finally killed off all the rest, the last being a lemon about 3 years ago.
Never simplify a complex cause or you fall into the trap that climatologists seems to be prone to, as exemplified by the rather good article above!
The dead trees were replanted a few years later.
It may not have been all, but it was all that could be seen from I-4.
Proxy reconstructions of Earth’s temperature are worthless and pointless. History has shown that it is far too easy for unscrupulous individuals to carefully select proxies to achieve the desired result. However, physical evidence doesn’t lie. The physical evidence that today’s temperature is colder than it has been for most of the last 10000 years is utterly overwhelming. As just one example of this, trees grew at higher latitudes and altitudes during the vast majority of the last 10,000 years.
Yes, the pathway to finding a proxy to suit your research objectives is to firstly determine what your end findings are going to be, then cast about through the arrays of proxies until you find the “Goldilocks” set.
And then weigh them so heavily in the statistics that they overwhelm all the other proxies. You know, the ones that weren’t so well-behaved.
Temperatures were much colder than today during the Middle Age (~450-950) and the 600 year Little Ice Age (~1250-1850), both periods of increased volcanic activity.
There are only three ls in my name.
My name has been constantly misspelled throughout my life and I find it irritating. The most amusing misspelling was Toyland.
Bill, how can a name as short as yours be misspelled so much?
I think that some people have never heard my name before. From memory, I have had the following spellings of my name:
How hard can it be? My name only has six letters.
At school, I even had a teacher explain to me that I had misspelled my own name.
I once had somebody try to send me an email using the name BillTolland. When he phoned me, I told him to remove one of the ls in my name. He was still unable to send me the email. I discovered that he had sent the email to BilTolland.
I hate it when people misspell my first name
I hear you. My maiden name was Burt, and you’d be surprised how many different ways that was spelled, particularly when heard over the phone. We got Bart, Bert, Birt, Bort, Birch, and Byrd. We all got in the habit of spelling it every time we gave it to someone.
Apologies if my previous comment added to that irritation, it was intended lightheartedly.
That’s ok, Tony. I found it funny.
“only three ls”
“Surely you jest” My bad!
You spelled Shirley wrong now also.
Billl? Or Tollland? 🙂
“carefully select proxies to achieve the desired result”
“As just one example of this, trees grew at higher latitudes and altitudes during the vast majority of the last 10,000 years”
That’s you selecting a proxy that you like, isn’t it?
If you paid me enough, I could probably collect, cherry-pick and “Mannipulate” enough tree-line data to make a hockey stick, using statistical methods that I don’t really understand (and which are inappropriate for the data in any case) but give me the answers I want. Once the answer is predetermined, the scientific method becomes no more than a cloak to hide the dubious methodology from uncritical readers.
PS I tend to agree with you that tree-lines would be a good proxy for historical temperatures, certainly better than tree-rings. But still, open to data-grooming in search the “right” answer. For example, I’ve been in sheltered valleys in northern Canada where there are stands of spruce and poplar – hundreds of kilometres beyond the tree-line shown on maps. No doubt enough anomalies like that could be collected by unscrupulous researchers to thoroughly pollute a data set.
MODS – please tell me what word(s) set off the alarm on this comment, which I thought was quite benign in tone. So I can avoid moderator-purgatory in the future!
(Smart Rock…. he he he can’t find the problem) SUNMOD
I would bet it’s the automatic WP bad-word filter with the Sc^nthorpe problem: laT!Tude and alT!Tude.
And then there are the inconvenient archeological sites that appear as Andean & Alpine glaciers recede. Unless we assume prehistoric people dug down through the ice to build the sites it has been warmer in the past.
The road to hell is paved with proxies.
It’s not only data selection but what baseline temps were at the time. It is fine to say that it looks like the temperature rose or fell by 0.2 degrees (LOL) but from what point. You have no idea what the absolute temperature was at the time. Somehow the use of anomalies has been translated into real actual absolute temperatures. As I see it, a 0.5C rise from a baseline of 10C is a lot different that a 0.5C rise from 15C. I would LOVE to see how these graphs separate if the were shown as absolute temps at the times shown.
The global average temperature construct has programmed our brains into believing there must be a mechanism acting globally to drive a change.
The argument has boiled down to a global average CO2 increase (i.e. a human cause?), vs an unknown (natural?) causal mechanism acting globally.
This is a false framing of the issue. The proof is in the proxies. They vary wildly from place to place and time to time.
Our conceptualization of a global human or natural mechanism is invalidated. There is much more going on.
Michael “Piltdown” Mann still hasn’t been thrown in jail for his colossal scientific fraud ?
Trump beat Stormy Daniels in court. Mark Steyn scratches his head and checks his bank account. Not saying that Michael Mann is a fraud, but he is not to climate as Stormy is to sex. They’re both paid actors, but she actually performs once in a while.
I will say Mann is a fraud.
Buntgen et al shows natural variability that is reasonable for a change….interesting paper….however if you play “pick the eruption” with his graph, only Tambora really stands out, Pinatubo is lost in noise.
ALL VEI4 and greater volcanic eruptions decrease average anomalous global temperatures because of their injection of SO2 aerosols into the stratosphere.
For a VEI4, the decrease is ~0.2 deg. C.,more for larger eruptions.
Somewhere out there, in the vast black hole of deleted emails and other internet entities is the famous email stating “We have to get rid of the Medieval Warm Period”, sent to Dr. David Deming, supposing him to be “one of them”. The person most likely to have sent the email was NOAA’s Jonathan Overpeck, who in fact did state in ’98 that “the so-called Medieval Warm Period did not exist.” Then Mike Pants-On-Fire Mann came out with his hockey stick graph. It was a fait accompli, and a major assault on science and truth itself.
At the same source I found this interesting paper:
Antarctic Winds: Pacemaker of Global Warming, Global Cooling, and the Collapse of Civilizations by W. Jackson Davis and W. Barton Davis of UC Santa Cruz
In their conclusions the write: “Therefore, the counterfactual framework of causality supports the conclusion that the natural climate cycles identified here and in previous studies, the ACO/AAO [12,13] and the ACWO that drives it (this paper), are the primary cause of the contemporary global warming signal. Atmospheric CO2 may be responsible for a small and declining fraction of the current global warming through its known greenhouse effects, but we conclude that the primary cause is the natural climate cycles explored here. The proportion of anthropogenic and natural contributions to contemporary global warming remains to be established.”
AAO – Antarctic Oscillation
ACO – Antarctic Centennial Oscillation
ACWO – Antarctic Centennial Wind Oscillation
Climate | Free Full-Text | Antarctic Winds: Pacemaker of Global Warming, Global Cooling, and the Collapse of Civilizations | HTML (mdpi.com)
Great find Jeff. Impressive work by Davis & Davis. Has the stamp of certainty all over it.
There are over a thousand research papers a summary of their results and the location of the proxy research here
A typical example
Steig et al. 2013: Last 2000 years
Fegyveresi et al. 2016: Last 5500 years
Fudge et al. 2016: Last 31,000 years
Steig et al. 2013: DeltaO18
Fegyveresi et al. 2016: Bubble number densities in ice core as temperature proxy
Fudge et al. 2016: The temperature history (Figure 2a) is derived by combining the WDC water stable-isotope record [Steig et al., 2013; WAIS Divide Project Members, 2013, 2015] with information from borehole temperatures and nitrogen isotopes.
Steig et al. 2013: Longterm cooling over past 2000 years (oxygen values trend to more negative values). Higher frequency warming peak 950-1100 AD preceded and followed by colder temperatures may correspond to MWP warming.
Fegyveresi et al. 2016: Warm phase 650-1150 AD. Note that (black) peak warming data point at 1200 kyrs BP (=750 AD) was (possibly erroneously) excluded by authors as outlier.
Fudge et al. 2016: Warm phase 500-1100 AD, superimposed on longterm cooling over past 2000 years
Lots of interesting stuff to look at.
Paleotemperature proxies need two things: good temperature accuracy and good temporal control. I used to work with a proxy that had excellent temperature accuracy but poor temporal control (noble gas solubility in groundwater). Oxygen isotopes from ice cores over the past few millennia have excellent temporal control, but their interpretation in terms of global temperatures is tricky. They are very useful, however, especially concerning the amount of ice locked in at the poles and the timing of major glaciation events. Dendrothermometers have excellent temporal control, but have not demonstrated any skill as thermometers. Other proxies like pollen in sediments and isotopes in marine creatures also have to be calibrated and one has to assume the calibration holds up over time and varying conditions.
It’s almost a Heisenberg uncertainty problem. One can have perfect accuracy or perfect time. Choose one. I hope there’s a proxy that improves on both (clumped isotopes?). Even if it’s local, if you get enough local results, maybe you can approach regional reconstructions, then enough regionals can make a global result.
“…have to be calibrated and one has to assume the calibration holds up over time and varying conditions.”
Which inevitably they don’t and proxy selection leads to hockey sticks when the calibration happens against the modern warming era which is all the data we have. Temperature proxies have extremely large error margins but they’re never truly appreciated or expressed or even frankly understood.
For tree rings, how do you express error against a factor like reduced sunlight due to neighbouring trees? Or reduced irrigation? Or disease? Or any number of factors that are completely unknown.
Well, the noble gas temperature (NGT) paleotemperature proxy does not need calibrating (it’s one charm), but it is a bear to date the temperature signal. It also has a lesser known virtue in that it can make statements about water recharge conditions (drought, storminess, etc.) via the “excess air” component of dissolved noble gases.
As I said, dendro has not yet been shown to be capable of reliable temperature calibration. It’s only virtue is that it is a good clock. There are many other proxies that still need calibration but are still superior in temperature performance than dendro.
All proxies have assumptions built into them. The NGT proxy is one I hadn’t come across until now, thanks.
The following dubious claim about NGT in the EPA “Noble Gas Temperature Proxy for Climate Change” document…
Seems wrong to me. How can the proportion of dissolved gasses in the groundwater be independent of the amount of groundwater that was added in each season? I’m guessing they just let it all average out and hence they’re ignoring (statistically) the error margins.
It is true that NGTs have assumptions, and that there are different models used for inverting NG concentrations into temperatures that give slightly different results. However, there’s an interesting paper written by yours truly that illustrates that temperature variations within a time series are virtually model independent.
The NGT proxy uses the concentrations of 4 different noble gas isotopes that are derived from air and not from subsurface radioactive decay (typically Ne-20, Ar-36, Kr-84 and Xe-132) to estimate both the temperature dependent equilibrium solubility component acquired by groundwater at the time of recharge and an excess air component. You have to account for mechanically introduced noble gases that dissolve at depth from entrained bubbles (the so-called excess air component). The 4 noble gases have different temperature dependence of solubility and we know the composition of air. In the simplest “unfractionated air” (UA) model, you have 4 data points (the 4 isotopes) and 2 parameters (temperature and excess air), so you have 2 degrees of freedom. You also need to know the altitude of recharge, but this is usually well known for most aquifers. The air saturated water (ASW) component tells you the temperature, and that component is present whenever water is in contact with the atmosphere, regardless of the amount of water that enters the aquifer.
It turns out that Xe is the most temperature sensitive gas and Ne the least. Xe tells you the most about temperature and Ne tells you the most about excess air.
For aquifers during drought conditions, most recharge is due to rare large storms (e.g. hurricanes), which can lead to the entrainment of a lot of bubbles that dissolve at high hydrostatic pressures (i.e., there’s a lot of excess air). So the excess air component can become an average humidity proxy.
One more thing. The NGT proxy is really based on the temperature at the water table in the recharge zone. Water percolates slowly down to the water table and equilibrates at the temperature of the ground near the water table, often 10s of meters down. The mean annual ground temperature is usually the mean annual air temperature, so you average a temperature signal over a few years or so. There are exceptions to this assumption (which I was active in researching), where recharge is so fast that water does not have sufficient time to equilibrate at the ground temperature (e.g. volcanic islands like Maui). These are exotic cases and are not normally used for NGT temperature reconstructions.
“The mean annual ground temperature is usually the mean annual air temperature, so you average a temperature signal over a few years or so.”
And the current temperature of the water dictates how much of the noble gasses remain dissolved so how is there any history of temperature there at all?
+/- 0.5 C? I can’t measure the temperature of a room to better than that. How can we determine the “global temperature” of the ancient past to that degree?
There is probably almost always more than +/-0.5C degrees difference over different parts of any given room
Just try living as a Swiss farmer during the LIA. Suicide your best option. Forget all these temperature maniacs.
As McShane and Wyner (MW) stated more than 10years ago,
“Second, we take the data as given and do not account for uncertainties, errors, and
biases in selection, processing, in-filling, and smoothing of the data as well as the
possibility that the data has been “snooped” (subconsciously or otherwise) based
on key features of the first and last block. Since these features are so well known,
there is absolutely no way to create a dataset in a “blind” fashion. ”
And more specifically to Mann´s proxies (which in my opinion can be asked to any other reconstruction as well):
“The process by which the complete set of 95/93 proxies is reduced to 59/57/55
is only suggestively described in an online supplement to Mann et al. (2008). As
statisticians we can only be skeptical of such improvisation, especially since the
instrumental calibration period contains very few independent degrees of freedom.
Consequently, the application of ad hoc methods to screen and exclude data in-
creases model uncertainty in ways that are unmeasurable and uncorrectable.”
From there (and many more details given in the MW paper) I think you can conclude that all these reconstructions are perfectly valid and correct (sic!), but continue to understate their uncertainty, which might render them complete useless for any climate related question!
In other words, since
>> 7 Major Temperature Reconstructions Find No Agreementcannot be a correct answer as the planet for sure had a global temperature in the past,
there are either flaws with the reconstruction or bias in the proxy selection or of course both.
the best reconstruction is obtained by AMO index of the following link:
Problem 1. The primary graph in this essay has the Y-axis in temperature ANOMALY values. This choice is popular for climate research but scorned by classic scientists. It already covers up some of the errors.
I lack the means to make a graph in actual temperatures, but plead for someone to do this.
Only then can one see how the several guesses really look for comparison.
Problem 2. With comparison graphs like the one that features here, there is rarely any expression of uncertainty about the X-axis, Time. It is likely that there are errors of hundreds of years for some of the proxies, which makes a meal from trying to match peaks and the to match peaks to events like volcanic eruptions.
Problem 3. There is fascinating material from Steve McIntyre about problems with the PAGES guesswork. He shows, for example, a number of individual proxy reconstructions, few of which have a hockey stick shape – but when combined into the composite, magically a hockey stick appears. In earlier eras of scientific history, such a circumstance would cause an outcry of the Piltdown Man type. Now it is hardly noticed by a community that has been educated with dumbed down science. My remedy would be Court case after Court case until proper conduct was resumed. Geoff S
PAGES 2019: 0-30N Proxies « Climate Audit
Not really important. It’s the future climate that matters, and we know from our models that will kill us all. We’re doomed.
Any reconstruction that shows this as the warmest period in the last 2000 or 8000 years is false
The problem with any study conducted after Hanson started pushing Global Warming is that the science is skewed. Anyone wanting federal research funds, or to get their PhDs, knows the required results.
Of course there is no agreement.
We have a hard time agreeing on present temperature data based on extensive measurements and more or less sophisticated test equipment.
And still, sediments, ice cores, tree rings and so on, is expected to provide sufficient precision…?
Anything beyond very vague trends could I not imagine trustworthy.
Models for coupling proxies to temperature are based on some more or less sound theories, which of course are more or less correct modelled. Then there are a lot of
Known unknowns, unknowns unknowns….
“From No Tricks Zone” ?? About a paper published in the open access bargain basement trash bin section of the scientific literature ??
What more need be said. 99% of the “content” of the “No Tricks Zone” website is pure nonsense. There’s no reason to imagine that this should be any different.
MDPI expects reviews of articles to be returned to the journal’s editor within 7-10 days.
Make of that what you will.