This story broke while Anthony and I were eclipse hunting. ~ctm
By jennifer on August 21, 2017 in Information
AFTER deconstructing 2,000-year old proxy-temperature series back to their most basic components, and then rebuilding them using the latest big data techniques, John Abbot and I show what global temperatures might have done in the absence of an industrial revolution. The results from this novel technique, just published in GeoResJ [1], accord with climate sensitivity estimates from experimental spectroscopy but are at odds with output from General Circulation Models.
According to mainstream climate science, most of the recent global warming is our fault – caused by human emissions of carbon dioxide. The rational for this is a speculative theory about the absorption and emission of infrared radiation by carbon dioxide that dates back to 1896. It’s not disputed that carbon dioxide absorbs infrared radiation, what is uncertain is the sensitivity of the climate to increasing atmospheric concentrations.
This sensitivity may have been grossly overestimated by Svante Arrhenius more than 120 years ago, with these overestimations persisting in the computer-simulation models that underpin modern climate science [2]. We just don’t know; in part because the key experiments have never been undertaken [2].
What I do have are whizz-bang gaming computers that can run artificial neural networks (ANN), which are a form of machine learning: think big data and artificial intelligence.
My colleague, Dr John Abbot, has been using this technology for over a decade to forecast the likely direction of particular stock on the share market – for tomorrow.
Since 2011, I’ve been working with him to use this same technology for rainfall forecasting – for the next month and season [4,5,6]. And we now have a bunch of papers in international climate science journals on the application of this technique showing its more skilful than the Australian Bureau of Meteorology’s General Circulation Models for forecasting monthly rainfall.
During the past year, we’ve extended this work to build models to forecast what temperatures would have been in the absence of human-emission of carbon dioxide – for the last hundred years.
We figured that if we could apply the latest data mining techniques to mimic natural cycles of warming and cooling – specifically to forecast twentieth century temperatures in the absence of an industrial revolution – then the difference between the temperature profile forecast by the models, and actual temperatures would give an estimation of the human-contribution from industrialisation.
Firstly, we deconstruct a few of the longer temperature records: proxy records that had already been published in the mainstream climate science literature.
These records are based on things like tree rings and coral cores which can provide an indirect measure of past temperatures. Most of these records show cycles of warming and cooling that fluctuated within a band of approximately 2°C.
For example, there are multiple lines of evidence indicating it was about a degree warmer across western Europe during a period known as the Medieval Warm Period (MWP). Indeed, there are oodles of published technical papers based on proxy records that provide a relatively warm temperature profile for this period [7], corresponding with the building of cathedrals across England, and before the Little Ice Age when it was too cold for the inhabitation of Greenland.
I date the MWP from AD 986 when the Vikings settled southern Greenland, until 1234 when a particularly harsh winter took out the last of the olive trees growing in Germany. I date the end of the Little Ice Age as 1826, when Upernavik in northwest Greenland was again inhabitable – after a period of 592 years.
The modern inhabitation of Upernavik also corresponds with the beginning of the industrial age. For example, it was on 15 September 1830 that the first coal-fired train arrived in Liverpool from Manchester: which some claim as the beginning of the modern era of fast, long-distant, fossil-fuelled fired transport for the masses.
So, the end of the Little Ice Age corresponds with the beginning of industrialisation. But did industrialisation cause the global warming?
In our just published paper in GeoResJ, we make the assumption that an artificial neural network (ANN) trained on proxy temperature data up until 1830, would be able to forecast the combined effect of natural climate cycles through the twentieth century.
We deconstructed six proxy series from different regions, with the Northern Hemisphere composite discussed here. This temperature series begins in 50 AD, ends in the year 2000, and is derived from studies of pollen, lake sediments, stalagmites and boreholes. Typical of most such proxy temperature series, when charted this series zigzags up and down within a band of perhaps 0.4°C on a short time scale of perhaps 60-years. Over the longer nearly 2,000-year period of the record, it shows a rising trend which peaks in 1200AD before trending down to 1650AD, and then rising to about 1980 – then dipping to the year 2000: as shown in Figure 12 of our new paper in GeoResJ.

Proxy temperature record (blue) and ANN projection (orange) based on input from spectral analysis for this Northern Hemisphere multiproxy. The ANN was trained for the period 50 to 1830; test period was 1830 to 2000.
The decline at the end of the record is typical of many such proxy-temperature reconstructions and is known within the technical literature as “the divergence problem”. To be clear, while the thermometer and satellite-based temperature records generally show a temperature increase through the twentieth century, the proxy record, which is used to describe temperature change over the last 2,000 years – a period that predates thermometers and satellites – generally dips from 1980, at least for Northern Hemisphere locations, as shown in Figure 12. This is particularly the case with tree ring records. Rather than address this issue, key climate scientists, have been known to graft instrumental temperature series onto the proxy record from 1980 to literally ‘hide the decline’[8].
Using the proxy record from the Northern Hemisphere composite, decomposing this through signal analysis and then using the resulting component sine waves as input into an ANN, we generated a forecast for the period from 1830 to 2000.
Figure 13 from our new paper in GeoResJ shows the extent of the match between the proxy-temperature record (blue line) and our ANN forecast (orange dashed line) from 1880 to 2000. Both the proxy record and also our ANN forecast (trained on data the predates the Industrial Revolution) show a general increase in temperatures to 1980, and then a decline.

Proxy temperature record (blue) and ANN projection (orange) for a component of the test period, 1880 to 2000.
The average divergence between the proxy temperature record from this Northern Hemisphere composite, and the ANN projection for this period 1880 to 2000, is just 0.09 degree Celsius. This suggests that even if there had been no industrial revolution and burning of fossil fuels, there would have still been some warming through the twentieth century – to at least 1980.
Considering the results from all six geographic regions as reported in our new paper, output from the ANN models suggests that warming from natural climate cycles over the twentieth century would be in the order of 0.6 to 1 °C, depending on the geographical location. The difference between output from the ANN models and the proxy records is at most 0.2 °C; this was the situation for the studies from Switzerland and New Zealand. So, we suggest that at most, the contribution of industrialisation to warming over the twentieth century would be in the order of 0.2°C.
The Intergovernmental Panel on Climate Change (IPCC) estimates warming of approximately 1°C, but attributes this all to industrialization.
The IPCC comes up with a very different assessment because they essentially remodel the proxy temperature series, before comparing them with output from General Circulation Models. For example, the last IPCC Assessment report concluded that,
“In the northern hemisphere, 1983-2012 was likely the warmest 30-year period of the last 1,400 years.”
If we go back 1,400 years, we have a period in Europe immediately following the fall of the Roman empire, and predating the MWP. So, clearly the IPCC denies that the MWP was as warm as current temperatures.
This is the official consensus science: that temperatures were flat for 1,300 years and then suddenly kick-up from sometime after 1830 and certainly after 1880 – with no decline at 1980.
To be clear, while mainstream climate science is replete with published proxy temperature studies showing that temperatures have cycled up and down over the last 2,000 years – spiking during the Medieval Warm Period and then again recently to about 1980 as shown in Figure 12 – the official IPCC reconstructions (which underpin the Paris Accord) deny such cycles. Through this denial, leaders from within this much-revered community can claim that there is something unusual about current temperatures: that we have catastrophic global warming from industrialisation.
To read the full article go here:
HT/Doug
What year does the data cut off? What about the criticism Zeke had on Twitter where he added the recent warming which reached 1.6 degs. higher than the first graph? He later made a correction since the reconstruction did not use tree ring data.
https://twitter.com/hausfath/status/900105322169118720
The ANN missed the recent warming, as did the proxy data. The ANN looked at what? Who told the ANN to look at the non-proxy temperature data?
The ANN was only run to the year 2000. You can’t “miss” what you don’t cover.
And what recent warming are you talking about? Even NASA and NOAA show no statistically significant warming since 1998…so why would any proxies?
https://wattsupwiththat.com/2017/05/01/global-temperatures-plunge-in-april-the-pause-returns/
For reference
As shown in the last plot of the article.
Most of the proxy record has a piss poor resolution of 100 to 300 years. Decadal flux is not registered, let alone El Nino peaks.
Jennifer Marohasy wrote:
“Proxy temperature record (blue) and ANN projection (orange) based on input from spectral analysis for this Northern Hemisphere multiproxy. The ANN was trained for the period 50 to 1830; test period was 1830 to 2000.”
An artificial neural network with enough complexity can be fit to reproduce almost any given dataset. The S&P500 in 2020, 2025 or 2035 can be predicted from the last century of stock prices. For example, the state of the future economy will depend on how much money waste reducing carbon emissions and how serious the next market crash will be. So what Jennifer is saying is the lousy proxy data for the period 50-1830 contains information that is capable of predicting what global temperature would have done for 1830 to present and into the future. Regardless of what happened to anthropogenic aerosols – or even volcanic aerosols. Regardless of what happened to GHGs. Regardless of whether or not there is another Maunder Minimum or Grand Solar Maximum – unless the ANN has managed to “learn” about the sun’s behavior.
What happens if one changes the structure of the ANN? What happens if the ANN is trained on the more reliable proxy data from about 600 or 1000 to 1850 or 1900 or even 1950 (before aGHGs had a major impact). The ANN will make different hindcasts and forecasts. How do we know which model is right? We don’t.
ANNs merely find patterns in the chaotic fluctuations in imperfect RECONSTRUCTIONS of past climate (which may or may not contain signals useful for predicting the future) and projects them into the future without any understanding of the fundamental physics driving climate change.
“So what Jennifer is saying is the lousy proxy data for the period 50-1830 contains information that is capable of predicting what global temperature would have done for 1830 to present and into the future.”
I missed it. Where does she say that?
Mark: I cited a quote directly from Jennifer’s blog.
Frank
Mark: I cited a quote directly from Jennifer’s blog.
Presumably that was a comment of a forum contributor. I really cant imagine JM would state: “So what Jennifer is saying is the lousy proxy data for the period 50-1830 contains information that is capable of predicting what global temperature would have done for 1830 to present and into the future.”
If that is the case, attributing that statement to JM herself is deceitful, misleading and dishonest.
What would the ANN make of this imperfect data?
http://www.geocraft.com/WVFossils/PageMill_Images/Temp_0-400k_yrs.gif
We say, that’s easy. That the planet operates on about 100,000 year timescales, does not preclude it from operating also at the same time on shorter time scales.
We know the hockey stick curve was produced by cherry picking data and applying (badly) and obscurely, the statistics of principal component analysis. I.e a big black box said “Hockey Stick” and that is the end of the story. Or it would have been were it not for the dogged persistence of Steve MacIntyre. The same mindset carried on up to 2017 with Arrigo’s cherry picking winkled out again by MacIntyre see https://climateaudit.org/2017/07/11/pages2017-new-cherry-pie/#more-23273
On the Skeptic side of the debate we pride ourselves on holding our science to higher standards of integrity than the other lot.
So I am a little bit worried that
1) Jennifer Marhosy selected only 6 proxies that all show the MWP LIA characteristics. This obliges her to defend in advance against an accusation of cherry picking.
2) The Neural Network stuff is another big black box and to most observers we have no idea whether it is gold dust or Mannian bias.
It would be so sad if a MacIntyre appeared on the AGW side to blow Jennifer out of the water.
Alastair Gray: 1) Jennifer Marhosy selected only 6 proxies that all show the MWP LIA characteristics. This obliges her to defend in advance against an accusation of cherry picking.
2) The Neural Network stuff is another big black box and to most observers we have no idea whether it is gold dust or Mannian bias.
3) functions chosen to model temperature: sines and cosines, orthogonal polynomials, wavelets, etc.
No matter what proxies you choose, functions you choose, estimation procedure you choose (lasso, least-squares with cross-validation, CNNs), a major problem is that there are not enough observations relative to (or informative about) the oscillation with an apparent period near 1000 years and no well-established mechanism to explain it. From a little before the MWP to now there is only 1 full period; for earlier peaks, the data are extremely sparse. If the temperature rise since 1885 or so is entirely driven by CO2 increase (I am not saying it is!) then we do not have even 1 full period of the hypothetical oscillation.
Great article, thanks. Of course, it won’t fly for those who think that because we can accurately predict eclipses, we can accurately predict climate…but we must keep trying.
BTW, ran into this recently, might be nice read for those who like historical proxies:
http://www.gutenberg.org/files/55375/55375-h/55375-h.htm
FAMOUS FROSTS AND FROST FAIRS IN GREAT BRITAIN.
Chronicled from the Earliest to the Present Time.
Might be interesting to compare to the MET’s “official” record…
Not could be, but demonstrably is.
Any human influence is negligible, and we don’t even know the sign, ie whether net cooling or warming. The later is essentially all urban and other heat island effects, not GHG effect. The issue is whether there are enough local heat islands to add up to a detectable global effect. If there be any, it’s only because of disproportionately weighting those areas in the global “average”.
Gloateus
My understanding is that more of the planet doesn’t have measurement sites than it does. And that many of the measurement stations are subject an urban heat island effect. i also believe Judith Curry asserts that the UHI effect is negligble providing anomalies are used rather than temperatures.
As a complete scientific thicko, I find that statement puzzling when over half the planet cant be measured and yet UHI infected sites are used to judge vast areas which don’t suffer UHI’s nor recording stations.
We cannot possibly apply infected data to the rest of the world, homogenised or otherwise. It is pure guesswork disguised as scientific certainty.
Lastly, there should be error bars in the graphs, for the data and for the models.
“while Anthony and I were eclipse hunting” — where did you find it? I was in Casper, WY.