Modern Ancient Temperatures

OK, no need to torture me, I confess it—I’m a data junkie.

And when I see a new (to me at least) high-resolution dataset, my knees get weak. Case in point? The temperature dataset of the Colle Gnifetti ice core. It has a two-year resolution thanks to some new techniques. Better, it stretches clear back to the year 800. And best, it extends up to near the present, 2006. This lets us compare it to modern datasets. The analysis of the ice core dataset is described in Temperature and mineral dust variability recorded in two low-accumulation Alpine ice cores over the last millennium by Pascal Bohleber et al.

Let me start with where Colle Gnifetti is located. Unusual among ice core records, it’s from Europe, specifically in the Alps on the border of Switzerland and Italy.

Figure 1. Location of the ice cores in the study.

This is good because some of the longest thermometer-based temperature records are in Europe.

One interesting thing about the site is that usually, ice core drilling occurs at the literal ends of the earth, in Antarctica and Greenland and the like. But this site is not far from the foot of the Margherita Hut, which is at over 4500 metres elevation.

Figure 2. The best advertisement ever for becoming a glaciologist.

Now, I wanted to see how well the ice core records matched up with the temperature records. So I calculated three records from the Berkeley Earth land-only dataset. (I used the land-only dataset because I’m comparing with a location on land. However, there is only a minimal difference from using their “land and ocean” dataset.)

The first record I calculated is the global temperature anomaly. Next is the northern hemisphere temperature anomaly. Finally, I looked at a 6° longitude by 4° latitude roughly square box around Colle Gnifetti itself. 

Curiously, of these three the best match is with the northern hemisphere data. Figure 2 below shows the comparison. I’ve linearly adjusted the ice core data to give the best fit to the Berkeley Earth data. 

Why linearly adjust it? Because the variance of a single ice core record at one location high on a mountain is different than the variance of e.g. the northern hemisphere average land temperature. This allows us to compare the records to the same scale.

So here is the comparison between thermometer data and the recent end of the Colle Gnifetti ice core data. Data sources are listed on the graph.

Figure 3. Berkeley Earth land-only temperatures and Colle Gnifetti ice core temperatures. Ice core temperatures have been linearly adjusted to give the best fit to the modern data, by multiplying them by about 1.6 and subtracting about 0.2°C. The background is the drilling hut at Colle Gnifetti

Man, I love it when two totally separate datasets line up in such an excellent fashion. I’d say that the Colle Gnifetti ice core tracks the northern hemisphere temperature very well. The only real anomaly is recent when the two diverge. No idea what that’s about. The answer may be in either dataset. But look at the excellent agreement of the large peaks and swings in the earlier part of the dataset.

So now that we have the paleo dataset aligned and variance-matched with the modern dataset, we can take a look at the ice core record of temperature variation over the entire time span of the data.

Figure 4. Colle Gnifetti ice core temperature, linearly adjusted to best fit modern data as shown in Figure 3. The background is the drilling area, lower right is the drilling hut at Colle Gnifetti

Now, this is a most fascinating temperature dataset. You can see the slow descent from about 1400-1500 into the Little Ice Age, bottoming out at around 1700.

The total range of the fast temperature swings is also quite interesting. For example, in the fifty years from 1190 to 1240, the temperature dropped by 2.3°C.

And the steepness of the natural warming trends is instructive. In 35 years, from 850 to 885, the temperature rose by 2.3°C. Zowie!

To take another look at the warming and the cooling, here’s a graph of the thirty-year trailing trends in the data.

Figure 5. 30-year trailing trends of the temperature record of the Colle Gnifetti ice core.

Current 30-year trailing temperature trends are on the order of 0.1 – 0.2°C/decade. But as you can see, this rate of warming is hardly unusual in the record. Indeed, twice in the period of record the trend has been four times as large as in the modern era. And the current rate of warming has been exceeded many other times in the past.

So … what kind of conclusions and questions can we draw from the data? Let me toss Figure 4 up again for reference.

Figure 4 (repeated).

First, what the heck caused the big swings in temperature? The earlier part of this paleo record shows swings that are both much larger and much faster than anything in modern times. Call me crazy, but on a planet with naturally occurring warmings of e.g. over two degrees C in 35 years from 850 to 885, I’m not seeing how people can possibly rule out such natural swings in looking at the relatively featureless and certainly in no way anomalous modern record.

This brings up my recommendation for the field of climate science—stop projecting the future and start reflecting on the past. Until we can explain things like why temperatures crashed around the year 1200, falling by 2.3°C in a mere fifty years, and then bouncing right back, we have NO BUSINESS MAKING PROJECTIONS. 

But I digress … next, we see warming in the data starting at the end of the Little Ice Age, around the year 1700. It occurs in two waves. The first rise in temperature, from about 1700 to 1790, is both faster and larger than the succeeding rise, which was from about 1820 to the present.

In fact, the modern temperature rise, supposedly fueled by CO2, is the slowest rise of the various longer-term temperature increases in this paleo record. Every other temperature rise is steeper … say what? I thought CO2 was supposed to be driving faster warming, but out here in the real world, there’s slower warming.

Next, in this ice core record, the peak of the later “Medieval Warm Period” around 1190 is about the same temperature as at present. However, the earlier peak around 920 is about half a degree warmer than that. It seems that current temperatures are not as unusual as is often claimed.

Finally, the caveats. The main caveat is the underlying assumption of invariability—that ceteris paribus, the past is operating under the same rules as the present. 

For example, to linearly adjust the modern end of the ice core data to be best fit to the modern temperature data you multiply it by about 1.6 and subtract about 0.2. The figure above assumes that the same thing is true in the past. This is a very reasonable assumption, we know no reason why it wouldn’t be so … and yet …

Next caveat? it’s only one study. I’d be happy to see more using the improved methods that give biennial resolution.

However, given those caveats, I find it a most instructive dataset.


Here on the Northern California coast, summer is in full swing. Many days, the inland valleys heat up. The heated air rises, pulling cool foggy air in from the cold nearby ocean. This cools the entire sea-facing side of the coastal mountain range, including our couple acres … so today it’s cool and foggy here.

I greatly enjoy the local symmetry. It gets hotter in one place … and it gets colder in another place. Lovely.

Figure 6. Satellite view of where I live, about 6 miles (10 km) inland from Bodega Bay, on the ocean-facing side near the top of the first big ridge in from the coast. Blue flag in the large patch of redwood forest marks the location of our house.

The layer of fog isn’t all that thick, typically maybe a couple thousand feet (600m). This leads to a curious acoustic phenomenon. Sounds down along the coast get “tunnel ducted” all the way up the hill. So even though the ocean is six miles (10km) away from our house as the crow flies, on certain foggy days we can hear the waves breaking on the shore. And sometimes, we can even hear the foghorn out at the end of the breakwater in Bodega Bay, my old commercial fishing home port, calling out its endless paean to the souls of those poor fisherwomen and men who never came back home to their loved ones …

Stay well, dear friends. Life is short, be sure to take the time to take the time.

w.

Further Reading: It’s instructive to compare the listed temperatures with the data in A Chronological Listing of Early Weather Events.

As Usual: I ask that when you comment you quote the exact words you are discussing, so that we can all follow the bouncing ball and avoid misunderstandings.

5 2 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

147 Comments
Inline Feedbacks
View all comments
Moderately Cross of East Anglia
July 24, 2020 2:35 pm

Fascinating. We need to remember it is for those who claim some novel feature is operating today which is different from anything in the past to provide proper, testable, evidence of that claim and not merely assert something new is happening. So far their attempts have been woeful and choosing CO2 as the culprit truly pathetic when evidence that it drives the climate engine is entirely unconvincing and flatly contradicted by past climate changes.

Moderately Cross of East Anglia
Reply to  Willis Eschenbach
July 24, 2020 3:02 pm

Thank you too Willis – I wasn’t aware of the Jones “confession”. Alas today I doubt if anyone in the BBC or media would have the temerity to ask him such a question.

CheshirePete
Reply to  Moderately Cross of East Anglia
July 24, 2020 3:35 pm

The BBC are becoming more blatant in their reporting! It’s about a Year ago now the supposed UK record was broken at 38.7c at Cambridge University. In itself this site had previously been removed as it failed even the METO siting criteria, ie buildings had been constructed around the original site. For a strange reason it was reinstated. Perhaps we were getting tiresome of the Gravesend in Kent station constantly breaking records, which now is discontinued with its 20 year monitoring career….! I digress! The BBC on 2 occasions in the past few days on their Weather forecasts displayed a big 39C on screen reminding us of that temperature 1 year ago. Compared to our current Summer which is abysmal. But one of their presenters, Carol Kirkwood, a few days ago had the big 39 on screen to remind us and said, yes the temperature was 38.7 but we’ve rounded it up to 39……….. I was agog…..

Reply to  CheshirePete
July 25, 2020 2:05 am

Next year they will adjust it to a nice round 40 because, well it’s easier to say. And the year after they will say, “remember a couple of years ago the temperature was approaching 50ºC?”

Another Ian
Reply to  Willis Eschenbach
July 24, 2020 3:21 pm

Willis

I just had a look at CCA. The update graphs aren’t showing for me

Reply to  Another Ian
July 24, 2020 6:30 pm

15 updates show for me.

Perhaps an issue local to you/your system

Jon-Anders Grannes
Reply to  Moderately Cross of East Anglia
July 25, 2020 12:42 am

CO2 has been chosen because:
“Affordable energy in ample quantities is the lifeblood of the industrial societies and a prerequisite for the economic development of the others.” — John P. Holdren, Science Adviser to President Obama. Published in Science 9 February 2001

https://www.forbes.com/2009/02/03/holdren-obama-science-opinions-contributors_0203_ronald_bailey.html#1abbec7c6db7

Their ideas and solutions can’t compete with today’s Western Civilizations so they have to destroy it?

Reply to  Moderately Cross of East Anglia
July 25, 2020 7:21 am

Willis and Moderately Cross: Looking at this data and graphs of the last 4 glacial cycles it appears there are greater and more rapid temperature swings when there is overall cooling. Doesn’t this imply that the ice age is forced as the global heat distribution becomes more unstable when temperatures drop or is that a result of increasing temperature differences between the Poles and lower latitudes? Surely the position of the large land masses will purturb the general circulation of the oceans and atmosphere therefore purturbing the distribution of heat?

JimG1
July 24, 2020 2:49 pm

“The total range of the fast temperature swings is also quite interesting. For example, in the fifty years from 1190 to 1240, the temperature dropped by 2.3°C.

And the steepness of the natural warming trends is instructive. In 35 years, from 850 to 885, the temperature rose by 2.3°C. Zowie!”

The older the ice core implied temperatures, the wilder the swings. This would lead one to potentially look for some mechanism in the derivation of temperature from ice cores driving that process. Seems that other old proxy data shows similar huge swings the older the data is.

MarkW
Reply to  JimG1
July 24, 2020 5:10 pm

If multiple proxy show wide temperature swings, isn’t it equally likely that there actually were wide temperature swings, rather than assuming that different proxies, from different locations, all show some unknown age related degradation?

JimG1
Reply to  MarkW
July 24, 2020 6:18 pm

Equally, yes, greater, no. Particularly, perhaps ice core proxies. https://www.astrobio.net/climate/ice-cores-may-not… Just one of several that argue adulteration of such samples.

Reply to  MarkW
July 24, 2020 9:46 pm

A logical conclusion might be that higher CO2 causes a moderation of temperature swings.
Warmer at night and in Winter, cooler during the day and in Summer.
Which seems to be what many of us find when we look carefully at the unadjusted temperature data.

Another notable conclusion seems to be related to the recent divergence seen in the first graph.
The hockey stickish spike is not matched in the ice core data set.
Interestingly, this spike does coincide rather closely with the advent of global warming theory and what many of us regard as unwarranted fiddling with the surface data sets in order to force the data into agreement with the CAGW “theory”.

Tony Heller’s graph which plots recent CO2 concentration vs the “adjustments” shows the adjustments to be an artifact of the same efforts evidenced in the first graph…to force compliance between increasing CO2 and an hypothesis which is otherwise completely unsupported by evidence.

Reply to  Nicholas McGinley
July 24, 2020 10:30 pm

Of course, “Warmer at night and in Winter, cooler during the day and in Summer” is also exactly what results from increasing levels of moisture in the air.
Water vapor is of course also a radiative gas, and the effect of moderating temperature swings is what is observed when the humidity of the air increases.
Also, this aspect of the effect of increasing moisture in the air holds true even in circumstances where the effect of changes of phase in the moisture do not take place.

Ian W
Reply to  Nicholas McGinley
July 25, 2020 7:50 am

Enthalpy is a forbidden word in climate ‘science’.
If the intent is actually to measure trapped ‘heat’ in the atmosphere then air temperature is the incorrect metric as it depends on both the energy and the enthalpy of the volume of air.
The correct metric is heat content in kilojoules per kilogram and to calculate it requires that the relative humidity is known.
Comparing temperatures alone is meaningless. An example – a volume of air in a Louisiana bayou misty after a shower at 100% humidity and air temperature 75F has twice the heat content of a similar volume of air in the Arizona desert at close to zero humidity and 100F.

Yet climate ‘scientists’ would call the Arizona air at 100F ‘hotter’ even though it has half the energy content of the 75F Louisiana air!!

With that level of imprecision why should anyone listen to climate ‘scientists’?

Indeed – what is the value of GISTemp or even the Central England Temperature series? They are nice to have but they are measuring the incorrect variable if the intent is to measure ‘trapped heat’ content.

So to answer Willis’ question – a relatively small change in average humidity would cause a 2C change in air temperature. Perhaps a change to a blocking weather pattern with latitudinal jet streams and/or a change in ocean SSTs leading to evaporation changes and more or less humidity. Not CO2.

Reply to  Nicholas McGinley
July 25, 2020 8:55 pm

I was not suggesting that changes in CO2 have anything causal to do with changes in the average temp of the air over long spans or even in the recent past.
I do not even think CO2 as an explanation for the noted lessening of swings in temperature holds water. For one thing, the amount of time over which CO2 has been significantly elevated is only about 70 years.

When it comes to explanations for the short term and longer term variations noted on the headline post, postulating a change in SST and a subsequent change in humidity levels runs into a chicken and egg question.
It would seem to require that cooling of the GAT was caused by a warmer SST over the whole planet for a long enough time to make a difference.
Also, an increase in average humidity over the entire planet seems like it would cause an increase in cloudiness, and hence albedo, which would tend to counteract any warming which led to higher sea surface temps.
And then the question remains, what caused the SST to change to begin with?
Can SST change significantly with no change in TOA incoming solar?
Volcanoes are one possibility for a mechanism for that.
Cosmic rays another.
Large changes in ocean circulation patterns another.

One other thing I noticed, was that the ice core data in this study seems to show a brief cooling around the time of the Year Without A Summer.
Some have presented evidence that there was really no such thing.
For that matter, for years warmistas have argued there was no LIA to speak of.

We are never going to get anywhere with most of the so-called “experts” peeing in the science soup.

Jon-Anders Grannes
Reply to  JimG1
July 25, 2020 1:13 am

Volcanoes?

Rick C PE
July 24, 2020 2:57 pm

Willis: Very interesting as usual. It appears Figure 5 has the ordinate units for the
trend as C/Century when the data is actually in C/Decade. Is the chart label incorrect?

Abolition Man
July 24, 2020 2:59 pm

Willis,
Thank you for another interesting and illuminating post! Have you considered doing another podcast with Anthony? The first was great but a little too short in my estimation.

Is the Happy Hooker still open or am I dating myself by even asking? I used to try and stop there whenever I headed up Highway One to visit friends in the Gualala area.

July 24, 2020 3:05 pm

At least, they didn’t try to hide the decline.

Reply to  Krishna Gans
July 24, 2020 7:10 pm

Willis is to blame. He forgot to hide the decline in figure 3. : )

Ron Long
July 24, 2020 3:07 pm

Nice catch on this ice core data, Willis. This look at natural variation matches the natural variation, expressed in sea-level variance, that geologists utilize to say there is no climate change signal detectable against natural variation. Stay sane and safe.

Rob_Dawg
July 24, 2020 3:10 pm

Willis,

you say:

> It has a two-year resolution thanks to some new techniques.

How do they know if a particularly warm summer didn’t melt off a dozen years of previous accumulation? I’m sure things like volcanic eruptions can lay down some specific markers but “two-year resolution?”

John VC
Reply to  Rob_Dawg
July 24, 2020 3:47 pm

Interesting is that they have been able to correlate lead (pb) deposits with historical events in England. Seems that there are known upticks in lead smelting at certain times. I suppose that would be a good aid in resolution.

Reply to  Rob_Dawg
July 24, 2020 4:30 pm

In some Greenland ice cores which experience high snow accumulation, annual timing of isotopes and impurities can be resolved and used for annual layer counting.
Rasmussen, S., K. Andersen, A. Svenson, J. et.al, A new Greenland ice core chronology for the last glacial termination, Journal of Geophyscial Research: Atmospheres/Volume 111, Issue D6, 2006. https://doi.org/10.1029/2005JD006079

Kevin kilty
July 24, 2020 3:14 pm

One of the follow-on attempts at reproducing “the hockey stick” was that of Henry Pollack and coworkers at Michigan (Shen and Huang were among these people) from borehole records. I thought their methods contained an element of circular reasoning, but even more important the inverse method of estimating surface temperature from borehole logs has very poor resolution (someone like David Chapman would claim it has good resolution for temperature, i.e. plus or minus 0.5C) and the juxtaposition of anomalies of plus or minus 1.7C amplitude over 50 year spans such as Figure 4 shows from 1100 to 1300 AD, would end up being a flat line. Looking at Figure 4 shows why Pollack and his co-workers found a hockey stick. Their method could not resolve anything but a hockey stick and provides no information not already in the surface temperature records.

July 24, 2020 3:17 pm

“In fact, the modern temperature rise, supposedly fueled by CO2, is the slowest rise of the various longer-term temperature increases in this paleo record. Every other temperature rise is steeper … say what? I thought CO2 was supposed to be driving faster warming, but out here in the real world, there’s slower warming?”

Hey Willis, love your work.
A possible “co2” explanation is that it’s meant to be cooling naturally currently, but co2 is causing the rise hence the natural processes buffer the anthropogenic rise? If so, how much?

But as you say, we need to better understand the past before we draw such conclusions.

July 24, 2020 3:20 pm

Thank you Willis, very interesting article. I am curious…what was the proxy that they used for temperature?

July 24, 2020 3:24 pm

Hottest summers in the last 2000 years were during Roman times

There’s a reason the Romans wore Toga’s

A new study near Sicily shows the sea surface temperatures were a whole two degrees Celcuis warmer then. The worst-case scenario of the Paris Agreement has already happened, and it was nearly 2,000 years ago. And instead of being a baked-earth apocalypse, the Roman empire flourished during the warmth and declined as it cooled.

Just in time, I wouldt like to say 😀

July 24, 2020 3:25 pm

Hottest summers in the last 2000 years were during Roman times

There’s a reason the Romans wore Toga’s

A new study near Sicily shows the sea surface temperatures were a whole two degrees Celcuis warmer then. The worst-case scenario of the Paris Agreement has already happened, and it was nearly 2,000 years ago. And instead of being a baked-earth apocalypse, the Roman empire flourished during the warmth and declined as it cooled.

Just in time, I wouldt like to say 😀

July 24, 2020 3:29 pm

I have never been a fan of BEST slice and splice methodologies based on what is happening in the Fourier space It doesn’t make sense to throw away the low frequency spectrum via a low cut slice to get a “better” climate signal.

When we are looking at 1000 AD to 1700 AD, just how many individual temperature records are in the data? Can’t be many. And how many times does BEST they cut and splice them?

What if you compared the ice core with a couple individual long temperature records, such as CET?

Reply to  Willis Eschenbach
July 25, 2020 7:23 pm

A fair point, Willis. If you linked to your code, I missed it.
If you linked to the data, I have to redo your work rather than just add to it.
But a fair point.
Thanks for the info.

Reply to  Willis Eschenbach
July 25, 2020 6:59 pm

I am with you on the sawtooth problem. As I was in 2014:

https://wattsupwiththat.com/2014/06/28/problems-with-the-scalpel-method/

But the problem with the sawtooth is the LOSS of INFORMATION, particularly the amount of adjustment, and thus an estimate of the instrument drift. That loss of information is contained in the low frequency content discarded by the frequency high-pass, low-cut scalpel implementation.

Your thought experiment above fails on the points:
1. the amount of a bulk shift is an unknown. It is an uncertain number. Therefore, some of it must get mixed in the data.
2. The thing about Fourier analysis is that you cannot use high frequency information content to predict the low frequency content.
3. a mix of short, medium, and long period (low frequency) sine waves
What if I split temperature record somewhere so that each piece has only a portion of the low frequency cycle?

That’s the thing about the “climate signal” It is information content of very low frequencies . Once I cut a 100-year temperature record into 10-year splices, I have lost the real data on 20, 30, 40, 50 year frequencies.

In seismic data processing, we work with band-pass data. 8 – 100 hz depending upon the survey. Inversion is the technique to integrate over time to estimate total impedance of the rock column. But you can’t do that with just band-pass data – high freq errors accumulate. So we use the stacking velocities to provide the low frequency content that doesn’t exist in the reflection data.

Trying to tease out a warming or cooling of 0.1- 0.2 deg C / decade out of a temperature record
is a form of inversion. You integrate the temperature changes over time. But in the climate the low frequency content is in the temperature records — no where else. The scalpel throws the low frequencies away.

Where are those low frequencies retained to be included in the final product? In the regional homogenization? How? If the homogenization is built upon short snips?

Willis often talks about going back to the raw data. I agree we should examine data at least as a spot check to make sure conclusions are reasonable. With that in mind, I urge you to investigate the B.E.S.T. page for one of the few Greenland stations that provide data prior to 1940. It is

http://berkeleyearth.lbl.gov/stations/155343

SW. Greenland coastal fjord station, MITTARFIK ILULISSAT (JAKOBSHAVN LUFTHAVN)

The record runs from Jan 1866 to Oct 2013. Long records are precious to climate science. But look what BEST did to it!
There are no fewer than 30 empirical breakpoints in the 35 year period from 1903 to 1938 !!!

From: https://wattsupwiththat.com/2019/01/13/greenland-near-surface-land-air-temperature-data-from-berkeley-earth-present-some-surprises/#comment-2589026

You can’t tease out a climate signal if you slice the record 30 times in a 35 year stretch.

Reply to  Willis Eschenbach
July 27, 2020 10:35 am

Willis, Thanks for the code.

What on Earth do you mean by, First off, there’s no “Fourier analysis”. ?

Any sequential signal (such as a time sequence) has a 1-to-1 corresponding with a frequency-phase-amplitude spectrum. Anything you do in the time domain has consequences the Fourier domain. The lowest frequency contained in any signal is associated with the length of the signal. The scalpel cuts a long signals into shorter ones, thus losing the lowest possible frequencies; it is a low-cut filter.

It is true, that when you reassemble the fragments, the resulting conglomerate has low frequencies. But these are counterfeit. They come from he glue, not the fragments. Low frequency information content cannot be improved by first discarding it.

The “saw-tooth” signal, where the scalpel turns instrument drift into climate signal is a direct result of the loss of low-frequency information content.

Thanks for the R code. I’m about to do some traveling, so it will be a few days before I can concentrate on it.

Reply to  Willis Eschenbach
July 27, 2020 11:52 am

Re the code: particularly the diffbox

I’m reaching back 40+ years to Linear Systems II and Hilbert functions.
The First Difference filter amplifies noise, more so at higher frequencies.
So while the example above doesn’t have high frequency noise, well the bulk shift has a high freq tail, the sine waves themselves are noiseless.

Not so temperature records. They are noisy and the noise is relatively high at (high freq) daily readings — the rounding to the nearest degree is part of that.

So I’m going to look at this issue with some super imposed noise.

https://dsp.stackexchange.com/questions/55361/proof-that-first-difference-filter-amplifies-noise

Reply to  Stephen Rasey
July 25, 2020 4:25 am

“I have never been a fan of BEST slice and splice methodologies based on what is happening in the Fourier space It doesn’t make sense to throw away the low frequency spectrum via a low cut slice to get a “better” climate signal.”

except we dont do that and the low frequency is not thrown away.
This is easy to test since we do the the series both with and without SLICING ( there is no splicing)

The slicing is fully justified since the stations have changed

You take station A. it is at location x,y. Its identifier code is 1234567

30 years later they MOVE THE STATION to location X2,Y2, BUT

THEY KEPT THE SAME IDENTIFIER CODE.

All we do is change the identifier code. 1234567b

There REALLY ARE TWO DIFFERENT TIME SERIES

one at location x,y
one at location X2,Y2

everyone else SPLICES THESE TOGETHER, we separate them in two series BECAUSE THEY ARE TWO DIFFERENT STATIONS..

Reply to  Steven Mosher
July 25, 2020 7:17 pm

BEST does not splice?
Then what do you call this?:
http://berkeleyearth.lbl.gov/stations/155343

SW. Greenland coastal fjord station, MITTARFIK ILULISSAT (JAKOBSHAVN LUFTHAVN)

The record runs from Jan 1866 to Oct 2013. Prior to 1950, it must be a precious remote long term temperature station. BEST pus no fewer than 30 empirical breakpoints in the 35 year period from 1903 to 1938 !!!
But it is all one record, now.

Tonyb
Editor
July 24, 2020 3:32 pm

Willis

Nice work.

Have you seen the latest sea level reconstruction posted at jonova?

http://joannenova.com.au/2020/07/hottest-summers-in-the-last-2000-years-were-during-roman-times/#comments

There appears to be some similarity to the glacier data with the ups and downs of sea levels as ice melted, reformed and thermal expansion kicked in. The geographic area is also similar. It also goes back that bit further so we can see a continuous picture

Tonyb

John Tillman
Reply to  Tonyb
July 24, 2020 3:50 pm

And the hottest summers of the past 3000 years were in the so-called Minoan Warm Period, and in the past 4000 years during the Egyptian Warm Period, and of the past 5000 to 8000 years in the Holocene Climatic Optimum.

The distressing trend remains cooling down as the Holocene wears on, just like all prior interglacials. Only should the Modern Warm Period for 60 to 180 years ever get hotter than the Medieval WP peak some 1000 years ago will there be a plausible human signature in global average temperature.

Peter Wells
July 24, 2020 3:32 pm

Perhaps the explanation for some of the apparent recent anomalies in the lack of warming is the result of the Milankovitch Cycles beginning to affect us. Past ice core records tend to show a fairly rapid drop off in temperatures as the cold takes over.

Katie
Reply to  Peter Wells
July 24, 2020 6:56 pm

good point Peter – obliquity is decreasing – could it be ?

Rud Istvan
July 24, 2020 3:37 pm

Willis, superb. Two thoughts.

First and most important, you shine a much longer term light on the attribution problem (Anthropogenic or natural causality), one of the most basic problems for climate models. As I previously posted here several times thanks to MIT Prof. Emeritus Lindzen, the instrument GAST rise from ~1920-1945 is statistically and visually indistinguishable from ~1975-2000. Yet even the IPCC says the former is mostly natural since there simply was not enough rise in CO2 even per the models. (IIRC, AR4 WG1 SPM fig. 4– to lazy to look it up from previous posts.) You have vastly enhanced the attribution model problem argument by showing even greater past natural variability.

Second, An explanation for the since ~1975 instrument/ice core divergence. In 1975, the world population was 4.061 billion (sorta, per UN). In 2006 it was 6.594 billion. Now separately UN says about 90% of population is in Northern hemisphere—because that’s where most of the habitable land is, including all of China and India. So running the numbers, NH increased 2.280 billion people in that time interval—over 50%. What the instrument record/ new ice core divergence shows is just the urban heat island effect (UHI) on land surface instrumentation, something well documented many times in many places. I provided an irrefutable Japanese example (Tokyo and adjacent Hachijyo island prefecture) in essay When Data Isn’t in ebook Blowing Smoke.

Regards

Geoff Sherrington
Reply to  Rud Istvan
July 24, 2020 10:57 pm

Yep, Rud,
A year back on WUWT I also drew attention to UHI. It needs better quantification before researchers try warmng attributions to natural or GHG. Geoff S

Reply to  Geoff Sherrington
July 25, 2020 7:26 am

I have said this before: Hadcrut and Best and etc. are doing the assessment of the ‘world temperature’ all wrong.
Your sampling of stations must be
1) equal number, NH and SH
2) together, your stations must balance out to ca. zero latitude
3) 30% of stations must be inland, 70% must be near or at sea.

If you do it right you might actually get the right answer, like I did (in 2015):
https://documentcloud.adobe.com/link/track?uri=urn:aaid:scds:US:b94c721a-35ca-4b89-b44c-cc648b2d89b4

How easy is it, but why do they keep on doing it wrong?

Jeff Alberts
Reply to  Henry Pool
July 25, 2020 8:01 am

Actually, if you’re averaging any station with a different one, you’re doing it all wrong. Intensive properties.

John Tillman
July 24, 2020 3:44 pm

“I’d say that the Colle Gnifetti ice core tracks the northern hemisphere temperature very well. The only real anomaly is recent when the two diverge. No idea what that’s about.”

Really?

As Judith Curry, herself a BEST collaborator, said in 2011 after the alarmist team’s release of data tortured until confession: “This is ‘hide the decline’ stuff. Our data show the pause, just as the other sets of data do. Muller is hiding the decline.”

July 24, 2020 3:51 pm

The other relevance of the Colle Gnigetti to AGW.

It’s where the industrial economy’s climate adventure was born… where it all began.

https://tambonthongchai.com/2018/10/23/a-natural-recovery-from-the-lia/

July 24, 2020 4:09 pm

The only real anomaly is recent when the two diverge. No idea what that’s about. The answer may be in either dataset.

Start with Berkeley Earth. It’s corrupted by a large amount of data from poorly-sited stations affected by the urban heat island effect. Compare 1979 to 2006 with UAH satellite temperature measurements and you will likely see a better fit.

Until they systematically analyze each station’s location and filter out the ones measuring local environmental changes like urban heat island, the terrestrial data is not to be trusted to be an accurate measure of global temperature. And before someone raises the point that they adjust the data to account for UHI, that arbitrary adjustment in no way makes the data more accurate. Just toss the poorly-sited stations and use that data only.

Reply to  stinkerp
July 25, 2020 7:08 pm

,

I’m with you. There is as you say poorly sited stations. But there is also what they did to GOOD stations.

http://berkeleyearth.lbl.gov/stations/155343

SW. Greenland coastal fjord station, MITTARFIK ILULISSAT (JAKOBSHAVN LUFTHAVN)

The record runs from Jan 1866 to Oct 2013. Long records are precious to climate science. But look what BEST did to it! There are no fewer than 30 empirical breakpoints in the 35 year period from 1903 to 1938 !!!

This is a station on an arctic Greenland fjord and there is little reason to believe it will behave in synchronicity with a sparse regional kriging network. Regardless, you cannot tease out any “climate signal” from temperature record snippets only a year or two long.

Michael Jankowski
July 24, 2020 4:19 pm

Don’t remember if it was PAGES2K or someone else, but they used like a 50 yr filter for the old data to smooth it out then used decadal averages for the 20th century and pretending it was an apples-to-apples comparison.

Editor
July 24, 2020 4:21 pm

Well done, Willis. Interesting. Thanks for the post.

Stay safe and healthy, all.
Bob

c1ue
July 24, 2020 4:35 pm

Cheers, Willis, for another data point that helps understanding of natural variability.
Can you put up an update on COVID-19 stats? I am curious as to whether the “surge” in reported cases has affected the downward trends that were showing in all the hotspots. Or not.

July 24, 2020 4:49 pm

Interesting article, Willis, but you lost me on two issues:

1) You state “. . . Colle Gnifetti ice core. It has a two-year resolution thanks to some new techniques.” However, the Figure 3 graph that compares the ice core (red line) data, linearized for comparison, indicates it has less temporal resolution than that of the Berkeley Earth NH land-only temperatures (yellow line), and it certainly does not appear to represent data obtained every two (or even every five) years. Why is there the very slow temporal response in the red line . . . has it been smoothed by some sort of running average?

2) In the first paragraph underneath the caption for Figure 3 you state “I’d say that the Colle Gnifetti ice core tracks the northern hemisphere temperature very well.” That, to me, does not appear to be the case. My eyeball estimate is that the red line and yellow line have about +/- 0.25 C variation, on average, from each other at any given point in time. And in particular, from 1750 to about 1780 and from 1830 to 1840 and around 1880, the red and yellow temperature lines disagree with each each other by 0.5 to almost 1.0 C. Given that the total range of the plotted temperature anomaly data is about -1.5 C to +1.0 C (a span of 2.5C), the typical discrepancy between the yellow and red lines therefore is about +/- 10% of the min/max data span, and the worst-case disagreements (excluding data more recent than 1990) would be in the range of 20 – 40% of the min/max data span.

A clarification would be appreciated.

Robert Davis
July 24, 2020 5:01 pm

I love that you find it being hot in one place & another place it is cold interesting Willis. Try this one on for size, we live inside a nonlinear chaotic thermodynamic heat engine.

Reply to  Robert Davis
July 26, 2020 2:13 am

Fractality is evident in a curious observation about Willis’ two graphs, the reconstructions going back 250 years in fig 3 and going back 1200 years in fig 4. They both look the same. Over the whole time period in each case, there is wider fluctuation at early times and smaller variation later. This is a hallmark of fractality – the same shape emerges on different spatial or temporal scales.

1 2 3
Verified by MonsterInsights