OK, no need to torture me, I confess it—I’m a data junkie.
And when I see a new (to me at least) high-resolution dataset, my knees get weak. Case in point? The temperature dataset of the Colle Gnifetti ice core. It has a two-year resolution thanks to some new techniques. Better, it stretches clear back to the year 800. And best, it extends up to near the present, 2006. This lets us compare it to modern datasets. The analysis of the ice core dataset is described in Temperature and mineral dust variability recorded in two low-accumulation Alpine ice cores over the last millennium by Pascal Bohleber et al.
Let me start with where Colle Gnifetti is located. Unusual among ice core records, it’s from Europe, specifically in the Alps on the border of Switzerland and Italy.

Figure 1. Location of the ice cores in the study.
This is good because some of the longest thermometer-based temperature records are in Europe.
One interesting thing about the site is that usually, ice core drilling occurs at the literal ends of the earth, in Antarctica and Greenland and the like. But this site is not far from the foot of the Margherita Hut, which is at over 4500 metres elevation.



Figure 2. The best advertisement ever for becoming a glaciologist.
Now, I wanted to see how well the ice core records matched up with the temperature records. So I calculated three records from the Berkeley Earth land-only dataset. (I used the land-only dataset because I’m comparing with a location on land. However, there is only a minimal difference from using their “land and ocean” dataset.)
The first record I calculated is the global temperature anomaly. Next is the northern hemisphere temperature anomaly. Finally, I looked at a 6° longitude by 4° latitude roughly square box around Colle Gnifetti itself.
Curiously, of these three the best match is with the northern hemisphere data. Figure 2 below shows the comparison. I’ve linearly adjusted the ice core data to give the best fit to the Berkeley Earth data.
Why linearly adjust it? Because the variance of a single ice core record at one location high on a mountain is different than the variance of e.g. the northern hemisphere average land temperature. This allows us to compare the records to the same scale.
So here is the comparison between thermometer data and the recent end of the Colle Gnifetti ice core data. Data sources are listed on the graph.



Figure 3. Berkeley Earth land-only temperatures and Colle Gnifetti ice core temperatures. Ice core temperatures have been linearly adjusted to give the best fit to the modern data, by multiplying them by about 1.6 and subtracting about 0.2°C. The background is the drilling hut at Colle Gnifetti
Man, I love it when two totally separate datasets line up in such an excellent fashion. I’d say that the Colle Gnifetti ice core tracks the northern hemisphere temperature very well. The only real anomaly is recent when the two diverge. No idea what that’s about. The answer may be in either dataset. But look at the excellent agreement of the large peaks and swings in the earlier part of the dataset.
So now that we have the paleo dataset aligned and variance-matched with the modern dataset, we can take a look at the ice core record of temperature variation over the entire time span of the data.



Figure 4. Colle Gnifetti ice core temperature, linearly adjusted to best fit modern data as shown in Figure 3. The background is the drilling area, lower right is the drilling hut at Colle Gnifetti
Now, this is a most fascinating temperature dataset. You can see the slow descent from about 1400-1500 into the Little Ice Age, bottoming out at around 1700.
The total range of the fast temperature swings is also quite interesting. For example, in the fifty years from 1190 to 1240, the temperature dropped by 2.3°C.
And the steepness of the natural warming trends is instructive. In 35 years, from 850 to 885, the temperature rose by 2.3°C. Zowie!
To take another look at the warming and the cooling, here’s a graph of the thirty-year trailing trends in the data.



Figure 5. 30-year trailing trends of the temperature record of the Colle Gnifetti ice core.
Current 30-year trailing temperature trends are on the order of 0.1 – 0.2°C/decade. But as you can see, this rate of warming is hardly unusual in the record. Indeed, twice in the period of record the trend has been four times as large as in the modern era. And the current rate of warming has been exceeded many other times in the past.
So … what kind of conclusions and questions can we draw from the data? Let me toss Figure 4 up again for reference.



Figure 4 (repeated).
First, what the heck caused the big swings in temperature? The earlier part of this paleo record shows swings that are both much larger and much faster than anything in modern times. Call me crazy, but on a planet with naturally occurring warmings of e.g. over two degrees C in 35 years from 850 to 885, I’m not seeing how people can possibly rule out such natural swings in looking at the relatively featureless and certainly in no way anomalous modern record.
This brings up my recommendation for the field of climate science—stop projecting the future and start reflecting on the past. Until we can explain things like why temperatures crashed around the year 1200, falling by 2.3°C in a mere fifty years, and then bouncing right back, we have NO BUSINESS MAKING PROJECTIONS.
But I digress … next, we see warming in the data starting at the end of the Little Ice Age, around the year 1700. It occurs in two waves. The first rise in temperature, from about 1700 to 1790, is both faster and larger than the succeeding rise, which was from about 1820 to the present.
In fact, the modern temperature rise, supposedly fueled by CO2, is the slowest rise of the various longer-term temperature increases in this paleo record. Every other temperature rise is steeper … say what? I thought CO2 was supposed to be driving faster warming, but out here in the real world, there’s slower warming.
Next, in this ice core record, the peak of the later “Medieval Warm Period” around 1190 is about the same temperature as at present. However, the earlier peak around 920 is about half a degree warmer than that. It seems that current temperatures are not as unusual as is often claimed.
Finally, the caveats. The main caveat is the underlying assumption of invariability—that ceteris paribus, the past is operating under the same rules as the present.
For example, to linearly adjust the modern end of the ice core data to be best fit to the modern temperature data you multiply it by about 1.6 and subtract about 0.2. The figure above assumes that the same thing is true in the past. This is a very reasonable assumption, we know no reason why it wouldn’t be so … and yet …
Next caveat? it’s only one study. I’d be happy to see more using the improved methods that give biennial resolution.
However, given those caveats, I find it a most instructive dataset.
Here on the Northern California coast, summer is in full swing. Many days, the inland valleys heat up. The heated air rises, pulling cool foggy air in from the cold nearby ocean. This cools the entire sea-facing side of the coastal mountain range, including our couple acres … so today it’s cool and foggy here.
I greatly enjoy the local symmetry. It gets hotter in one place … and it gets colder in another place. Lovely.



Figure 6. Satellite view of where I live, about 6 miles (10 km) inland from Bodega Bay, on the ocean-facing side near the top of the first big ridge in from the coast. Blue flag in the large patch of redwood forest marks the location of our house.
The layer of fog isn’t all that thick, typically maybe a couple thousand feet (600m). This leads to a curious acoustic phenomenon. Sounds down along the coast get “tunnel ducted” all the way up the hill. So even though the ocean is six miles (10km) away from our house as the crow flies, on certain foggy days we can hear the waves breaking on the shore. And sometimes, we can even hear the foghorn out at the end of the breakwater in Bodega Bay, my old commercial fishing home port, calling out its endless paean to the souls of those poor fisherwomen and men who never came back home to their loved ones …
Stay well, dear friends. Life is short, be sure to take the time to take the time.
w.
Further Reading: It’s instructive to compare the listed temperatures with the data in A Chronological Listing of Early Weather Events.
As Usual: I ask that when you comment you quote the exact words you are discussing, so that we can all follow the bouncing ball and avoid misunderstandings.
Fascinating. We need to remember it is for those who claim some novel feature is operating today which is different from anything in the past to provide proper, testable, evidence of that claim and not merely assert something new is happening. So far their attempts have been woeful and choosing CO2 as the culprit truly pathetic when evidence that it drives the climate engine is entirely unconvincing and flatly contradicted by past climate changes.
Indeed, Moderately, the onus is on the alarmists to justify their alarm. Me, I find very little that is unusual about the current climate. I discussed this in my 2010 post yclept Congenital Climate Anomalies. See in particular the updates at the end.
Regards,
w.
Thank you too Willis – I wasn’t aware of the Jones “confession”. Alas today I doubt if anyone in the BBC or media would have the temerity to ask him such a question.
The BBC are becoming more blatant in their reporting! It’s about a Year ago now the supposed UK record was broken at 38.7c at Cambridge University. In itself this site had previously been removed as it failed even the METO siting criteria, ie buildings had been constructed around the original site. For a strange reason it was reinstated. Perhaps we were getting tiresome of the Gravesend in Kent station constantly breaking records, which now is discontinued with its 20 year monitoring career….! I digress! The BBC on 2 occasions in the past few days on their Weather forecasts displayed a big 39C on screen reminding us of that temperature 1 year ago. Compared to our current Summer which is abysmal. But one of their presenters, Carol Kirkwood, a few days ago had the big 39 on screen to remind us and said, yes the temperature was 38.7 but we’ve rounded it up to 39……….. I was agog…..
Next year they will adjust it to a nice round 40 because, well it’s easier to say. And the year after they will say, “remember a couple of years ago the temperature was approaching 50ºC?”
Willis
I just had a look at CCA. The update graphs aren’t showing for me
15 updates show for me.
Perhaps an issue local to you/your system
All 15 show for me as well, Ian.
w.
CO2 has been chosen because:
“Affordable energy in ample quantities is the lifeblood of the industrial societies and a prerequisite for the economic development of the others.” — John P. Holdren, Science Adviser to President Obama. Published in Science 9 February 2001
https://www.forbes.com/2009/02/03/holdren-obama-science-opinions-contributors_0203_ronald_bailey.html#1abbec7c6db7
Their ideas and solutions can’t compete with today’s Western Civilizations so they have to destroy it?
Willis and Moderately Cross: Looking at this data and graphs of the last 4 glacial cycles it appears there are greater and more rapid temperature swings when there is overall cooling. Doesn’t this imply that the ice age is forced as the global heat distribution becomes more unstable when temperatures drop or is that a result of increasing temperature differences between the Poles and lower latitudes? Surely the position of the large land masses will purturb the general circulation of the oceans and atmosphere therefore purturbing the distribution of heat?
“The total range of the fast temperature swings is also quite interesting. For example, in the fifty years from 1190 to 1240, the temperature dropped by 2.3°C.
And the steepness of the natural warming trends is instructive. In 35 years, from 850 to 885, the temperature rose by 2.3°C. Zowie!”
The older the ice core implied temperatures, the wilder the swings. This would lead one to potentially look for some mechanism in the derivation of temperature from ice cores driving that process. Seems that other old proxy data shows similar huge swings the older the data is.
If multiple proxy show wide temperature swings, isn’t it equally likely that there actually were wide temperature swings, rather than assuming that different proxies, from different locations, all show some unknown age related degradation?
Equally, yes, greater, no. Particularly, perhaps ice core proxies. https://www.astrobio.net/climate/ice-cores-may-not… Just one of several that argue adulteration of such samples.
A logical conclusion might be that higher CO2 causes a moderation of temperature swings.
Warmer at night and in Winter, cooler during the day and in Summer.
Which seems to be what many of us find when we look carefully at the unadjusted temperature data.
Another notable conclusion seems to be related to the recent divergence seen in the first graph.
The hockey stickish spike is not matched in the ice core data set.
Interestingly, this spike does coincide rather closely with the advent of global warming theory and what many of us regard as unwarranted fiddling with the surface data sets in order to force the data into agreement with the CAGW “theory”.
Tony Heller’s graph which plots recent CO2 concentration vs the “adjustments” shows the adjustments to be an artifact of the same efforts evidenced in the first graph…to force compliance between increasing CO2 and an hypothesis which is otherwise completely unsupported by evidence.
Of course, “Warmer at night and in Winter, cooler during the day and in Summer” is also exactly what results from increasing levels of moisture in the air.
Water vapor is of course also a radiative gas, and the effect of moderating temperature swings is what is observed when the humidity of the air increases.
Also, this aspect of the effect of increasing moisture in the air holds true even in circumstances where the effect of changes of phase in the moisture do not take place.
Enthalpy is a forbidden word in climate ‘science’.
If the intent is actually to measure trapped ‘heat’ in the atmosphere then air temperature is the incorrect metric as it depends on both the energy and the enthalpy of the volume of air.
The correct metric is heat content in kilojoules per kilogram and to calculate it requires that the relative humidity is known.
Comparing temperatures alone is meaningless. An example – a volume of air in a Louisiana bayou misty after a shower at 100% humidity and air temperature 75F has twice the heat content of a similar volume of air in the Arizona desert at close to zero humidity and 100F.
Yet climate ‘scientists’ would call the Arizona air at 100F ‘hotter’ even though it has half the energy content of the 75F Louisiana air!!
With that level of imprecision why should anyone listen to climate ‘scientists’?
Indeed – what is the value of GISTemp or even the Central England Temperature series? They are nice to have but they are measuring the incorrect variable if the intent is to measure ‘trapped heat’ content.
So to answer Willis’ question – a relatively small change in average humidity would cause a 2C change in air temperature. Perhaps a change to a blocking weather pattern with latitudinal jet streams and/or a change in ocean SSTs leading to evaporation changes and more or less humidity. Not CO2.
I was not suggesting that changes in CO2 have anything causal to do with changes in the average temp of the air over long spans or even in the recent past.
I do not even think CO2 as an explanation for the noted lessening of swings in temperature holds water. For one thing, the amount of time over which CO2 has been significantly elevated is only about 70 years.
When it comes to explanations for the short term and longer term variations noted on the headline post, postulating a change in SST and a subsequent change in humidity levels runs into a chicken and egg question.
It would seem to require that cooling of the GAT was caused by a warmer SST over the whole planet for a long enough time to make a difference.
Also, an increase in average humidity over the entire planet seems like it would cause an increase in cloudiness, and hence albedo, which would tend to counteract any warming which led to higher sea surface temps.
And then the question remains, what caused the SST to change to begin with?
Can SST change significantly with no change in TOA incoming solar?
Volcanoes are one possibility for a mechanism for that.
Cosmic rays another.
Large changes in ocean circulation patterns another.
One other thing I noticed, was that the ice core data in this study seems to show a brief cooling around the time of the Year Without A Summer.
Some have presented evidence that there was really no such thing.
For that matter, for years warmistas have argued there was no LIA to speak of.
We are never going to get anywhere with most of the so-called “experts” peeing in the science soup.
Volcanoes?
http://www.pastglobalchanges.org/products/latest/7164-the-history-of-volcanic-eruptions-since-roman-times
Answered elsewhere on this thread. TL;DR answer is no. See graph.
w.
Willis: Very interesting as usual. It appears Figure 5 has the ordinate units for the
trend as C/Century when the data is actually in C/Decade. Is the chart label incorrect?
Thanks, Rick, fixed. One reason I love writing for the web is that my mistakes don’t last long …
w.
Willis,
Thank you for another interesting and illuminating post! Have you considered doing another podcast with Anthony? The first was great but a little too short in my estimation.
Is the Happy Hooker still open or am I dating myself by even asking? I used to try and stop there whenever I headed up Highway One to visit friends in the Gualala area.
At least, they didn’t try to hide the decline.
Willis is to blame. He forgot to hide the decline in figure 3. : )
Nice catch on this ice core data, Willis. This look at natural variation matches the natural variation, expressed in sea-level variance, that geologists utilize to say there is no climate change signal detectable against natural variation. Stay sane and safe.
Willis,
you say:
> It has a two-year resolution thanks to some new techniques.
How do they know if a particularly warm summer didn’t melt off a dozen years of previous accumulation? I’m sure things like volcanic eruptions can lay down some specific markers but “two-year resolution?”
Interesting is that they have been able to correlate lead (pb) deposits with historical events in England. Seems that there are known upticks in lead smelting at certain times. I suppose that would be a good aid in resolution.
In some Greenland ice cores which experience high snow accumulation, annual timing of isotopes and impurities can be resolved and used for annual layer counting.
Rasmussen, S., K. Andersen, A. Svenson, J. et.al, A new Greenland ice core chronology for the last glacial termination, Journal of Geophyscial Research: Atmospheres/Volume 111, Issue D6, 2006. https://doi.org/10.1029/2005JD006079
One of the follow-on attempts at reproducing “the hockey stick” was that of Henry Pollack and coworkers at Michigan (Shen and Huang were among these people) from borehole records. I thought their methods contained an element of circular reasoning, but even more important the inverse method of estimating surface temperature from borehole logs has very poor resolution (someone like David Chapman would claim it has good resolution for temperature, i.e. plus or minus 0.5C) and the juxtaposition of anomalies of plus or minus 1.7C amplitude over 50 year spans such as Figure 4 shows from 1100 to 1300 AD, would end up being a flat line. Looking at Figure 4 shows why Pollack and his co-workers found a hockey stick. Their method could not resolve anything but a hockey stick and provides no information not already in the surface temperature records.
“In fact, the modern temperature rise, supposedly fueled by CO2, is the slowest rise of the various longer-term temperature increases in this paleo record. Every other temperature rise is steeper … say what? I thought CO2 was supposed to be driving faster warming, but out here in the real world, there’s slower warming?”
Hey Willis, love your work.
A possible “co2” explanation is that it’s meant to be cooling naturally currently, but co2 is causing the rise hence the natural processes buffer the anthropogenic rise? If so, how much?
But as you say, we need to better understand the past before we draw such conclusions.
Thank you Willis, very interesting article. I am curious…what was the proxy that they used for temperature?
My bad, John, I thought I’d included the link to the actual article, Temperature and mineral dust variability recorded in two low-accumulation Alpine ice cores over the last millennium
Pascal Bohleber et al.
It’s all explained therein. I’ll add this to the head post.
w.
Hottest summers in the last 2000 years were during Roman times
Hottest summers in the last 2000 years were during Roman times
Just in time, I wouldt like to say 😀
I have never been a fan of BEST slice and splice methodologies based on what is happening in the Fourier space It doesn’t make sense to throw away the low frequency spectrum via a low cut slice to get a “better” climate signal.
When we are looking at 1000 AD to 1700 AD, just how many individual temperature records are in the data? Can’t be many. And how many times does BEST they cut and splice them?
What if you compared the ice core with a couple individual long temperature records, such as CET?
Stephen, that’s why I link to the datasets, so folks can continue the investigation in new and different directions.
As to the CET, it’s never been one of my favorites because it has something like (from memory) 13 splices in the record. Makes it less than useful for multi-year comparisons of this type.
Regards,
w.
A fair point, Willis. If you linked to your code, I missed it.
If you linked to the data, I have to redo your work rather than just add to it.
But a fair point.
Thanks for the info.
Stephen Rasey July 24, 2020 at 3:29 pm
I think the slice methodology has a problem, but not that problem. Try an experiment.
Make up say five signals, where each is a mix of short, medium, and long period (low frequency) sine waves. At a random point in each signal, add a fixed amount from the point to the end of a signal to simulate say a move of a climate station to a nearby, warmer location.
Now, use the slice method to cut each signal in two. According to you, this should “throw away the low frequency signal”.
However, when you reconstruct an average from the five datasets (each split by the slice method into subsets), you do so by taking the first differences of the datasets, averaging them, and then taking a cumulative sum of the average differences.
And if you do that … you’ll find that it still contains the low frequency signal. Go figure.
HOWEVER, as I said, there is a theoretical problem with the method. The problem comes not with low frequency but with sawtooth type signals. Imagine that we paint a Stevenson Screen white every ten years, and neglect it in between. The temperature measurement will start out correct, then get warmer and warmer, until a new coat of white paint cools it down again.
Now, if there’s no trend in the underlying temperature, there will be no trend in the resulting sawtooth wave. But the Berkeley Earth method identifies the discontinuity when the temperature drops because the Screen is just painted, and it cuts that part of the signal out.
As a result, when you reconstruct the signal as described above, it now shows a distinct and totally bogus long-term trend.
I have no idea how much that affects the overall real-world Berkeley Earth result. It does seem that it would affect things like say moving a station affected by UHI to a nearby airport or location out of town. This creates the very kind of “sawtooth” that may be problematic.
Interesting question,
w.
I am with you on the sawtooth problem. As I was in 2014:
https://wattsupwiththat.com/2014/06/28/problems-with-the-scalpel-method/
But the problem with the sawtooth is the LOSS of INFORMATION, particularly the amount of adjustment, and thus an estimate of the instrument drift. That loss of information is contained in the low frequency content discarded by the frequency high-pass, low-cut scalpel implementation.
Your thought experiment above fails on the points:
1. the amount of a bulk shift is an unknown. It is an uncertain number. Therefore, some of it must get mixed in the data.
2. The thing about Fourier analysis is that you cannot use high frequency information content to predict the low frequency content.
3. a mix of short, medium, and long period (low frequency) sine waves
What if I split temperature record somewhere so that each piece has only a portion of the low frequency cycle?
That’s the thing about the “climate signal” It is information content of very low frequencies . Once I cut a 100-year temperature record into 10-year splices, I have lost the real data on 20, 30, 40, 50 year frequencies.
In seismic data processing, we work with band-pass data. 8 – 100 hz depending upon the survey. Inversion is the technique to integrate over time to estimate total impedance of the rock column. But you can’t do that with just band-pass data – high freq errors accumulate. So we use the stacking velocities to provide the low frequency content that doesn’t exist in the reflection data.
Trying to tease out a warming or cooling of 0.1- 0.2 deg C / decade out of a temperature record
is a form of inversion. You integrate the temperature changes over time. But in the climate the low frequency content is in the temperature records — no where else. The scalpel throws the low frequencies away.
Where are those low frequencies retained to be included in the final product? In the regional homogenization? How? If the homogenization is built upon short snips?
From: https://wattsupwiththat.com/2019/01/13/greenland-near-surface-land-air-temperature-data-from-berkeley-earth-present-some-surprises/#comment-2589026
You can’t tease out a climate signal if you slice the record 30 times in a 35 year stretch.
Stephen, you say:
Let me recommend that you set all theory aside and actually try it. First off, there’s no “Fourier analysis”. Next, the size of the “bulk shift” is immaterial. Finally, yes, it works with a mix of short, medium and long-period waves. Here’s an R program to show what I mean, but you really should write your own code to test it.
Mosh is right. There’s no low frequency information getting thrown away.
w.
Willis, Thanks for the code.
What on Earth do you mean by, First off, there’s no “Fourier analysis”. ?
Any sequential signal (such as a time sequence) has a 1-to-1 corresponding with a frequency-phase-amplitude spectrum. Anything you do in the time domain has consequences the Fourier domain. The lowest frequency contained in any signal is associated with the length of the signal. The scalpel cuts a long signals into shorter ones, thus losing the lowest possible frequencies; it is a low-cut filter.
It is true, that when you reassemble the fragments, the resulting conglomerate has low frequencies. But these are counterfeit. They come from he glue, not the fragments. Low frequency information content cannot be improved by first discarding it.
The “saw-tooth” signal, where the scalpel turns instrument drift into climate signal is a direct result of the loss of low-frequency information content.
Thanks for the R code. I’m about to do some traveling, so it will be a few days before I can concentrate on it.
Re the code: particularly the diffbox
I’m reaching back 40+ years to Linear Systems II and Hilbert functions.
The First Difference filter amplifies noise, more so at higher frequencies.
So while the example above doesn’t have high frequency noise, well the bulk shift has a high freq tail, the sine waves themselves are noiseless.
Not so temperature records. They are noisy and the noise is relatively high at (high freq) daily readings — the rounding to the nearest degree is part of that.
So I’m going to look at this issue with some super imposed noise.
https://dsp.stackexchange.com/questions/55361/proof-that-first-difference-filter-amplifies-noise
Stephen Rasey July 27, 2020 at 10:35 am
What I mean is that neither Mosh nor I are doing Fourier analysis. You are the first one to mention it.
Say what? All frequencies may be contained in a signal, it is unconnected with the length of the signal. All the length of the signal determines is the longest period that we can reliably detect.
It does NOT mean that there are no lower frequencies involved.
This is not true, and doesn’t get more true by your repeating it over and over. DO THE DAMN EXPERIMENT AND STOP TALKING UNTIL YOU DO!!!
The “saw-tooth” signal, where the scalpel turns instrument drift into climate signal is a direct result of the loss of low-frequency information content.
No, it is NOT from that. That signal loss also occurs when the scalpel method starts chopping up and resetting a LINEAR TREND. It has absolutely nothing to do with low-frequency content.
Welcome. Do the experiment.
w.
A further note. You claim incorrectly that
Take one complete cycle of a sine wave. The lowest frequency is the frequency of the sine wave.
Now, divide that complete cycle into thirds.
What is the lowest frequency in those three resulting signals?
Same. The frequency of the sine wave … DESPITE the shortness of the data, which is less than one cycle long.
w.
Stephen, I took a look at what you said about first differencing increasing the noise. I’m not finding it.
I read the article. It points out that the noise variance doubles when you first difference the data, viz:
However, that’s noise in the first differencing. And obviously, when you use a cumulative sum to reconstruct the signal from the first differences, it perforce must cut the variance of the error in half.
BUT first differencing is not all that’s being done. After that, the (noisy) first difference signals from various stations are averaged.
Then that average is cumulatively summed to reconstruct the underlying anomaly.
Net result from the method is to reduce the noise.
I don’t think Berkeley Earth does it that way, though …
Onwards,
w.
“I have never been a fan of BEST slice and splice methodologies based on what is happening in the Fourier space It doesn’t make sense to throw away the low frequency spectrum via a low cut slice to get a “better” climate signal.”
except we dont do that and the low frequency is not thrown away.
This is easy to test since we do the the series both with and without SLICING ( there is no splicing)
The slicing is fully justified since the stations have changed
You take station A. it is at location x,y. Its identifier code is 1234567
30 years later they MOVE THE STATION to location X2,Y2, BUT
THEY KEPT THE SAME IDENTIFIER CODE.
All we do is change the identifier code. 1234567b
There REALLY ARE TWO DIFFERENT TIME SERIES
one at location x,y
one at location X2,Y2
everyone else SPLICES THESE TOGETHER, we separate them in two series BECAUSE THEY ARE TWO DIFFERENT STATIONS..
BEST does not splice?
Then what do you call this?:
http://berkeleyearth.lbl.gov/stations/155343
SW. Greenland coastal fjord station, MITTARFIK ILULISSAT (JAKOBSHAVN LUFTHAVN)
The record runs from Jan 1866 to Oct 2013. Prior to 1950, it must be a precious remote long term temperature station. BEST pus no fewer than 30 empirical breakpoints in the 35 year period from 1903 to 1938 !!!
But it is all one record, now.
Willis
Nice work.
Have you seen the latest sea level reconstruction posted at jonova?
http://joannenova.com.au/2020/07/hottest-summers-in-the-last-2000-years-were-during-roman-times/#comments
There appears to be some similarity to the glacier data with the ups and downs of sea levels as ice melted, reformed and thermal expansion kicked in. The geographic area is also similar. It also goes back that bit further so we can see a continuous picture
Tonyb
And the hottest summers of the past 3000 years were in the so-called Minoan Warm Period, and in the past 4000 years during the Egyptian Warm Period, and of the past 5000 to 8000 years in the Holocene Climatic Optimum.
The distressing trend remains cooling down as the Holocene wears on, just like all prior interglacials. Only should the Modern Warm Period for 60 to 180 years ever get hotter than the Medieval WP peak some 1000 years ago will there be a plausible human signature in global average temperature.
Perhaps the explanation for some of the apparent recent anomalies in the lack of warming is the result of the Milankovitch Cycles beginning to affect us. Past ice core records tend to show a fairly rapid drop off in temperatures as the cold takes over.
good point Peter – obliquity is decreasing – could it be ?
Willis, superb. Two thoughts.
First and most important, you shine a much longer term light on the attribution problem (Anthropogenic or natural causality), one of the most basic problems for climate models. As I previously posted here several times thanks to MIT Prof. Emeritus Lindzen, the instrument GAST rise from ~1920-1945 is statistically and visually indistinguishable from ~1975-2000. Yet even the IPCC says the former is mostly natural since there simply was not enough rise in CO2 even per the models. (IIRC, AR4 WG1 SPM fig. 4– to lazy to look it up from previous posts.) You have vastly enhanced the attribution model problem argument by showing even greater past natural variability.
Second, An explanation for the since ~1975 instrument/ice core divergence. In 1975, the world population was 4.061 billion (sorta, per UN). In 2006 it was 6.594 billion. Now separately UN says about 90% of population is in Northern hemisphere—because that’s where most of the habitable land is, including all of China and India. So running the numbers, NH increased 2.280 billion people in that time interval—over 50%. What the instrument record/ new ice core divergence shows is just the urban heat island effect (UHI) on land surface instrumentation, something well documented many times in many places. I provided an irrefutable Japanese example (Tokyo and adjacent Hachijyo island prefecture) in essay When Data Isn’t in ebook Blowing Smoke.
Regards
Yep, Rud,
A year back on WUWT I also drew attention to UHI. It needs better quantification before researchers try warmng attributions to natural or GHG. Geoff S
I have said this before: Hadcrut and Best and etc. are doing the assessment of the ‘world temperature’ all wrong.
Your sampling of stations must be
1) equal number, NH and SH
2) together, your stations must balance out to ca. zero latitude
3) 30% of stations must be inland, 70% must be near or at sea.
If you do it right you might actually get the right answer, like I did (in 2015):
https://documentcloud.adobe.com/link/track?uri=urn:aaid:scds:US:b94c721a-35ca-4b89-b44c-cc648b2d89b4
How easy is it, but why do they keep on doing it wrong?
Actually, if you’re averaging any station with a different one, you’re doing it all wrong. Intensive properties.
“I’d say that the Colle Gnifetti ice core tracks the northern hemisphere temperature very well. The only real anomaly is recent when the two diverge. No idea what that’s about.”
Really?
As Judith Curry, herself a BEST collaborator, said in 2011 after the alarmist team’s release of data tortured until confession: “This is ‘hide the decline’ stuff. Our data show the pause, just as the other sets of data do. Muller is hiding the decline.”
The other relevance of the Colle Gnigetti to AGW.
It’s where the industrial economy’s climate adventure was born… where it all began.
https://tambonthongchai.com/2018/10/23/a-natural-recovery-from-the-lia/
Start with Berkeley Earth. It’s corrupted by a large amount of data from poorly-sited stations affected by the urban heat island effect. Compare 1979 to 2006 with UAH satellite temperature measurements and you will likely see a better fit.
Until they systematically analyze each station’s location and filter out the ones measuring local environmental changes like urban heat island, the terrestrial data is not to be trusted to be an accurate measure of global temperature. And before someone raises the point that they adjust the data to account for UHI, that arbitrary adjustment in no way makes the data more accurate. Just toss the poorly-sited stations and use that data only.
@stinkerp,
I’m with you. There is as you say poorly sited stations. But there is also what they did to GOOD stations.
http://berkeleyearth.lbl.gov/stations/155343
SW. Greenland coastal fjord station, MITTARFIK ILULISSAT (JAKOBSHAVN LUFTHAVN)
The record runs from Jan 1866 to Oct 2013. Long records are precious to climate science. But look what BEST did to it! There are no fewer than 30 empirical breakpoints in the 35 year period from 1903 to 1938 !!!
This is a station on an arctic Greenland fjord and there is little reason to believe it will behave in synchronicity with a sparse regional kriging network. Regardless, you cannot tease out any “climate signal” from temperature record snippets only a year or two long.
Don’t remember if it was PAGES2K or someone else, but they used like a 50 yr filter for the old data to smooth it out then used decadal averages for the 20th century and pretending it was an apples-to-apples comparison.
Well done, Willis. Interesting. Thanks for the post.
Stay safe and healthy, all.
Bob
Cheers, Willis, for another data point that helps understanding of natural variability.
Can you put up an update on COVID-19 stats? I am curious as to whether the “surge” in reported cases has affected the downward trends that were showing in all the hotspots. Or not.
Interesting article, Willis, but you lost me on two issues:
1) You state “. . . Colle Gnifetti ice core. It has a two-year resolution thanks to some new techniques.” However, the Figure 3 graph that compares the ice core (red line) data, linearized for comparison, indicates it has less temporal resolution than that of the Berkeley Earth NH land-only temperatures (yellow line), and it certainly does not appear to represent data obtained every two (or even every five) years. Why is there the very slow temporal response in the red line . . . has it been smoothed by some sort of running average?
2) In the first paragraph underneath the caption for Figure 3 you state “I’d say that the Colle Gnifetti ice core tracks the northern hemisphere temperature very well.” That, to me, does not appear to be the case. My eyeball estimate is that the red line and yellow line have about +/- 0.25 C variation, on average, from each other at any given point in time. And in particular, from 1750 to about 1780 and from 1830 to 1840 and around 1880, the red and yellow temperature lines disagree with each each other by 0.5 to almost 1.0 C. Given that the total range of the plotted temperature anomaly data is about -1.5 C to +1.0 C (a span of 2.5C), the typical discrepancy between the yellow and red lines therefore is about +/- 10% of the min/max data span, and the worst-case disagreements (excluding data more recent than 1990) would be in the range of 20 – 40% of the min/max data span.
A clarification would be appreciated.
Hmmm…..
https://www.breitbart.com/europe/2020/07/24/study-mediterranean-sea-was-3-6f-hotter-at-time-of-the-roman-empire/
I love that you find it being hot in one place & another place it is cold interesting Willis. Try this one on for size, we live inside a nonlinear chaotic thermodynamic heat engine.
Thanks, Robert. True, and the key to understanding it is the Constructal Law.
In any case, been there, posted that, see “The Magnificent Climate Heat Engine”.
w.
Fractality is evident in a curious observation about Willis’ two graphs, the reconstructions going back 250 years in fig 3 and going back 1200 years in fig 4. They both look the same. Over the whole time period in each case, there is wider fluctuation at early times and smaller variation later. This is a hallmark of fractality – the same shape emerges on different spatial or temporal scales.
Hi Willis,
The (claimed) biggest recorded volcanic eruption took place in Korea in November of 946 (Mt Paektu). Could this be linked to the drop that looks like it starts right around that dater? I cant tell from you graphs how close the dates might be.
Nice willis.
when we were writing the paper I suggested that we do this “type” of study to “cross check” the early part of the record ( 1753-1850) against available paleo. idea got rejected.
anyway.
Issue with Ice cores, as I am sure you know, is that the core reflects the temperature at the “source”
location of the precipitation that caused it.
For this region I have no clue if that is temporally consistent.
The issue is that if you start hunting for the source area you run into a texas sharpshooter
Thanks, Mosh. I hadn’t thought about looking at it the other way around, with the paleo confirming the thermometer record reconstruction rather than the other way around.
You’re correct that the core reflects the temperature where the water was evaporated. It’s part of the assumption of uniformity … which, as I said, may or may not be valid for all aspects of the question.
w.
Willis
I suspect that isotopic fractionation occurs with evaporation (and sublimation), condensation, and crystallization. The temperature calibration may be a can of worms.
One oddity
see 1879.
huge el nino,
Steven Mosher:
As with ALL El Ninos, it was caused by a decrease in Atmospheric SO2 levels.
In this instance, the reduction was caused by the Oct. 1873-Mar 1879 “Long Depression,” when essentially all fossil-fuel burning SO2 from smelters, foundries, etc. etc. shut down, cleansing the air, and the SO2 from the 1875 VEI5 Askja finally settled out, forming the usual volcanic-induced El Nino, also due to cleansing of the air. Between the two air-cleansing events, temperatures naturally soared.
huge el nino
Does El Nino have much effect on the N Atlantic (where the moisture source would come from)? Maybe some moisture from the Mediterranean too.
Willis:
Underneath your Figure 4, you ask “what the heck caused the big swings in temperatures”
The answer is the presence or absence of volcanic eruptions. See https://www.osf.io/b2vxp/ for my analysis of the Central England Temperatures Data Set, 1665-2020
Burl, sorry, not happening. Here are all eruptions of VEI>5.
w.
Willis:
If you take the time to visit to visit my link, you will see that VEI4 (and possibly some VEI3 eruptions are also responsible for temperature decreases.
They can’t be ignored, as you are doing.
From Willis’ graph, I see no effect from VEI=>5s. Why would there be any for VEI4s & lower? Meh…..
Beng135:
Any temperature excursion BELOW 0.0 Deg. C. would represent a volcanic effect.
I misread his graph–I thought that they were VEI5 eruptions, instead of > VEI5 eruptions.
So he is IGNORING anything less than a VEI6 eruption. No wonder he discounts volcanic effects, and can’t understand the cause of big swings in temperature.
Check out my link, and see how wrong he is.
Burl Henry July 25, 2020 at 4:34 am
Burl, here you go. All eruptions with VEI 4 and above …
I’m gonna pass on your theory …
w.
Willis:
I am not offering a theory.
I am sim[ply reporting on the measurements from an actual instrumental record, which is much more precise than ice core data.
For example, the Central England Temperatures data set shows at least 38 VEI4 and higher eruptions between 1660 and 1875, each one of which caused measurable cooling. Your graph shows perhaps 10.
Burl Henry July 25, 2020 at 7:50 pm
That’s just a question of the graph scale. In fact, my graph shows no less than SEVENTY-EIGHT VEI4 and higher eruptions from 1660 to 1875. That’s one every three years. If each and every one of them caused “considerable cooling” we’d have frozen to death long ago.
However, we don’t have to guess. Instead, we can “stack” the data by aligning the temperature data from each eruption on the eruption data, and looking to see whether ON AVERAGE the temperatures following an eruption were lower.
Read’m and weep … absolutely no visible effect from the eruptions.
No temperature drop after eruptions, sorry.
w.
Hi Willis
That freeze story reminds me now of the Nenana ice break up of which we accurate records since 1917. I know you did a story on that. Do you perhaps have an updated graph that also includes the results of the past 4 years? I was wondering if that down trend still holds.
Willis:
You say “no temperature drop after eruptions”
May I ask whether you have viewed the link https://www.osf.io/b2vxp/
This instrumental temperature record for Central England shows that essentially EVERY drop in temperatures below 0.0 deg. C. occurred as the result of an eruption, somewhere in the world.
So. we have a major difference in our conclusions on an important topic, which definitely needs to be resolved.
Henry Pool July 26, 2020 at 1:47 pm
I haven’t looked since then, but I will.
Thanks,
w.
Burl Henry July 27, 2020 at 7:44 am
Yes. I thought it was a joke. It doesn’t contain one bit of actual statistical analysis. You just put both on a chart and said “See!”
No, the major difference is not in our conclusions, it is in our methods.
I’m using math and science, and you are using handwaving. I just showed you, by stacking the 78 eruptions during the period you cited, that there is NO sign of any temperature drop following the eruptions. Don’t like that? I wouldn’t either if I had your theory … but that doesn’t change the facts—on average there is no temperature drop following an eruption.
w.
Your comparison of a single geograpical data pint to the BEST NH average is flawed. You cannot compare one location to many. Any observation/inference drawn is worthless.
Pool,
“Any observati’on/inference drawn is worthless.” UNLESS, the average and sample are highly correlated!
You are basically saying that because high temporal-resolution data are rare, and sparsely distributed, there is no point in looking at such data because it can’t be used for anything other than what happened at that location on the ground. Therefore, the homogenization and interpolation used in constructing a data-set such as BEST, is invalid, and averages are worthless.
Maybe you should stick to offering your unqualified services to those who don’t ask for help.
“Therefore, the homogenization and interpolation used in constructing a data-set such as BEST, is invalid, and averages are worthless.”
In this case the averages are worthless.
Of course you can compare one to many. It’s done all the time. Where do you think terms like “above average” and “below average” come from? I tend to agree with Stephen Rasey about BEST. I think they understate the uncertainty in the trend. As for proxy reconstructions, there is a serious problem with what I call “leverage.” You are making comparisons outside of the calibrated range. That is verboten when using instruments to make direct measurements, yet so-called climate science has made a whole field out of doing the “verboten” thing when using proxies (i.e. indirect measurements) for historical reconstructions. I think the further outside the calibrated range you go, the higher the uncertainty. For proxy reconstructions that can only be calibrated in the modern era, that means that your uncertainty increases the further back the reconstruction goes. The other problem is that BEST itself is a reconstruction – it is not “data.” So, you are comparing one reconstruction to another. Nevertheless, whatever the caveats, I don’t see any issue with comparing one to many. My biggest concern is understanding the uncertainty properly when you do that.
” I think they understate the uncertainty in the trend. ”
fortunately we tested that.
uncertainty is correct
“uncertainty is correct”
So sayeth the person likely to be embarrassed if it were shown that the uncertainty is understated. How about an unbiased third-party opinion? It would be helpful if that person also knew how to capitalize and punctuate properly — you know, someone like an English major.
Henry
I Have previously written extensively on CET. Many weather organisation and scientists believe it to be a good if not perfect proxy if not for global temperatures then for NH ones. That includes the UK and Dutch met office and Hubert Lamb . cET is taken at 3 separate locations in the centre of England and with the UK being an island we are some sort of useful weather vane.
Tonyb
Actually, Tony, it’s a bit more complex than that. Over time, there have been no less than seventeen! changes in the sources used for the CET. As a result, I fear I don’t trust it much …
Thanks,
w.
This “Henry Pool” is not Henry Pool but a Troll impersonating him. Trolls on the Khmer Vert’s payroll do that here from time to time.
Phil Salmon
Thanks for noting and pointing it out. Just so we are all clear. I am very sceptical of any and all man made warming except the warming caused by man by aiding the greening of earth.
If one had no information other than the CO2 concentration, the Berkeley temperature series and the borehole data, one can come to the conclusion that CO2 is causing a slight cooling in the Alps and that this effect will continue as the CO2 increases.
Does anyone, for the slightest moment, think that the IPCC will not claim that any emerging cooling trend in the 21st Century is caused by human activities? All they need is a big “discovery” that reverses our impact after some imaginary “tipping point” has been over-passed. After that cooling the cure will be we have to reduce our CO2 emissions to warm the planet again.
Most people used to have a sense of both honour and shame but it is getting harder to find good examples in the science community these days. I compliment the authors for not suppressing the decline by truncating the data when it departed from the surface temperature record.
There are now two clear signals: the tree ring series (lots of them) that Briffa et al truncated at 1960 and this ice core which is presented whole. Is it possible there is an emerging underground set of scientists in Europe who are secretly skeptical that their forebears were lying through their teeth in an effort to gain prominence and grant money? It would be interesting to know if there was any pressure applied to the authors to clip the tip when they submitted the article.
Yes, up or down, the only sure thing is that we caused it.
And it’s much worse than we thought.
… and the science is settled!
“one can come to the conclusion that CO2 is causing a slight cooling in the Alps and that this effect will continue as the CO2 increases.”
the ice core is not a thermometer. It doesnt “record” temperature.
it is a proxy
proxy for what is always the question.
proxy for LOCAL T?, regional T? hemispherical T?
annual T? seasonal T?
Among ice core drilling sites in the European Alps, Colle Gnifetti (CG) is the only non-temperate glacier to offer climate records dating back at least 1000 years. This unique long-term archive is the result of an exceptionally low net accumulation driven by wind erosion and rapid annual layer thinning. However, the full exploitation of the CG time series has been hampered by considerable dating uncertainties and the seasonal summer bias in snow preservation. Using a new core drilled in 2013 we extend annual layer counting, for the first time at CG, over the last 1000 years and add additional constraints to the resulting age scale from radiocarbon dating. Based on this improved age scale, and using a multi-core approach with a neighbouring ice core, we explore the time series of stable water isotopes and the mineral dust proxies Ca2+ and insoluble particles. Also in our latest ice core we face the already known limitation to the quantitative use of the stable isotope variability based on a high and potentially non-stationary isotope/temperature sensitivity at CG. Decadal trends in Ca2+ reveal substantial agreement with instrumental temperature and are explored here as a potential site-specific supplement to the isotope-based temperature reconstruction. The observed coupling between temperature and Ca2+ trends likely results from snow preservation effects and the advection of dust-rich air masses coinciding with warm temperatures. We find that if calibrated against instrumental data, the Ca2+-based temperature reconstruction is in robust agreement with the latest proxy-based summer temperature reconstruction, including a “Little Ice Age” cold period as well as a medieval climate anomaly. Part of the medieval climate period around AD 1100–1200 clearly stands out through an increased occurrence of dust events, potentially resulting from a relative increase in meridional flow and/or dry conditions over the Mediterranean.
Mosher
Virtually all measurements of temperature are proxies, whether it is the volumetric expansion of a liquid or the ohmic resistance of of a thermistor. I think that the only actual measurement of temperature that would NOT be a proxy would be by comparison of an object of unknown temperature to temperature standards. One could theoretically bracket the temperature by observing the direction of heat flow. However, such an approach would be so awkward that no reasonable person would attempt to use it in practice. A thermometer is as much a proxy for temperature as oxygen isotopes in an ice core are. While thermometers typically give current temperatures, a Min-Max thermometer can also provide historically recent temperatures. As I have remarked before, the problem with self-educated people is that they don’t realize what they don’t know. To compensate for that, they often act as though they are experts in fields where there are no experts.
https://www.britannica.com/science/temperature
Clyde Spencer,
+10^42 for you comments “As I have remarked before, the problem with self-educated people is that they don’t realize what they don’t know. To compensate for that, they often act as though they are experts in fields where there are no experts.”
I would have given you a google of credits had you also included Richard Feynman’s pertinent quote: “The first principle is that you must not fool yourself and you are the easiest person to fool.”
Steven
You are ignoring the very close correlation of the Colle Gnifetti ice core reconstruction with the BEST NH temperature record.
If it is uncertain what the core is measuring then it is equally uncertain what BEST is measuring. What is BEST measuring? Is it LOCAL T?, regional T? hemispherical T?
annual T? seasonal T?
This is damning with feint praise, indeed. I cannot join in the compliment. NOT CHEATING and NOT LYING is expected 100% of the time in all science. The late Dr. Stephen Schneider is darning socks in hell for hinting to his teammates that there are appropriate times for cheating and lying.
Why are socks in hell needed unless the yarn is asbestos?
A spectral (frequency) analysis on FIgs 4 and 5 might be interesting. 60-80 years might have power.
Hey, we’re a full-service website … here’s a CEEMD analysis of the question. Discussion of CEEMD here.
w.
Willis
Love the link to the Chronological list of weather events
First few pages is grim, famine after famine.
Sure is nice to live in a modern world fueled by hydrocarbons where the only famines are human caused.
John,
Intuitively, a lead story like that could be challenged when you estimate tons of Pb mined, % of this that went airborne and how well we can measure ppb levels in ice and soils today. Does your reference go into these matters?
Is proxy “data” with no validation Data at all? Good clean fun but possibly meaningless. Certainly diverges from the thermometer record in those last few years, a new “Divergence Problem.”
“Certainly diverges from the thermometer record in those last few years, a new “Divergence Problem.””
Diverges quite a few times, if you look closely.
“Now, this is a most fascinating temperature dataset. ”
Except it’s not a temperature dataset. It’s a proxy that someone believes matches temperature in some way. But even in your figure 3, there are a fair number of departures from the BEST data (another averaging no-no).
I’d say this proxy is no better than tree rings, which anyone can make match some dataset some of the time.
I’m afraid proxies really aren’t that reliable.
Jeff Alberts July 25, 2020 at 12:20 am Edit
Dang, bro, way to zoom in and hammer on a totally trivial point.
By chance I’ve just been looking at the various temperature datasets—Berkeley Earth, HadCRUT, RSS, MSU, CERES, JMA … and guess what?
Pick any one of them, and you’ll find that there are a “fair number of departures” from the other datasets, just like with Berkeley Earth and Colle Gnifetti. And those are temperature datasets, not proxies! It’s the nature of climate datasets.
If you truly believe that, I’d say you don’t understand either the limitations of tree rings or the nature of ice core data.
Depends on what you are calling “that reliable” … and in any case, condemning the entire universe of proxies is a joke. Here’re some of the proxies:
To no one’s surprise, some are more reliable, some less. In this study they’ve used two proxies to refine their results.
w.
Silly me, I thought you wanted to be accurate. It’s not a temperature dataset.
Willis
You left out the volumetric expansion of a liquid used in a thermometer as a proxy for the temperature of the liquid. 🙂 The real issue is that not all proxies are equal because they may be affected by more than one variable.
You also didn’t reply to Mosher when he said essentially the same thing above: “the ice core is not a thermometer. It doesnt “record” temperature. it is a proxy”
psst we dont average
Whatever you want to call it. You’re presenting a single line supposedly representing “global temperature” when there is no such thing.
Willis you say “The only real anomaly is recent when the two diverge. No idea what that’s about.”
Did you consider applying the Mike’s Nature Trick fix to what is obviously erroneous data?
Interesting that divergence in first graph of the modern temps from ice core happens in the last 30 years while the warming rate of the last 30 years on the second graph matches the satellite data sets. It suggests data treatment or measurement techniques might be behind divergence. Perhaps the satellite and Best data sets need reconciliation.
Willis
Said it before, you are a treasure.
Mosh
When are you going to finally admit that CO2 may not be the driver you keep saying it is.
Since volcanoes and recent ‘unexplained’ increases in temperature have been mentioned already, I have absolutely nothing of any use to add.
Thank you, Willis – I found the article a fascinating read. We need all the accuracy of correlation we can get, before we even consider causation.
I have said this before: Hadcrut and Best and etc. are doing the assessment of the ‘world temperature’ all wrong.
Your sampling of stations must be
1) equal number, NH and SH
2) together, your stations must balance out to ca. zero latitude
3) 30% of stations must be inland, 70% must be near or at sea.
If you do it right you might actually get the right answer, like I did (in 2015):
https://documentcloud.adobe.com/link/track?uri=urn:aaid:scds:US:b94c721a-35ca-4b89-b44c-cc648b2d89b4
How easy is it, but why do they keep on doing it wrong?
Ir does not even have to be that complex.
They need only select say the 100 best sited stations, which are free from urbanisation and free from any change over the last 150 years, with the best historic and complete record, and then retrofit each station with the same LIG thermometres as were used in the past at the individual station, and calibrate those as per the standard as used in the relevant country in the past, and then take measurements at the same TOB as historically used at the particular station.
Then you would have 100 sets of data, which could be compared directly with themselves. No attempt to make hemishpere composites, just compare each station with itself, and then draw up a list as to how many stations show warming (its magnitude) cooling (its magnitude), or no significant warming.
We would quickly know whether there is any significant change.
richard
The UHI is real and contributes to warming, even downwind from the urban area. Therefore, it needs to be factored in. However, giving equal weight to urban temperatures and rural temperatures obscures the relationship between anthropogenic warming and natural changes.
so are you always this confident that you are right and the rest wrong – analysing different types of data using different methods and correlations – as the posts clearly indicate – makes no one person correct – it means it’s still debatable and extremely difficult to determine any exact truths – Henry who is convinced he did it right!! and even sceptical climate scientists do not agree – but Henry is right?
True. I must be wrong because Katie knows it best.
Funny. But it seems the cold is always following me….
https://breadonthewater.co.za/2020/07/07/brrr-it-is-getting-colder/
….and I hate cold weather….
Henry Pool:
Temperatures are cooler now because of the two VEI4 eruptions in June 2019 (It typically takes about year for the maximum cooling from a VEI4 eruption to occur. And about 18 months for the cooling aerosols to settle out, and cause temperatures to return to pre-eruption levels, or a bit higher).
So,hang in there, warmer temperatures are on the horizon!
@ Clyde Spencer & others
The comment made here
https://wattsupwiththat.com/2020/07/24/ancient-temperatures/#comment-3039232
is not from me.
I don’t know how it is possible that someone decides to use my name on same blog?
If you click on my name you can check if it is in fact me that is making the comment.
Henry, why can’t you answer directly to a comment ?
Instead you put your text with whatever meaning completey out of context what makes it not just easier to follow.
Just an idea to use the “Reply”-link 😀
I am not seeing my last comment about somebody else using my name to make comments on this blog?
“ stop projecting the future and start reflecting on the past.” to do this you need to remove the political tumor from the body of Climate Science.
Here is another study using an alternate data set, suggesting the Roman Empire period, from AD 1 to AD 500, was the hottest in the last 2000 years.
https://www.nature.com/articles/s41598-020-67281-2
It’s good news that advancing technology will improve knowledge of past climates. In this case it is last ablation inductively coupled plasma mass spectroscopy (LA-ICP-MS). These are highly credible reconstructions.
Figure 4 challenges the simple notion of a Medieval warm period and a Little Ice Age. There are multiple warm and cold intervals. For instance at about 1066 and 1233 there are cold minima which are colder than any during the later “LIA” although they occur during the tine that people would loosely call the “MWP”. The only trend possibly discernible is a reduction in amplitude of temperature fluctuations over the last millennium.
Correction:
lastLaser ablationIn both figures 3 and 4 there is a trend of decreasing amplitude in reconstructed temperature fluctuation. Before about 1400 there are wide swings of temperature and after that they get narrower. This might be an example of the phenomenon of “amplitude death”. This is a smoothing of oscillation that sometimes affects coupled chaotic oscillators. Here’s a paper on the subject:
https://www.researchgate.net/publication/7387084_Amplitude_death_in_coupled_chaotic_oscillators
The author A Prasad mentions an important role of delayed coupling or delayed feedback in amplitude death. We know that with the oceans especially, there is always delayed coupling in the atmosphere-ocean climate system.
Also, amplitude death can be a precursor to a phase-flip bifurcation (PFB)
https://www.nature.com/articles/s41598-018-30026-3
Interesting times ahead.
@ Willis
excellent work!
@ Phil Salmon & others who might be interested:
Proof that man made greening leads to man made warming….
https://journals.ametsoc.org/doi/pdf/10.1175/JCLI3627.1
(read the conclusion, if you get a chance)
BW
Henry
Henry
I think overall greening leads to cooling.
Albedo is not the whole story.
Yes greening darkens the surface increasing albedo.
But it also increases plant transpiration of water vapour into the atmosphere, and enhances the trapping of water in soils by plants 🌱 . This is how plants changed much or the earth from arid to vegetated. On balance it’s a cooling effect. A huge negative feedback to any warning effect CO2 May have.
Phil
Also, some of the light that is not reflected is used for photosynthesis, and not heating. Therefore, the simple albedo of vegetation is not a good measure of the contribution to warming. Similarly, it is not a good indicator of the emissivity.
“The only real anomaly is recent when the two diverge. ”
I can see some real divergence around 1775, 1870, and 1965 in figure 3.
Comparing weather history to even the two year resolution ice core data won’t qualify much because of the large variability of seasonal temperature anomalies year by year. And which season has the largest effect on the ice core? Also the records are sparse in the James A. Marusek chronology, the 880’s for example only mentions rain and cold, nothing to confirm high summer heat.
“Life is short, be sure to take the time to take the time.” This is why we love you. Keep it up. I’ve taught my youngins more about this issue from your work than from almost any other. Armed with your analysis, even their “teachers” couldn’t argue the science any longer, but fell back on the “consensus” argument. Quickly dispelled. You’re an inspiration and I hope to be inspired for many more years.
Thanks for the kind words, Tom, always glad to hear that folks enjoy my work. If you haven’t been over to my website, it’s called “Skating Under The Ice“, and it features all kinds of posts, both about current events as well as autobiographical tales of my outré adventures like “Life In The Psychedelicatessen” and “A Pacific Penance” …
Finally, there’s an index to most of my work here …
Regards,
w.
Willis Eschenbach
July 27, 12:37 pm
“I’m using science, and you are using handwaving”
A strange characterization of actual data.
Your use of “math and science” has led you to the WRONG conclusion, that there there is NO sign of any temperature drop following eruptions.
There is ALWAYS a drop in temperatures following an explosive VEI4 or larger eruption, because of their injection of SO2 into the stratosphere.
Consider Pinatubo, which lowered temperatures by approx. 0.5 deg. C
And Tambora, in 1815, which cooled the northern hemisphere by 0.53 deg. C, in 1816, causing “the year without a summer”
To insist otherwise is a poor reflection on your competence.
You also have a strange view of volcanic eruptions, saying that we should be freezing by now, if the LIA temperature decreases were due to eruptions.
The climatic effect of a given VEI4 eruption eruption lasts for only about 3 years,
and for perhaps 10-15 years for a VEI7 eruption. There is NO carry-over to the present.
Burl, please point to the sign of the drop in temperatures after the eruptions you’ve listed … here are the eruptions in question:
And here are the temperatures stacked on the date of the eruption. If on average there was a temperature drop after the eruptions, it would be visible in the average line.
Where is the drop in temperatures?
You also say:
Or not … see “When Eruptions Don’t”
You know the exact cooling to a hundredth of a degree for an event over a century ago? Really?
In any case, see “Missing The Missing Summer”
Finally, you say:
That would be true IF there were no compensatory response by the climate system … but there is such a response. As soon as the globe starts to cool from any reason, all across the tropical ocean the daily cumulus cloud fields emerge a bit later. This allows more sunlight in, which warms the globe back up and counteracts the cooling.
So what happens from a volcano is a short-term local cooling downwind from the volcano … but, as shown in the stacked volcano graph above, this does NOT translate to a global cooling. The months immediately after an eruption are no cooler on average than the months immediately before an eruption.
Here are 18 or so previous posts of mine analyzing the rather small effects of volcanic eruptions on global temperatures.
Best regards,
w.
Concerning the comment by Ian W on the importance of using the (moist) enthalpy as a metric for warming/cooling (something Prof. Roger Pielke always insisted upon): you may look here for a live plot of moist enthalpy (and sensible heat).
Just to be sure that I did not goof with the html code: https://meteo.lcd.lu/today_01.html
There also is a link on a comment how to calculate these from usual meteorological parameters: click on the word “enthalpy”).
This whole endeavor started with a discussion at Prof. Roger Pielke’s (Sr.) web site a long time ago….