Past and Present Warming – A Temporal Resolution Issue

By Renee Hannon

Christian Freuer has translated this post into German here.

This post examines how present global surface temperatures compare to the past 12,000 years during the Holocene interglacial. The AR6 IPCC climate assessment report, Climate Change 2021: The Physical Science Basis, by Working Group 1 states in their Summary for Policymakers section A.2.2:

“Global surface temperature has increased faster since 1970 than in any other 50-year period over at least the last 2000 years (high confidence). Temperatures during the most recent decade (2011–2020) exceed those of the most recent multi-century warm period, around 6500 years ago [0.2°C to 1°C relative to 1850–1900] (medium confidence). Prior to that, the next most recent warm period was about 125,000 years ago, when the multi-century temperature [0.5°C to 1.5°C relative to 1850–1900] overlaps the observations of the most recent decade (medium confidence).”

AR6

Paleoclimate proxy data records have low temporal resolution.

Comparing present instrumental data to the past is no small task. Temperature data during the Holocene and older are indirect measurements based on proxies. Scientists have compiled and extensively analyzed these proxy data covering the past 10,000 years. The datasets contain 100’s of records and include terrestrial, marine, lake, and glacial ice proxy data, to name a few.

Unfortunately, lake and marine proxy data are smoothed due to sediment mixing and uncertain age control. Smoothing of paleoclimate proxy data also occurs due to averaging of multiple data types together which destroys higher frequency decadal data (Kaufman and McKay, 2022). Hence, proxy data during the Holocene is multi-century at best, representing an average temperature smoothed over a couple hundred years.

The IPCC statement above is correct but can be misleading. They compare decadal average temperatures to multi-century average proxy data. To better understand how modern temperatures compare to the past, one can either deconvolve past proxy data or smooth present instrumental temperatures to produce a similar temporal resolution comparison.

Kaufman and McKay, 2022, wrote a technical note comparing multi-century present and future temperatures to the past. They used the average of instrumental data plus AR6 model projections to show global mean temperatures of about 1 deg C during a 200-year period from 1900-2100 shown in Figure 1. These averages include 120 years of present instrumental data and 80 years of future modeled projections. The pre-industrial baseline is defined by the IPCC as the average global temperature during 1850-1900.

Figure 1. Global surface temperature from instrumental data used by the IPCC consists of 4 datasets (Hadcrut, NOAA, Berkeley Earth, Kadow) shown in black from Trewin, 2022. Global temperature projections using three emissions scenarios (low, intermediate, and high) from AR6 also shown. The length of the dashed line indicates the period over which data were averaged. All datasets have been calibrated to a 1850-1900 pre-industrial baseline. After Kaufman and McKay, 2022.

Instrumental temperature data have been around since 1850, about 170 years. These data are closely approaching a bicentennial timescale. To note, pre-1950 HadCRUT instrumental data is considered lower quality due to sparser data coverage and increased noise (McLean, 2018). Since IPCC scientists use simple averages for comparison to the past, then averaging instrumental data should also be considered as a present base case. Using the IPCC’s instrumental dataset, a simple average for the last 170 years shows a global temperature anomaly of a whopping 0.3 deg C, uncertainty range of 0.1, above the pre-industrial baseline shown in Figure 1.

Smoothing instrumental temperature during the last century and a half allows for a truer comparison to smoothed multi-century proxy data. This smoothed instrumental data is 70% cooler than the 1 deg C represented by present plus future temperature means over a 200-year period. Annual global instrumental temperatures have only been slightly at or above 1 deg C for about one decade. That’s not even close to being a multi-century comparison to the past.

A More Valid Comparison of the Present to the Past

Using instrumental data without adding in uncertain future modeled projections seems to be a better way to compare present temperatures to the past. Nobody knows how accurate model projections are especially considering the debates about their track record of not matching observed temperatures and past proxy data. A smoothed instrumental average for comparison to the past is absent in the AR6 report and never established, mentioned, or recognized by the IPCC. Figure 2 shows the 170-year instrumental temperature average (small black square) compared to past proxy data during the Holocene.

Figure 2: Millennial global surface temperature ranges from proxy data over the Holocene. Temperature 12k data from Kaufman, 2020. Data are calibrated to a 1850-1900 pre-industrial baseline. Instrumental data represents less than 1.5% of the past 12,000 years.

The Holocene climatic optimum (HCO) occurred 6000-7000 years ago with the warmest 200-year long interval estimated at 0.7 deg C with an uncertainty range of 0.3 to 1.8 deg C according to extensive proxy data compiled by Kaufman, 2020. An earlier proxy study by Marcott, 2013, shows an HCO temperature mean of 0.8 deg C with a two-standard deviation of 0.3 above the pre-industrial period. Marcott also confirms that proxy records completely remove centennial variability, and no variability is preserved at periods shorter than 300 years in his reconstruction. Andy May also performed a Holocene global reconstruction using proxy data here. His reconstruction shows an HCO of 0.85 deg C above the pre-industrial baseline and over 1 deg C warmer than the coldest time of Little Ice Age. Figure 3 shows the 170-year instrumental temperature average compared to the HCO temperature of these reconstructions.

Chemical, biological, and physical data supports a warmer Holocene past. A mid-Holocene climatic optimum is supported by pollen records which show expanded grass and shrub vegetation in the African Sahara, increased temperate forest cover in Northern Hemisphere mid-latitudes and boreal forest instead of tundra in the Arctic (Thompson, 2022). Glacier and ice cap fluctuations from lake studies in the Arctic were smaller than present or absent during the early and mid-Holocene (Larocca, 2022). Both Javier Vinos, 2022, and Kaufman, 2023, have a thorough discussion of empirical evidence at different latitudes supporting a warmer past mid-Holocene.

Figure 3: Histogram of proxy temperature reconstructions in gray showing the warmest temperature of the Holocene compared to the multi-century average of instrumental data in red. Error bars shown by black line. All temperature deltas are from the 1850-1900 pre-industrial period. No climate model data is included.

Even the IPCC states that around 6500 years ago temperatures ranged from 0.2°C to 1°C warmer relative to 1850–1900 pre-industrial period. Therefore, the present global temperature 170-yr average is mostly cooler than the past Holocene climatic optimum 6500 years ago. As a matter of fact, the present average temperature barely hits the 5% minimum error bar on one of the reconstructions and is just over the IPCC minimum range.

In the IPCC technical justification note, Kaufman and McKay 2022, conclude that global recent plus the modeled upcoming warming reaches a level unprecedented in more than 100,000 years. My emphasis on the word plus. Without including future modeled temperatures, present instrumental temperature, averaged over 170 years, does not exceed the warmest multi-century period of the Holocene based on proxy data. And it’s not even close to the last interglacial period when multi-century temperatures were almost 1.5 deg C warmer than the pre-industrial period. If, big IF, the climate models are considered reliable, then perhaps in the future 80 to 100 years, present global temperatures might be as warm as the past Holocene Climatic Optimum.

Download the bibliography here.

4.9 35 votes
Article Rating
114 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
sherro01
March 29, 2023 10:25 pm

Thank you, Renee.
Can you provide any reasons why the IPCC inclusion of model, projected data now to 2100 cannot be regarded as deliberate scientific misconduct?
Like many other scientists, I think it way past time that IPCC faced a Spanish Inquisition. You just do not try to pull shifty math like that in scientific circles that I have moved in. Geoff S

DavsS
Reply to  sherro01
March 30, 2023 5:27 am

There can be no doubt that it is deliberate; the objective is to generate the scariest numbers that they can get away with. In the same way that they define their temperature anomaly metric in terms of 30 year average. but then give a headline comparison between current and1850-1900 baseline using a 10 year average.

DavsS
Reply to  DavsS
March 31, 2023 4:50 am

As recorded here. The IPCC has already tried to defend what they did, but that they did it multiple times demonstrates beyond any reasonable doubt that it was deliberate

https://m.youtube.com/watch?v=-c86QFtrP9A

The previous video from the same channel introduces the issue.

Mr David Guy-Johnson
March 29, 2023 11:09 pm

Extremely interesting. Thank you

Nick Stokes
March 29, 2023 11:33 pm

Renee,
You didn’t say how you did the smoothing, so I presume just a moving average. The problem is that the result is then lagged, with the central point 85 years ago. So you are comparing, in effect, the temperature of the Holocene with the temperature of 1938 – ie before most of the AGW.

Chris Hanley
Reply to  Nick Stokes
March 30, 2023 12:05 am

AGW is irrelevant, the article is comparing the supposed instrumental period average to the supposed Holocene Optimum average irrespective of climate factors.

Last edited 2 months ago by Chris Hanley
Nick Stokes
Reply to  Chris Hanley
March 30, 2023 2:12 am

OK, so we may not know so much about the Holocene Optimum period (so why “Optimum”?). But we know what is happening now.

mickeyreno
Reply to  Nick Stokes
March 30, 2023 2:36 am

Nick Stokes wrote: “But we know what is happening now.”

Ha ha ha ha ha ha ha… Thanks for the great laugh, Nick. You’re ready for open mic night at the local comedy club.

Joseph Zorzin
Reply to  mickeyreno
March 30, 2023 4:49 am

better yet, he should do a Netflix Special 🙂

Editor
Reply to  Nick Stokes
March 30, 2023 3:35 am

Nick, Because we need to know if “now” is unusual. When the global average surface temperature varies almost four degrees every single year and local temperatures often vary more than that in any 24-hour period, it seems likely that one degree of warming in 170 years is pretty normal.

Last edited 2 months ago by Andy May
ZenoMorphic
Reply to  Nick Stokes
March 30, 2023 3:40 am

Optimum because the sahara was s grassland and tundra now was forest then. As a species that evolved in equatorial Africa, a warmer climate is much more optimal us for food growth and life in general than colder climates.

Besides – you know that before your cult took over the institutions the Holocene thermal optimum was coined. It was not until the cult that people started pretending 1850 was some kind of utopian climate age.

Last edited 2 months ago by ZenoMorphic
Joseph Zorzin
Reply to  ZenoMorphic
March 30, 2023 4:52 am

“a warmer climate is much more optimal us”

Here in frigid, damp Woke-achusetts, I’ve been desperate for a day over 60 F for 5 months! And since all my ancestors for centuries were in Italy, my genetics desperately need WARMER temperatures.

michael hart
Reply to  ZenoMorphic
March 30, 2023 5:04 am

Yup. Stokes fell into his own elephant trap with that one.

Tony_G
Reply to  Nick Stokes
March 30, 2023 10:21 am

Then how can we know that now is worse?

Richard M
Reply to  Nick Stokes
March 30, 2023 12:36 pm

But we know what is happening now.

We? You are right that many of us do know that humans are not affecting the climate. You on the other hand keep claiming the opposite.

ThinkingScientist
Reply to  Nick Stokes
March 30, 2023 12:40 am

Rubbish Nick, she is correctly comparing measures at similar resolution. You will need another 100 years of temperature DATA to do what you suggest.

So let’s put all the climate crisis policy nonsense on hold until we actually have sufficient data to make a valid comparison. And call out bogus comparisons of rates of warming at resolutions differing by at least 2 orders of magnitude.

Nick Stokes
Reply to  ThinkingScientist
March 30, 2023 2:10 am

Rubbish Nick, she is correctly comparing measures at similar resolution.”

Yes, if you go back far enough you’ll always find data of lower resolution that you can’t compare with. So? We know what is happening now.

ThinkingScientist
Reply to  Nick Stokes
March 30, 2023 2:33 am

We know its warming now.

We know its warmed (and cooled) before.

To show current warming is “unprecedented” or somehow exceptional you have to be able to compare to previous historical warming at the same resolution.

You can’t unless you reduce the resolution of the modern measurements to that of the paleo data. And that’s a one way street.

So claims of “unprecedented” or “climate crisis” are therefore unproven. Any other claim about modern versus paleo warming is without scientific merit and anyone persisting in comparing rates of warming at differing resolution (2 orders of magnitude different at least) should be called out. Its not science, its recklessly misleading.

Last edited 2 months ago by ThinkingScientist
wilpost
Reply to  ThinkingScientist
March 30, 2023 4:54 am

Thank you, Renee, for a fresh look at data

It is recklessly misleading, but the IPCC gets away with such lying, because there are plenty of brainwashed sheep among the masses.

It takes just one dog to control such a flock. The elites know this, so they collaborate with their lapdog media to rule as they like.

The IPCC has been getting away with its 100-plus, bogus computer models since well before 1979, but the increasing gap between “prediction” and satellite measurements has become more and more obscene over the past 44 years.

The IPCC knows about it, but nevertheless spouts its high temperature nonsense, supported with fantasy graphs.

The IPCC knows, it can count on the central command/control Biden posse fanatics, that are willing to wipe out the whale population to build dysfunctional, 900-ft-tall wind turbines

stinkerp
Reply to  Nick Stokes
March 30, 2023 3:46 am

But due to the low temporal resolution of paleoclimate proxies from ice cores, sediment, etc., especially as you go further back in time, we have no way to know how quickly it warmed so any statements proclaiming the supposedly “unprecedented” pace of warming in the last 50 or 100 or 150 years are complete nonsense. If it warms another 1 or 2 °C in the next hundred years (far from certain based on our limited data), we’ll have enough data to say “unprecedented warming” with a high degree of certainty when compared to the low temporal resolution data from thousands of years ago, yet we still won’t be able to say that when compared to data over the last million years where we see repeated cycles of natural warming and cooling by up to 14 °C. It still remains unknown if the ~1 °C warming since the mid-1800s is natural, mostly natural, or mostly caused by human greenhouse gas emissions.

Last edited 2 months ago by stinkerp
Rich Davis
Reply to  Nick Stokes
March 30, 2023 4:00 am

Yes indeed. We can see that the climate is getting milder, deaths from extreme weather events are down 90%, food production is booming. It’s a catastrophe!

Joseph Zorzin
Reply to  Rich Davis
March 30, 2023 4:54 am

doomsayers will always be with us

TheFinalNail
Reply to  ThinkingScientist
March 30, 2023 2:14 am

…she is correctly comparing measures at similar resolution.

Yes, but the big difference is that we have thermometer observations for the recent record. We know that the current global warming trend didn’t really start in the thermometer record until the 1950s, so, as Nick says, the entire period of observed warming is masked when it is averaged alongside the earlier part of the record.

You might argue that the same could have been true of earlier periods and that might be right; but we have no way of knowing at present. We do know about the past ~170 years though.

ThinkingScientist
Reply to  TheFinalNail
March 30, 2023 2:38 am

Its irrelevent. Resolution is resolution. You want to get two data points of modern temperature data on a graph with support (resolution) of 200 years? You need 400 years of temperature measurements that you can divide into two samples, averaging 200 years of data into each. Then measure the rate of warming between the first and the second block of data. That’s the rate too compare.

And Marcott notes resolution of 300 years, so you would need 600 years of modern temperature data to compare a warming rate.

Change of variance and rates of change in time series under change of scale are very significant and can only be done in one direction – High Res to Low Res. Renee’s point here is very valid and anyone claiming otherwise does not understand resolution and its impact on rates of change and variance in time series.

Last edited 2 months ago by ThinkingScientist
Tim Gorman
Reply to  ThinkingScientist
March 30, 2023 4:09 am

Far too many in the climate alarmist clique do no understand resolution at all. They think you can increase resolution by averaging multiple data points – i.e. you can measure the diameter of a crankshaft journal using a yardstick if you just take enough measurements and average them together.

And variance? Forget it. Do you *ever* see any variance quoted with the global average temperature?

ThinkingScientist
Reply to  Tim Gorman
March 30, 2023 4:26 am

Worse than that, and not just restricted to climate scientists, there are those who think you can “downscale” from low resolution measurements to high resolution.

Just where they think the unmeasured additional bandwidth comes from is beyond me…

Tim Gorman
Reply to  ThinkingScientist
March 30, 2023 5:00 am

They’ve never heard of the rules for significant digits.

Joseph Zorzin
Reply to  Tim Gorman
March 30, 2023 5:00 am

heck, even I know that and I only took a Mickey Mouse one credit course in statistics 52 years ago- I only wish now that I had taken a better statistics course- it’s a very powerful tool- without which you don’t have real science

Tim Gorman
Reply to  Joseph Zorzin
March 30, 2023 5:14 am

Never forget, statistics is only a descriptive tool for collected data. It is the data that is the science, not the statistics.

For instance, both multiple measurements of the same thing using the same device under conditions of repeatability and multiple single measurements of different things using different devices lacking repeatability conditions *both* can be described by the same statistical descriptors such as average and variance.

Those statistical descriptors can be useful in one situation and not so useful in the other. And even then it depends on the calibration status of the measurement devices as to whether the statistical descriptors give *accurate* information.

Joseph Zorzin
Reply to  Tim Gorman
March 30, 2023 5:19 am

“It is the data that is the science, not the statistics.”

I think it’s both. The data is like the pixels in an image and statistics helps us see the image- though people will debate what they see in the image. Then comes theory building, testing, etc. I’m no scientist but I get the basics.

Jim Gorman
Reply to  Joseph Zorzin
March 30, 2023 7:03 am

The data is the science. Science is more about making an hypothesis, creating a mathematical description, and designing experiments capable of obtaining the necessary data to prove or not prove the hypothesis. The experiments must capture the data to a resolution that is appropriate for making a decision.

The largest part of current temperature data does not meet the required resolution to adequately calculate the small changes that are occurring. Few climate scientists appear to have the experimental science education needed to deal with actual, measurable physical quantities. Few experimental scientists or for that matter, mechanics, machinists, and engineers would have the chutzpa to average disparate meadurements and through “arithmetic averages” extend resolution by simple mathematical calculations.

One can use statistics as a tool to make inferences about the data, but careful attention must be given to the assumptions behind the the tool. Climate science in general does not do this, and therefore, the inferences they arrive at are questionable!

Joseph Zorzin
Reply to  Jim Gorman
March 30, 2023 8:37 am

I certainly doubt tree rings as proxies for temperature. As a forester for 50 years- I know that tree ring widths have more to do with rain than temperature – along with age of tree, competition with other vegetation, injuries and diseases, etc. Sides of the tree with more sun will be larger. Lots of information there but little about temperature. Of course there are other “temperature proxies” of which I know nothing- but I have little confidence in them. I’m waiting for the aliens to land and tell us precisely what’s happened over the past few million years as they’ve likely been monitoring the Earth. I say this last part half in jest and half with the hope they exist and will enlighten us. Having seen a UFO once I have more confidence in aliens than I do in climate alarmists.

Tony_G
Reply to  ThinkingScientist
March 30, 2023 10:25 am

I propose an experiment around the idea of resolution:

Measure the temperature once daily at 7am for a month (the historical record), then one day measure it once per hour from 7am-2pm. Chart the results. I bet you see changes at an “unprecedented” rate in the hourly data.

Given the resolution differences, I don’t see how any comparison of the modern record vs. the historical can make any claim about the speed of any change.

Renee
Reply to  Tony_G
March 30, 2023 12:19 pm

Tony_G,

You are missing the point as you are still comparing instrumental temperature to instrumental temperature in your experiment. A better experiment would be to measure the air temperature once monthly (the present), and then measure the temperature of groundwater once a decade (the historical proxy data).

Deep (10-20 m) groundwater or subsurface sediment temperatures have been successfully related to mean annual air temperatures [Todd, 1980]. In groundwater, seasonal variations in heat fluxes are dampened out and subsurface sediment temperatures are constant throughout a year. With time and during burial, the annual sediment layers are mixed by bioturbation and storms and become averaged over multiple annual layers.

Tony_G
Reply to  Renee
March 30, 2023 5:18 pm

Renee, I don’t think I missed your point, but the only thing I was addressing was sampling rate (i.e. “resolution”). If you have more frequent samples, you can see finer detail. Annual or even decadal samples, whatever the means, cannot be reasonably compared to centennial or longer samples. I was ONLY addressing that, with a ridiculously simple thought experiment.

Proxy vs. instrument makes the comparisons, IMO, even less valid, which I believe was what you were saying.

Joseph Zorzin
Reply to  TheFinalNail
March 30, 2023 4:56 am

the only way I’d be convinced about that past 170 years would be if we had a million thermometers across the planet then and now- we have more than that now but not uniformly everywhere

Jim Gorman
Reply to  Joseph Zorzin
March 30, 2023 7:16 am

Even a million thermometers wouldn’t help if they didn’t have the resolution to detect the very small changes that climate science says is occuring.

You simply can not add resolution by averaging. Too many mathematicians like to quote the Central Limit Theory as allowing one to achieve the information that provides additional resolution. This is so far off base it is pathetic.

The CTL only tells one how close a series of sample means can predict the actual true mean. It describes an interval surrounding the estimated mean (calculated from samples), within which the true mean may lay. It is known as the Standard Error of the sample Means (SEM). That interval has nothing to do with the resolution of the the data, nor the resolution of the calculated mean. Those resolutions are determined by using Significant Digit Rules.

To summarize, those who claim that the SEM is a measure of the applicable resolution of the true mean are sadly mistaken.

Last edited 2 months ago by Jim Gorman
Joseph Zorzin
Reply to  Jim Gorman
March 30, 2023 8:39 am

and here I thought all those PhD climate scientists had to take advanced statistics- is that not true? so why don’t they use it correctly and why isn’t that failure caught in “peer review”? Other than the fact that the peer review system is in failure mode?

Tim Gorman
Reply to  Joseph Zorzin
March 30, 2023 11:04 am

Most of the statistics taught today simply denies the fact that uncertainty exists. I have five introductory college statistics textbooks I have purchased over the years. Not a single one has even one example where the data is given as “stated value +/- uncertainty”. None of the several physics textbooks I have do either. I learned about uncertainty and significant digits in my EE and Chemistry labs many, many moons ago – when nothing ever came out exactly matching what I calculated from the “rules”.

One anecdote. My youngest son, when starting in microbiology, was told to not worry about taking anything more than basic math requirements and that included not taking any statistics courses. He was told “if you need a statistical analysis done go find a math major!”. Thankfully he listened to me and took several stat courses so he *knows* how to analyze data. He’s also a very hands on experimenter and knows the difference between data and statistical descriptions.

I suspect that is the problem in so many fields today. The PhD understands the data but knows very little in how to analyze it using statistics – so they have no base to judge the efficacy of the statistical analysis. The stat majors doing the statistical analysis know how to analyze the data but understand nothing about it and so have no base to judge the efficacy of the statistical analysis. You wind up with the blind leading the blind (pardon my lack of wokeness) down the road to perdition.

Joseph Zorzin
Reply to  Tim Gorman
March 30, 2023 11:17 am

though I claim to know almost nothing about statistics- all this time I thought it ALL about uncertainty- I should think the lack of appreciating the topic might be fine in a world dominated by dogma but not one which wants to advance science and engineering and public policy

regarding the climate thing- it makes perfect sense to me that given the size of the Earth, its complexity, the added complexity of 8B people and what they’ve done, to claim that it’s settled that we have a climate emergency is 1% science and 99% hard leftist politics- it just doesn’t smell right- then when I heard Gore saying the oceans are boiling I realized they aren’t just wrong- they’re totally crazy

Tim Gorman
Reply to  Joseph Zorzin
March 30, 2023 11:32 am

they’re totally crazy”

True believers.

Editor
Reply to  Tim Gorman
March 30, 2023 1:27 pm

For those with a little bit of training in statistics, I recommend Statistics done Wrong, by Alex Reinhart, who admits that “I still take obsessive pleasure in finding ways to do statistics wrong.”

If you are a scientist, I defy you to read the book without finding at least one mistake you’ve made in your career!

Don’t ask me which one(s) I’ve made, I will deny it!

Last edited 2 months ago by Andy May
Jim Gorman
Reply to  Andy May
March 30, 2023 3:32 pm

The first thing people, including climate scientists, need to ask themselves is “are we dealing with samples or with the entire population”. No one has ever done that where I could find it.

What they end up doing is dividing the standard deviation they find by the √n, where they claim that the number of stations = “n”. In so doing, they are claiming that they have the entire population of temperatures making up the Global Average Temperature.

If that is true, then the Standard Deviation of the population is what should be quoted and not divided by anything.

I’ve had folks claim individual stations are samples. If that is true then “n” is the size of each sample, i.e. 12 (months) and still not the number of stations. In order to calculate the population Standard Deviation from a sample means distribution, you MULTIPLY the standard deviation of the sample means distribution by the size of the samples. The equation is all over the internet, there is no excuse for anyone to not know it. SEM = SD/√n. If you know the SEM, i.e., the standard deviation of the sample means, then it becomes SD = SEM * √n.

It is enlightening to see no one ever, discuss how they calculate the variance of an anomaly. That should be the Var(X-Y) = VarX + VarY. In other words, the variance of a months average plus the variance of the baseline. Wanna bet that it is calculated that way?

Last edited 2 months ago by Jim Gorman
Clyde Spencer
Reply to  Joseph Zorzin
March 30, 2023 11:58 am

and here I thought all those PhD climate scientists had to take advanced statistics

Michael Mann, and most of his cohorts, are self-described ‘climatologists.’ His academic qualifications are :
A.B. applied mathematics and physics (1989),
MS physics (1991), MPhil physics (1991),
MPhil geology (1993),
PhD geology & geophysics (1998)

Climatology is noticeable by its absence.
His academic background is not too different from my own, or others commenting here.
What is different, is that I don’t represent myself as a climatologist. I make it a point to try to let the facts and logic speak for themselves.

sherro01
Reply to  Clyde Spencer
March 31, 2023 4:59 am

Do you mean the Mann of Mann, Bradley and Hughes, who made a graph nicknamed hockey stick, which pinned instrumental T data on the end of a long period of proxy data? Two very different resolutions on one graph?
That ended well, didn’t it. Sceptics descended on that verboten technique in force, attracting attention about how bad science was part of the global warming story. That was additional to “Hide the decline”.

bobclose
Reply to  Joseph Zorzin
April 1, 2023 5:59 am

Joseph, that’s why we believe the satellite data, that make millions of observations regularly of the atmosphere, so they are comparative and cover the whole planet not just a few thousand convenient stations on land.

Joseph Zorzin
Reply to  bobclose
April 1, 2023 6:08 am

OK- too bad we didn’t have them centuries ago so we could have better “climate science” now.

Tom Abbott
Reply to  TheFinalNail
March 30, 2023 7:23 am

“Yes, but the big difference is that we have thermometer observations for the recent record. We know that the current global warming trend didn’t really start in the thermometer record until the 1950s,”

My thermometer record shows warming starting long before the 1950’s. We had warming from the 1910’s to the 1940’s and then cooling from the 1940’s to the 1970’s, and then warming from the 1980’s into the 2000’s, where the temperatures have peaked, as they did in the 1930’s, and a cooling trend has appeared.

Here’s the U.S. chart (Hansen 1999) showing warming and cooling before 1979, and then the UAH satellite chart showing the warming and cooling from 1979 to present.

As you can see, there is no unprecedented warming in North America comparing today with the past. You don’t have to go back hundreds or thousands of years to find a period that was just as warm as today. The Early Twentieth Century was just as warm as today as recorded in numerous written temperature charts from all over the world.

Hansen 1999:

comment image

UAH:

comment image

Combined, these two charts represent the real temperature profile of the Earth where it was just as warm in the recent past as it is today, and CO2 has had no visible effect on temperatures because much more CO2 is in the air today then was in the air in the Early Twentieth Century, yet it is no warmer today that it was then.

This is the BIG LIE the climate change alarmist tell, and it is refuted by the written temperature record, which is available to just about anyone who cares to look, so one has to wonder why all the alarmist experts, and some of those on the skeptic side, continue to ignore the Early Twentieth Century and the bastardization that has taken place to erase the Early Twentieth Century from memory.

The written temperature record and the Early Twentieth Century temperatures repudiate the Catastrophic Anthropogenic Global Warming (CAGW) claims of the Alarmists. That’s all you need as proof that CO2 has no discernable effect on the Earth’s atmosphere. Almost 100 years of increased CO2 going into the atmosphere, yet it is no warmer today than it was then. What’s left to say?

Clyde Spencer
Reply to  TheFinalNail
March 30, 2023 11:48 am

… the current global warming trend didn’t really start in the thermometer record until the 1950s …

Actually, more like after the 1970s.

Richard M
Reply to  Clyde Spencer
March 30, 2023 12:44 pm

It started right about the time humans started dumping plastic pollution into the oceans. Almost every alarmist argument is based on correlation. They would apply just as well to plastic pollution.

Dave Fair
Reply to  TheFinalNail
March 30, 2023 12:57 pm

TFN, you seem to forget (ignore?) the approximately 1910 to 1945 warming trend that is greater than the post-1950 trend. It is also essentially the same as the post-1975 warming trend that got the Leftists’ panties in a wad. The globe is now cooling, doancha know? As much as the CliSciFi practitioners like to play around with the surface temperature record, weather balloons and satellites tell the true story in the atmosphere, where the greenhouse effect occurs.

Last edited 2 months ago by Dave Fair
aaron
Reply to  TheFinalNail
April 1, 2023 6:35 am

We have a temperature record showing similar warming (actually, more because ice melt was greater) and despite low coverage in arctic (so lacking representation of arctic amplification) before greenhouse gas warming was substantial. So we know the modern greenhouse warming isn’t a lot different than decadal variability.

Last edited 2 months ago by aaron
TheFinalNail
Reply to  Nick Stokes
March 30, 2023 2:03 am

It’s interesting to note that over its first 80 years (1850-1930), HadCRUT has no warming trend. The current warming trend doesn’t really start until ~1950s in any of the global temperature data sets.

Editor
Reply to  TheFinalNail
March 30, 2023 4:07 am

And, the 1950s just happened to be when the modern solar maximum peaked.

Mark BLR
Reply to  TheFinalNail
March 30, 2023 10:33 am

It’s interesting to note that over its first 80 years (1850-1930), HadCRUT has no warming trend.

The current warming trend doesn’t really start until ~1950s in any of the global temperature data sets.

I’ve actually been looking at the HadCRUT5 (Analysis / Infilled) dataset for something else, trying to find “trend channels” more than 30 years long (so they count as “climate”).

If I squint a bit I can get a “current warming trend” starting in 1964, but not in the 1950s.

TCRE_2.png
Last edited 2 months ago by Mark BLR
Clyde Spencer
Reply to  Mark BLR
March 30, 2023 12:03 pm

Hansen’s data suggest a start about 1964, but other data sets suggest a start after the concern about another Ice Age waned.

Richard M
Reply to  Clyde Spencer
March 30, 2023 12:46 pm

The warming started in 1976-77 when the PDO moved into it’s warm phase.

Mark BLR
Reply to  Richard M
March 31, 2023 4:26 am

The warming started in 1976-77

My first selection of trend channels had a “clean break” between “1937 to 1977” and “1978 to 2022″ (/ 2030).

This had the advantage that the separate “Cumulative CO2 emissions” sections were all “anchored” to the start-date of that channel’s trend line (1978 for the last one initially).

Extending the last channel from 1978 to 1964 had me changing the “anchor” to the end-date (2022) instead for “personal preference / aesthetic” reasons … to have overlaps everywhere without “breaks”, a case of “beauty is in the eye of the beholder” and all that …

You may well not be the only person who prefers the attached graph instead of the version given in my OP.

TCRE_2bis.png
Tom Abbott
Reply to  TheFinalNail
March 30, 2023 2:00 pm

Phil Jones says three periods are equal in warming magnitude.

comment image

Note the dates.

ZenoMorphic
Reply to  Nick Stokes
March 30, 2023 3:35 am

lol – the 1930s was hotter in the place in the world with the most dense temperature records ( the USHCN ) than it is now by quite a lot. There were more 90 degree days in the US in the 1930s than there are now. I get that you pretend the entire southern hemisphere can be averaged by 2 stations but the USHCN is not reliable because it tells a story you don’t like – but what makes you think the 1930s during the dust bowl was cooler than now?

Last edited 2 months ago by ZenoMorphic
Tom Johnson
Reply to  ZenoMorphic
March 30, 2023 6:04 am

There is a significant difference between the thermometer readings in the USHCN and the so-called “global temperature” data sets. One can certainly argue that if the earth is experiencing “Catastrophic Global Warming”, it would show up in thermometer readings in the US. I’m making a distinct difference between what the thermometer readings are, and what the “homogenized global temperature” data sets are. The USHCN thermometer records also show that there were warmer decades than the last decade, and colder decades.

Willis has shown here: The US Blows Hot And Cold | Watts Up With That? ,that the daily high thermometer readings in the US have increased by only about 1.26 degrees F over the last century (based on the USHCN actual thermometer readings). I doubt that any sane person could call that “catastrophic”.

There are roughly a million thermometer readings in the data sets, so if anyone doesn’t like his analysis, he has included links to the data for anyone who wants to analyze it differently. In the meantime, I’ll defer to his always insightful analyses.

Last edited 2 months ago by Tom Johnson
ThinkingScientist
Reply to  Nick Stokes
March 30, 2023 4:33 am

Your claim that Renee is comparing “the temperature of the Holocene with the temperature of 1938” is patently nonsense. She is doing no such thing. She is comparing the mean temperature of a period that includes ALL the claimed AGW up to date to the Holocene.

Where you choose to report or plot the date of the sample is irrelevant in this context, the averaging into a single value of the correct resolution is the same. She could plot the point as a trailing mean from 2022 or centred on 1938. Either way, its the same value at the same resolution. And moving the point back or forth by 85 years on a graph representing 12,000 years at a resolution of 200 years makes no difference at all to the comparison. Her point still stands.

Your claim is without merit and should be ignored.

Last edited 2 months ago by ThinkingScientist
Clyde Spencer
Reply to  Nick Stokes
March 30, 2023 11:45 am

As usual, you are being disingenuous. The average temperature represents the set of the entire 170-year collection of instrumental temperatures, not the temperature for 1938.

If she had used a shorter, more recent time period, I would expect you to complain that she didn’t use all the instrumental data available. You are looking for anything to complain about if it doesn’t agree with your catechism.

Capt Jeff
Reply to  Nick Stokes
March 30, 2023 8:29 pm

The most rapid 30 year increase in the US Heat index occurred during the period 1925 to 1954, almost twice the rate of the last 30 years. 48% of US states current high temperature records were set in the 1930s. If not AGW, then what?

Chris Hanley
March 30, 2023 12:26 am

‘Temperatures during the most recent decade (2011–2020) exceed those of the most recent multi-century warm period, around 6500 years ago’

The IPCC statement above is correct but can be misleading

Comparing the supposed proxy-derived global average temperature over a 4,000 year period (9,500 – 5,500 yrs BP) to the thermometer average over the past decade being described as “misleading” is somewhat an understatement IMO.

Last edited 2 months ago by Chris Hanley
Thomas
Reply to  Chris Hanley
March 30, 2023 7:11 am

I did a similar analysis some years ago, but using a 50 year average. Even with that, current temperature are not shockingly higher than past temperatures. Attaching the thermometer record to the proxy record is lying with statistics unless the thermometer values are averaged over a similar time period as the proxies.

ThinkingScientist
March 30, 2023 12:35 am

Very good Renee. Correctly comparing estimates at the same resolution as far as possible.

The claims of other scientists based on comparison of modern temperature data at high resolution to 200-300 year resolution of smooth proxies is very close to scientific fraud, at the least it is recklessly misleading.

This one simple analysis demonstrates how unremarkable modern temperature changes are.

Nick Stokes
Reply to  ThinkingScientist
March 30, 2023 2:15 am

This one simple analysis demonstrates how unremarkable modern temperature changes are.”

No, it doesn’t demonstrate that. It can’t. It demonstrates that time resolution is such that it is possible that a temperature spike occurred way back that we can’t now detect. But it doesn’t show that one occurred. We do know that one is happening now.

Nelson
Reply to  Nick Stokes
March 30, 2023 2:33 am

And exactly what is happening now? The Antarctic has been cooling or unchanged for 70+ years. The Japanese Met office shows cooling around Tokyo and offshore islands for 40 years. The SE US has experienced cooling for more than 50 years. Yes. There are places that have warmed.

The world temperature graphs shown in the article are adjusted or just plain made up. Anyone who tells you the world was warmer in the 70s than say 1921 isn’t telling the truth. We need a complete audit grid cell by grid cell of the data being presented as the world temperature data.

TheFinalNail
Reply to  Nelson
March 30, 2023 4:59 am

We need a complete audit grid cell by grid cell of the data being presented as the world temperature data.

Didn’t Berkeley Earth do that a few years back and come up with more or less the same results as everybody else? (Including the Japanese Met Office, since you mentioned them).

Tim Gorman
Reply to  TheFinalNail
March 30, 2023 5:03 am

You mean that Berkeley Earth that shows the uncertainty in temperature records from the 1800’s in the tenths digit?

TheFinalNail
Reply to  Tim Gorman
March 30, 2023 6:16 am

Not sure why that would be an issue? The precision is the result of the processing required (averaging, standard deviations, etc).

Jim Gorman
Reply to  TheFinalNail
March 30, 2023 7:49 am

Ha, ha,ha. You don’t deal in physical measurements do you?

Averaging, standard deviation, etc. DO NOT determine the information available in a measurement! Only the resolution of the original measurements determine the ultimate resolution of calculations. Anyone who has taken higher level lab classes would know this.

From John Hopkins University:

https://www2.chem21labs.com/labfiles/jhu_significant_figures.pdf

“””””9. When determining the mean and standard deviation based on repeated measurements

“””””o The mean cannot be more accurate than the original measurements. For example, when averaging measurements with 3 digits after the decimal point the mean should have a maximum of 3 digits after the decimal point.”””””

“””””o The standard deviation provides a measurement of
experimental uncertainty and should almost always be rounded to one significant figure”””””

If you need additional references from other university lab courses, let me know. If you find a reference that disputes, please post here so all can see it.

Tim Gorman
Reply to  TheFinalNail
March 30, 2023 8:26 am

You cannot add precision by averaging. That violates significant digit rules for metrology. A repeating decimal as an average is *NOT* infinitely precise.

If you have a standard deviation then how do you know what the true value is? For any specific measurand the true value is considered to be somewhere in the standard deviation interval. How does that increase precision?

If you have measured the same thing multiple times with the same device under repeatability conditions and all measurement error is totally random and Gaussian (or at least symmetric) then you can assume the average value as the true value – BUT you cannot extend the precision of that true value beyond the precision of the measuring device.

If you want to know something out to the hundredths digit then your measuring device better have a resolution in the hundredths digit at least and in the thousandths digit would be better. The temperature record simply doesn’t meet that requirement – not even the newest measuring devices (please note, the resolution of the sensor does not determine the resolution of the total device).

The temperature record simply isn’t fit for the purpose to which it is being used. You simply cannot know temperature differences in the hundredths digit. It is far more likely that *all* temperatures should have the units digit as the last significant digit. And the use of anomalies fixes nothing, their last significant digit still be in the units digit.

Last edited 2 months ago by Tim Gorman
ThinkingScientist
Reply to  Nick Stokes
March 30, 2023 2:42 am

And?

Null hypothesis rules.

Extraordinary claims of “climate crisis” can never be demonstrated by comparing modern high resolution temperature to paleo data. Even at 200 year resolution you would need 400 years of modern temperature data to get two points on a graph to compare warming rates.

Let me know when you get to 400 years of modern temperature observations and I’ll get worried. Meanwhile any claims to the contrary are recklessly misleading and not science.

Last edited 2 months ago by ThinkingScientist
wilpost
Reply to  ThinkingScientist
March 30, 2023 5:06 am

But, the “science” is settled!

Tim Gorman
Reply to  Nick Stokes
March 30, 2023 4:15 am

It demonstrates that time resolution is such that it is possible that a temperature spike occurred way back that we can’t now detect”

But we are not being beaten about the face over a “temperature spike” but over “the earth is going to turn into a cinder”. Temperature growing forever till we all die – and the tipping point is nigh!

Thomas
Reply to  Nick Stokes
March 30, 2023 7:26 am

Nick,

It demonstrates that time resolution is such that it is possible that a temperature spike occurred way back that we can’t now detect. But it doesn’t show that one occurred. We do know that one is happening now.

Yes, but we don’t know that the current temperature spike is unusual. This analysis shows that it probably is not. More precisely it shows that the current spike cannot be compared to the past proxies to support the conclusion that the current spike is unusual. In other words, it shows that all those papers that compared the recent thermometer record to past proxies to conclude that the current spike is unusual cannot be trusted because they are based on an incorrect statistical analysis.

ThinkingScientist
Reply to  Nick Stokes
March 30, 2023 8:11 am

Nick Stokes reverses the null hypothesis. Probably not for the first time either.

Clyde Spencer
Reply to  Nick Stokes
March 30, 2023 12:11 pm

It demonstrates that time resolution is such that it is possible that a temperature spike occurred way back that we can’t now detect.

OK, I’ll give you this one. But, the point that is further implied is that there is no justification for claiming that the current warming is “unprecedented for the last 100,000 years,” or some similar statement. We simply don’t know. However, I’d suggest that there is high probability, knowing what the current variability looks like, that there were similar spikes in the pre-instrumental days.

Tom Abbott
Reply to  Nick Stokes
March 30, 2023 2:05 pm

“We do know that one is happening now.”

Nope. No temperature spike in the United States. It’s cooler now than in the past. No unprecedented warming here. CO2 warming is missing in action.

Tony_G
Reply to  Tom Abbott
March 30, 2023 5:26 pm

If we don’t have sufficient resolution in the historical data to identify if there is or is not a “spike”, then we don’t have sufficient grounds to assume any current “spike” is in any way unusual.

Ben Vorlich
Reply to  ThinkingScientist
March 30, 2023 2:18 am

It’s on a par with comparing modern GPS navigation with pre-Harrison navigation. Or navigating by place names. Cape Wrath Old Norse word meaning turning point, where you turned to get to the Western Isles or Suðreyar or southern islands known to the Scots as Na h-Innse Gall “islands of the strangers” going a bit further south they got to Earra-Ghàidheal the border region of the Gaels or Argyll. We still have a Bishop of Sodor and Man Sodor being Suðreya,
Sorry I got sidetracked but, modern navigation gets you to a few feet from a front door, old methods get you to a island or county.
We’re now comparing high precision to rough guidance.

ThinkingScientist
March 30, 2023 12:46 am

Renee,

There is a method in geostatistics that uses the change (reduction) in variance with decreasing resolution to correct measures and their variance to different resolution. The method is usually referred to as change of support and there is a large body of literature on it.

The change of variance due to averaging was in fact an original observation in Danie Krige’s thesis.

Might be worth finding out about the relation between the smoothing and the change of variance as described in the change of support in the geostatistics literature .

Reply to  ThinkingScientist
March 30, 2023 3:11 am

TS,
You might find that there are too many large exogenous variables in the instrumental T record, like site moves and changes to shelters and thermometer types and UHI. They will dominate the variance. Chemical analyses used by Krige et al has less variance from other causes, so the variance approach has a better chance of success.
For weeks now I have been struggling with 50 Australian stations from “pristine” locations that should have less UHI than usual. The month-to-month variance is so large that it is hard to derive useful systematics. Geoff S

ThinkingScientist
Reply to  Geoff Sherrington
March 30, 2023 3:21 am

Perhaps. But the general principles of change of support are true, notwithstanding its origin in gold assay and averaging. I have studied and applied it comparing seismic attributes to well data.

Also interesting would be to compare different proxy measures of temperature and their variance and compare to the supposed resolution.

Don’t forget that notwithstanding variance change caused by change of support (ie scale or measurement), the fact that a measure is a proxy (and therefore not correlated with R=1 to the actual measurement) also reduces the variance. So the proxy temperature to modern temperature variance comparison suffers from (at least) a double whammy of variance reduction (smoothing) – once from resolution and once from it being a proxy (ie R<1).

Last edited 2 months ago by ThinkingScientist
Renee
Reply to  ThinkingScientist
March 30, 2023 5:32 am

There is a method in geostatistics that uses the change (reduction) in variance with decreasing resolution to correct measures and their variance to different resolution. The method is usually referred to as change of support and there is a large body of literature on it.

Thanks for the information. It’s surprising that climate scientists have not tried such an option or at least used a smoothing algorithm on instrumental data for comparison to the past.

ThinkingScientist
Reply to  Renee
March 30, 2023 5:39 am

Hi Renee,

There is a simple introduction in An Introduction to Applied Geostatistics by Isaaks and Srivastava. I have some nice training notes on it too, and how change of scale can be related to a variogram too. If moderators can pass my email to you and you email me I can send some material. Andy May also has my email somewhere I suspect so that might be another route.

Regards,

TS

sherro01
Reply to  ThinkingScientist
March 30, 2023 4:31 pm

As one whose colleagues did a deep dive into geostatistics starting about 1974, then applied it with success to ore resource calculations on some major new mines, I thought it was a major advance of scientific method. IIRC, I was advocating a geostatistics approach to temperature analysis to people like the Phil Jones group way before Climategate, but there were no takers. A decade ago on WUWT I was asking geoscientists then using geostatistics to come forward and help, but none did. Geoff S

ThinkingScientist
Reply to  sherro01
March 31, 2023 1:30 am

I have been involved in geostatistics for 30 years, in petroleum geoscience. I have run many projects, designed and developed software and present numerous training courses. I taught geostatistics to MSc students at Imperial for 14 years (sadly wokeism has destroyed that course now).

I have been on WUWT for around 15 years, possibly longer.

Last edited 2 months ago by ThinkingScientist
Jim Gorman
Reply to  ThinkingScientist
March 31, 2023 4:18 am

I am an absolute novice at geostatistics. What I have read appears to be useful for relatively static and localized objects. I guess I’m not sure how a globally dynamic atmosphere with multivariable influences such as sun, rain, wind, humidity, lapse rate, etc. that vary with time frames of minutes or hours would be able to be analyzed using geostatistics methods. Do I have a misconception here?

Steve Case
March 30, 2023 2:28 am

Just for fun here’s a comparison of figure 2 above with the IPCC FAR figure 7.1

Hannon 2023 vs IPCC FAR.png
wilpost
Reply to  Steve Case
March 30, 2023 5:33 am

That graph is hugely different from IPCC graphs
It would be great to compare the data sets.

Remember, a few years back, the IPCC claimed the little ice age was just a European event. But, folks from around the world, said, their areas also had recorded a cold area.
That claim was buried

Last edited 2 months ago by wilpost
Reply to  Steve Case
March 30, 2023 4:56 pm

Steve Case,
One needs to take care with the depth of research behind that Holocene temperature sketch, attributed to Lamb, used in FAR.
Steve McIntyre wrote several articles about it on Climate Audit. This link will get you to others.
https://climateaudit.org/2012/10/09/the-afterlife-of-ipcc-1990-figure-7-1/
…….
The entire Climate Audit article is relevant to Renee’s article here and is recommended reading, a reminder that not all is new under the sun.
Geoff S

LT3
March 30, 2023 5:45 am

Very interesting, I think the exact same analysis could be done with CO2, considering the Antarctica data is smoothed with a multi centennial length averaging function, guaranteeing that if one went a few hundred years to the future, collected and processed the Antarctica data with the same process, 400 PPM would not be visible.

Renee
Reply to  LT3
March 30, 2023 6:22 am

Joos, 2008, attenuates atmospheric greenhouse gas variations during the enclosure process of air into firn and ice using a firn diffusion model. The current CO2 atmospheric spike would be portrayed as a broad 30-40 ppm excursion in the ice record. However, it is not smoothed a second time to match proxy sample spacing.

https://www.pnas.org/action/downloadSupplement?doi=10.1073%2Fpnas.0707386105&file=07386Fig8.pdf

Last edited 2 months ago by Renee
LT3
Reply to  Renee
March 30, 2023 6:31 am

Wow, thanks.

LT3
Reply to  Renee
March 30, 2023 6:41 am

I wonder if would be possible to derive an anomaly time series from the proxy data and apply spectral balancing to adjust the anomalies to account for the smoothing and then convert back to derive a proxy that is a more realistic comparison to the instrumental data?

Renee
Reply to  LT3
March 30, 2023 7:10 am

To deconvolve past proxy data, especially global averages, would be very difficult. A simple start would be evaluating individual proxy data with higher resolution instead of a global average. Andy has a good post on using proxy data at different latitudes. https://andymaypetrophysicist.com/2021/06/23/how-to-compare-today-to-the-past/

LT3
Reply to  Renee
March 30, 2023 7:34 am

Thanks, I am curious because in geophysical analysis of seismic data we have a similar problem of trying to match seismic data to well bore data because of the lack of high and lows in the seismic data relative to the borehole well log data. We use deconvolution and match filters to adjust the seismic wavelets closer to the bandwidth of the well bore data, so volumetric seismic reflectivity estimation has a chance of working, and this seems like a situation where that workflow could be applicable.

mydrrin
March 30, 2023 6:43 am

There is no chance that the world today is warmer than 6,500 years ago. And anything near the oceans 12k years ago. The northern hemisphere changes were abrupt and big. 12k years ago warm oceans invaded the poles warming the planet. Marine proxies show a peak about 12k years ago and declining from there. Most glaciers by the oceans all disappeared from about 12k years ago and it took about 5k years to melt the giant ice cubes that were the Ice Sheets covering the continents. Antartica’s minimum like the Ross Ice Shelf was about 6k years ago. Permafrost was less, glaciers were less, alpine and arctic tree lines were less, it all shows the same thing – warmer. There is zero chance.

Reply to  mydrrin
March 30, 2023 4:43 pm

Spelling: Antarctica
A common error in speech and text. Geoff S

BurlHenry
March 30, 2023 7:15 am

Renee:

You say that instrumental temperature data has been around since 1850, about 170 years.

The Central England Instrumental Temperatures Data set began in 1659 and has continued to the present, about 360 years.

Would its inclusion change any of your conclusions?

Renee
Reply to  BurlHenry
March 30, 2023 9:07 am

The HadCET data is more representative of the NH hemisphere. For a 200-year duration it appears there would be 30 more years of cooler data included within the mean and the recent warming would still be smoothed to a lower average than its current decadal mean. Therefore, I do not believe it would change any of my observations.

Beta Blocker
March 30, 2023 9:14 am

In a reply to ThinkingScientist above, Tim Gorman says, “They’ve never heard of the rules for significant digits.”

Tim makes a very astute observation here. Let’s expand upon his remarks and take a hands-on approach to working with significant digits, a.k.a. significant figures.

My right hand has five fingers. The finger count for my right hand, which is 5, has only one significant digit.

My left hand also has five fingers. The finger count for my left hand, which is also 5, also has only one significant digit.

Taken together, my two hands have a total of ten fingers. But the total finger count, which is 10, has only one significant digit.

Among all these various counts, only figures containing one significant digit appear.

And so a question naturally arises ….. which of the digits among my ten fingers is the most significant digit?

I am right handed, so the digit which is the most likely candidate for being ‘most significant’ is probably associated with that hand.

Is it No 1, my thumb? Is it No.2, my index finger? Is it No. 3, my middle finger? Is it No. 4, my ring finger? Is it No. 5, my pinky?

Here is a list of arguments for each candidate competing for the title of Most Significant Digit.

— Digit No. 1, the thumb, because we as humans could not use tools without it.
— Digit No. 2, the index finger, because it is exceptionally useful for pushing buttons and for pointing fingers at guilty parties.
— Digit No. 3, the middle finger, because it is very effective for expressing anger and/or disdain at those whom we disagree with.
— Digit No. 4, the ring finger, because it indicates our marital status to those who might be interested, for whatever reason.
— Digit No. 5, the pinky, because it complements the use of the other four fingers in a variety of different circumstances. 

In answering this vitally important question, a complication arises; i.e., which digit is most significant at any given point in time is contextually and situationally dependent.

For myself, I can only say that Digit No. 2, the index finger is most often my most significant digit.

Why is that?

Because I spend a lot of time pointing fingers at those in the nuclear industry who refuse to take responsibility for bringing their projects in on budget and on schedule.

Clyde Spencer
Reply to  Beta Blocker
March 30, 2023 12:26 pm

More properly, the number of fingers that a person has are exact integers, with as many significant figures as necessary.

Dave Fair
Reply to  Beta Blocker
March 30, 2023 1:51 pm

In general it is only the trigger finger that is significant.

SteveZ56
March 30, 2023 11:28 am

Historical records show that the Medieval Warm Period (ca 1000 – 1300) was warmer than now (raising sheep in Greenland, etc.), while the Little Ice Age (ca. 1600 – 1750) was colder than now. There aren’t many instrumental records going back that far, but could the current slow warming be part of a natural cycle–the same cycle that caused the cooling from the Medieval Warm Period to the Little Ice Age, that has nothing to do with human CO2 emissions?

Besides, every average temperature includes a few extremely high and a few extremely low local temperatures, and the average for even this month includes blizzards in southern California, which are considered rare by people who have lived there for decades.

ferdberple
March 30, 2023 4:22 pm

I don’t have the reference. I do recall reading that some proxies had been found (ocean cores?) that showed large temperature shifts in time periods as short as 20 years on the boundaries between glacial and interglacials. These shifts were both plus and minus. Before then it was assumed such shifts would be gradual.

As such there is nothing unusual about current temperature changes as a warning sign that our interglacial is approaching the end.

Perhaps someone has a reference as I have seen it referenced at least twice.

ferdberple
March 30, 2023 5:13 pm

What seems insane to me is that we do not have a value for Natural Climate Variability

We have the historical data and a ton of people getting paid for climate work. It seems to me that statistical methods should be able to solve for historical variance and Natural Climate Variability.

Looking at Paleo climate at different times scales it sure looks to me like some sort of self-similar fractal distribution. It is only when we look at the more recent reconstructions (5-10k years) that this distribution appears to break down.

This suggestst to me that climate science may have improperly accounted for natural Variability. However I’m not on the list of people funded to find the problems. If I was, for a meer $500 million I would almost guarantee a result.

“It is hard to get someone to find something when their job depends on not finding it”

Last edited 2 months ago by ferdberple
Tim Gorman
Reply to  ferdberple
March 31, 2023 2:42 pm

Historical variance? Climate Science won’t even properly handle the variance of daily temperatures, monthly averages, or annual averages – be they absolute temps or anomalies!

Capt Jeff
March 30, 2023 8:23 pm

Lamb assessed more recent temperature fluctuation changes looking at agricultural records, particularly viticulture. British Isles viticulture start in Roman times but disappear in subsequent cooling then back again in MWP before disappearing in LIA.
Viticulture is back but has not extended as far as the 2 previous northern limits.
Expansion then retraction of glaciers with carbon dating of ancient forests buried in moraine are frequently referred to by glaciologists and good indicators of more recent climate fluctuations.

Jack
April 1, 2023 9:37 am

the IPCC states that around 6500 years ago temperatures ranged from 0.2°C to 1°C warmer relative to 1850–1900 pre-industrial period.”
In my opinion the IPCC is far from the truth. Recent studies (published 2020 and later) point towards much warmer periods during the early Holocene. IPCC would be well advised to update its datas:

  • Allan et al., 2021 Greenland 5-7°C warmer (4-5°C vs. 10-12°C) than today from 7,500 to 5,500 years ago.
  • Myers et al., 2021 Antarctica “5 °C warmer than modern conditions” during the Early Holocene
  • Campbell-Heaton et al., 2021 Arctic Canada 6-8°C warmer than today during Early Holocene
  • Cheli et tal., 2021 Adriatic Sea (Italy) 4°C warmer than today 9,000 to 5,000 years ago
  • Sjögren, 2021 Sub-Arctic Norway, Sweden, Finland, Russia 2.5°C to 7.0°C warmer than today during Early Holocene
  • Lv et al., 2021 Central China ~3-5°C warmer than modern from ~11,000 to 6,000 years ago
  • Morán et al., 2021 Southernmost South America “warmer than today” (2-3°C?) during the Late Holocene
  • Astakhov and Semionova, 2021 Russian Arctic had dense forests (tundra today) and sea floor temperatures were “4–8°C higher” when CO2 was 280 ppm (last interglacial)
  • Conti et al., 2020  Scotland lake summer temps 6.8°C warmer (~20°C vs. 13.2°C) 5600-5900 yrs ago
  • Liu et al., 2020  Central China 3-4°C warmer (Mid-Holocene)
ChrisG
April 1, 2023 9:53 am

I have taken a look at the 50 highest resolution proxies behind PAGES12k. The standard deviation across these proxies for a typical decade is around 0.7C. Obviously this does not take into account the smoothing effect of using linear interpolation between temperature data points, the accuracy of each method (reported as +/-1C) or the fact that all the other proxies behind major reconstructions have lower resolution.

This means that the actual standard deviation behind any claim based on the assumption of a stable past is >1C for any decade. Consequently we simply do not have the accuracy or resolution to claim either that anything unusual is happening or that the past was actually stable.

The IPCC current claimed human caused warming is 1.1C. So we are within one standard deviation from the long term trend.

%d bloggers like this:
Verified by MonsterInsights