Global Mean Temperature Flattens the Past


Guest post by Renee Hannon

Introduction

There have been recent discussions about ‘flattening the curve’ and some curves are easier to flatten than others. The Pages 2K Consortium calculates global mean temperature in a manner that flattens the long-term trend and makes present day temperatures appear warmer relative to past temperatures. Across the globe, temperature reconstructions show cooling millennial temperature trends with one exception, the Pages 2K global mean.


Millennial Temperature Trends Show Global Cooling

Global mean surface temperature anomalies were recently calculated by the Pages 2K Consortium led by Nuekom, 2019. Their statistical means are a conglomeration of seven different averaging methods for 7000 proxy records over the past 2000 years. The median across all global mean methods is plotted as a dashed line in Figure 1 and compared to Pages 2K’s published regional reconstructions. All means demonstrate similar trends as the median and will be simply be referred to as the global mean(s).

Regional temperature reconstructions are chosen that utilize similar proxy datasets used in the global mean calculation. The Arctic reconstruction by McKay incorporates a balance of proxy records consisting of ice cores, tree rings, lake and marine sediments north of 60 deg N. The Northern Hemisphere (NH) European reconstruction by Luterbacher is tree ring proxy based. And Stenni’s Antarctic reconstruction uses predominantly ice core isotopes.

The Pages 2K global mean appears to be reasonable compared to regional reconstructions from Present through the Little Ice Age (LIA) until about 1250 AD. Although it is difficult to see how the mean compares to regional reconstructions during the Present when using a 1961-1990 baseline as all reconstructions converge creating the “hockey stick” effect. Pre-1250 AD, the global mean appears to parallel NH Europe temperatures largely ignoring the Antarctic.

Figure 1: Top graph are surface temperature reconstructions with a 50-year loess filter plotted with Pages 2K global mean of the 7000-member ensemble across all methods. Bottom graph shows linear trends over the past 2000 years.

Linear regression analysis of the temperature reconstructions in Figure 1, bottom graph, shows cooling trends over the past 2000 years. Surprisingly, both the Arctic and the Antarctic show a similar long-term cooling trend of -0.4 deg C per 1000 years. As a matter of fact, all regional reconstructions show a negative slope, or cooling trend, in temperature anomalies during the past 2000 years shown in Table 1.

Interestingly, all the global means are nearly flat or show a subtle cooling trend. The global mean cooling trend is more aligned with the NH Europe temperature reconstruction. Note the 97.5% global mean cooling of -0.2 deg C per 1000 years is still flatter than both the Arctic and Antarctic mean cooling trend of -0.4 deg C per 1000 years. Also, the global mean low 2.5% range of -0.04 deg C per 1000 years is much flatter than the low range of any regional reconstruction.

Table 1: Millennial trends of spatial temperature reconstructions over the past 2000 years compared to Pages2K global means. Means and ranges are from McKay for the Arctic, Stenni for Antarctic, Luterbacker for Europe, and Neukom for global means.

Of the regional temperature reconstructions, the NH Europe mean millennial trend shows the least amount of cooling during the past 2000 years of only -0.20 deg C per 1000 years. This temperature reconstruction consists entirely of tree ring proxy data. There is a notable shift in data quantity and quality of tree ring datasets around 1000 AD. The number of tree ring records are reduced significantly from 400 records post-1600 AD to less than 30 records pre-1000 AD (Luterbacher, 2016).

McKay reports an Arctic cooling trend of -0.47 deg C per 1000 years during the period 0 to 1900 AD. The cooling trend reported here is for the period 0 to 2000 AD and includes the Present. Including the Present slightly reduces the Arctic millennial cooling trend from -0.47 to -0.40 deg C per 1000 years. As expected, the Arctic reconstruction with large centennial temperature swings shows the highest spread of millennial trends ranging from -0.10 to -0.70 deg C per 1000 years.

Stenni, 2017, shows cooling trends ranging from -0.30 deg c per 1000 years for the East Antarctic Plateau to -0.52 deg C per 1000 years for the Antarctic Peninsula during 0-1900 AD. She breaks out the last 100 years separately which shows the higher frequency or shorter-term centennial warming of the Present. Including the Present slightly increases the range of the Antarctic millennial cooling trend. The E. Antarctic is the last place on Earth where the Present centennial warming has occurred. This delayed warming is not captured by climate models which tend to overestimate Antarctic warming (Stenni, 2017).


Global Mean Falls outside of the Arctic Antarctic Envelope

As discussed in my previous post, I prefer using the LIA 1600-1700 AD as a baseline rather than the 1961-1990 baseline for extended temperature reconstructions. Using the LIA baseline maintains convergence of temperatures during the cold LIA and temperature divergences between the Arctic and Antarctic during warmer periods. It allows the Medieval Warm Period (MWP), Roman Warm Period (RWP) and LIA climate events to be prominently visible on the Arctic data shown in Figure 2. Additionally, these Polar regions are placed in a proper climate context with the Antarctic showing colder temperature anomalies than the Arctic.

General observations show the MWP and RWP to have a peak Arctic temperature like the 1940 Present peak. All three peaks are approximately 1.3 deg C warmer than the LIA baseline. In contrast, Antarctic temperatures are 0.25 to 0.50 deg C warmer during the MWP and RWP than Present.

The global mean is basically flattened backwards in time by not incorporating the underlying millennial cooling trends. When reconstructions are datumed on the LIA, the global mean falls outside the Arctic and Antarctic envelope pre-1250 AD. From 0 to 1250 AD, the mean shows colder global temperatures than even the Antarctic. A simple difference in average temperature between the LIA and reconstructions from 0-1000 AD is revealing. It shows the global mean with a slight increase of only 0.25 deg C warming prior to the LIA in contrast to both the Antarctic and Arctic which show increases of 0.5 and 0.8 deg C warming, respectively, prior to the LIA. The global mean appears reasonable during the Present.

Figure 2. Antarctic and Arctic temperature reconstructions plotted with Pages 2K Global Mean relative to a 1600-1700 baseline. Temperature reconstructions filtered with a 50-year Loess. Bottom graph is a zoom in which shows linear trends from the MWP to the LIA. Reconstructions are filtered with a 30-year loess.

As expected, the millennial underlying trend is transient with time. As an example, the MWP cooling descent into the LIA is faster than the cooling over the past 2000 years, bottom graph in figure 2. The Arctic shows a cooling of -1.1 deg C per 1000 years and the Antarctic is cooling at a rate of -0.6 deg C. Unbelievably, the Pages 2K global mean shows a LIA cooling rate of only -0.2 deg C per 1000 years that is even slower than the Antarctic. The global mean flattening effect reduces the temperature anomaly of warm periods prior to the LIA and does not properly preserve the LIA cooling trend.


Global Mean is Biased by Tree Ring Proxies

The data used by Pages 2K to calculate the global means is based on 7000 proxy records. However, the majority (59%) of the records are tree rings which are located primarily in the Northern Hemisphere (Pages 2K, 2017). Nuekon, 2019, acknowledges that tree ring records are detrended and therefore, do not capture centennial and multi-centennial trends. They also do not retain longer-term millennial trends. Furthermore, he confirms this problem compounds backwards in time and results in underestimation of low-frequency variability especially during the first millennium of the Common Era. This would be during warm period analogs such as the MWP and RWP. The Pages 2K global mean calculations are driven by NH proxy data with an overemphasis on tree ring proxies which is the primary reason for the flattening in the past.

Christainsen et. al, 2017, has an excellent analysis and discussion on the lack of preservation of low-frequency or longer-term variability in proxy records and large-scale temperature reconstructions. He states that tree ring records have absolute annual dating control and can be cross dated with other chronologies. However, he confirms tree ring data has problems related to preserving the very low frequencies and longer-term trends. Additionally, he states that averaging proxies acts as a low-pass filter resulting in the signal being “flattened out,” thus preventing the true magnitude of cold and warm periods in temperature reconstructions from being captured. Both issues apply to the global mean across the RWP and MWP and therefore, should not be directly compared to the Present centennial warming in absolute temperature terms.

Additionally, pre-1000 AD tree ring records are reduced in data quantity and quality. For an objective review of Pages 2K tree ring proxies, I recommend reading Steve McIntyre’s articles. He discusses the accuracy of tree ring data, the divergence problem and cherry picking of data.


Conclusions

Pages 2K global mean published in 2019 does not capture the millennial cooling trend observed in Arctic and Antarctic regional temperature reconstructions. Their global mean relies on a database biased with Northern Hemisphere tree ring proxies which do not preserve long-term temperature trends of the polar regions.

The overall effect of the Pages 2K dataset and mean is to flatten temperature trends backwards in time, especially during the RWP and MWP which are key present-day analogs. The cooling descent into the LIA is largely removed. Warmer Arctic and Antarctic temperatures during the RWP and MWP are minimized and not represented by the global mean temperature.

Thus, the Pages 2K Consortium has flattened the global mean temperature profile in the past.

Acknowledgements: Special thanks to Donald Ince and Andy May for reviewing and editing this article.

References Cited:
Christiansen, B. & Ljungqvist, F. C. Challenges and perspectives for large-scale temperature reconstructions of the past two millennia. Rev. Geophys. 55, 40–96 (2017). https://agupubs.onlinelibrary.wiley.com/doi/full/10.1002/2016RG000521

Luterbacher J et al. European summer temperatures since Roman times. Environmental Research Letters 11, 024001, DOI: 10.1088/1748-9326/11/2/024001, 2016.

McKay, N. P. and Kaufman, D. S.: An extended Arctic proxy temperature database for the past 2,000 years, Scientific Data 1:140026, doi:10.1038/sdata.2014.26, 2014 Dataset: https://www1.ncdc.noaa.gov/pub/data/paleo/pages2k/arctic2014temperature-v1.1.txt

McIntyre, S. Climate Audit blog. https://climateaudit.org/?s=Pages

PAGES 2k Consortium: Continental-scale temperature variability during the past two millennia, Nat. Geosci., 6, 339–346, Published online 21 April 2013, https://doi.org/10.1038/NGEO1797, 2013.1c PAYWALLED. Dataset available see above.

PAGES 2k Consortium- Neukom, R., Barboza, L.A., Erb, M.P. et al. Consistent multidecadal variability in global temperature reconstructions and simulations over the Common Era. Nat. Geosci. 12, 643–649 (2019). https://doi.org/10.1038/s41561-019-0400-0. Paywalled, but shared by the author at the following link. http://pastglobalchanges.org/science/wg/2k-network/nature-geosc-2k-july-19

Stenni, B., Curran, M. A. J., Abram, N. J., Orsi, A., Goursaud, S., Masson-Delmotte, V., Neukom, R., Goosse, H., Divine, D., van Ommen, T., Steig, E. J., Dixon, D. A., Thomas, E. R., Bertler, N. A. N., Isaksson, E., Ekaykin, A., Werner, M., and Frezzotti, M.: Antarctic climate variability on regional and continental scales over the last 2000 years, Clim. Past, 13, 1609–1634, https://doi.org/10.5194/cp-13-1609-2017, 2017.

Temperature Reconstruction Datasets

Arctic McKay, 2014. https://www1.ncdc.noaa.gov/pub/data/paleo/pages2k/arctic2014temperature-v1.1.txt
Antarctic Stenni, 2017. https://www1.ncdc.noaa.gov/pub/data/paleo/pages2k/stenni2017antarctica/CPSrecons/All_regions_recons_CPS.csv
Europe Luterbacher, 2016. https://www1.ncdc.noaa.gov/pub/data/paleo/pages2k/EuroMed2k/eujja_2krecon_nested_cps.txt
SH Nuekom, 2014. https://www1.ncdc.noaa.gov/pub/data/paleo/contributions_by_author/neukom2014/SH_Fig2_recons_Ens-means_wrt1000-2000.txt
Pages 2K 2013 dataset. www.ncdc.noaa.gov/paleo/pages2k/pages-2k-network.html
Pages 2K-Nuekom Ensemble Means 2019. https://www.ncdc.noaa.gov/paleo-search/study/26872.

145 thoughts on “Global Mean Temperature Flattens the Past

  1. This entire project is invalidated by one thing:

    Temperature since 2001 is not rising. It is cooling. Not a “pause” in warming – a natural decline.

    Any claim for abnormal warming must answer to this: “Why does it not show up in USHCN and GHCN?” [50 million and 500 million recordings, respectively]

    Direct Measurement Rules.

    • How can you claim that the temperature since 2001 is declining? Perhaps it is not significantly different from that in 1998.

      • Direct measurement of surface air temp, both United States (USHCN) and the few global long-term stations (GHCN) show TMAX dropping from 2000 to now. You can download the data yourself, for instance as linked on this page for USHCN:

        https:/theearthintime.com

        1999 was four degrees Fahrenheit higher, but back down four to 1974 the back up during the 1930s etc etc etc etc etc etc Welcome to the Holocene.

          • “Dates per windlord.”

            Very funny. I targeted 2000 because that is where the absurd hockey stick on the posted study starts to go postal. That is what must be trashed.

            I only make claims with the full context of 120-150 years of direct measurement — even when I examine one tiny segment of the organic sine curve. You know that, Jack.

            Nice try.

          • “I only make claims with the full context of 120-150 years”

            You said, very explicitly
            “Temperature since 2001 is not rising. It is cooling.”

            Jack Dale simply pointed out that this is quite wrong. On any measure, temperature since 2001 is rising. It is not cooling.

          • a) I made a claim about a tiny segment of the overall TMAX. You will notice I said “with the full context of 120-150 years,” so don’t go twisting. I can say something about a segment of the organic natural sine curve which displays no abnormal warming or cooling over 150 years … with the full context of that reality entrench at the baseline of temperature.

            b) Millions of TMAX measurements since 2000 reveal clear downslope at this time. Measurement is king. Proxies, tropo satellites, etc, have to account for any variance with measurement.

          • windlord-sun

            b) Millions of TMAX measurements since 2000 reveal clear downslope at this time.

            According to NOAA’s US TMAX data there has been a very slight warming trend in TMAX between Jan 2000 and Dec 2019 (0.06 F/dec): https://www.ncdc.noaa.gov/cag/national/time-series/110/tmax/12/12/2000-2019?base_prd=true&begbaseyear=1901&endbaseyear=2000&trend=true&trend_base=10&begtrendyear=2000&endtrendyear=2019

            NOAA don’t supply TMAX data for global at that site, but the trend in global ‘average’ temperatures between Jan 2000 and Dec 2019 is +0.21 C per decade (note, Celsius), according to them. That NOAA global average trend is confirmed as +0.208 ±0.116 °C/decade (2σ) – statistically significant warming, according to The University of York Trend Calculator: http://www.ysbl.york.ac.uk/~cowtan/applets/trend/trend.html

            Where are you getting your information from?

          • I downloaded both USCHN and GCHN and plotted the data. 50 Million TMAX for USCHN and 500 Million for GCHN. Did you look at my website and vet my sourcing?

            That is “data.” I do not get my information from applets that drive anomaly analysis.

          • “I am using dates provided by windlord-sun.”

            So two wrongs make a right?

          • windlord-sun

            I downloaded both USCHN and GCHN and plotted the data. 50 Million TMAX for USCHN and 500 Million for GCHN. Did you look at my website and vet my sourcing?

            Yes, I looked at your website. As I have mentioned before, the only way I can reconcile your GCHN plot with that of NOAA is by taking ‘only’ the maximum daily temperatures recorded in any particular month.

            In other words, you (it seems to me, and I stand to be corrected) have singled out the warmest individual days in each month per year and used those to construct a chart purporting to show a trend in TMAX.

            That’s such an obviously flawed way of calculating trends that it barely needs an explanation; but here goes: Say I support a sports team and use as my only metric for their success their biggist win in any particular series.

            They might have won 10-0 (soccer, say) in one particular game; but they might have lost or drawn every other game in the series. Would only counting that 10-0 game give an accurate assessment of the overall performance in that series?

            That’s what you are doing, if all you are doing is counting the highest daily TMAX figure per month and ignoring the rest of the data. NOAA include all the data. Do You?

          • You are way off. I don’t even understand how you came to the conclusion that I am “taking ‘only’ the maximum daily temperatures recorded in any particular month.” Please give the basis for your claim/suspicion.

            I parsed right down to the day. Seriously. To the day. That information is on files with the extension .dly for each weather station.

            This is serious data crunching.

            Download file: ghcnd_all.tar

            NOTE: these are not actually pure raw data. NOAA does not allow us to see that. The data have been adjusted to some extent.

            There are 115,074 separate files in the download folder, one for each weather station. It contains the entire history of that station, in one file. Besides TMAX, you find TMIN, SNOW, and PERC.

            Example of one line from one daily file for a GHCN station
            USC00201299.dly
            as follows:

            USC00201299192805TMAX 206 6 256 6 311 6 311 6 139 6 111 6 161 6 217 6 267 6 283 6 194 6 156 6 211 6 211 6 272 6 167 6 272 6 261 6 228 6 261 6 250 6 261 6 161 6 161 6 206 6 200 6 133 6 156 6 183 6 206 6 244 6

            Parsed out,
            Station: USC00201299
            Year: 1928
            Month: 05 (May)
            Type: TMAX (maximum temperature for the day)
            Note: the “6” found between each daily is a flag designation, quality control, etc.

            This is the same for USHCN, you can parse out fully.

            So, I run a program that parses out every daily TMAX for a station, and posts it to a StationDays table. The graph of that table is a “strand of spaghetti.” Build up thousands of strands and you get the sawtooth that underlays the graph on my website. Then run a 5-year mean, or just draw the sine curve from sight.

            The result is a normal organic sine curve with an amplitude of 4 degrees Fahrenheit over 120-150 years, with no sign of abnormal warming or cooling.

            Why don’t you do this yourself? You could catch me in a big lie if I’m doing what you accuse me of.

          • @TheFinalNail

            Do you not have even a modicum of human integrity to withdraw your unwarranted serious accusation against me? It deserves an apology, but at least a withdrawal?

            Not that your silence after being presented with reality is disappointing. It is actually rather dramatic. Your unrequested soccer lecture on how to analyze data falls flat when suddenly you see the depth and detail in the sine curve at TheEarthInTime, along with the response I posted to your accusatory “NOAA shows all data, do you?”

            What’s far more important than my satisfaction in your silence is this: you can’t touch the logic in the conclusion revealed by NOAA’s own (adjusted/homogenized/redacted) dataset. That’s what I care about:

            Per NOAA, there is no sign of abnormal warming in the world.

        • @ TheFinalNail

          “NOAA include all the data. Do You?”

          Okay, I am officially insulted.

          Here’s my comeback. “O, thank goodness, there’s something I need from NOAA and you assure me they include all the data. Please link me to the raw data for the 400+ USA weather stations that NOAA began blacklisting, starting in the YearOfMann, 1989. I am so looking forward to getting that raw data.”

      • So, how do they explain their error, Jack?

        Direct measurement of surface TMAX by NOAA shows decline since 2000, with 2019 being the 2nd lowest in recorded history. We are in a trough. It will start rising again in 5-10 years.

      • That’s MEAN temperature, not MAX. If you dig into the ground station data, you do find Cooling of Max Temperatures (daytime) but Warming in the MIN Temperatures (nighttime) …

        • If you dig into the ground station data, you do find Cooling of Max Temperatures.

          From and to what dates? I can’t find any cooling in the TMAX between 2000 and the present in any NOAA US data. Indeed, they show a fractional warming trend.

          The global average data much more so – statistically significant, in fact.

          What is your source for your claim that there has been a cooling trend please?

      • If you have a look at f.e. the Berkeley Data, download March 2019 vs. May 2020

        2018 1 1,234 - 1,176
        2018 2 1,172 - 1,121
        2018 3 1,565 - 1,508
        2018 4 1,472 - 1,422
        2018 5 1,215 - 1,150
        2018 6 1,151 - 1,102
        2018 7 1,225 - 1,177
        2018 8 1,094 - 1,094
        2018 9 0,804 - 0,834
        2018 10 1,206 - 1,191
        2018 11 0,681 - 0,651
        2018 12 1,136 - 1,067

        Only one year as example of what manipulation ever, it’s starting 1850
        http://berkeleyearth.lbl.gov/auto/Global/

        • In the Berkeley “TMAX complete data” I found:

          with 15982673 data points

          Estimated Jan 1951-Dec 1980 absolute temperature (C): 14,40 +/- 0,09

          As Earth’s land is not distributed symmetrically about the equator, there
          exists a mean seasonality to the global land-average,

          Estimated Jan 1951-Dec 1980 monthly absolute temperature:
          Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
          8,15 8,99 11,26 14,30 17,23 19,29 20,12 19,70 17,97 15,06 11,71 9,11
          +/- 0,11 0,09 0,09 0,09 0,09 0,10 0,10 0,10 0,09 0,09 0,09 0,10

          • I have to add, that at the first view, I didn’t realise the following in my 2020 data download (above it was from 3/2019):

            with 17710644 data points
            Estimated Jan 1951-Dec 1980 absolute temperature (C): 14.42 +/- 0.11
            As Earth’s land is not distributed symmetrically about the equator, there
            exists a mean seasonality to the global land-average.
            Estimated Jan 1951-Dec 1980 monthly absolute temperature:
            Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
            8.17 9.01 11.28 14.30 17.23 19.30 20.14 19.71 17.98 15.08 11.73 9.13
            +/- 0.13 0.12 0.11 0.11 0.12 0.11 0.12 0.12 0.11 0.13 0.12 0.12

    • Yes, direct measurement rules:

      http://temperature.global/?fbclid=IwAR1mhZfsFG7WnZYOjTznx_Yvy-_MguXETmvV-cioDlJGGsEqNoWppwAMrUo

      “Temperature.Global calculates the current global temperature of the Earth. It uses unadjusted surface temperatures. The current temperature is the 12M average mean surface temperature over the last 12 months compared against the 30 year mean. New observations are entered each minute and the site is updated accordingly. This site was created by professional meteorologists and climatologists with over 25 years experience in surface weather observations.
      Data Sources

      NOAA Global METARs
      NOAA One-Minute Observations (OMOs)
      NBDC Global Buoy Reports
      MADIS Mesonet Data ”

      “Stations processed last hour: 68110”

      “The recorded global temperature for previous years:
      2015 average: 0.98 °F (0.54 °C) below normal
      2016 average: 0.48 °F (0.27 °C) below normal
      2017 average: 0.47 °F (0.26 °C) below normal
      2018 average: 1.33 °F (0.74 °C) below normal
      2019 average: 0.65 °F (0.36 °C) below normal”

  2. Climate in northern Europe reconstructed for the past 2,000 years: Cooling trend calculated precisely for the first time

    09.07.2012
    An international team including scientists from Johannes Gutenberg University Mainz (JGU) has published a reconstruction of the climate in northern Europe over the last 2,000 years based on the information provided by tree-rings. Professor Dr. Jan Esper’s group at the Institute of Geography at JGU used tree-ring density measurements from sub-fossil pine trees originating from Finnish Lapland to produce a reconstruction reaching back to 138 BC. In so doing, the researchers have been able for the first time to precisely demonstrate that the long-term trend over the past two millennia has been towards climatic cooling. “We found that previous estimates of historical temperatures during the Roman era and the Middle Ages were too low,” says Esper. “Such findings are also significant with regard to climate policy, as they will influence the way today’s climate changes are seen in context of historical warm periods.” The new study has been published in the journal Nature Climate Change.

    https://www.uni-mainz.de/eng/bilder_presse/09_geo_tree_ring_northern_europe_climate.jpg

    • Krishna Gans:

      If the graph shown in your reference were to be extended from 2000 to 2020, the alleged cooling trend would probably disappear

      • Why? There was no statistically significant warming from 1998 to 2016 super El Ninos, and cooling since then.

          • OK Jack, no warming since the Holocene Climate Optimum. Is that more to your liking?

          • The Holocene Optimum was followed by 6000 years of cooling which abruptly reversed in the past 2.5 centuries. We should be much cooler.

          • Jack complaining about cherry picking. Now that’s funny.
            Regardless, how is comparing one El Nino to another an El Nino cherry pick?

          • Not a cherry pick to compare two super El Ninos.

            There has been no statistically significant warming in this century, and we’re still headed down, after over four years since the last super El Nino, without a major La Nina, if any at all.

          • Jack,

            No we should not be much cooler, although the post-Minoan Warm Period, 3000-year downtrend is still intact.

            The Holocene Optimum ended about 5200 years ago. The Egyptian WP, ~4 Ka, was about as warm as the HCO, as was the Minoan WP, ~3 Ka. Then the downtrend started in earnest. The Roman WP peak, ~2 Ka, was lower than the Minoan. The Medieval WP, ~1 Ka, was lower than the Roman. So far the Modern WP is still cooler than the Medieval.

            So natural cycles yet rule. Should the Modern WP ever get hotter than the Medieval and stay there for a couple of 60-year normal fluctuations, then you can start talking about rejecting the null hypothesis of nothing out of the climatic ordinary happening.

            We’re still in the 3000-year secular cooling trend, with millennial scale counter trend warming cycles, but each cooler than the next. The Little Ice Age Cool Period was also probably colder than the prior, Dark Age CP between the Roman and Medieval WPs.

          • PS:

            The massive East Antarctic Ice Sheet quit retreating about 3000 years ago. It still hasn’t moved during the Modern WP, any more than it did during the Roman and Medieval WPs.

            No surprise, since there has been no warming at the South Pole since continuous observations began in 1958. Yet that is the spot on Earth which sould show the most warming from a fourth molecule of CO2 per 10,000 dry air molecules, the air there being so dry.

          • Jack “Arrhenius predicted that Antarctic would warm less quickly”

            We don’t do earth science like Arrhenius did, he “knew” nothing, and if we did the science like he did, we’d largely still know nothing.

            Sheesh, Arrhenius said the atmosphere would be full of carbonic acid ffs 😀

        • John Tillman:

          Temperatures were abnormally high in 1998 because of the 1997-1998 El Nino (avg. anom. J-D temp. +0.62 deg. C.), and using that date will naturally give the impression that temperatures since then have been have been decreasing.

          In 2000, they were +0.40 deg. C. , and for 2019 they were +0.87 deg. C., The 0.47 deg. C. increase since 2000 would suggest that the downtrend may not actually exist.

        • Jack,

          It hasn’t warmed at all, yet enjoys the same level of plant food, with few plants to make good use of it.

          Where water vapor is at its lowest, whatever GHE CO2 may have should be highest. It isn’t.

          Arrhenius rightly considered any AGW beneficial, as did Callendar.

          • BTW – Lots of substances are beneficial; however they have optimal levels. Selenium is beneficial in the human diet. At 400 PPM by weight, it is toxic.

            It is warming, regardless of that your TMAX says.

            If you think CO2 is plant food at all levels, I suggest you increase you intake of quarter pounders with cheese; you body react with increased growth; it is human food.

          • Jack.

            You missed my point, which was about the relative amount of H2O vs. CO2 above the South Pole. The low level of water vapor there means that CO2 should have its greatest relative effect there, but it has not done so despite continuous observation for over 60 years.

            I don’t need your lesson. I’m fully aware of the relative GHE of water v. carbon dioxide, both due its much greater abundance, more potent effect and overlapping absorption bands.

          • John – you really should have read the lesson

            “It’s true that water vapor is the largest contributor to the Earth’s greenhouse effect. On average, it probably accounts for about 60% of the warming effect. However, water vapor does not control the Earth’s temperature, but is instead controlled by the temperature. This is because the temperature of the surrounding atmosphere limits the maximum amount of water vapor the atmosphere can contain.”

            Since the RH of the Antarctic is low, the 60% contribution of H2O as a GHG would be missing.

            CO2 is a driver, H2O is an amplifier. No H2O, no amplification, would mean cooler temperatures.

          • John – Slight correction to my previous post. The Antarctic has a low absolute humidity (not RH), i.e, less H2O vapor. Still no amplification effect.

  3. Further:

    Solar insolation changes, resulting from long-term oscillations of orbital configurations, are an important driver of Holocene climate. The forcing is substantial over the past 2,000 years, up to four times as large as the 1.6 W m -2 net anthropogenic forcing since 1750 (ref.A), but the trend varies considerably over time, space and with season. Using numerous high-latitude proxy records, slow orbital changes have recently been shown to gradually force boreal summer temperature cooling over the common era. Here, we present new evidence based on maximum latewood density data from northern Scandinavia, indicating that this cooling trend was stronger (-0.31 °C per 1,000 years, ±0.03 °C) than previously reported, and demonstrate that this signature is missing in published tree-ring proxy records. The long-term trend now revealed in maximum latewood density data is in line with coupled general circulation models indicating albedo-driven feedback mechanisms and substantial summer cooling over the past two millennia in northern boreal and Arctic latitudes. These findings, together with the missing orbital signature in published dendrochronological records, suggest that large-scale near-surface air-temperature reconstructions relying on tree-ring data may underestimate pre-instrumental temperatures including warmth during Medieval and Roman times

    Orbital forcing of tree-ring data

  4. “……some curves are easier to flatten than others”; not so with Kardashian curves.

  5. Funny thing is, I know of a plastic surgeon (with time on his hands right now because of social distancing) who offers to flatten naturally acquired curves in bellies and butts for any desired effect with his liposuction machine, also for a fee. And I have little doubt about the ordinary appeal of his alterations.

  6. From Judith Curry’s Week in eview: – climate science edition: https://judithcurry.com/2020/05/02/week-in-review-climate-science-edition-2/

    “A new study affirms Northern Eurasia (Sweden, Yamal) has warmed 3 to 6 times SLOWER in the 20th century than during the 4th, 15th and 19th centuries. 1900s-2000s warming: 0.37°C to 0.85°C/100 yrs Roman, Medieval, 1800s warming: 1.37°C to 3.31°C/100 yrs https://link.springer.com/article/10.1007/s00382-020-05179-5

  7. Look, I’m always impressed by the concentration discipline of researchers like Renee who can dig around in silos of numbers and find inconsistencies in historic temperatures constructions and conclusions, but I still remain skeptical about any real-world accuracy or practical application of purported average planetary temperatures of hundredths of one degree C over decades, centuries or millenia.

    At the current stage of our knowledge about Earth’s atmospheric, oceanic and terrestrial interactions, we can’t produce a weather forecast that has more than a 50 / 50 chance of accuracy only 4 days out.

    • Isnt this the main point. If you were to average the current global temperatures over the same century or millenial periods as the proxy data – you would completely flatten the current curve and we would end up worrying about – nothing at all?

  8. “Global Mean is Biased by Tree Ring Proxies”

    You mentioned Steve McIntyre and Climate Audit, but didn’t mention one of the most important things he uncovered: The extensive ex-post selection of proxies which only show desired results. That’s the root of every single one of these paleo reconstructions, whether tree rings, lake sediments, etc. Almost all of them are polluted in such a way.

    • Still trying to drive clicks to your site, eh? Why not submit one of your posts as a guest post here?

      • Jeff Alberts: “Why not submit one of your posts as a guest post here?”

        WR: Could be interesting

      • Thank you, Jack.

        OK, I downloaded it. I’m looking at North America tree proxies. All I see are anamolies, where’s the original?

        For example, tree proxy ca528. How do I work out the original temperature? For example in 1987.

        If I can’t figure that out I can’t compare it to what was recorded by thermometers in the area.

        See? No data!

      • Thank you, Renee.

        The data is still in anamolies, not absolute temperatures.

        For all I know, the actual temperatures could 2.3 degrees below thermometer temperatures.

        For all I know, they could have used different baseline years for different trees.

        No data!

      • What global mean ?

        Meridional Distributions of Historical Zonal Averages and Their Use to Quantify the Global and Spheroidal Mean Near-Surface Temperature of the Terrestrial Atmosphere by Gerhard Kramm, Martina Berger, Ralph Dlugi, Nicole Mölders

        The zonal averages of temperature (the so-called normal temperatures) for numerous parallels of latitude published between 1852 and 1913 by Dove, Forbes, Ferrel, Spitaler, Batchelder, Arrhenius, von Bezold, Hopfner, von Hann, and Börnstein were used to quantify the global (spherical) and spheroidal mean near-surface temperature of the terrestrial atmosphere. Only the datasets of Dove and Forbes published in the 1850s provided global averages below ⟨T⟩=14˚C,mainly due to the poor coverage of the Southern Hemisphere by observations during that time. The global averages derived from the distributions of normal temperatures published between 1877 and 1913 ranged from ⟨T⟩=14.0˚C (Batchelder) to ⟨T⟩=15.1˚C (Ferrel). The differences between the global and the spheroidal mean near- surface air temperature are marginal. To examine the uncertainty due to interannual variability and different years considered in the historic zonal mean temperature distributions, the historical normal temperatures were perturbed within ±2σ to obtain ensembles of 50 realizations for each dataset. Numerical integrations of the perturbed distributions indicate uncertainties in the global averages in the range of ±0.3˚C to ±0.6˚C and depended on the number of available normal temperatures. Compared to our results, the global mean temperature of ⟨T⟩=15.0˚C published by von Hann in 1897 and von Bezold in 1901 and 1906 is notably too high, while ⟨T⟩=14.4˚C published by von Hann in 1908 seems to be more adequate within the range of uncertainty. The HadCRUT4 record provided ⟨T⟩≅13.7˚C for 1851-1880 and ⟨T⟩=13.6˚C for 1881-1910. The Berkeley record provided ⟨T⟩=13.6˚C and ⟨T⟩≅13.5˚C for these periods, respectively. The NASA GISS record yielded ⟨T⟩=13.6˚C for 1881-1910 as well. These results are notably lower than those based on the historic zonal means. For 1991-2018, the HadCRUT4, Berkeley, and NASA GISS records provided ⟨T⟩=14.4˚C,⟨T⟩=14.5˚C,and ⟨T⟩=14.5˚C,respectively. The comparison of the 1991-2018 globally averaged near-surface temperature with those derived from distributions of zonal temperature averages for numerous parallels of latitude suggests no change for the past 100 years

        You have the choice, what global mean you mean 😀

        • Krishna,
          “What global mean?”

          The global mean evaluated in this post is from the 2019 Pages 2K global temperature mean at the following reference.
          PAGES 2k Consortium- Neukom, R., Barboza, L.A., Erb, M.P. et al. Consistent multidecadal variability in global temperature reconstructions and simulations over the Common Era. Nat. Geosci. 12, 643–649 (2019). https://doi.org/10.1038/s41561-019-0400-0. Paywalled, but shared by the author at the following link. http://pastglobalchanges.org/science/wg/2k-network/nature-geosc-2k-july-19

          • You misunderstood, or I was not precise enough.
            What I would show are the quantity of different “global means”, not that you provided “global means” but anomalies without the base temperature…

          • Krishna,
            This evaluation covers the past 2000 years and uses regional reconstructions and the global mean of these reconstructions based proxy data. In order to compare proxy data, the authors provided temperature anomalies. The instrumental data used in your example only covers 150 years at best. You are more than welcome to explore the proxy data and plot up temperatures instead of the temperature anomalies presented here.

        • Krishna,
          “The problem with these provided data is, that there is no indication of the base, the anomalies has been calculated”

          The anomalies in Figure 1 are relative to a 1961-1990 base and in Figure 2 relative to a 1600-1700 base. This is clearly stated on the figures.

          • Renée, do you really believe I brachiate through all the links to find out, what the global mean temperature they used to calculate the anomalies ?
            You told to present global mean temperatures, but you didn’t, instead you came around the corner with anomalies with an unknown base temperature. To know some suspect dates, one, I found myself, 1961-1990CE, and if you’ll show me figures, than do it, please.

          • Go Krishna Gans!

            My line is this: “If you show me a chart driven by anomalies, I then have to ask for all the underlying raw data to validate how you got your mean as well as your object data. Please just give me the raw data.”

          • You ask me to do your work ? Really ?
            You tell us to provide global mean temperatures but show us anomalies and wonder we ask for the global mean base / absolute temperature you pretended to provide us ?
            And last but not least you link again to what you call temperature data but that are still anomalies ?
            Yes, I opend the file and found 692 links.
            I followed the one and the other, and, imagine, what did I found ?
            Anomalies, or wht ever that will be, thank you very much for nothing !

          • Spaghetti vs Duct Tape

            Isn’t it futile to duct-tape vastly differing proxy reconstructions in a graph? Even worse, an attempt to meld them togther into one reconstruction? None of them have precision, ice worst of all.

            It would be useful to see each type of proxy clearly in itself, its own start/stop, its own sine wave. Spaghetti.

            Frankly, if curving trends do not then display congruence of some kind, the conclusion must be that no claims of trend reconstruction from the proxies mean much.

          • Krishna,
            Not sure what your talking about, but this file contains all the temperatures and isotope data. Here’s an example, looks like temperature data and not anomalies to me.
            https://www1.ncdc.noaa.gov/pub/data/paleo/pages2k/pages2k-temperature-v2-2017/data-current-version/Afr-LakeTanganyika.Tierney.2010.txt

            The authors stasitcally calculated 7 different means of 7000 proxy data records. Each of the 7000 proxy datasets will have their own temperature mean, which is why they use temp anomalies. It is time and computer intensive. Sorry if the files are more a bit more overwhelming than the hundred instrumental station datasets that you’re used to seeing. Here’s a link to a recent article that describes the process of normalizing and data QC of all these proxy records.
            https://www.nature.com/articles/s41597-020-0445-3.pdf

  9. In 1998, the climate wroughters were dismayed that even with this super El Nino, late 1930s temperatures still were the high point. When T didnt rise above 1998, Hansen before retirement, pushed 1930s -40s temperatures down about 0.5C (without changing the 100+yr T rise of 0.6C). In one fell swoop, he not only got rid of the dirty 30s high, but removed the deep 35 yr cooling (Ice Age Cometh fear) from 1945 to 1979 (BEST even uses an algorithm) that fixes Ice Age Cometh type dips by assuming a sort of ‘discontinuity’ and slides the low up to match T at the high that follows).

    The rationale of the changes from the wroughters becomes clear when considering their catastrophic warming theory. You can’t have alarm when 0.5C of temperature rise since 1850 to the present occurred before 1940 and that that then declined 0.3C to 1979 with CO2 galloping upwards unabated. What kind of ‘control knob’ is that?

    • Climate Dowsing – is a type of divination employed in attempts to produce a desired regional temperature reconstruction that affirms pre-conceptions, aka confirmation bias. The divining rods in the case of Pages 2K are tree ring data setsex post selected to produce the desired result.

  10. Renee Hannon: “As discussed in my previous post, I prefer using the LIA 1600-1700 AD as a baseline rather than the 1961-1990 baseline for extended temperature reconstructions.”

    WR: Figure 2 of that post is really interesting. The Arctic amplification shows a three century up trend and an eight century down ward trend. Comparable to the upward trend into interglacials and the downward trend into the following glacials: https://andymaypetrophysicist.files.wordpress.com/2017/07/071717_0114_theroleofoc1.png?w=700

    Both trends are indicating that a change in oceanic behavior plays an important role in the creation of periods of higher temperatures. Surfaces of oceans can warm fast (a period of less wind will cause ocean warming because of less deep (cold) oceanic upwelling and because of less mixing of the upper layer with colder layers below ) and warmer oceans can cause deep oceans to warm as well. Cooling the deep oceans however takes more time.

    And the main changes seem to happen in the Arctic. Conform the warming that in recent warming period is noticed: mainly Northern Hemisphere and within the NH mainly the Arctic area. That is exactly where warmer oceans can create most changes.

    • Wim,
      The centennial trends in the Arctic are quite interesting and I agree the oceans play a key role. These Arctic centennial trends which occur every 600-800 years show an abrupt warming and slower cooling over time with decadal fluctuations. It does seem the 1940 warm peak ended the abrupt warming increase coming out of the LIA and now we are experiencing decadal temperature swings in the Arctic. Yes, the centennial trends seem to mimic the interglacial-glacial cycles, just on a smaller scale, rapid warming and gradual cooling.

      These centennial warming events are suppressed in the Antarctic data, which shows mostly the underlying longer-term millennial cooling trend.

      • Renee: “These centennial warming events are suppressed in the Antarctic data, which shows mostly the underlying longer-term millennial cooling trend.”

        WR: The Antarctic is not susceptible to subsurface warm water inflows like the Arctic. For this reason no Southern Hemisphere decadal and centennial scale fluctuations happen on the same scale as in the Arctic.

        A second reason is that the climate system of the Antarctic is rather independent. Otherwise than the Arctic the high and cold Antarctic surface knows a continuous High Pressure area over the South Pole. This HP area results in a continuous down ward ice cold flow of air, flowing northward over the southern oceans. This flow is very stable and suppresses climate influences from elsewhere.

        Antarctic sea ice is much more stable than Arctic sea ice for the reasons above. A large recent Antarctic sea ice loss in the second half of the last decade has been quickly ‘repaired’ by the more stable HP area over the Antarctic. The huge ice masses of the Antarctic created their own weather and climate system.

          • Phil: “Was this all real, or has anyone fessed up to some kind of equipment issue?”

            WR: Following the 2015/2016 El Nino weather patterns changed, also in the SH. It has been said that uncommon storms were able to disrupt the mass of sea ice. Normally those Low Pressure areas are circling widely around the Antarctic. Now they seem to have mixed more of the upper layers closer to Antarctica, disrupting sea ice for some years.

            But soon the High Pressure system became strong enough to keep the Low Pressure areas again at a distance. Present situation on behalf of pressure systems, one and a half month before South Pole winter and about two and a half month before the coldest period is visible here: https://earth.nullschool.net/#current/wind/surface/level/overlay=mean_sea_level_pressure/orthographic=-220.32,-86.16,325/loc=87.579,-89.988

          • Thanks for the response Wim. I’m still a bit skeptical though, as the really big highs came right before that big, long El Nino. I use a lot of equipment to measure stuff in my lab., and that all seemed just too far above normal, followed by too far below normal and now back to bang on the 1981-2010 average.

            Are you aware of anyone tasked with analyzing the satellite data having some discussion about this – real or some equipment issue?

          • Phil, as long as satellite data are observed in the same way as in other periods I have no strong doubts about the rising or falling trends observed. As far as sea ice data from the Antarctic is concerned I cannot remember to have seen a serious doubt/discussion about the quality of the data.

            The rise in the quantity of Antarctic sea ice before 2015 takes away some of the possible doubt: this rise was a big problem for the doom predicters that predicted ice loss while CO2 was rising. Nevertheless the rise was noted. The same for the heavy loss in the years after 2015: inconvenient for other people.

            A temporary fall in strength of the Antarctic high pressure area makes LP areas come closer to the continent during longer periods of time. Those LP areas (heavy storms) are mixing a lot of water and bringing in warmer water from the North, changing temperatures and salinity and causing temporary changes in vertical and horizontal water movements. I can imagine that a short time of warmer water around Antarctica diminished sea ice for some years until the influence of the HP area was fully back. There is no reason why this change in weather pattern could not happen before the El Nino became visible. All weather systems influence each other, no sequence is prescribed.

        • Wim,

          “A second reason is that the climate system of the Antarctic is rather independent. “

          Additionally, the Antarctic follows an inverse solar insolation pattern from the Arctic.

        • Thanks again Wim for the response. That was just a weird 5-year oscillation, seen for the first time in the satellite record from 1979.

    • Michael,
      Yes, Steve’s link is included in my post. This post compares regional reconstructions to the 2019 updated Pages 2K database, which is not much better.

  11. G.M MacDonald, K.V Kremenetski and D.W Beilman (2007).
    Climate change and the northern Russian treeline zone:
    “… Dendroecological studies indicate enhanced conifer recruitment during the twentieth century. However, conifers have not yet recolonized many areas where trees were present during the Medieval Warm period (ca AD 800–1300) or the Holocene Thermal Maximum (HTM; ca 10 000–3000 years ago) …”.
    https://royalsocietypublishing.org/doi/10.1098/rstb.2007.2200

    • Not to mention the bristlecone pines that were alive in the Roman Warm Period at the wonderful Ancient Bristlecone Pine Forest near Mammoth Lakes, California:

      https://www.fs.usda.gov/detail/inyo/specialplaces/?cid=stelprdb5129900

      You will have to walk up the hill if you want to be close to a RWP one, but you might get inducted into the National Academy of Sciences, and even get a paper in Nature Climate Change, if you report that you had to walk downhill.

  12. When are these these people, similar to Dilbert’s boss, going to acknowledge non-random data(all the data here is non-random averages of one sort or another) that is full of short-term trends will not give any information.
    Before it is even sampled it has been averaged in several ways, depending on what it is. Ice-cores do not form from solid layers of ice, but snow that compacts into ice over the course of years. That blurs any date or temperature from one into roughly 100 years(another averaging).

    I have never seen any sound argument that averaging temperature over 100 years “increases” the amount of info we get.

  13. There is no way known to (scientific) man that temperatures over the past 2000 years can be reconstructed by any known method to an accuracy better than +/- 2 degrees C (2 sigma, if you use customary conventions to express this.)
    I have made this assertion after decades of study of numbers and methods used in climate research.
    There are some simple arguments, for example, that the accuracy of proxy reconstructions cannot be better than the accuracy of direct instrumental thermometry. Another example, as Pat Frank noted long ago, many authors seem not to know the important difference between accuracy and precision. Another example, it is scientifically invalid to use subjective rejection of inconvenient data, see Steve McIntyre for ex-post selection in PAGES 2K.
    I have made this assertion. Prove it wrong if you can. Sadly, nobody wants to, because so many incomes depend on continuing the farce. Geoff S

    • Geoff,
      “that the accuracy of proxy reconstructions cannot be better than the accuracy of direct instrumental thermometry.”

      No argument here. Unfortunately instrumental data has only recorded the past 150 years of temperatures. And yes, the data over the past 2000 years is imperfect data with a higher error bar. The proxy reconstructions and their input raw data still contain valuable information.

      • Renee,
        Please define “valuable data” and give 3 examples.
        Simply, you might BELIEVE that there are valuable data, but the proof has to come from observation, replication of such, knowledge of absolute error, confounding variables and so on in ways that used to be routine.
        We did not describe data as valuable. That is subjective. We classed it as valid or not.
        Why, just today I found very strong circumstantial evidence that NOAA is inserting made up numbers for missing data in their daily ML. CO2 public data. We used to sack people caught making up data. Geoff S

        • You are exactly on target. “DATA” is actual physical measurements of real physical phenomena. Much of the information used to calculate so-called Global Average Temperature is actually interpolated and homogenized calculated RESULTS, and therefore it is not data.

          Mathematicians and statisticians consider the numbers as data because that is the focus of their work,i.e., manipulating the numbers. The physical value contained in the DATA is secondary to what is their primary purpose.

          They ignore measurement uncertainty, significant digits, etc. in their search for better numbers. Many believe you can decrease uncertainty and increase precision through averaging. I don’t think 1 out of 10 have any metrology or trending education at all.

          • Jim, yes.

            50 million reported recordings of TMAX by 1200 weather stations in the United States (USHCN) for 120 years are data. Also any global stations that have a 100-year-plus history. Are some “inaccurate?” “Skewed?” “Poorly calibrated?” Yes … plus siting issues like urban heat island creep.

            However, these problems have a random over/under*. With such a massive sample, things even out. The data can be plotted (not ‘averaged’) and the curving trend becomes visible.

            If, instead, the datapoints are filtered, gridded, estimated, homogenized, repressed … you no longer have data. You have a “climate model.”

            *note: UHI is not random over/under. It skews “up.” This would favor warming alarms, if anything. In the overall, however, UHI is not enough to destroy the truth of the sine curve of measured TMAX.

          • Jim,

            “They ignore measurement uncertainty, significant digits, etc. in their search for better numbers. Many believe you can decrease uncertainty and increase precision through averaging. I don’t think 1 out of 10 have any metrology or trending education at all.”

            It is so wonderful to hear someone mention metrology. I’m in aerospace manufacturing, and you don’t keep from having AOG (aircraft on ground) by averaging! I wish the anomaly-obssed would get that.
            Also to hear your precise (irony) observation that you can’t decrease uncertainty by averaging.

            You have to plot the data, then observe the curve.

        • Geoff,
          In order to study past climates, it is necessary to use proxy data which cannot measure temperatures directly and has a higher error bar than instrumental data. You are free to limit your definition of data to only instrumental records. But, I prefer using all available data even that with less than perfect precision rather than cherry picking by only using instrument data.

          I agree with Jim Gorman that the Global Mean is homogenized and inaccurate particularly backwards in time.

          • Renee,
            Nowhere have I said to use only instrumental data.
            I have drawn attention to errors in all measurements being larger than wishful thinking can fabricate.
            If you read the comments of others after I raised accuracy as a subject, including your own, you might start to realise a common cause of complaint is trying to draw inferences from data that are high in real error but confusing in low, artificial error. In a few words, there is far too much wading through noisy weeds.
            Example – the CET numbers can confirm that horses could once cross the frozen Thames, or vice versa, but that allows a guess at temperatures no better than +/- 2 deg C. It can in no way estimate to a tenth of a degree for any day, month or year. Expressing T to a tenth from proxies is akin to writing a fictional historical novel and labelling it a documentary because it usee sciencey terms.
            Geoff S.

  14. As I posted elsewhere, a similar tactic seems to be in use regarding Covid-19 deaths – but for the opposite effect.
    Essentially, deaths in care homes from the early days of Covid are being counted as Covid deaths even if the death was from ‘old age’. This has the effect of inflating the numbers in the early days which makes the later numbers seem good by comparison: the precautions to tackle Covid are seen to be working because the numbers have come down steeply.

  15. Sorry Renee, it is not, cannot be, DATA. Merely yet another kind of modelling. Calling it data is just a lie in scientific terms. Brett Keane

  16. MOD

    When I tried to open my WUWT email today it wanted me to sign into ‘no tricks zone’

    This hasn’t happened before and I didn’t sign in, in fact I unsubscribed and came to this post a different way.

    Is this something new?

  17. MOD

    I tried to click through posts to get to the most up to date. The weekly roundup post triggered the sign in request for ‘notrickszone’.

    Can someone please confirm if this is something new? Apart from replies to this post I am not getting any other new posts.

  18. Seems to be sorted MOD

    The new blog just came through with no problems. It’s just the weekly roundup that doesn’t want to play. I’m sure it was excellent.

  19. Why no mention of the Early Twentieth Century Warming (ETCW)?

    We don’t need proxies to understand our current temperature/CO2 situation.

    It was just as warm in the early 20th century as it is today according to actual temperature readings from all around the world.

    There is much more CO2 in the Earth’s atmosphere today than there was in the early 20th century, yet it is no warmer today than it was then.

    This means that CO2 is not the control knob of the Earth’s climate and does not control the Earth’s temperature and we don’t have to bankrupt ourselves and give away our freedoms, and destroy our environment to fix something that doesn’t need to be fixed.

    We don’t need no stinkin’ proxies to demonstrate this. All we need is the unmodified surface temperature record, which we have.

    Not acknowledging the ETCW period means one buys into the Data Manipulator’s version of computer-generated temperature history, and must ignore actual temperature readings, because they are so different from the computer-generated false reality, which is the bogus, bastardized Hockey Stick chart.

  20. Tom
    “Why no mention of the Early Twentieth Century Warming (ETCW)?”

    The ETCW is referred to as the Present and is shown on the top graph in both Figures 1 and 2. The Present compared to the MWP and RWP is also discussed in the post and was included in the linear trend analysis.

    • I see what you mean, Renee. The Present is shown but I can’t tell if that temperature highpoint is 1934, or 1998, or 2016. It makes a differnece (to me), and that’s what I’m complaining about, although not too vehemently since you are talking proxies and I’m not. I would like to see an RWP, and MWP, and LIA, and ETCW label across that graph, but that’s just me. 🙂

      • Tom,
        “The Present is shown but I can’t tell if that temperature highpoint is 1934, or 1998, or 2016.”

        The first temperature high point for the Present (ETCW), of course, varies with latitude. The Arctic and NH Europe peaks are around 1945-1950 AD. This peak is not well developed in the Southern Hemisphere or Antarctic. I did put your proposed labels on Figure 2, top graph. I didn’t use those labels on Figure 1, because the events are so hard to see when using the 1961-1990 baseline, but probably should have.

        I would use the higher resolution instrumental data to evaluate high points for the ETCW. The inter hemispheric differences show the 1940 high point quite nicely followed by a temperature low around 1980. They also show that 2020+ is still not at a temperature high point yet.
        https://imgur.com/a/oEvIhTM

        • “The inter hemispheric differences show the 1940 high point quite nicely followed by a temperature low around 1980.”

          You are describing reality! Alright! I just became a fan!

          • Tom,
            Did you also notice the difference between GISS and HadCRUT’s 1940 bump? It more pronounced in the GISS data. The reason is because GISS interpolates the data at the polar regions which have sparse data and HadCRUT does not. Interpolation of the data tells a different story.

            Cowtan, K. & Way, R. G. Coverage in the HadCRUT4 temperature series and its impact on recent temperature trends. Q. J. R. Meteorol. Soc. 140, 1935–1944 (2014). https://rmets.onlinelibrary.wiley.com/doi/full/10.1002/qj.2297Documents

        • I reread your article, Renee, and apparently read over the quote below without it registering on my brain:

          “General observations show the MWP and RWP to have a peak Arctic temperature like the 1940 Present peak. All three peaks are approximately 1.3 deg C warmer than the LIA baseline.”

          It was right there all the time, wasn’t it. 🙂

          So, another confirmation that it was just as warm in the 20th century as it is today.

          That means we are not experiencing unprecedented warming today, which is denied by those promoting human-caused climate change, who claim we are now experiencing the warmest temperatures in human history. That’s not so. There is more CO2 in the atmosphere today, but the tempertures are not any warmer than in the recent past. Therefore, CO2 is not the control knob of the Earth’s atmosphere.

          The bogus, bastardized modern-era Hockey Stick chart is the only thing showing that we are experieincing unprecedented warming today. This is a false reality created in a computer lab that bears no resemblance to reality. Every unmodified, regional temperature chart from around the world says the Hockey Stick chart misrepresents reality. None of them resemble the “hotter and hotter” temperature profile of the Hockey Stick. Instead, they all show the decade of the 1930’s to be just as warm as it is today. Warmer, in the case of the U.S. where Hansen said 1934 was 0.5C warmer than 1998, which would make 1934, 0.4C warmer than 2016, the so-called (by the Data Manipulators) “Hottest Year Ever!”

          Here’s a link (below) to a NASA webpage showing the U.S. surface temperature chart (Hansen 1999) alongside a bogus, bastardized, modern-era Hockey Stick chart.

          As you can see, the U.S. surface temperature chart shows the very warm 1930’s and the very cold 1970’s, just as Renee describes above, in her study. You can see that 1934 was warmer than 1998, and that makes it warmer than 2016, too. All unmodified, regional surface temperature charts from around the world resemble, more or less, the U.S. surface temperature chart, as does the chart of the AMO.

          On the right, you can see the Hockey Stick chart. Looking at this chart would make you think that the temperatures have been getting hotter and hotter and hotter, for decade after decade, and we are now at the hottest tempertures in human history.

          This is what the Promoters of Human-Caused Climate Change want you to think, and this is why they rigged their computer models to generate this particular temperature profile. But it is all a Gigantic Lie. None of the unmodified, regional surface temperature charts from around the world resemble this Hockey Stick temperature profile. The Hockey Stick is all alone in proclaiming this the hottest period in human history. And it is the only “evidence” the advocates of Human-caused Climate Change have to offer. Their position is so weak it is amazing it has gotten this far and gone on this long.

          http://www.giss.nasa.gov/research/briefs/hansen_07/

    • That’s the temperature for England, Anthony. And you are claiming that represents the world, too.

      You usually criticize people who claim the U.S. surface temperature record represents the global temperature record, yet here you are claiming England’s temperature record represents the global temperature.

      I can produce a number of unmodified, regional temperature chart profiles from around the world that resemble the U.S. surface temperature profile (i.e., it was just as warm in the 1930’s as it is today), that lends supporting evidence the U.S. chart represents the world. You’ve seen them in the past yet you say they don’t represent the world. I say they do.

      • Tom Abbott:

        Anthony is absolutely correct.

        The Central England instrumental data set DOES represent the world..

      • “That’s the temperature for England, Anthony. And you are claiming that represents the world, too.”

        Actually no.
        Just responding to the plainly incorrect assertion that ….

        “ It was just as warm in the early 20th century as it is today according to actual temperature readings from all around the world.”

        Note.
        Not “the world” but “all around the world”.
        Full comprehension of the sentence please.
        BTW: I don’t claim that the CET is representative of the world.
        It is just a small subset of it, being dominated by the SST’s of N Atlantic ocean and the strength of the NAO.

        But to baldly say that early 20th ct temps were higher than today”s is borne out “by actual temperature readings all around the world” is ignorantly wrong.
        It just is, and I keep seeing it said on here.

    • Beng135;

      By the end of the ~300 year MWP, the arctic had to be essentially ice-free–farming in Grenland, etc.

  21. Tree rings are not thermometers. I’ve been a forester for 47 years and I’ve looked at thousands of tree stumps. Many things effect the size/density of the rings. The temperature might be one of the least significant.

  22. John Tilman:

    “AD 2000 was a La Nina year. Starting a trend line there is a cherry pick.”

    O.K. 2001 J-D temp. was +0.53 (nor a La Nina year–the 2000 La Nina ended in March). So the INCREASE is +34 deg. C.

    “Comparing two super El Nino years isn’t”

    Except that both of the super El Ninos were temporary man-made events and have no more import than a temporary volcanic-induced El Nino.

    ALL El Ninos are caused by temporary reductions in atmospheric SO2 aerosol levels, and their effects need to be completely excluded from any calculations of actual global temperatures.

  23. “ALL El Ninos are caused by temporary reductions in atmospheric SO2 aerosol levels”

    I see where the ENSO meter has made a move back towards La Niña overnight.

    • Tom Abbott:

      That can only mean that atmospheric SO2 aerosol levels have increased. I’ll have to see what is happening re China. ”

Comments are closed.