170 Years of Earth Surface Temperature Data Show No Evidence of Significant Warming

Author: Thomas K. Bjorklund, University of Houston, Dept. of Earth and Atmospheric Sciences

October16, 2019

Key Points

1. From 1850 to the present, the noise-corrected, average warming of the surface of the earth is less than 0.07 degrees C per decade.

2. The rate of warming of the surface of the earth does not correlate with the rate of increase of fossil fuel emissions of CO2 into the atmosphere.

3. Recent increases in surface temperatures reflect 40 years of increasing intensities of the El Nino Southern Oscillation climate pattern.

Abstract

This study investigates the relationships between surface temperatures from 1850 to the present and reported long-range temperature predictions of global warming. A crucial component of this analysis is the calculation of an estimate of the warming curve of the surface of the earth. The calculation removes errors in temperature measurements and fluctuations due to short-duration weather events from the recorded data. The results show the average rate of warming of the surface of earth for the past 170 years is less than 0.07 degrees C per decade. The rate of warming of the surface of the earth does not correlate with the rate of increase of CO2 in the atmosphere. The perceived threat of excessive future global temperatures may stem from misinterpretation of 40 years of increasing intensities of the El Nino Southern Oscillation (ENSO) climate pattern in the eastern Pacific Ocean. ENSO activity culminated in 2016 with the highest surface temperature anomaly ever recorded. The rate of warming of the earth’s surface has dropped 41 percent since 2006.

Text

Section 1

Introduction

The results of this study suggest the present movement to curtail global warming may by premature. Both the highest ever recorded warming currents in the Pacific Ocean and technologically advanced methods to collect ocean temperature data from earth orbiting satellites coincidently began in the late 1970s. This study describes how newly acquired high-resolution temperature data and Pacific Ocean transient warming events may have convolved to result in long-range temperature predictions that are too high.

HadCRUT4 Monthly Temperature Anomalies

This analysis uses the HadCRUT.4.6.0.0 version of monthly medians of the global time series of temperature anomalies, Column 2, 1850/01 to 2019/08 (Morice, C. P., et. al. 2012). The NASA Goddard Institute for Space Studies data set of global-mean annual land and sea surface temperature anomalies, 1880 to 2018, was also analyzed using the methodology described in this report. The results are essentially the same as the results from the HadCRUT4 data set analyses. The HadCRUT4 data analysis was used for this report because the time series is longer, and the monthly global temperature anomalies are easier to import to Excel.

Only in recent years have high-resolution satellites provided simultaneously observed data on properties of the land, ocean and atmosphere (Palmer, P.I., 2018). NOAA-6 was launched in December 1979 and NOAA-7 was launched in 1981. Both were equipped with microwave radiometry devices (Microwave Sounding Unit-MSU) to precisely monitor sea-surface temperature anomalies over the eastern Pacific Ocean and the areas of ENSO activity (Spencer, et al., 1990). These satellites were among the first to use this technology.

The initial analyses of the high-resolution satellite data yielded a remarkable result. Spencer, et al. (1990), concluded the following: “The period of analysis (1979–84) reveals that Northern and Southern hemispheric tropospheric temperature anomalies (from the six-year mean) are positively correlated on multi-seasonal time scales but negatively correlated on shorter time scales. The 1983 ENSO dominates the record, with early 1983 zonally averaged tropical temperatures up to 0.6 degrees C warmer than the average of the remaining years. These natural variations are much larger than that expected of greenhouse enhancements and so it is likely that a considerably longer period of satellite record must accumulate for any longer-term trends to be revealed”.

Karl, et al. (2015) claim that the past 18 years of stable global temperatures is due to the use of biased ocean buoy-based data. Karl, et al. state that a “bias correction involved calculating the average difference between collocated buoy and ship SSTs. The average difference globally was −0.12°C, a correction that is applied to the buoy SSTs at every grid cell in ERSST version 4.” This analysis is not consistent with the interpretation of the past 18-year pause in global warming. The discussion below of the first derivative of a temperature anomaly trendline shows the rate of increase of relatively stable and nearly noise-free temperatures peaked in 2006 and has since declined in rate of increase to the present.

The following is a summary of conclusions by Karl, et al. (2015) (called K15 below) by Mckitrick (2015): “All the underlying data (NMAT, ship, buoy, etc.) have inherent problems and many teams have struggled with how to work with them over the years. The HadNMAT2 data are sparse and incomplete. K15 take the position that forcing the ship data to line up with this dataset makes them more reliable. This is not a position other teams have adopted, including the group that developed the HadNMAT2 data itself. It is very odd that a cooling adjustment to SST records in 1998-2000 should have such a big effect on the global trend, namely wiping out a hiatus that is seen in so many other data sets, especially since other teams have not found reason to make such an adjustment. The outlier results in the K15 data might mean everyone else is missing something, or it might simply mean that the new K15 adjustments are invalid.”

Mears and Wentz (2016) discuss adjustments to satellite data and their new dataset, which “shows substantially increased global-scale warming relative to the previous version of the dataset, particularly after 1998. The new dataset shows more warming than most other middle tropospheric data records constructed from the same set of satellites.” The discussion below shows the warming curve of the earth has been decreasing in rate of increase of slope since July 1988; that is, the curve has been concave downward. Based on this observation alone, their new dataset should not show “substantially increased global-scale warming.”

Analysis of Temperature Anomalies

All temperature measurements used in this study are calculated temperature anomalies and not absolute temperatures. A temperature anomaly is the difference of the absolute measured temperature from a baseline average temperature; in this case, the average annual mean temperature from 1961 to 1990. This conversion process is intended to minimize the effects on temperatures related to the location of the measurement station (e.g., in a valley or on a mountain top) and result in better recognition of regional temperature trends.

In Figure 1, the black curve is a plot of monthly mean surface temperature anomalies. The jagged character of the black temperature anomaly curve is data noise (inaccuracies in measurements and random, short term weather events). The red curve is an Excel sixth-degree polynomial best fit trendline of the temperature anomalies. The curve-fitting process removes high-frequency noise. The green curve, a first derivative of the trendline, is the single most important curve derived from the global monthly mean temperature anomalies. The curve is a time-series of the month-to-month differences in mean surface temperatures in units of degrees Celsius change per month. These very small numbers are multiplied by 120 to convert the units to degrees per decade (left vertical axis of the graph). Degrees per decade is a measure of the rate at which the earth’s surface is cooling or warming; it is sometimes referred to as the warming (or cooling) curve of the surface of the earth. The green curve temperature values are similar in magnitude to noise-free earth surface temperature estimates determined by University of Alabama in Huntsville for single points (Christy, J. R. May 8, 2019). The green curve has not previously been reported and is critical to analyzing long-term temperature trends.

image

Figure 1. The black curve is the HadCRUT4 time series of the mean monthly global land and sea surface temperature anomalies, 1850-present). Anomalies are deviations from the 1961-1990 annual mean temperatures in degrees Celsius. The red curve is the trendline of the HadCRUT4 data set, an Excel sixth-degree polynomial best fit of the temperature anomalies. The green curve is the first derivative of the trendline converted from units of degrees C per month to degrees C per decade, that is; the slope of the trendline curve.

In a recent talk, John Christy (May 8, 2019), director of the Earth System Science Center at the University of Alabama in Huntsville, reported estimates of noise-free earth warming in 1994 and 2017 of 0.09 and 0.095 degrees C per decade, respectively. The 2017 average value for the green curve is 0.154: this value is 0.059 degrees per decade higher than the UAH estimate. The latest value in August 2019 for the green curve is 0.125 degrees C per decade. The average degrees C per decade value of earth warming based on the green curve over 2,032 months since 1850 is 0.068 degrees C per decade. The average rate of warming from 1850 through 1979, to the beginning of the most recent El Nino Southern Oscillation (ENSO), is 0.038 degrees C per decade.

A warming rate of 0.038 degrees C per decade would need to significantly increase or decrease to support a prediction of a long-term change in the earth’s surface temperature. If the earth’s surface temperature increased continuously starting today at a rate of 0.038 degrees C per decade, in 100 years the increase in the earth’s temperature would be only 0.4 degrees C., which is not indicative of a global warming threat to humankind.

The 0.038 degrees C per decade estimate is likely beyond the accuracy of the temperature measurements from 1850 to 1979. Recent statistical analyses conclude that 95% uncertainties of global annual mean surface temperatures range between 0.05 degrees C to 0.15 degrees C over the past 140 years; that is, 95 measurements out 100 are expected to be within the range of uncertainty estimates (Lenssen, N. J. L., et al. 2019). Very little measurable warming of the surface of the earth has occurred from 1850 to 1979.

In Figure 2, the green curve is the warming curve; that is, a time series of the rate of change of the temperature of the surface of the earth in degrees per decade. The blue curve is a time series of the concentration of fossil fuel emissions of CO2 in parts per million in the atmosphere. The green curve is generally level from 1900 to 1979 and then rises slightly due to lower frequency noise remaining in the temperature anomalies from 40 years of ENSO activity. The warming curve declined since early 2000 to the present. The concentration of CO2 increased steadily from 1943 to 2019. There is no correlation between a rising CO2 concentration in the atmosphere and a relatively stable, low rate of warming of the surface of the earth from 1943 to 2019.

image

Figure 2.  The green curve is the first derivative of the trendline converted to units of degrees C per decade, that is, the rate of change of the surface temperature of the earth.  See Figure 1 for the same curve along with the temperature anomalies curve and the trendline curve.  The blue dotted curve showing total carbon emissions from fossil fuels in the atmosphere is modified from Boden, T. A., et al. (2017); the time frame shows only emissions since 1900. There is no correlation between increase in the concentration of carbon in the atmosphere and the surface temperature of the earth.  [caption updated 11/17/19

In Figure 3, the December 1979 temperature spike (Point A) is associated with a weak El Nino event. During the following 39 years, five strong to very strong intensity El Nino events are recorded; the last one, in 2015-2016, the highest intensity El Nino ever recorded (Goldengate Weather Services. (2019). The highest ever mean global monthly temperature anomaly of 1.111 degrees C was recorded in February 2016. Since then, monthly global temperature anomalies declined 35 percent to a temperature of 0.724 degrees C in August 2019 as the El Nino decreased in intensity.

image

Figure 3. An enlarged portion of Figure 1 from 1963 to 2019 with modified vertical scales to emphasize important changes in the shape of the green curve.

Points A, B and C mark very significant changes in the shape of the green warming curve (values on left vertical axis).

1. The green curve values increased each month from 0.085 degrees C per decade in December 1979 (Point A) to 0.136 degrees C per decade in July 1988 (Point B); this is a 60 percent increase in rate of warming in nearly 9 years. The warming curve is concave upward. Point A marks a weak El Nino and the beginning of increasing ENSO intensity.

2. From July 1988 to September 2006, the rate of warming increased from 0.136 degrees C per decade to 0.211 degrees per decade (Point C); this is a 55 percent increase in 18 years but about one-half the total rate of the previous 9 years because of a decrease in the rate of increase each month. The July 1988 point on the x-axis is an inflexion point at which the warming curve becomes concave downward.

3. September 2006 (Point C) marks a very strong El Nino and the peak of the nearly 40-year ENSO transient warming trend, imparting a lazy S shape to the green curve. The rate of warming has declined every month since peaking at 0.211 degrees per decade in September 2006 to 0.125 in August 2019; this is a 41 percent decrease in 13 years.

Section 2

Truth and Consequences

The “hockey stick graph”, which had been cited by the media frequently as evidence for out-of-control global warming over the past 20 years, is not supported by the current temperature record (Mann, M., Bradley, R. and Hughes, M. 1998). The graph is no longer seen in the print media.

None of 102 climate models of the mid-troposphere mean temperature comes close enough to predicting future temperatures to warrant changes in environmental policies. The models start in the 1970s at the beginning of a time period that culminated in the strongest ENSO ever recorded and by 2015, less than 40 years, the average predicted temperature of all the models is nearly 2.4 times greater than the observed global tropospheric temperature anomaly in 2015 (Christy, J. R. May 8, 2019). The true story of global climate change has yet to be written.

The peak surface warming during the ENSO was 0.211 degrees C per decade in September 2006. The highest global mean surface temperature ever recorded was 1.111 degrees C in February 2016; these occurrences are possibly related to the increased quality and density of ocean temperature data from the two, earth orbiting MSU satellites described previously. Earlier large intensity ENSO events may not have been recognized due to the absence of advanced satellite coverage over oceans.

The use of a temperature trendline to remove high frequency noise did not eliminate the transient effects of the longer wavelength components of ENSO warming over the past 40 years; so, estimates of the rate of warming for that period in this study still include background noise from the ENSO. A noise-free signal for the past 40 years probably lies closer to 0.038 degrees C per decade, the average rate of warming from 1850 to the beginning of the ENSO in 1979 than the average rate from 1979 to the present, 0.168 C degrees per decade. The higher number includes uncorrected residual ENSO effects.

Foster and Rahmstorf (2011) used average annual temperatures from five data sets to estimate average earth warming rates from 1979 to 2010. Noise removed from the raw mean annual temperature data is attributed to ENSO activities, volcanic eruptions and solar variations. The result is said to be a noise-adjusted temperature anomaly curve. The average warming rate of the five data sets over 32 years is 0.16 degrees C per decade compared to 0.17 degrees C per decade determined by this study from 384 monthly points derived from the derivative of the temperature trendline. Foster and Rahmstorf (2011) assume the warming trend is linear based on one averaged estimate, and their data cover only 32 years. Thirty years is generally considered to be a minimum period to define one point on a trend. This 32-year time period includes the highest intensity ENSO ever recorded and is not long enough to define a trend. The warming curve in this study is curvilinear over nearly 170 years (green curve on Figures 1 and 3) and is defined by 2,032 monthly points derived from the temperature trendline derivative. From 1979 to 2010, the rate of warming ranges from 0.08 to 0.20 degrees C per decade. The warming trend is not linear.

The perceived threat of excessive future temperatures may stem from an underestimation of the unusually large effects of the recent ENSO on natural global temperature increases. Nearly 40 years of natural, transient warming from the largest ENSO ever recorded may have been misinterpreted to include warming due to anthropogenic activities. There is no evidence of a significant anthropogenic contribution to surface temperatures measured over the last 40 years.

Caltech recently announced the start of a 5-year project with several other research centers to build a new climate model “from the ground up” (Perkins, R. 2018). During these five years, the world’s understanding of the causes of climate change should be greatly improved.

The scientific goal must be to narrow the range of uncertainty of predictions with better data and better models until human intervention makes sense. We have the time to get it right. A rational environmental protection program and a vibrant economy can co-exist. The challenge is to allow scientists the time and freedom to work without interference from special interests.

Acknowledgments and Data

All the raw data used in this study can be downloaded from the HadCRUT4 and NOAA websites. http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/series_format.html

https://research.noaa.gov/article/ArtMID/587/ArticleID/2461/Carbon-dioxide-levels-hit-record-peak-in-May

References

1. Boden, T.A., Marland, G., and Andres, R.J. (2017). National CO2 Emissions from Fossil-Fuel Burning, Cement Manufacture, and Gas Flaring: 1751-2014, Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, doi:10.3334/CDIAC/00001_V2017.

2. Christy, J. R., May 8, 2019. The Tropical Skies Falsifying Climate Alarm. Press Release, Global Warming Policy Foundation. https://www.thegwpf.org/content/uploads/2019/05/JohnChristy-Parliament.pdf

3. Foster, G. and Rahmstorf, S., 2011. Environ. Res. Lett. 6 044022

4. Golden Gate Weather Services, Apr-May-Jun 2019. El Niño and La Niña Years and Intensities. https://ggweather.com/enso/oni.htm

5. HadCrut4 dataset. http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/series_format.html

6. Karl, T. R., Arguez, A., Huang, B., Lawrimore, J. H., McMahon, J. R., Menne, M. J., et al.

Science 26 June 2015. Vol. 348 no. 6242 pp. 1469-1472. http://www.sciencemag.org/content/348/6242/1469.full

7. Mann, M., Bradley, R. and Hughes, M. (1998). Global-scale temperature patterns and climate forcing over the past six centuries. Nature, Volume 392, Issue 6678, pp. 779-787.

8. Mckitrick, R. Department of Economics, University of Guelph. http://www.rossmckitrick.com/uploads/4/8/0/8/4808045/mckitrick_comms_on_karl2015_r1.pdf, A First Look at ‘Possible artifacts of data biases in the recent global surface warming hiatus’ by Karl et al., Science 4 June 2015

9. Mears, C. and Wentz, F. (2016). Sensitivity of satellite-derived tropospheric
temperature trends to the diurnal cycle adjustment. J. Climate. doi:10.1175/JCLID-
15-0744.1. http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-15-0744.1?af=R

10. Morice, C. P., Kennedy, J. J., Rayner, N. A., Jones, P. D., (2012). Quantifying uncertainties in global and regional temperature change using an ensemble of observational estimates: The HadCRUT4 dataset. Journal of Geophysical Research, 117, D08101, doi:10.1029/2011JD017187.

11. Lenssen, N. J. L., Schmidt, G. A., Hansen, J. E., Menne, M. J., Persin, A., Ruedy, R, et al. (2019). Improvements in the GISTEMP Uncertainty Model. Journal of Geophysical Research: Atmospheres, 124, 6307–6326. https://doi.org/10. 1029/2018JD029522

12. NOAA Research News: https://research.noaa.gov/article/ArtMID/587/ArticleID/2461/Carbon-dioxide-levels-hit-record-peak-in-May June 4, 2019.

13. Palmer, P. I. (2018). The role of satellite observations in understanding the impact of El Nino on the carbon cycle: current capabilities and future opportunities. Phil. Trans. R. Soc. B 373: 20170407. https://royalsocietypublishing.org/doi/10.1098/rstb.2017.0407.

14. Perkins, R. (2018). https://www.caltech.edu/about/news/new-climate-model-be-built-ground-84636

15. Spencer, R. W., Christy, J. R. and Grody, N. C. (1990). Global Atmospheric Temperature Monitoring with Satellite Microwave Measurements: Method and Results 1979–84. Journal of Climate, Vol. 3, No. 10 (October) pp. 1111-1128. Published by American Meteorological Society.

Advertisements

263 thoughts on “170 Years of Earth Surface Temperature Data Show No Evidence of Significant Warming

  1. All I can say at the moment is…”The warming can’t come fast enough! I am freezing my *ss off and I live in Texas. Good Gawd I feel sorry for you northerners.”

    • Sun will decide what happens next and when, but it does appear that there is still a bit of warming in front of us, mainly due to oceans’ stored energy oscillating along absorption/release cycles.
      According to number of solar scientists sun is heading for a prolonged minimum. Working out the solar deep minimum’s effect on the global cooling isn’t an easy task. About couple of years ago I did an exercise analysing the Maunder minimum type effect on the N. Hemisphere’s climate trends.
      The ‘conclusion’ was that a ‘short’ solar minimum effect would be negligible while a 50 year Maunder type minimum would result in up to 0.75 degree C temperature fall, or about 20 years of cooling in excess of 0.5 degree C.
      The analysis is based on the past ‘reality’, which suggest there are long term natural variability cycles, not necessarily directly related to the deep solar minima.
      The intensity of cooling that may occur depends where the Maunder type minimum falls in relation to the multi-decadal & multi-centenary cycles.
      In this exercise I looked at possibility that the MM start coincides with any of the three future cycles: SC25 or SC26 or SC27.
      In this link I show graphic representation of my analysis
      http://www.vukcevic.co.uk/NH-GM.htm
      From the above, accounting for the Atlantic’s multi-decadal and global multi-centenary trends, initial cooling was a slow process while the subsequent warming appears to have been more rapid.

      • Many years ago I stated that the climate effects of solar variability would be greatly modulated by the combined effect of separate cycles in all the ocean basins sometimes supplementing and sometimes offsetting the solar effect on jet stream tracks (which affects global cloudiness).

      • A cooling of 0.75 C average would be “peanuts” compared to the temperature day/night/seasonal swings. I would think more important would be some kind of NH/SH cyclical cooling over land caused by jet stream swings. The average air temp, as by satellite, would not show this since the ocean is ~70% of the data and with the equatorial belt, even more. Average world air temp data could vary little while the USA and Northern Europe freezes to an ice age.

        • Hi BFL
          Yes, I agree, even individual years annual temperature at most of the N.H’s locations vary as much and often more. My analysis is based on the estimated drop of about 1C degree in the CET (Central England Temperature) during the Maunder minimum.

        • Imo, the downside to a cooling trend/period is not limited just to a change in temperature. The weather this spring across much of the NH was lousy for farmers. Next spring should be similar or worse for getting crops in the ground. The question is will early winter/delayed spring become a pattern in the years ahead, perhaps for several decades. The weather over the last several months has been very similar to last year, so far. Now to see if the storms start coming in off of the Pacific around late Dec/early January to ease the fire danger.

      • clothing doesn’t help when you have to drive in it. Not one bit.
        Unless you are driving an EV and have to walk when your battery dies

  2. This is consistent with my hypothesis that a more active sun reduces global cloudiness by making jet stream tracks more zonal and causes them to shift poleward so that more energy enters the oceans and El Ninos become more dominant relative to La Ninas.

    • That sounds very plausible. During the next decades, with a less active sun, there should be more La Ninas then and much less temperature increase (or even a decrease). Unfortunately, the policy makers will not be patient enough to wait for that the happen. Luck is with the dumb ….

    • Shows what you can do with headlines. It could just as easily be “ Recent Global warming .154 degrees per decade, Up from historic .038”…..which puts a totally different spin on the article.

      • If your temperature readings are only accurate to 0.1*C, then your warming /cooling calculations are only significant to 0.1*C per decade. I strongly doubt that any reading from multiple sources are accurate to more than 1*C on a global scale. Extrapolation of air temperature over vast ocean grids doesn’t cut it. Why is this point so conveniently ignored?

          • They also only recorded the daily high and low.
            Anyone who believes you can get an accurate daily average temperature from just a high and low temperature has never been outside.

        • The problem with error propagation in these series is that taking 27,353 temperature samples from stations all over the earth is not the same as measuring the length of a meter stick 27,353 times. However, the same Law of Large Numbers, and Central Limit Theorem, and standard deviation and uncertainty in the mean calculations can be made, but they mean very different things.

          This is hardly ever made clear in discussions on these topics, but it works like this:

          If I measure a meter stick with a meter-and-a-half-stick, marked in 1 mm graduations, the measurement error in my observations will be ±0.5mm. Let’s say I take 100 measurements, and my standard deviation is 1.2 mm. I employ the LLN and divide the standard deviation by the square root of 100 to get ±0.12mm. The measurement error pretty much falls out, because it gets added in quadrature and then divided by the number of measurements.

          sqrt(100*0.5^2)/100 = ±0.05mm.

          I’m not sure how to handle the two kinds of uncertainties here; are they added? If so, the final calculation would be something like 1000.2 ±0.2mm.

          If you have 27,353 temperature measurements from all over Earth, they can be averaged, and the LLN be applied, but it has a completely different meaning. In the case of measuring the meter stick 100 times, our measurement increased in precision from 0.5mm to 0.2mm. The average measurement didn’t get more accurate, the mean was more precise, meaning closer to the “true” value.

          If our 27,353 measurements have a standard deviation of 7.8 and the calculated mean is 14.3C, then the uncertainty in the mean is

          7.8/sqrt(27353) = 7.8/165.4 = ±0.05C

          In this case, though, the uncertainty in the mean is not a measure of the precision of the mean in relation to the “true” value, but instead is a prediction. It says that if the entire temperature collection process was repeated with all new values, the mean calculated from those measurements would have a probability of 67% of being within that ±0.05C of the first calculation of the mean.

          It’s the same equations used in both cases, but the answer has a completely different meaning, because of the different kind of measurements being taken each time.

          • James,

            the LLN with its 1/sqrt(n)-argument for the composite error works if the n observations are i.i.d. (independent identically distributed). For non-i.i.d. observations we would have in the case of Gaussian noise a covariance matrix with “lots of” non-zero off-diagonal elements. Thus, computing the error (=standard deviation) of the average of n correlated observations leaves us with a finite non-vanishing error.

            The same is true with any experiment: if we measure the gravitational constant of Newton’s law of gravity, say, in London, Paris, New York, Tokyo and Buenos Aires we may treat these 5 measurements as independent with zero correlation matrix. However, if we repeat our experiment in 10000 other places, including here in addition places like Baltimore and Kyoto, we cannot ignore the correlations in those measurements: the temperature and other physical variables which might have an impact on our measurement might be highly correlated for New York and Baltimore and likewise for Tokyo and Kyoto. Therefore the correlation matrix for the aggregation of our measurement results will have non-diagonal elements leading to a natural lower bound to the error of the combined value for the gravitational constant.

            Problem is: how do we estimate possible covariance structures for all our measurement points?

    • I took notice that some people think that the average temp of the whole planet, land and sea, was known at every point in time back to 1850 to within a hundredth of a degree.
      Hard to get past that observation right off the bat.
      How seriously can/should one take anything based on such obvious ridiculousness?

        • Real scientists will focus on the discussion of the technical work. Anyone criticising the writer is not a real scientist. Real scientists seek out “what is right”. Everyone else seeks out “who is right”.

  3. “The rate of warming of the surface of the earth does not correlate with the rate of increase of fossil fuel emissions of CO2 into the atmosphere.”

    A correlation between temperature change and CO2 emissions may or may not happen – it’s irrelevant. What matters is the correlation between temperature change and forcing change. Ideally this “forcing” would be a sum of all the known forcings. Insert here the usual but necessary caveats: that we only have accurate CO2 measurements since 1958, that aerosol forcing is uncertain, etc.

    And yes, there is pretty good correlation between a year’s temperature anomaly and its forcing anomaly. It depends on the exact datasets you use but it’s easily above 0.7. See figures 2 and 3 in this article:
    https://judithcurry.com/2016/10/26/taminos-adjusted-temperature-records-and-the-tcr/

    You may argue that this correlation is spurious, that it’s caused by something else, etc. You cannot argue it does not exist.

    Also, the article’s figure 2 is supposed to show the lack of correlation between emissions and temperatures, but I have no idea where the “emissions” data comes from. (The text below says emissions, but the axis clearly shows concentrations).

    Figure 2 shows an increase of 400 ppm since 1900, which is about four times the actual increase in CO2 concentrations since then. Even if one talked about CO2 emissions, these are roughly two times bigger than the the increase in concentrations (it takes about 2 ppm of emissions to increase concentrations by 1ppm), so it’s not clear how the chart gets to 400ppm.

    Figure 2 would also be wrong if it charted CO2 forcing, because it compares a *rate* of increase (in temperature’s case) with an absolute increase (for emissions, or concentrations, or whatever the other line on the chart represents). You can compare rates of increase, you can compare absolute increases, and more – but you have to be consistent and do it for both variables.

    • The increase in CO2 whether in absolute terms or in concentration clearly has no relation to the rate of temperature change so Fig 2 is useful.
      There is no denying that the rate of temperature rise has decreased of late yet CO2 emissions continue to accelerate.
      As the effects of the last EL Nino fade away the ‘pause’ is returning.

      • However, the caption to Figure 2 states:

        “The blue dotted curve showing total parts per million CO2 emissions from fossil fuels in the atmosphere …”

        No, I believe that the curve is for CO2 concentration in the atmosphere, from all sources. And it definitely isn’t just “… from fossil fuels …”

        • It cannot be CO2 concentration in the atmosphere.
          Look at where it starts out…way below 100ppm.
          That(the blue dots in figure 2) has got to be the most poorly labelled and described graph I have ever seen in a serious work.
          On the graph, it is labelled as if it is concentration of CO2 in the atmosphere, but in the description it seems to describe the dots as representing how much CO2 is being emitted from human sources at each point in time.

          But it is a terrible description, mangled grammar and very badly stated.
          See here:
          “and the total reported million metric tons of carbon are converted to parts per million CO2 for the graph. ”
          Total reported million metric tons?
          I think it is meant to say total reported AMOUNT, converted to PPM of CO2, which presumably is a comparison of emitted CO2 from human sources to the total mass of the atmosphere.
          The graph itself seems to say something very different from the text description below the graph.

        • *** Global CO2 Emissions from Fossil-Fuel Burning, ***
          *** Cement Manufacture, and Gas Flaring: 1751-2014 ***
          *** ***
          *** March 3, 2017 ***
          *** ***
          *** Source: Tom Boden ***
          *** Bob Andres ***
          *** Carbon Dioxide Information Analysis Center ***
          *** Oak Ridge National Laboratory ***
          *** Oak Ridge, Tennessee 37831-6290 ***
          *** USA ***
          *** ***
          *** Gregg Marland ***
          *** Research Institute for Environment, Energy ***
          *** and Economics ***
          *** Appalachian State University ***
          *** Boone, North Carolina 28608-2131 ***
          *** USA ***
          Oak Ridge, Tennessee 37831-6290 ***

          Mt C divided by 24.4 equals ppm CO2

          Thanks for the comment. If you think I am still wrong, let me know.

      • In Figure 2 of the above article, the blue curve y axis is clearly labeled as “Parts per million CO2 in the atmosphere” and the whole graph is clearly labeled as “Comparison of Atmospheric CO2 Concentration and Warming Rate of Surface Temperature of the Earth”.

        Something is definitely wrong with the blue curve in this graph because it begins around year 1900 with an asserted atmospheric CO2 concentration of about 30 ppm and there is NO scientific data to support this. Alternatively, if the blue curve and entire graph were mislabeled and the blue curve was to represent only the anthropogenic amount of CO2 in the Earth’s atmosphere (as is asserted in the body text underlying the Figure 2 graph), then this too is not credible because the value around year 2012 exceeds 400 ppm. That is, the entire amount of atmospheric CO2 in 2012 was attributed to burning fossil fuels, with no natural contributions, which is also NOT supported by any scientific data.

        Very sloppy for what otherwise appears to be a solid, credible article.

        • I agree with you Gordon.
          Just what those blue dots represent is as clear as mud.
          It cannot be what it says within the box, and what it says below the box makes no sense, and is so poorly written that it is impossible to discern what the author is meaning to assert.
          Is it supposed to be cumulative emissions, or is each dot meant to represent annual emissions of CO2?

    • “I have no idea where the “emissions” data comes from”
      As referenced under Figure 2, the data comes from the Carbon Dioxide Information Analysis Center

      • Loydo is such a hypocrite, using technology derived from petroleum to attack petroleum.
        Just some friendly info, hypocrisy is not a virtue regardless of how you wield it.

        Start writing your comments on the cave wall where you should be living.

        • The post is junk, the method is junk and the conclusion is junk. Concocting a reducing rate with an Excel sixth-degree polynomial best fit? Then using that to cast doubt, and here is the rub to prevent “changes in environmental policies”.

          I know, I know, the type of industry he is employed by is a complete coincidence. So-called sceptics.

      • Oh look the conspiracy theory from the troll … so lets go with our own conspiracy that Loydo is a paid troll. Same argument really as you clearly have links to enviroment groups.

        • I sometimes suspect he’s actually a skeptic trying to make doomsters look stupid.
          If so, he is succeeding very well.

      • Lloydo,
        I was fortunate to have lived in an era when the main, everyday scientific advances were from industrial rather than academic research. It is childish to declare that one of these is tainted for some reason. Geoff S

        • Geoff, speaking as an analytical chemist, do you believe it’s at all possible that the, “95% uncertainties of global annual mean surface temperatures range between 0.05 degrees C to 0.15 degrees C over the past 140 years,” when the lower limit of resolution of the thermometers was ±0.25 C?

          It seems to me they’re magicking data out of thin air.

          As done in the rest of AGW so-called science.

          • I agree completely Pat.
            No way anyone knows average temp within anything close to that amount of uncertainty.
            Of the whole planet no less?
            It is preposterous.

          • Pat,
            Answer is “No.”
            Here is a relevant email from the Australian BOM –
            11 April 2019

            Dear Mr Sherrington,
            Thank you for your correspondence dated 1 April 2019 and apologies for delays in responding.
            Dr Rea has asked me to respond to your query on his behalf, as he is away from the office at this time.
            The answer to your question regarding uncertainty is not trivial. As such, our response needs to consider the context of “values of X dissected into components like adjustment uncertainty, representative error, or values used in area-averaged mapping” to address your question.
            Measurement uncertainty is the outcome of the application of a measurement model to a specific problem or process. The mathematical model then defines the expected range within which the measured quantity is expected to fall, at a defined level of confidence. The value derived from this process is dependent on the information being sought from the measurement data. The Bureau is drafting a report that describes the models for temperature measurement, the scope of application and the contributing sources and magnitudes to the estimates of uncertainty. This report will be available in due course.
            While the report is in development, the most relevant figure we can supply to meet your request for a “T +/- X degrees C” is our specified inspection threshold. This is not an estimate of the uncertainty of the “full uncertainty numbers for historic temperature measurements for all stations in the ACORN_SAT group”. The inspection threshold is the value used during verification of sensor performance in the field to determine if there is an issue with the measurement chain, be it the sensor or the measurement electronics. The inspection involves comparison of the fielded sensor against a transfer standard, in the screen and in thermal contact with the fielded sensor. If the difference in the temperature measured by the two instruments is greater than +/- 0.3°C, then the sensor is replaced. The test is conducted both as an “on arrival” and “on departure/replacement” test.
            In 2016, an analysis of these records was presented at the WMO TECO16 meeting in Madrid. This presentation demonstrated that for comparisons from 1990 to 2013 at all sites, the bias was 0.02 +/- 0.01°C and that 5.6% of the before tests and 3.7% of the after tests registered inspection differences greater than +/- 0.3°C. The same analysis on only the ACORN-SAT sites demonstrated that only 2.1% of the inspection differences were greater than +/- 0.3°C. The results provide confidence that the temperatures measured at ACORN-SAT sites in the field are conservatively within +/- 0.3°C. However, it needs to be stressed that this value is not the uncertainty of the ACORN-SAT network’s temperature measurements in the field.
            Pending further analysis, it is expected that the uncertainty of a single observation at a single location will be less than the inspection threshold provided in this letter. It is important to note that the inspection threshold and the pending (single instrument, single measurement) field uncertainty are not the same as the uncertainty for temperature products created from network averages of measurements spread out over a wide area and covering a long-time series. Such statistical measurement products fall under the science of homogenisation.
            Regarding historical temperature measurements, you might be aware that in 1992 the International Organization for Standardization (ISO) released their Guide to the Expression of Uncertainty in Measurement (GUM). This document provided a rigorous, uniform and internationally consistent approach to the assessment of uncertainty in any measurement. After its release, the Bureau adopted the approach recommended in the GUM for calibration uncertainty of its surface measurements. Alignment of uncertainty estimates before the 1990s with the GUM requires the evaluation of primary source material. It will, therefore, take time to provide you with compatible “T +/- X degrees C” for older records.
            Finally, as mentioned in Dr Rea’s earlier correspondence to you, dated 28 November 2018, we are continuing to prepare a number of publications relevant to this topic, all of which will be released in due course.
            Yours sincerely,
            Dr Boris Kelly-Gerreyn
            Manager, Data Requirements and Quality

          • Regardless of the resolution of the thermometers, what they wrote down was rounded to the nearest degree.
            Which makes the resolution of the records themselves 1.0C.
            Assuming the station attendant took the time to accurately read the thermometer.
            People forget that reading old mercury/alcohol thermometers was not as easy as it is today, where you just copy down the numbers on the display.
            Among other problems, unless your mark I eyeball was exactly at the same level as the top of the mercury column, you would add an error to your reading.

          • Geoff, Boris Kelly-Gerreyn’s answer is 100% baffle-gab.

            He didn’t answer your question at all.

            If they’re not using aspirated-shield sensors to calibrate their field sensors, they’ve got nothing. And Boris didn’t mention aspirated standards at all.

      • I was going to make a similar sarcastic comment, Loydo. (You forgot the tag by the way.)

        The data is hard to refute.

      • Translation: I can’t refute the science, so I’ll attack the scientist.
        Typical Loydo.
        Secondly, his link doesn’t support his claim that Dr. Bjorklund is in the pocket of petro-chemical interests.
        I’d call on Loydo to apologize, but she isn’t man enough.

      • As opposed to alarmists being in the pockets of governments, with infinitely bigger……pockets than any oil company? Thought so.

      • Just seems like another wash of unicorn tears from, yes, a reality-denier hiding in the fetal position under his desk in his mom’s basement.

      • I see no doubt mongering here. It is a direct refutation and falsification of global warming/climate change, not a misdirection.

        • Of course he doesn’t. If he did he would have use those, instead of attacking the messenger because he can’t refute the message.

          (Loydo, and others who are proven trolls never do this:

          Two intellectually-honest tactics

          There are only two intellectually-honest debate tactics:

          1. pointing out errors or omissions in your opponent’s facts
          2. pointing out errors or omissions in your opponent’s logic

          That’s it. Simple! The dishonest list is much longer.

          LINK

          If everyone can do this, Moderators can go on vacation……) SUNMOD

          • SUNMOD
            Thanks for the link to the interesting article. While I might quibble over some minor points, overall, I thought he did a good job in covering how people avoid honest debate. I particularly liked the statement:

            High school graduate movie stars with no training or expertise in government policy pontificating about government policy are another example of persons using their success in one realm to imply high expertise in an unrelated field.

      • MODERATOR WARNING!

        (No more ad hominem/funding fallacies comments accepted, you MUST post a constructive critical comment against the article, or you will get snipped) SUNMOD

        Fair enough. “a constructive critical comment”.

        However, there are thousands of comments on this site that come nowhere near to meeting that requirement, tens of thousands. Would you like me to start pointing them out?

        Yes, I disagree with a lot of what gets posted here, but the majority of my comments *are* accompanied by links to reseach papers, quotes, graphs, etc. The credibilty of those source is routinely derided with unsupported ad hominen.

        I expressed my opinion that was to question the Bjorklund’s credibilty:
        1. Not his area of research
        2. A consultant to “major oil and gas companies”.
        3. His argument seems to rest on a plunging decadal temperature increase which is highly questionable.

        Surely casting doubt is acceptable around here.

        I think you are holding me to a higher standard.

        (This is my only reply to you here, further comments from you over my moderation will be deleted)

        (I do not have the time to moderate every comment, there are so many spam comments I have to view, then to look in the trash bin and then read in comment threads. Your comment was too far off the ideal comment standard for this blog, that I had to warn you to stop the attack on the PERSON, to instead attack what he WRITES, here is that forum policy:

        “Trolls, flame-bait, personal attacks, thread-jacking, sockpuppetry, name-calling such as “denialist,” “denier,” and other detritus that add nothing to further the discussion may get deleted…”. This is what YOU wrote that is normally considered an attack on the person with an useless fallacy, since you didn’t address his research at all, the one you make clear you don’t like.:

        Just seems like another pile of doubt-mongering from, yes, a Petroleum Geologist in the pocket of ” major oil and gas companies”

        There will be no more attacks on the PERSON, it doesn’t help you and irritates others here who are tired of your funding/education/authority fallacies.) SUNMOD

        • Loydo
          If someone has an apparent conflict of interest, then it warrants a more rigorous examination of their claims. That responsibility falls on the shoulders of the reader(s) who feel that the author has reason to be biased. However, ultimately, the claims made should be evaluated based on the legitimacy of the facts and the logic used to reach a conclusion. Anything less, and you are becoming the very thing you are complaining about — a biased participant.

        • MM, where to satrt. Loydo, your “senior scientist” at the IPCC is a former Exxon executive, fact check all you can. The previous ‘senior scientist’ at the IPCC was a senior executive at India Oil and a railway engineer. I would suggest that a geologist is FAR more qualified at climate than a railwayman .. Your Death Cult God AlBoar is tied at the lips with Occidental, again fact check yourself. He is also tied at the market rigging game with his mates at Goldman Sachs, GIM.

          bye

      • Well said Sunmod .
        Loydo is a troll flying past throwing manure .
        I have seen nothing that he has written that adds anything to the debate .
        I would say that the geologists that I know and have met are very much down to earth .Pun intended They have studied the earth and the climate and most have concluded that CO2 is a very minor player in the earths atmosphere and temperature .
        They all know that the planet has been much colder and warmer than present .

    • there is pretty good correlation between a year’s temperature anomaly and its forcing anomaly.

      Mainly because the aerosol fudge factor was defined to provide that correlation.

      I have no idea where the “emissions” data comes from. (The text below says emissions, but the axis clearly shows concentrations).

      It is explained in the text, it is the accepted emissions data from Boden et al. with the units transformed from gigatons into ppm. As you remember, we have emitted about double of what the levels have increased in the atmosphere.

      • Javier
        You said, “we have emitted about double of what the levels have increased in the atmosphere.” Which is a small fraction of the total CO2 involved in the Carbon Cycle!

        • To say nothing about how much if any has come out of or gone into solution in the oceans, no?
          I am also rather dubious that the value of total human emissions or additions to the air is known with any accuracy or precision.
          Concrete manufacturing, for example releases CO2, but much of this is absorbed back into the concrete over time, eventually amounting to all that was released.
          And what about land use changes, forests removed, wood used in construction and manufacturing, etc?
          I doubt anyone can do better than make some WAG on these numbers. Likewise between things like coal fires, nat gas flaring, oil field fires, lack of precise data on mined coal or pumped oil for many places and times, how accurately is CO2 from fossil fuels known?
          Some of this may be nit picking and the amounts insignificant compared to what is known, but with no discussion of these and other sources, how should anyone know that?

          • Nicholas
            That is really my point. It may just be coincidence that the annual increase in CO2 is approximately half of human emissions. One would expect that as the Earth warms, CO2 will come out of solution in the oceans and lakes. Also, as I wrote my my first article for WUWT, we don’t even have a good handle on the total anthropogenic contributions. We have a lower-bound with fossil fuels and concrete, but the other sources are poorly characterized.
            http://wattsupwiththat.com/2015/05/05/anthropogenic-global-warming-and-its-causes/

        • How is contribution to atmospheric CO2 from the ElNino warming measured, or more accurately segregated from the CO2 measurements?

    • if there were an “increase” of 400 ppm, that would mean we started with 0 ppm CO@, which is obviously impossible

    • Alberto, thanks for citing my article at Judies. There is a clear link between the increase in the antropogenic forcing, in this case ex volcano, ex solar because the used GMST also didn’t include the impact of this agents +ENSO which is a natural variability, and the warming.
      It’s a pitty that so many “sceptics” don’t want to see this. The much more interesting question is: what is the impact of this forcing ( namely the forcing due to doubling of CO2) and the result of my article gives 1.35 °C for TCR. This is remarkably below the models mean estimate on 1.85 for the CMIP5’s. The upcoming CMIP6 shall anhance this value as one speculates. The most interesting question is: Why are models running hot but not the question if they respond to the forcing in any way which they do of course.

  4. It was very clear from day 1 that the Karl, et al’s (K15) Pause Buster SST adjustments were bogus, and merely a desperate attempt by climate fraudsters to erase the inconvenient Pause, probably on order from the WH OSTP (Mr. Holdren) with the upcoming Paris COP. But to my understanding, after Tom Karl retired, the newest NCDC ERSST adjustment backed-out most of that K15 SST data fraud because it was an outright embarrassment for NCDC.

    The Figure 2, as pointed out by Alberto above, looks very wrong. And with Fig 2 very wrong, the whole thing looks very bad.

    So bad in fact, I’d have to guess this presentation is either an attempt to make skeptics look bad, or some kind of fake presentation to discredit some group who might use it.

      • I don’t think you can dismiss Fig 2 by describing it as “nit-picking”, Stephen. It hit this layman between the eyes as an error immediately. The axis shows CO2 concentrations apparently increasing from 0 to 400 (414 with the outlying dot) since the start of last century.

        So either the graph is wrong or the scale on the axis is wrong. Careless either way and in this perfervid (!) climate a gift to the “opposition”!

        • Agreed. The shape of the curve matches emissions not concentrations so the left labels should be changed. Possibly divided by 100. The growth of concentrations is pretty smooth (Keeling curve) but it is also not the result of fossil fuel emissions as the anthropogenic contribution annually is only about 3% of the flux.

        • Read the Figure 2 description newminster!
          “The blue dotted curve showing total parts per million CO2 emissions from fossil fuels in the atmosphere is modified from Boden, T. A., et al. (2017); the time frame shows only emissions since 1900, and the total reported million metric tons of carbon are converted to parts per million CO2 for the graph.”

          • It is very poorly worded and the description is inadequate even besides for the poor diction.
            I think it is meant to represent total cumulative emissions from human sources only.
            Instead of ” total reported million metric tons of carbon” it ought to say (if I am interpreting it correctly) total emissions of CO2.
            Why mention million metric tons, if it just means total amount (which is reported in millions of metric tons, although it is billions these days), and why say carbon if it means CO2?
            It is very unclear.
            What is the point of taking a lot of trouble to do such an essay and not describe what is being represented in a way which is clear and concise?
            Bad descriptions and imprecise language makes what is being said hard to discern.
            The purpose is to communicate, and we have ways to do that effectively.
            Among the ways is to use clear language and concise descriptions.

      • There is nothing sound about adjusting the high quality buoy data so that it better matches the much lower quality ship borne sensor data.

    • Is it so difficult to see that figure 2 shows our emissions transformed into ppm units?

      Not the usual way to present this data, but it is not wrong.

      • It is always more difficult to see what you don’t what to see.
        Whole industries have been created out of not seeing what is obvious if you take the time and effort to “look then think, then look again, then think again”.
        It is much easier to say it “doesn’t look right, because it challenges my preconceived notions, so it must be wrong”. And then tag it as a “fake presentation”.
        Analysis done by “how it makes me feel” 101.

        • Correct. Whenever I see something that doesn’t look right, my first thought is, what am I not understanding, because I know that the problem most likely is with me.

          That happened with Figure two, and a close reading of the paragraph below clearly explains it as Javier did.

        • I do not interpret the criticisms to mean that people do not WANT to see what is meant.
          How about just taking such criticisms literally, IOW why not assume the person asking for a better explanation actually wants a better and more clear explanation?
          It does not say it is cumulative emissions, although that seems to be what is meant.
          It says carbon, but the source listed refers to CO2, which is not the same as carbon.
          Did the author only count the part of CO2 which is carbon, or did he just say carbon when he meant CO2?
          Why do that?
          Why talk about millions of metric tons when what is meant is simply the amount of emissions, which may be given in units of millions of metric tons, but also may be given in kg, or in billions of regular tones, or in pounds, or whatever?
          When vague and sloppy language is used, what is said becomes a matter of interpretation, and we can see just within a dozen or two comment how much of a problem that can be.
          I for one do not read and comment to waste time or nit pick, but to obtain and disseminate accurate information, and to refute or clarify bad info.
          It is probably not helpful do dismiss people with questions with a blanket assertion that they do not really want to know.

          • You are thinking of WANT as a conscious decision. It is a subconscious reflex that is NOT a decision. It is a subconscious act that filters your understanding what the data shows.
            If you don’t like the presentation DO IT YOURSELF!!!
            The references are included.
            Demands for spoon fed knowledge, only shows that you are unwilling to do any thinking for yourself, and you require an analysis that is exactly how you would do it.
            It is about the data, and whether it correlates or not, to a measurable effect that requires action or not It is not about the style of presentation.

      • Javier,
        So are claiming anthro-CO2 emissions raised the molar ratio of CO2 by 400ppm? Because that IS what Figure 2 is implying.

        Forcing doesn’t care about anthro emissions.
        Forcing doesn’t care whether the CO2 molecule came from petroleum combustion or natural organic matter decomposition.

        And a then a 6-order polynomical fit? Seriously. Look a the HadCrut data, clearly a linear downward trend (negative slope) from 1945 to 1976, but the polynomial slope stays positive with the derivative staying above zero. Clearly wrong.

        Figure 2 is complete junk.

        Then the author wrote this junk
        “The peak surface warming during the ENSO was 0.211 degrees C per decade in September 2006. The highest global mean surface temperature ever recorded was 1.111 degrees C in February 2016; these occurrences are possibly related to the increased quality and density of ocean temperature data from the two, earth orbiting MSU satellites described previously. Earlier large intensity ENSO events may not have been recognized due to the absence of advanced satellite coverage over oceans.”

        Well if he is now claiming the peak warming was 2006, but that would also put it in the middle of the Pause (hiatus). And the highest global mean surface temp, and then giving it to 3 decimal places???? And then what previous discussion of “earth orbiting MSU satellites described previously.” Huh? Where previously? HadCRUT is the data presented.

        Seriously, this whole manuscript is junk.
        It’s all complete junk-gibberish.

        • I agree much of what is written here is very unclear.
          Very badly written.
          I had not even got to this part Joel, because so much of it at the very beginning was dubious and hard to understand.
          Does he mean to say highest ever recorded ANOMALY?
          He must mean that.
          And so he should say that.

          Re “The peak surface warming during the ENSO was 0.211 degrees C per decade in September 2006”, does this mean that a monthly rate of warming was extrapolated out to a decadal rate of change?
          By ENSO does he mean during the (or an?) el nino?
          ENSO is the name for the entire oscillation, which is continuous and ongoing.

        • Joel I am not defending all the article, just explaining that the emissions part of figure 2 is correct, when several people commented it was not. Emissions can be expressed in several units, normally Gigatons of carbon or CO2, but they can also be expressed as ppm equivalents which allows a comparison with CO2 levels in the atmosphere. That doesn’t mean that the figure implies anything about atmospheric levels as you say.

          • Javier,
            The CO2 part of Fig 2 just junk. Pure junk.
            And the polynomial curve of the temperature fit is also worse than junk.

            Admit it. Move on.

  5. This is a bombshell moment for all those Extinction Rebellion protesters who demand that we trust the scientists. Well, here are the scientists telling us that the earth has not warmed since 1850 and, in any event, CO2 is not the control knob for catastrophic global warming. Game over.

    Can’t wait to see the BBC’s coverage of this research. Oh….er…hang on….

    • John, that is not what the article above states. It clearly shows an ongoing increase in temperature, but at a moderate rate, and at a currently declining rate of increase.

      • Agreed. It is warming. That is good. There is no acceleration of warming, even though CO2 in the atmosphere is clearly rising. The models used to justify the climate change hysteria are broken.

        Either CO2 is in of itself a weaker forcer of atmosphere temperature than the propaganda claims, or the Earth’s negative feedback systems are powerful enough to counter most of that forcing from CO2 that does exist.

        Either way, the case for catastrophic change fails.

  6. Another point of error is Figure 3, where the author wrote, “3. September 2006 (Point C) marks a very strong El Nino …”

    Not hardly the El Nino that year (2006) was pretty weak, as ONI was only at/above +0.5 for five 3-month periods with max of +0.9.
    ref: https://origin.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/ONI_v5.php

    All in all, very bad. This does not appear to me to be a manuscript that any knowledgeable skeptic should reference or cite. It is so bad, it looks intentionally bogus.

      • If I make a gallon of chocolate ice cream, and you saw me mix in a spoonful of fresh dog crap with it, would you still buy that ice cream from me?

        The entire credibility of this manuscript is shot.
        Just like dog crap flavored ice cream…. don’t buy it. And most certainly do not ingest it.

        • Well I would say that the ice cream is the ENSO effect and the dog crap is human emissions of CO2.
          That remains true regardless of the quality or otherwise of certain details of the head post.
          There is no doubt that the rate of increase in global temperatures Is out of sync with human emissions and one doesn’t need the details of the head post to appreciate that simple fact.

          • It appears to me that the author of the headline must have spent some time and put some effort into this essay and the included graphs.
            So it makes no sense to not take the time to make the language clear and the descriptions concise.
            I was just looking up one phrase from the article using Cortana and this second item given was an article in Tamino trashing WUWT using this article as an example of so-called shoddy work published here.
            So I think it does matter, when warmistas are so quick to jump on any example to paint skeptics in a bad light.

            BTW, I agree that one does not need the details of this essay to know that temps and human CO2 emissions are not correlated, regardless of whether one uses anomalies or not, rate of change of temp or not, annual or cumulative emissions, or total atmospheric concentration, or whatever metric one wants.

    • I’m inclined to agree with Joel. To me this paper looks like it might be a decoy: a paper so deliberately spiked with error of methodology that it will attract copious criticism from alarmists, whereupon it will be “retracted” with a huge fanfare in the press. An early attack might be on the use of a high-order polynomial fit, given that such a fit is numerically unstable at the ends of the fit range. Even if it is not a deliberate decoy and is simply wrong it the errors will be jumped on and advertised as evidence that skeptics can’t do science. even if its conclusions turn out to be correct but for the wrong reasons. Treat with a long pair of tongs and put it in quarantine in a fume chamber with the hood down!

        • No, not desperation. I was about to compose and post my objections to use of a 6th order polynomial, for reasons such as the instability near the ends of the data set. You can try fits of order 5, 6 and 7 to see this effect. The mathematical derivation becomes uncoupled from the physical data. Geoff S

  7. I have been looking at a website called temperature.global.
    It claims real world, unadjusted temperatures, collated in real time.
    There is not much info on the website. It claims to use mostly NOAA data.
    The information that is presented, does seem to gel with this oped.
    However, the current global temperature, and temperature trend, is certainly at odds with the current ‘wisdom’!
    I would appreciate it, if anyone is familiar with this body of work and website, if the info contained therein, be trusted? It looks legit.
    http://temperature.global/

    • Its Bogus.

      Sorry. They use METAR data and what they do is a simple average of the values.
      A simple average is guaranteed to give you the wrong answer.

      Look folks it is warming.

      There was an LIA

      • Yes the English Lit adjusted data gives that nice comfy feeling. You have to slow baste your data while applying the Colonel’s 11 secret adjustments and then rest it for that perfect time and then you get that perfect finger lickin good data.

      • Steven,
        In a few lines, what are the main objections to use of Metar for temperatures? I can surmise a few, but surely their trend over time since Metar started should agree with other measurements. If not, why not? Geoff S

      • I assume the METAR data you mention is the same data I get from the airport before I land. I sure as heck hope it is accurate or I might have an issue with density altitude or crosswinds.

        All the new flight software gives you the “weather” at the airports along the way and the multiple airports around a city all are slightly different depending on the local conditions. So just what is the temperature of the area at any given moment? I can tell you the exact temperature at the measuring spot and can give you a general temperature of the area, but there is no way I can give you an exact answer to hundredth’s of a degree.

        Then throw in the areas I fly over with no airports so I can estimate it from other airport data but again, I don’t know exactly. And from experience, there are a lot of areas that may be exposed or snow covered on winter nights that can be 10C deg colder than surrounding areas.

        Sounds a lot like trying to measure the Earth. You can’t do it accurately so why don’t we all just stop trying to do it?

      • “Steven Mosher November 15, 2019 at 2:08 am

        Its Bogus.

        A simple average is guaranteed to give you the wrong answer.”

        Thank you!

      • Steven,

        How would you account for stations with 30-year cooling trends thoroughly mixed in with stations with 30-year warming trends? The only “global warming” that appears to be going on is on the average, because there are more stations warming than cooling..

  8. How much of the surface temperature, I assume global, was recorded 170, 150, 130 or 110 years ago given ~70% of it is water?

    • This is precisely my objection to the whole idea of a “global temperature”. Even the coverage of terrestrial temperatures is sparse and unbalanced, especially from a century or more ago. Add to that the inherent uncertainties in the calibration of thermometers, time of measurements and so forth, and the data become deeply suspect. Of course, one cannot average an intensive property like temperature, so the whole exercise is futile.

      • I can measure 3 different temperatures in my back yard, with the same instrument. Global average is pseudoscience

        • I have 3 different brand thermometers in my yard, none in the sun and just now (2pm Pacific Standard time) they read on average 51.3 degrees F + or – 0.9, where 0.9 is the average deviation.
          When trying to measure very small temperature differences, i.e., 1 degree C over 100 years, the uncertainties of the instruments (and any other uncertainties) will overwhelm establishing a true difference.

    • Most of the data are made up. There simply were not enough measurement points for at least half of the temperature record to be able to get an accurate determination. Scroll down to the interactive globe for an eye opening view of how few temperature gauges there were in the world for most or recorded history.

      https://data.giss.nasa.gov/gistemp/station_data_v3/

  9. What on earth is the point of this? The results are actually not different from what is well known. Hadcrut has a rather lower trend than it should, because it doesn’t so Arctic properly, as Cowtan and way showed. A trend of about 0.15°C/decade for Hadcrut in recent times is what is generally found, and a trend of 0.07°C since 1850 is also standard. To deal with the key points:
    1. Yes, about 0.07 average trend since 1850. Most of that period has little GHG effect.
    2. Yes, the rates doesn’t correlate. They shouldn’t. The rate of warming should correlate, if anything, with total GHG. That provides the flux.
    3. Yes, there have been some large ENSO oscillations recently; more down than up. So?

    Fitting a sixth order polynomial is not a sensible way of calculating trend. It has to be about right in the longer term, but direct regression is better. In the short term it shifts variation around, and so the dip at the end is probably spurious.

    • The point is that the interpretation of the evidence differs from the alarmist meme.
      For many other reasons the author’s interpretation looks more likely to be correct.
      Quibbling about extraneous detail does not invalidate the central point.

    • Nick almost said something sensible and he hasn’t even realized it 🙂

      Lets correct a few things and see if Nick can get his head around it. The rate won’t equate directly to the total GHG either because you are dealing with radiative transfer and QM heat baths.

      lets go out on a limb and assume you actually want to learn something, nature magazine has a reasonable write up on the quantum thermodynamic laws. It really isn’t hard, try reading
      https://www.nature.com/articles/srep35568

      The sections you need to pay attention
      – Exchange of heat and work between two interacting systems
      -Second law of thermodynamics

      What will directly correspond is the thing they called “pseudo-temperature”, there term not mine. I am not sure anyone has calculated it via direct methods for climate science, but lets say the system isn’t in equilibrium (it’s always chasing) the value will be 2/3 the average energy. Now if you are puzzling why 2/3 it’s because you have a Maxwell–Boltzmann distribution with three degrees of freedom …… E = 3/2KbT.
      Google “Maxwell–Boltzmann distribution”

      So QM actually predicts your classical answer will be wrong and by a rather large amount as all that other energy is in the QM domain 🙂

      Now if you run the calculations with proper physics I suspect your graphs might line up at least for a while. Why they won’t remain locked is a story for another day.

    • We would expect to see greater rates of warming in lock-step with greater rates of forcing.
      Positive and negative feedback’s complicate this over the short term, but on a multi-decade time scale each of those changes in forcing or feedback’s are either deterministic for changes in temperatures or offset by opposing feedback’s which makes them non- deterministic for changes in temperature.
      If increased CO2 due to our emissions created greater rates of warming, then there would be a correlation between the “cause” and the “effect”.
      So increased emissions of CO2 is either not deterministic or sufficiently offset by opposing negative feedback’s.
      The current rates of warming are similar to those that we know of in the past. It is reasonable to assume we are not on a “dangerous trajectory” that requires immediate action.
      The intelligent course of action would be to monitor the situation for changes to the current understanding, but making substantial changes based on dubious claims is not warranted.

  10. The difficulty with a presentation which involves extensive adjustments is the the non-expert like myself is left wondering whether the adjustments in anyway reflected the bias of the author. I speak only for my ignorant self here but I would be unhappy with that noise elimination if it produced a warmest result and I think I should try to be consistent.
    Doubtless those of you who understand such things can explain.

  11. I would love this to be correct and to be spread around but I fear the XR mob will just say he is an oil company person.

    It took two minutes to find this article on the University of Houston website:

    https://www.uh.edu/nsm/earth-atmospheric/people/faculty/tom-bjorklund/an-analysis-of-the-mean-global-temperature-in-2031_january-2019_revised.pdf

    He admits his expertise is in the oil industry and only has a general knowledge of climate science. They will jump on that.

    The data of course ought to speak for itself – and the analysis is based on publicly available data. It needs a few hitherto Michael Mann supporters to convert.
    ——
    Incidentally, the BBC recently ran a program on Climategate – of course showing that the data fiddlers were innocent.

    • “Incidentally, the BBC recently ran a program on Climategate – of course showing that the data fiddlers were innocent.”

      That was last night UK time.
      And they were.
      As was the Berkeley team 2 years later that found the exact same “hockey-stick” (despite being funded by the Kochs) …. and many others around the world since.
      And all they fiddled was in replacing tree-ring proxy data since 1960 with real world instrumental global temps for a cover graph – NOT for a paper.
      Look up the tree-ring divergence problem.

      • Well I looked it up and saw some academic papers. The net outcome of these seems to me, a lay person, to indicate that tree rings are not necessarily an accurate proxy for temperature and are influenced by many other factors. In short, there seem to be more questions than answers.

    • Great programme, haven’t laughed so much in years! Steve McIntyre was the only “sceptic” on the programme, Steve Mosher was neutral more or less. However, this doesn’t prove anything when such programmes are made by people with known biases & prejudices towards warmism! The conclusion was inevitable, suggesting to me that the conclusion was “concluded” prior to production, the warmista experts were merely there to cry over how hard done by they were by evil denialistawarmists, who hurt their feelings ! The implication that no biases of any kind were palpable beyond anything, St Paul Jones of UEA was almost in tears, as were one or two others allegedly “interviewed” from UEA! Closing & ranks were the buzz-words!!!

      • I felt so sad when Michael Mann opened the programme with an air of wounded vulnerability describing how he had received an envelope containing some kind of powder in the post. I don’t hold with sending anybody mysterious or dangerous packets, but he was so out of character.

        • Richard, not sure if you are a UK person – if not, it is critical here to establish the fact that you are a victim. So the ‘I was sent anthrax in the post’ statement upfront gets the audience on your side and means that other people cannot attack you.

    • David Tallboys

      “It needs a few hitherto Michael Mann supporters to convert.”
      I am no Michael Mann supporter at all (nothings against his work though).

      Maybe you like his analysis because its shows no warming? Mots WUWT commenters do, after all.
      I like only analyses when they look sound and free of trivial bugs.

      I get always troubled about people who speak only about ‘El Nino warming’ .
      Are there no La Ninas anymore in his mind?

      What about this??
      https://www.esrl.noaa.gov/psd/enso/mei/

      *
      Santer & al. did a lot of work to extract ENSO and volcano signals out of the RSS3.3 LT time series from 1979 till 2013.

      The residual warming they obtained was 0.086 °C / decade compared with 0.124 °C for the original time series at that time.

      https://dspace.mit.edu/bitstream/handle/1721.1/89054/solomon%206%20Santer_etal_NatGeo_Article_File_22jan2014.pdf?sequence=1&isAllowed=y

      That is Science with a capital S!

    • I live at about 49 N and I’m pretty sure we can’t grow olives here in the middle of North America. Go straight West from here about 1000 mile and you are at the West Coast and still at 49 N. Pretty sure you can grow olives there. Unless you are at altitude of course, ’cause that changes climate, too!

    • Yes, it has warmed about a degree in the last century. And tree trunks are showing at the ice front of melting Alaskan glaciers….Otzi melted out of a receding Swiss glacier…It’s been warm before….but Olive groves will not be agricultural produce in Oregon for the foreseeable future.

  12. As well as talking confusingly about emissions and concentrations of CO2 since 1900, the chart in Fig 2 misleadingly shows concentrations of only around 20ppm in 1900. This article needs to go back and be fully error checked. It does not appear to be of sufficient quality.

    • Fig 2 does not show concentrations. It shows rate of human emissions. The side bar needs to be amended to make that clearer but the underlying point that there is no correlation between the rate of temperature change and the rate of human emissions remains valid.

      • The scale is labelled “parts per million CO2 in the atmosphere.” Sounds like concentration to me. Emissions would be expressed simply in units of mass.

        • Yes, but read below the Figure:
          “The blue dotted curve showing total parts per million CO2 emissions from fossil fuels in the atmosphere is modified from Boden, T. A., et al. (2017); the time frame shows only emissions since 1900, and the total reported million metric tons of carbon are converted to parts per million CO2 for the graph. ”

          Bad labeling.

        • Of CO2 in the atmosphere due to human emissions. That is what we are currently discussing. What part of that eludes you?

    • That low CO2 at the start is an estimate of what mankind was emitting at the time. Add natural emissions and you get to about 180 ppm. But, that 180 ppm is also questionable, because of the thousands of chemical analyses of ambient levels in those early days that often give higher results. Of course, some were sampled in places locally high, but there remains doubt about how useful that 180 ppm is. Geoff S

  13. Is the 6th degree polynomial curve fitting to be taken serious? With 6 degrees both ends must turn up or turn down and for the 5th degree slopes one end will go up and one end go down. The behaviour at the ends is very controlled by the highest power term even though the coefficient is small the 6th power dominates, especially when large numbers like dates are used. A problem I found with using actual dates in Excel is the derived coefficients may not be given accurately enough and so a good practice is to subtract ,say, 1800 off the dates. With the large date numbers being used the plots in Excel are OK but using the equation printed on the graph outside Excel may be inaccurate. And DON’T extrapolate.

    • You don’t need such mathematical contortions to see the obvious fact that the rate of warming shows no correlation with our rate of emissions.

      • Plot the 6th-order regression equation for a few decades before and after the data points and watch what happens.

    • “Is the 6th degree polynomial curve fitting to be taken serious? With 6 degrees both ends must turn up or turn down”
      Yes, that is a problem. There is no point in interpolating the fitting of a sixth order polynomial to get a derivative, which is basically fitting a first order polynomial. You might as well calculate the first order direct from the data.

      Not that it makes very much difference. The slopes quoted are really not much different from what is usually found, except for spurious diversions at the ends.

      • Stokes
        The 1st-derivative of a 6th-order polynomial is a 5th-order polynomial. The process is called “differentiation,” not “interpolating.” That gives the ‘instantaneous’ change in the function at any particular point. One only gets a first-order polynomial from differentiating a 2nd-order polynomial.

        The slopes should be good for points WITHIN the range of data, assuming that there is a high R^2 value. It is OUTSIDE the range of data where the polynomial fit and the 1st-derivative get dodgy!

    • Correct, and the problem with the ends of higher-order polynomial regression (it is still linear regression, even if the regression equation is not) can be neatly illustrated by plotting the regression equation at x-values (time) beyond the two endpoints. The curve will rapidly deviate to y-values far away from the input data values — the curvature at the ends of the green plots in Figs. 1 & 2 are the starts of these deviations. Thus, trying to extract any information from the 1st derivative at the ends is fruitless (or even misleading).

    • An obvious solution to the Excel problem is to not use Excel. Use Python or R preferably in a Jupyter notebook to let others see the code and critique it. it also allows others to see the data set you have used and what manipulations have been made to it to make calculations.
      The polynomial is used to fit a better predictive curve to the data you have. A simple linear regression can’t do this. The 6th degree polynomial can’t and shouldn’t be used to predict values outside of the data set.
      I have never agreed with using an average of X years as a baseline to plot ‘anomalies’. I have never seen an analysis of the distribution used to arrive at an average by anyone. I’m not saying it hasn’t been done. I just haven’t seen one.
      Oddly enough, I came up with some similar results as Thomas from the Met Office data for Oxford UK. It is written in Python and is available to to criticise.

  14. I took some data.
    I threw some out. I called it noise.
    I wont show you the before and after.
    I wont show my code.
    But I just proved there was no LIA. !!!

    OMG where’s my Nobel!

    I’m a skeptic, published at WUWT

    • If you are a true believer you publish in a “friendly science mag” .. and if you are Mike Mann you claim you have a Nobel 🙂

      I have asked you a number of times to give us a prediction, but you decline. We just have to take it on faith that your numbers are the Best 🙂

      • According to Steven, every number they publish is a prediction, not an average. This does not stop his site from proclaiming “2018: Tied for 4th Warmest Year Ever”

    • Steve’s desire to be taken seriously has caused him to embarrass himself again.

      What is it about you alarmists and your pathological need to attack arguments that were never made?

      • Your comment is completely inaccurate and idiotic terms like Greenie, green head, true believer etc. indicate you haven’t been following the debate and are biased.

        • Notice your bias, because I just echoed Mosher’s stupid comment 🙂

          This is climate science it is as toxic as hell and everyone is biased.

          • Moderator, thousands pass through the gates without comment and you pick my post? What are the odds?

            consistent
            /kənˈsɪst(ə)nt/
            adjective
            1. acting or done in the same way over time, especially so as to be fair or accurate.

            (You have posted THREE times now in this thread, NONE of them on topic, and two of them in reply to a moderator. You have a difficult time realizing that after 737 comments you posted, have not learned to stay on topic CONSISTENTLY, even when you have been asked several times to do so by moderators. You are wasting my time having to coddle you, drop it or you will be placed in moderation, requiring a moderator to approve the rare on topic comment you might be able to create) SUNMOD

    • Steven, some of us come here with a skeptical bent, but interested in hearing from both sides. Geoff Sherrington and Nick Stokes made thoughtful comments about the article. You launch ad hominem attacks and make caustic, cryptic comments that suggest you don’t fully understand what is being debated. If you have something helpful to say, by all means say it. Otherwise, don’t clutter up the comments with your bile.

    • Moshpit says:
      Another Oil head

      Oh give it a rest. Warmunists want skeptics to think “oil money” is bad, but academic/taxpayer/NGO money isn’t? The amount of academic/taxpayer/NGO (the climate-change industrial complex) money spent on “climate change” is orders of magnitude more than oil money. It’s the “look at the squirrel!!!” Alinsky technique, and it doesn’t fool any reasonable person.

  15. I am puzzled about the interpretation of the green curve, the first derivative. Both figures 1 and 3 show that the green curve does not become negative, , i.e. the slope of the red temperature curve is always positive, which means that temperatures have been rising over the investigated period, be it with different velocities. So how can the green curve be used as representing real temperature or temperature anomalies? It is not a “warming” curve, as stated in the contribution but only indicates changes in warming. I would be grateful for some supplementary explanation.

    • The derivative is rate of change of the temperature anomalies per month converted to degrees per decade by multiplying by 120. The average warming per decade over 170 years is 0.07 degrees C per decade, including negative values; that is, the warming rate for surface of the earth for that time period. The number of monthly data points is over 2000. Does this make sense to you?

      I would like to respond to all comments, well, almost all comments, but I do not have the time. Thanks for the question.

      Tom

  16. surely the most stupid thing is taking the 1st derivative of a 6 poly trend line to show rate of change. As others have stated a poly is not a good curve fit especially at the ends – and to use it to then generate a rate is pure madness.

  17. I just love this statement:
    “and the monthly global temperature anomalies are easier to import to Excel”

    Can the author not be bothered to even import other data because the columns are not lined up? He’s obviously not a mathematical genius!

  18. That low CO2 at the start is an estimate of what mankind was emitting at the time. Add natural emissions and you get to about 180 ppm. But, that 180 ppm is also questionable, because of the thousands of chemical analyses of ambient levels in those early days that often give higher results. Of course, some were sampled in places locally high, but there remains doubt about how useful that 180 ppm is. Geoff S

  19. If we use HRs/month as our metric for determining the HR production anomaly for baseball players, then every general manager in the league must go into a panic when their stars’ numbers fall from +6 in July & Aug to -6 in Dec & Jan. Maybe Hr/yr is a more appropriate measure…..I rather doubt that the gene frequency in the pre-Colombian bison herd changed much from decade to decade or even from century to century. A millennial time scale would be more appropriate to measure evolution compare to climate….And why have the MasterMinds determined that a 30 yr average defines “climate” when there’s a rather obvious 60 yr undulation in the extant temp records?….What’s “normal” for temps & climate? Shouldn’t we be using a 100 or 500 yr average at a minimum for determining a meaningful anomaly?

  20. Sitting here in the UK midlands, can I make a request for more global warming as soon as is possible to be delivered. We don’t have snow cover just yet, but all the indications are that it is coming. We have all the cold and all the water we are going to need for the next year at least. The oft claimed global warming, just never seems to show up in the weather patterns we are getting?
    Thankfully the evidence is, we are about 0.8 deg C warmer than we were back in the middle of the 1800 just before the American Civil War. I hope there is no connection between those cold climate times and the anger of man…
    Where is all the warmth we need?

  21. The following graph shows the flux optical depth anomaly (expressed as a percentage) for the Earth’s atmosphere between 1948 and 2007. It turns out that this anomaly is a rough measure of the total column density of water vapor in the Earth’s atmosphere from year to year over this time period.

    https://2.bp.blogspot.com/_tG8JCC_Tnp0/S62NJ0YL3fI/AAAAAAAAAD4/Bl4PKJrsatQ/s1600/optical-depth.JPG

    The following graph is a comparison between the polar Fast Fourier Transform (FFT) of the flux optical depth anomaly between 1964 and 2001 and a periodogram of the ENSO/SOI over the same time period.

    https://1.bp.blogspot.com/-n6otyph1rBM/Xc6bpa8QFcI/AAAAAAAAB6U/xvGv5zbWeBQtxY3yCFHoLBl9ioC6n9RnQCLcBGAsYHQ/s1600/wonderful.JPG

    [N. Sidorenkov, Astronomy Reports, Vol. 44, No. 6, 2000, pp 414 – 419, translated from Astronomischeskii Zhurnal, Vol. 77, No. 6, 2000, pp 474 – 480]

    Remarkably, the 6.2 (& 6.0), 4.8, 3.6, 2.4, and 2.1-year periodicities in the ENSO/SOI periodogram of Siderenkov (2000), are also clearly evident in the FFT of the flux optical depth anomaly data.

    Four of these six long-term periodicities (i.e. 2.4, 3.6, 4.8, and 6.0 years) are sub-harmonics of the 1.2 year period of the Earth’s free nutation i.e. the Chandler Wobble. In addition, all six of the long-term periodicities are very close to the super-harmonics of the 18.6 year period of the Earth’s forced nutation (i.e. 6.2, 4.7, 3.7, 2.3, and 2.1 years) i.e. the periodic precession of the line-of-nodes of the Lunar orbit.

    This data tells us that the ENSO must play a major role in setting the overall column density of water vapor in the Earth’s atmosphere. In addition, it indicates that the ENSO must also be an important factor in setting the World’s means temperature, since water vapor is the dominant greenhouse gas in the Earth’s atmosphere. The El Nino/ LA Nina may also play an important role in changing the Earth’s albedo via its effects upon the overall amounts of regional low and high-level cloud

    What is even more remarkable, is the fact that common frequencies seen in the two data sets are simply those that would be expected if ENSO phenomenon was the resonant response of the Earth’s (atmospheric/oceanic) climate system brought about by a coupling between the Earth’s forced (18.6-year Nodical Lunar Cycle) and unforced (1.2-year Chandler Wobble) nutations.

    Support for this hypothesis is provided by the following paper:

    Wilson, I.R.G. and Sidorenkov, N.S., 2019, A Luni-Solar Connection to Weather and Climate III: Sub-Centennial Time Scales, The General Science Journal, 7927

    https://www.gsjournal.net/Science-Journals/Research%20Papers-Astrophysics/Download/7927

    This paper shows that the the variations in the rate of change of the smoothed HadCRUT4 temperature anomalies closely follow a “forcing” curve that is formed by the simple sum of two sinusoids, one with a 9.1-year period which matches that of the lunar tidal cycle, and the other with a period of 10.1469-year that matches that of half the Perigean New/Full moon cycle. This is precisely what you would expect if the natural periodicities associated with the Perigean New/Full moon tidal cycles were driving the observed changes in the world mean temperature anomaly on decadal time scales [between 1850 and 2017].

    https://1.bp.blogspot.com/-hGFQX7E8Oro/XZyEWZ7KtHI/AAAAAAAABzs/mpSn4NgZA8EJAImelda3zgHfieNkNkfUgCLcBGAsYHQ/s1600/Delta_Temp_model.jpg

    • This data tells us that the ENSO must play a major role in setting the overall column density of water vapor in the Earth’s atmosphere.

      That is not surprising. El Niño implies a huge transfer of energy from the ocean to the atmosphere and the vector is water vapor.

      What is even more remarkable, is the fact that common frequencies seen in the two data sets are simply those that would be expected if ENSO phenomenon was the resonant response of the Earth’s (atmospheric/oceanic) climate system brought about by a coupling between the Earth’s forced (18.6-year Nodical Lunar Cycle) and unforced (1.2-year Chandler Wobble) nutations.

      The problem with that theory is that ENSO frequency has been changing during the Holocene, with El Niño being almost absent during the Holocene Climate Optimum, while the Moon and the Earth axis were doing their thing the same. While it is not impossible that ENSO responds to changes in Earth movements, it is clear that other factors play an even more important role.

      • Javier,
        Your objections to the Lunar influence on the initiations El Nino are no longer valid. I cannot fault you since you are not fully aware of all the recent findings. However, you will be seeing a series of papers coming out over the coming year that will place this connection on a more solid foundation.

        The lunar Periegan New/Full tidal cycles naturally produce Gleissberg (~ 88 years) and de Vries (208 years) cycles. These cycles are evident in some recent climate records such as the South American Moonsoon.

        In addition, they are clearly evident in geological stratigraphy data that is over 90 million years old (after making some allowance for the slow drift of the Moon away from the Earth due to tidal torquing).

  22. Figure 2 is a bit of a mystery. Unless someone else does this beforehand, I will get to Boden (2017) when I reach my office later this morning, and try to determine how the blue, dotted curve in Figure 2 was constructed.

    There are all sorts of things I might suggested looking at other than the high order polynomial. A Kalman filter, for instance, or some adaptive filter.

  23. Curiosity got the best of me and so I had a look at Boden’s data while sitting at breakfast. The full mystery of Figure 2 will not be cleared up by examination of this data (for example if Figure 2 shows additions to global CO2, then why plot the final data point which is obviously the current level, and another thing entirely). However, Boden’s data is CO2 emissions per country all of which start reporting at different times, and some of which, like nations in the old Soviet orbit do not begin to report until 1992 and start with large values. Thus, totals built from that data have a low bias in the past. Another instance of an estimate that should include some error bars.

    • Boden et al., 2017 dataset has three excel documents, one global, one regional and one national. The global one contains their estimate of carbon emissions from 1751 to 2014. I have plotted that data in GtCO2 from 1900 updated to 2018 and it looks exactly as figure 2 blue curve except for the 2014-18 not included and that last point that does not belong to the dataset.

      And you can’t blame the author of this article for Boden et al. estimates. It is after all the most accepted, used dataset for emissions.

      • I don’t blame the author for anything, but it is his problem that he mislabeled the graph, title and vertical axis both, against the text of the caption, and mixed data within the graph itself–i.e. the last data point in the set should have been clearly identified with a different symbol. Perhaps a paper with such a remarkable claim should have taken a little more care in presenting his central evidence.

  24. None of 102 climate models of the mid-troposphere mean temperature comes close enough to predicting future temperatures to warrant changes in environmental policies.

    Unfortunately the tipping point was passed some time ago, and it is this: “Climate Change” has achieved “Too Big To Fail” status.

  25. From the article: “The highest global mean surface temperature ever recorded was 1.111 degrees C in February 2016”

    So the Earth reached a temperature 1.111C above the estimated average global temperature since 1850, in February 2016.

    The UN IPCC and associated alarmists used to say humanity had to limit the increase in global temperatures to 2C above the global average if we are to avoid catastrophic global warming effects.

    In the intervening years after the 2C limit claim was made, the global temperatures did not rise the way the IPCC thought they would, so in order to continue the climate change crisis, they have now lowered the point at which humanity will face a global warming crisis to 1.5C above the estimated global average temperature since 1850.

    But since Feb. 2016, the temperatures have cooled by about 0.4C, so the Earth is heading away from the 1.5C crisis point, not towards it. And that is while CO2 continues to increase. Good news for Greta! She can relax. The Earth is not heading into a disaster.

    Btw, 2016, the “hottest year evah!” was only 0.1C warmer than 1998, a statistical tie, and in the United States, 1934 was 0.4C warmer than 2016, according to Hansen 1999, and the UAH satellite chart.

    Hansen 1999:

    https://climateaudit.files.wordpress.com/2007/02/uhcnh2.gif

    UAH satellite chart:

    http://www.drroyspencer.com/wp-content/uploads/UAH_LT_1979_thru_October_2019_v6.jpg

  26. Interesting stuff. I’m a geophysicist in the energy biz, and not convinced by the catastrophic warming narrative out there. These kinds of studies are critical in countering that false narrative, but it would be better for them to appear in peer-reviewed publications, because otherwise, warming zealots have an excuse to discount them. Is this study published in a peer-reviewed journal somewhere? If not, why not, or are there plans to publish? Thanks

    Randall

  27. I agree several others here about the inadvisability of fitting a sixth order polynomial to such data. It’s frequently a mistake that new grad students make which their advisors have to correct, in particular then taking the derivative of the fitted curve is particularly egregious. Check the following for information on this:
    http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.569.2504&rep=rep1&type=pdf
    E.g. “Results based on high order polynomial regressions are sensitive to the order of the polynomial. Moreover, we do not have good methods for choosing that order in a way that is optimal for the objective of a good estimator for the causal effect of interest. Often researchers choose the order by optimizing some global goodness of fit measure, but that is not closely related to the research objective of causal inference”.
    “Based on these arguments we recommend that researchers not use such methods, and instead control for local linear or quadratic polynomials or other smooth functions.”

    • The data points are well-behaved. I assert that I could fit the data by hand-fitting the curve with a crayon and realized essentially the same results. I have not taken the time to do this and will not it.

      • Your assertion is incorrect. I suggest you plot your six order curve including the error bars and then plot the derivative curve with its error bars you’ll find significant deviations particularly near the ends of the curve.

  28. Trends are relevant to the future until they aren’t. This study seems robust enough to question alarmist predictions, regardless of its own predictive value.

  29. Can you image how happy people would be if they moved the arbitrary 0.0 degrees anamoly line up just 0.6 degrees? They would be happily looking back at the poor buggers who had to endure all those freezing conditions in just “wolfskins, and no internet” back in the 1940’s safe in the knowledge that they had “cured” the planet of hypothermia.

  30. I hate to keep bringing up uncertainty and significant digits as a problem, but it is scary to see people misrepresent data. For at least a hundred years, temperatures were recorded to the nearest integer temperature. That means the uncertainty is +/- 0.5 at a minimum. This doesn’t include any assessment of accuracy or other systemic uncertainty in the recorded data. One should also remember that each recorded piece of data contains this uncertainty. The uncertainty must be taken into account when doing each calculation to reach the end result.

    You can not erase this uncertainty through determining anomalies or averaging. The standard use of significant digits means you can only have data that is stated to the nearest integer for the period when data was recorded as integers.

    The following comment is from Washington Univ. at St. Louis. “Significant Figures: The number of digits used to express a measured or calculated quantity. By using significant figures, we can show how precise a number is. If we express a number beyond the place to which we have actually measured (and are therefore certain of), we compromise the integrity of what this number is representing. It is important after learning and understanding significant figures to use them properly throughout your scientific career.” The link is at: http://www.chemistry.wustl.edu/~coursedev/Online%20tutorials/SigFigs.htm .

    This kind of erases the conclusion of the paper since the uncertainty is wider than the calculated numbers.

    • Jim:
      And then throw in the uncertainty in bio-indicator data like tree rings. Stat analyses cannot undo that.

  31. “There is no correlation between a rising CO2 concentration in the atmosphere and a relatively stable, low rate of warming of the surface of the earth from 1943 to 2019.” – Figure 2

    Wrong! You must compare the rate of warming with the rate of CO2 change, i.e, the derivative of the blue curve.

    • unka, you said, “You must compare the rate of warming with the rate of CO2 change, …” I agree.

      However, a plot of surface temperature versus cumulative anthropogenic CO2 emissions should provide an understanding of how humans might be contributing to increasing global temperatures. A first or second-order regression would provide a coefficient of determination (R^2) that would tell us how useful CO2 emissions are for explaining or predicting future temperatures — assuming that the correlation is not spurious.

      • Compare red curve form Fig. 1 and blue curve from Fig.2 and you see correlation.

        Comparing green curve form Fig.2 with red blue curve form Fig. 2 is methodologically incorrect.

  32. Recent statistical analyses conclude that 95% uncertainties of global annual mean surface temperatures range between 0.05 degrees C to 0.15 degrees C over the past 140 years;

    Those uncertainties are 3 to 10 times smaller than the lower limit of resolution of the historical instruments. Utter incompetence; standard for that field.

    • Who is reviewing these so-called studies that ignores basic scientific accuracy and precision requirements?

      It is hard for me to understand how basic fundamental metrology isn’t just ignored, it doesn’t even seem to be on these peoples radar. It points to ignorance of the physical world. It reminds me of kids in school 50+ years ago that had no idea how hot a soldering iron gets or difference in wattage in resistors. These folks are computer programmers who don’t even know how to handle rounding and precision in excel spreadsheets. How sad!

    • Jim, it literally is not on their radar, just as you suggest.

      Reading their papers, one gets the impression that none of these folks has ever made a measurement or struggled with an instrument

      They assume all the error away, the Central Limit Theorem is their magic talisman, and their lives become very easy.

      I published a paper earlier this year on the radiolysis of the amino acid cysteine in a synchrotron X-ray beam. I had to track the resolution and accuracy of the beam line right down to whether the pre-amp was calibrated and the nitrogen ionization chamber voltage was above its saturation knee.

      I kill myself to make sure my data are good and so does everyone else I know. The bland carelessness of these climate people, and their hostility toward rigor sticks in my craw more than anything else.

      • Pat
        I suspect that part of the problem is that many of the scientists who have not yet retired come from a generation when academic standards were declining. Consequently, they don’t know what they don’t know.

        I taught at Foothill College (Los Altos Hills, CA) from 1971 through 1982. Many of our students came from Palo Alto High School, considered one of the better in the Bay Area (Probably only Bellarmine had a better reputation). During the decade I taught, I saw a noticeable decline in the quality of students. There were many that didn’t know that one could multiply by 10 by moving the decimal point! Yet, Foothill awarded A’s and B’s to 50% of the students.

        • Clyde, it’s all so awful.

          But somehow I think the climate scientists are a special case. Most of the modelers are mathematicians who know nothing of science or physical reasoning as a matter of course.

          But I also think that climate scientists plain are not well educated. Look at Phil Jones, of a prior generation, who nevertheless knows nothing of validating data quality and assumes all the error away. His younger follow-on colleagues are the same.

          And, I’m sorry to say, Roy Spencer showed no understanding of how to assess error or even to reproduce the linear emulations of my paper. He’s also of a prior generation.

          These problems seem endemic in the field.

          So, while I fully agree with you that rigor has been nearly eviscerated from instruction, it seems like there hasn’t been any in climate science for 40 years or more.

          Charles Brooks did heroic work in 1926 testing ship intake SST validity. And in 1963 J. Saur did a careful analytical study of ship intake temperatures.

          But their analytical rigor seems to have been lost in consensus climate science, and the whole field seems to lack competence.

          • >>
            These problems seem endemic in the field.
            <<

            It’s already been mentioned by many on this post and in other posts: averaging intensive properties (it’s nonsense to do so), starting with a list of numbers with unit precision and averaging out to the thousandth place (utter nonsense and violates the rules of significant figures), and averaging averages (not mathematically valid in general).

            Jim

  33. but we don’t have 170 of legit data! (I read that on this blog so it must be true)

    I guess we’ll never know what is causing all that ice melting, fires, increased frequency of record highs (and an order of magnitude record lows), and so on.

    Thank the author so much! I’m going to buy some Florida ocean side real estate before he gets it all!

    • No one is saying the original recorded data is not legit (although some of it may be made up at various times). What is being said is that the use of that data is not legit.

      You simply can not take data that is recorded as integers and through mathematical calculations like subtraction (anomalies) or finding averages extend the precision of the recorded temperatures out to tenths, hundredths, or thousandths of a degree. That is simply not legitimate science. If the original precision was units, that is as accurate as you can ever be, like it or not.

      I can’t tell you how many students I have tutored for hours about significant digits will invariably write down the maximum number of decimal digits on their calculators when dealing with measurements. Climate scientists make the same mistake but I suspect they don’t even know better.

      Do you ever wonder why these graphs and conclusions never ever show any error/uncertainty bars or state the conclusion along with an uncertainty value?

      • Two nights ago I was watching a UK forensics drama in which the highly trained pathologists and forensic scientists were puzzling over a pair of burn marks on a body, trying to determine their origin and hypothesizing they were from a taser or stun gun. One of the scientists took a small plastic 10 cm ruler, placed it against the body to measure the separation between the burns and announced the distance as: “three point one seven five centimeters”. From this they went on to conclude: “Ah-ha! This one and a quarter inches so this was an illegal stun gun made in the USA!”

        I cringed at this bit of theatrics, knowing the ruler was only graduated in millimeters yet somehow the separation was determined with a resolution 50 microns. Obviously the writer took 1.25 inches and multiplied by 2.54 in a cell phone calculator, which presented the answer as 3.1750 cm, and put the number into the script. The actors were none the wiser, and this condition is apparently extended to climate scientists.

  34. The challenge is to allow scientists the time and freedom to work without interference from special interests

    Most of the “scientists” working in the climate field ARE the “special interests”.

    Or at least one of three big special-interest groups – the other two being the “renewable energy” industry and the politicians who make policies and direct public money in the currently fashionable direction.

    The jobs of climate scientists and their research grants depend on there being a human-caused warming trend that is having an adverse effect on both the natural environment and human civilization, and that this trend will accelerate and have even more and even worse adverse effects in the future.

    How else to explain that there is not one study among the many thousands published every year that can find ANY beneficial effects – local or global – of past, present or (inferred) future warming?

    (unrelated complaint) I don’t like polynomial-fitting in general, and especially I don’t like it when it’s being applied to a parameter that may vary in a cyclic way, and even more especially if there’s less than one complete cycle in the data, and yet more especially if the data may contain more than one set of superimposed cycles. And using the last inflection of a polynomial-fitted curve, or any other curve fitted to inherently noisy data, to extrapolate outside the data range (i.e. – in this case, into the future) should be a criminal offence.

  35. 3. Recent increases in surface temperatures reflect 40 years of increasing intensities of the El Nino Southern Oscillation climate pattern.

    Did something cause the increasing intensities of ENSO? What was it? How do we know that?

    ENSO is part of the global net of mechanisms that transfer heat through the ocean and atmosphere. Of course any net warming will entail net warming effects of at least some of the mechanisms, no matter what is driving the net warming.

    This analysis does not really help decide whether sun, CO2, urbanization, or something else has been driving the warming, imo.

    • There may be no need for any energy input rise/fall for the atmospheric global warming/cooling to take place.
      Ocean currents take warm water from equatorial region towards the poles, and cold water in the opposite direction while the processes of up/down -welling provide for the bidirectional energy exchange between the oceans and atmosphere. The time scale of ocean currents travel between various critical locations defines time constants at the bases of number of natural warming/cooling cycles. Just having three cycles (e.g. Atlantic, Pacific and Indian oceans) with even small individual amplitude and periodicity variability would make it next to impossible to resolve within the very narrow window of the global temperature data available.

    • Matthew, how do we know the El Nino response is not time-lagged?

      What’s the ocean-response time constant?

      What if today’s El Ninos are a response to the Medieval Warm Period? Does anyone know?

  36. A question – I assume that UAH measures average temperatures. I believe that HadCRUT and other surface thermometer-based temperature series measure the highest and lowest temperatures and divide them in two. I understand from Anthony Watts’s presentation that most of the warming has taken place at night, that is the lowest temperature setting increasing much faster than the highest (perhaps as a result of all the UHI interference and the poor siting of the measuring apparatus). Would this explain why HadCRUT, etc show a higher decadal increase in average global temperature?

  37. Facts, facts , nothing but facts. But what about the 97 % certainty ?

    Mr. and Mrs. average come home from a tiring day at work, and settle down
    to a meal and relaxation. So who do they believe, the facts, pages and pages
    of them , or just a simple statement like the 97 %.

    MJE VK5ELL

  38. Figure 2. Is wrong & makes no sense.

    In Figure 2, “The blue curve is a time series of the concentration of fossil fuel emissions of CO2 in parts per million in the atmosphere.” (Starting at ~ 30ppm)
    BUT
    the top of the curve shows TOTAL (Anthropogenic & Natural) atmospheric CO2 (414.7 ppm)

    So what are we supposed to be seeing ? Total atmospheric CO2 or fossil fuel emissions of CO2 ???

    If it’s Total atmospheric CO2 then starting point should be ~ 295

    If it’s fossil fuel emissions the end should be ~ 200ish

    You can’t have both on the same curve

    • The side bar of Fig 2 needs correcting.
      The intent is to show the rate of increase in our emissions over time compared to the rate in change of temperature.
      It is obvious that the rate of change in temperature does not correlate with the rate of change in our emissions.
      The head post goes on to say that the rate of change in temperature does correlate with the level of El Niño activity.
      Both assertions are clearly correct.
      All the objections posted thus far simply distract from that underlying truth.
      That is partly the fault of the author but adverse responders are also at fault for missing the essential point.

  39. Are we looking at a ENSO event which as the scientists tell us that bottom
    water from the Poles to the final up welling can take up to 800 years to
    occur.

    So what was happening back in the 12th century, why that was still the GWP
    So is today’s situation caused by the accumulated warms from back then. ?

    A very interesting article.

    MJE VK5ELL

  40. ‘170 Years of Earth Surface Temperature Data Show No Evidence of Significant Warming’

    170 years ago would be 1849. Sorry, but we didn’t have much data in 1849. We didn’t have much more in 1900. Adequate data started in 1979. Analyses to 1849 may be fun, but they are more of a parlor game than science.

  41. This seems an exercise in curve-fitting. And not a very good one.

    1) The r-squared value for the 6th order polynomial from the author is 0.75. For a 2nd order fit, I got r-squared = 0.73. Adding extra terms does basically nothing to improve the fit. (The r-square value for a linear if drops dramatically to 0.60, so that is a bit too simple). A 6th order polynomial is highly over-fit.

    2) If you project this 6th order curve fit 50 years back, it is -17C! If you project it forward 50 years, it is -8C. While the fit is OK for the years it is aiming at, its complete inability to project forward or backwards shows that it is not at all predictive. A 6th order polynomial is highly over-fit.

    3) 2nd and 3rd order fits give much better predictions forward and backward 50 years. This also strongly suggests they are better fits. A 6th order polynomial is highly over-fit.

    So a 2nd or 3rd order (quadratic or cubic) polynomial is probably a better function to use for the fit. These give a straight line and a parabola respectively when taking the slope (ie when generating the “green curve” for the graphs above). A parabola is an excellent fit for the CO2 curve — the “green curve” matches the shape of the “blue curve” quite well. Even the straight line for the “green curve” would be a pretty good fit.

    A 6th order polynomial is highly over-fit. ” John von Neumann

    • Oops … the quote should have been
      “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.” — von Neumann.

      • Tim
        I don’t think that von Neumann meant a 5th order polynomial with ONE parameter when he talks about 5 parameters.

        However, I do agree that a 6th-order polynomial is probably overfitting the time series.

        With respect to your second point, extrapolations or projections are always fraught with risk if you are dealing with a system that is not well behaved and well understood. The order of the polynomial is not all that important except that the end-points may head off to infinity faster than physically possible with high order polynomials.

        • Clyde, he could have fit the T series with a cosine plus a line.

          I did that in a post at Jeff id’s tAV site here, 8 years ago(!).

          When the cosine is removed from the GISS or HadCRU temp series, one is left with a linear trend showing virtually no increase in the rate of warming since 1880.

          The cosine 58 year phase is a pretty close match to AMO+PDO.

          • Pat

            I just read your analysis at the link you provided. It is very interesting It is a shame that it only had 8 votes after 8 years.

            Might I suggest that you re-do the analysis with current data and re-publish it here, where it will almost certainly get more attention. It would be interesting to see if there are any substantive changes, and how well your predictions fared during that time.

            Would it were that those receiving grant money for climatology research were as creative as you.

          • Pat, your analysis suffers from the same sort of extrapolation problems I described for the original post here

            * Going back before 1880, your ‘model’ predicts anomalies around -0.6 C in the 1840’s which was definitely not the case.
            * Going forward past 2010, your model predicted slowly dropping anomalies. Instead, actually temperatures were already above your predictions in 2010 and have gone up, not down since then.

            It was an excellent empirical fit for the data you used, but fails going forward or backwards.

          • Clyde, the fits included a fitted scaling constant and a couple of necessary offsets, but the business end of the fit included three fitted parameters (c, d, and e below).

            The whole thing was ‘a+b*cos(c*temp+d)’ and the linear part was ‘e*temp+f,’ where a and f are offsets, b is a scale factor, and e the fitted linear slope.

            It’s kind of you to suggest reworking and publication, but it seems overwhelmingly likely it would never see the light of day.

            I’ll be writing up a paper on the air temperature record after the holidays. Without giving anything away, there’s something that’s really going to blow those folks right out of the water.

          • Tim, the fit is not a model of the climate. It was never meant to be predictive.

            It was only meant to show the structure that could be extracted from the known temperature trend.

            Once the cosine part was removed, the remaining linear trend showed about zero evidence of accelerated warming across the 20th century.

        • Clyde,

          A 6th order polynomial like the one used here has 7 adjustable parameters:
          y = (a1)x^6 + (a2)x^5 + (a3)x^4 +(a4)x^3 + (a5)x^2 + (a6)x + (a7)
          This is pretty much exactly what von Neumann was a talking about. It is pretty clear that a 6th order polynomial was used here not for any mathematical reason, but because that is as far as Excel will go.

          Yes, extrapolations are always fraught with difficulties. That is a very good reason to keep fits simpler.

          • Tim
            I would call them coefficients of the same parameter, raised to various powers. If one wanted to go to higher dimensions, then additional parameters would be necessary.

            Yes, fits should be kept simple, except when they shouldn’t be. There is no demonstrated physical reason why a 6th-order polynomial is justified in this instance. However, that’s not to say that there might not be a case where it would be justified. Air friction varies with the cube of the velocity, even for streamlined objects. I suspect that turbulent behavior might add at least another order to the modeled behavior.

    • That quote should have been:
      “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.”

  42. The math doesn’t look rate or at least it’s discordant with the claim.

    The first order derivative shows the rate of change over time. You clearly have a significantly increasing rate of change. You address this in section to claiming it’s ENSO but I don’t see the work proving it to be ENSO.

  43. Bjorklunds paper is clearly consistent with my 2017 paper “The coming cooling: usefully accurate climate forecasting for policy makers.” which said:
    ” This paper argued that the methods used by the establishment climate science community are not fit for purpose and that a new forecasting paradigm should be adopted.”
    The reality is that Earth’s climate is the result of resonances and beats between various quasi-cyclic processes of varying wavelengths.
    It is not possible to forecast the future unless we have a good understanding of where the earth is in relation to the current phases of these different interacting natural quasi-periodicities which fall into two main categories.
    a) The orbital long wave Milankovitch eccentricity, obliquity and precessional cycles which are modulated by
    b) Solar “activity” cycles with possibly multi-millennial, millennial, centennial and decadal time scales.
    When analyzing complex systems with multiple interacting variables it is useful to note the advice of Enrico Fermi who reportedly said “never make something more accurate than absolutely necessary”. The 2017 paper proposed a simple heuristic approach to climate science which plausibly proposes that a Millennial Turning Point (MTP) and peak in solar activity was reached in 1991,that this turning point correlates with a temperature turning point in 2003/4, and that a general cooling trend will now follow until approximately 2650.
    The empirical temperature data is clear. The previous millennial cycle temperature peak was at about 990. ( see Fig 3 in the link below) The recent temperature Millennial Turning Point was about 2003/4 ( Fig 4 in link below ) which correlates with the solar millennial activity peak at 1991+/.The cycle is asymmetric with a 650 year +/- down-leg and a 350+/- year up-leg. The suns magnetic field strength as reflected in its TSI will generally decline (modulated by other shorter term super-imposed solar activity cycles) until about 2650.
    The temperature increase since about 1650 is clearly chiefly due to the up- leg in the natural solar activity millennial cycle as shown by Lean 2018 “Estimating Solar Irradiance Since 850 AD” Fig 5
    https://4.bp.blogspot.com/-XXHivbXcr7U/W9taWv4M2LI/AAAAAAAAAq8/mSUrIxZqDmkf-9eNsZPhbIXLEf1LcdtzACEwYBhgL/s1600/Leannew.png
    Fig1
    Lean 2018 Fig 5.
    This Lean figure shows an increase in TSI of about 2 W/m2 from the Maunder minimum to the 1991 activity peak . This TSI and solar magnetic field variation modulates the earths albedo via the GR flux and cloud cover. From the difference between the upper and lower quintiles of Fig 4 (in link below) a handy rule of thumb a la Fermi would conveniently equate this to a Northern Hemisphere temperature millennial cycle amplitude of about 2 degrees C with that amount of cooling probable by 2,650+/-.
    https://1.bp.blogspot.com/-8lZ4DCHLT00/W9y9yozeuFI/AAAAAAAAArk/OY8SjWNg4R8DxBeAqoElxF1QbTuJJO3KACLcBGAs/s1600/Tropical%2Bcoud%2Bcover.jpg
    Fig 2
    The MTP in cloud cover was at about 2000.
    https://1.bp.blogspot.com/-oIYE3I2Yf-Y/XYjzkje_faI/AAAAAAAAAt4/avAFWd57XMUeuN2oM-6OPkCxL7fCgjjPwCNcBGAsYHQ/s1600/NeutronCount2018mar.gif
    Fig 3
    The decline in solar activity (increase in neutron count ) since the 1991 solar activity MTP is seen in the Oulu neutron count.
    The establishment’s dangerous global warming meme, the associated IPCC series of reports ,the entire UNFCCC circus, the recent hysterical IPCC SR1.5 proposals and Nordhaus’ recent Nobel prize are founded on two basic errors in scientific judgement. First – the sample size is too small. Most IPCC model studies retrofit from the present back for only 100 – 150 years when the currently most important climate controlling, largest amplitude, solar activity cycle is millennial. This means that all climate model temperature outcomes are too hot and likely fall outside of the real future world. (See Kahneman -. Thinking Fast and Slow p 118) Second – the models make the fundamental scientific error of forecasting straight ahead beyond the Millennial Turning Point (MTP) and peak in solar activity which was reached in 1991.These errors are compounded by confirmation bias and academic consensus group think.
    See the Energy and Environment paper The coming cooling: usefully accurate climate forecasting for policy makers.http://journals.sagepub.com/doi/full/10.1177/0958305X16686488
    and an earlier accessible blog version at http://climatesense-norpag.blogspot.com/2017/02/the-coming-cooling-usefully-accurate_17.html See also https://climatesense-norpag.blogspot.com/2018/10/the-millennial-turning-point-solar.html
    and the discussion with Professor William Happer at http://climatesense-norpag.blogspot.com/2018/02/exchange-with-professor-happer-princeton.ht

    • Pat,

      I agree that analyses often under-value cycles; under-value natural ebbs and flows that can influence climate patterns.

      OTOH, I think you may be over-valuing the cycles. Even you fudge with phrases like “quasi-cyclic processes” and “possibly multi-millennial, millennial, centennial and decadal time scales”. Other than Milankovitch cycles that can be predicted based on celestial mechanics, other cycles like sunspots, el Nino, AMO, PDO are not predictable enough to make definitive predictions about the future.

      Your own admonition “the sample size is too small” could be applied to your own analysis. As far as I can tell, you have data for *one* 990 year “cycle” and from this you assume a similar pattern in the next 990 years. Without 3+ clear, repeating cycles (or a strong theoretical basis to predict a cycle of specific amplitude and period), there is no reason to expect the cycle to repeat (or to expect that it is a cycle to begin with).

      Also, any prediction of future behavior is predicated on conditions remaining similar so that similar cycles will persist. The changes in CO2 in the past ~ 100 years are not natural and will have some warming effect. So at a minimum, your cooling theory will compete with CO2 warming. You do not have sufficient data nor sufficient theory nor sufficient knowledge of future CO2 levels to definitively state whether the warming effect or the cooling effect will have a greater influence.

      • Tim /Pat? You obviously didn’t check the data in the links See Fig 2 at
        http://climatesense-norpag.blogspot.com/2017/02/the-coming-cooling-usefully-accurate_17.html
        “Fig. 2 shows that Earth is past the warm peak of the current Milankovitch interglacial and has been generally cooling for the last 3,500 years.
        https://1.bp.blogspot.com/-XYWXyMlGqzg/WKMvovC9K0I/AAAAAAAAAio/8Q0DP7pXqqgRqLGaU0HGTXyY1LiikEgFACLcB/s1600/GISP2%252520TemperatureSince10700%252520BP%252520with%252520CO2%252520from%252520EPICA%252520DomeC.GIF
        Fig. 2 Greenland Ice core derived temperatures and CO2 from Humlum 2016 (8)
        The millennial cycle peaks are obvious at about 10,000, 9,000, 8,000, 7,000, 2,000, and 1,000 years before now as seen in Fig. 2 (8) and at about 990 AD in Fig. 3 (9). It should be noted that those believing that CO2 is the main driver should recognize that Fig. 2 would indicate that from 8,000 to the Little Ice Age CO2 must have been acting as a coolant.

        • Dr Norman,

          Figure 2 shows a variety of peaks of various sizes and periods. Some of the major recent peaks are:
          1000 years ago (1000 year period)
          1600 years ago (600 year period)
          2000 years ago (400 year period)
          3300 years ago (1300 year period)
          There is no clear pattern I see. A power spectrum like in figure 6 would be very illuminating. Have you done such an analysis?

          • I think the Holocene peaks mentioned above are clearly there.
            Fig 6A spectral analysis covers the entire Holocene and the same spectrum peak is also seen in the Miocene in 6B .As the paper says – also “Kern 2012 (19) presents strong evidence for the influence of solar cycles during the Holocene and in a Late Miocene lake system. It is noteworthy that the Millennial periodicity is persistent and identifiable throughout the Holocene Figs. 2 and 6 and in the Miocene – 10.5 million years ago Fig.6. The prominent Millennial unnamed peak in Fig. 6a above is also seen in Scaffetta’s Fig. 10 in the C-14 data (20) and is correlated with the Eddy cycle with a suggested period of 900 to 1050 years. ( Ref 20 -Fig10 spectral analysis)

            Scaffetta N, Milani F, Bianchini A, . On the astronomical origin of the Hallstatt oscillation found in radiocarbon and climate records throughout the Holocene. Earth Sci Rev 2016; 162: 24–43. Google Scholar CrossRef
            Looks reasonably solid to me especially when correlated with the 1991 Millennial solar activity peak.( with 12 year +/- delay because of the temperature inertia of the oceans.)

          • Dr Norman,

            1) The ~1000 year periodicity in Figure 6 is for *sunspots*. This only vaguely related to temperature.

            2) The data for the past ~ 10,000 years (Figure 2) shows no predictable pattern for temperature spikes. There are some peaks about 1000 years apart; there are peaks closer together; there are peaks farther apart. There simply is not a strong enough pattern that can be used to predict what should happen next.

            3) Many of the peaks are much larger than the current rise seen in Figure 2. By that reasoning, there should be be a large uptick yet before cooling sets in.

            4) CO2 is a wildcard. CO2 levels have dramatically and artificially risen in the past century. This alone could be enough to disrupt subtle 1000 year cycles. If nothing else, it could cause warming to offset the cooling that might have happened.

        • 1. Yes the periodicity is related to solar activity cycles just as I am saying. There is about a 12/13 year delay between the solar activity peak in 1991 Fig 3 above and the temperature trend millennial turning point at 2003/4 Fig 4 in the paper link.
          2. When you have multiple variables of different wave lengths their actions are sometimes additive sometimes subtractive . The millennial peaks appear often enough to strongly show their existence. The 990 and 2003 millennial peaks and turning points are well marked.
          3.Yes. You didn’t notice that the ice core data ended at 1813. Since then the earth has indeed warmed and is now very close to the Millennial peak at 1000 +/- See Fig 7 a and d in the 2017 paper.
          4. There is no indication that CO2 has much effect on temperature but an increase on CO2 certainly” greens ” the earth.
          From the 2017 paper.
          The IPCC AR4 SPM report section 8.6 deals with forcing, feedbacks and climate sensitivity. It recognizes the shortcomings of the models. Section 8.6.4 concludes in paragraph 4 (4): “Moreover it is not yet clear which tests are critical for constraining the future projections, consequently a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed”
          What could be clearer? The IPCC itself said in 2007 that it doesn’t even know what metrics to put into the models to test their reliability. That is, it doesn’t know what future temperatures will be and therefore can’t calculate the climate sensitivity to CO2. This also begs a further question of what erroneous assumptions (e.g., that CO2 is the main climate driver) went into the “plausible” models to be tested any way. The IPCC itself has now recognized this uncertainty in estimating CS – the AR5 SPM says in Footnote 16 page 16 (5): “No best estimate for equilibrium climate sensitivity can now be given because of a lack of agreement on values across assessed lines of evidence and studies.” Paradoxically the claim is still made that the UNFCCC Agenda 21 actions can dial up a desired temperature by controlling CO2 levels. This is cognitive dissonance so extreme as to be irrational. There is no empirical evidence which requires that anthropogenic CO2 has any significant effect on global temperatures. “

  44. One more voice showing evidence of a flaw in Mr Björklund’s argument.

    Grant Foster aka Tamino confirms below, among others, Tim Folkerts’ opinion. You may appreciate Tamino or not. But his ability to accurately show statistic flaws is undeniable.

    Let me quote him:

    I’ve plotted the rate according to the 6th-degree-polynomial model as a blue (rather than green) line, and again I’ve added light blue shading to indicate the uncertainty range (95% confidence interval). And since he mentioned the HadCRUT4 data go back to 1850, I’ll show all of it:

    https://tamino.files.wordpress.com/2019/11/cru_vpoly6.jpg

    Look at the uncertainty range before the year 1870 and after the year 2000. According to the 6th-degree polynomial model, the warming rate right now might be as high as 0.044 °C/yr — that’s 4.4 °C/century. Do we really have evidence that the warming rate declined since 2006?

    Of course not. The uncertainty range explodes near the ends of the time span. As I’ve mentioned in the past, this is one of the characteristic drawbacks of using high-order polynomials to approximate a long term trend. Far from the endpoints they do a great job, but their endpoint behavior is a disaster. The higher the order, the worse the disaster. Especially when estimating rates, not just values.

    We see here

    https://tamino.files.wordpress.com/2019/11/cru_v.jpg

    that there are more correct ways to show what happens.

    *
    But beyond this inaccurate use of polynomials, I feel also disturbed by this eternal global cooling claim ‘since 2016′.

    Apart from the fact that the time span since 2016 is by far too small to allow for such claims, they ignore even the recent past.

    If we look at a comparison of HadCRUT4 (surface) with UAH6.0 LT (lower troposphere) we see of course at the plots’ right end a clear hint to a temperature decrease:

    https://drive.google.com/file/d/16kkyvMfCvWmAJXE9U0GdGdAt8_G6atC8/view

    No doubt! But a closer look at the plots shows that the same situation is visible after the El Nino event in 1997/98.

    This becomes even more visible when we extract, for both 1997/98 and 2015/16, identical periods out of the time series, and superpose them by letting them both begin at zero, btw getting rid of the warming difference between the two events:

    1. HadCRUT:
    https://drive.google.com/file/d/1O4p-M9wTvaweGzVxauecZyRura652zkh/view

    2. UAH:
    https://drive.google.com/file/d/1y1zmzMt_1gD5jxCOH13UVYvbocYulbNz/view

    We see that within both time series, the relative anomalies increase after the decrease usually subsequent after a strong El Nino.

    This time series’ behavior similarity is amazing, and should motivate us to wait for a couple of years before deciding wether or not the Globe moves toward the pretended cooling.

    *
    Last not least, these relative anomaly comparisons show us moreover that there is, as shown by the MEI index anyway

    https://www.esrl.noaa.gov/psd/enso/mei/

    no reason to pretend that the 2015/16 El Nino edition was stronger than that of 1997/98. This claim is probably due to the fact that many people confound UAH and El Nino.

    • The numbers from the derivative are probably smaller than the possible error range. TSTM. The main take home is not the numbers but lack of a correlation with an increasing CO2 concentration.

  45. It’s hard to imagine a more flawed analysis of climatic time series than curve-fitting a sixth-degree polynomial to the HADCRUT monthly anomalies, followed by first differencing and rescaling to obtain the decadal rate of change. Contrary to naive assumptions, such curve-fitting doesn’t actually “remove” any noise from the data; it indiscriminately eliminates high-frequency components in a inconsistent, nondescript way. Nor does the subsequent first-differencing produce a rationally smoothed version of the “first derivative” of actual temperature variations; this quaint metric is largely an artifact of the arbitrary curve fitting.

    The recently announced aim to make WUWT more scientifically respectable is not served well by providing, one again, a platform for sheer analytic ineptitude. Even Tamino is more credible: https://tamino.wordpress.com/2019/11/15/back-to-basic-climate-denial/#more-10979

    • I could hand draw the trendline with a crayon and get the same results. The data are well-behaved. If you do not like the end-effects, knock off 30 years on each end. I think you are picking nits. With your logic, we would forever be a few years away from an answer.

  46. Look, Facts don’t matter when it comes down to science. Repeat: Facts don’t matter!!! Look at those folks that are Global Warmers that cite science as their authority. No, facts just don’t matter when it comes to science. My “facts” trump all other ones and the perpetrators fully believe that. They are just as sincere as “deniers” but terribly misguided. Unfortunately it will take a long time to unravel the science and produce a “consensus” just as the hoax of the Piltdown Man did.

  47. On the subject is the atmosphere increasing in temperature. I give this a “possibly”.

    On the issue is CO2 influencing temperature. I give this a “not from the available measured data and information from the atmosphere”. If carbon dioxide is responsible, I am unable to explain how an increasing atmospheric CO2 content can 1) decrease atmospheric temperature, 2) maintain a constant atmospheric temperature, and 3) increase atmospheric temperature?

    Years Temperature CO2 Content
    % change % change
    1850 – 1876 zero +1.6
    1876 – 1878 + 0.15 +0.2
    1878 – 1911 – 0.20 +3.4
    1911 – 1929 + 0.07 +2.0
    1929 – 1938 + 0.12 +0.8
    1938 – 1976 – 0.08 +7.8
    1976 – 1997 + 0.22 +9.3
    1997 – 2013 zero +9.0
    2013 – 2016 + 0.10 +1.7
    2016 – 2018 – 0.10 +1.0

    Observations on Carbon Dioxide
    > increases during all time periods
    > increases are not linear
    > the changes are an order of magnitude greater than changes of Parameter A

    Observations on Parameter A
    > 3 time periods totalling 73 years (42% of the time) have a negative change
    > 2 time periods totalling 42 years (25% of the time) have no change
    > 5 time periods totalling 53 years (33% of the time) have a positive change
    > the various change, or no change, periods are randomly distributed over the 168 years

    Mean Global Annual Average Temperature from UK Meteorological Office (Hadley Centre): http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/time_series/HadCRUT.4.6.0.0.annual_ns_avg.txt

    Annual Average CO2 content from USA’s EPA: http://www.epa.gov/sites/production/files/2016-08/ghg-concentrations_fig-1.csv

    Significant corroborating evidence exists to support the observed changes in temperature. A couple of example:
    Mt. Baker glaciers in NW Washington State, USA, reached their maximum extent of the past century in 1915 which is consistent with the cooling period from 1878 to 1911. They then retreated during the subsequent warming period. The following cooling period saw strong advancement of the glaciers. Warming following 1976 resulted in retreat. The extent of the glaciers now is about the same as in 1950.
    In the other direction, the increases during 1911 to 1938 was caused by anomalous tropical sea surface temperatures which extended warm waters into the far north Atlantic and cooler waters into the eastern Pacific oceans. See Schubert et al (2004) NASA. 30 of the 50 US States and 9 of the 10 Canadian Provinces set record high temperatures during the 1911 to 1938 period which to this point in time have never been exceeded.

    Conclusively, atmospheric CO2 has no influence on atmospheric temperature.

    • B Sanders

      Like so many, you forget the action of the oceans, which – in our minds arbitrarily – store and release warmth, CO2 or both at a time.

      That the results of GCM models doing that do not fit to what we think they should is do: that is imho our problem.

      It makes absolutely no sense to keep on surface temperatures and to compare them with atmospheric concentrations of CO2.

      The first you should add is for example:
      http://www.data.jma.go.jp/gmd/kaiyou/english/ohc/ohc_global_en.html

        • B Sanders

          “So the oceans influence the atmosphere’s temperature, and not CO2.”

          If you say so!

          But it was of course not what had to be interpreted out of what I wrote 🙂

  48. We know now. We always somehow knew but it starts to become overwhelming. I am waiting for decision-makers in the dock charges being pressed against the, For having exposed the developed bits of the world to economic doom and calamity on the basis of a scam. Politicians will not want to bear the brunt so they will start to look for scapegoats as they always do. Have fun watching the hyenas fighting among themselves.

  49. This figure might help to explain some of the problems using Excel trend line equations as well as showing 6 polynomials for the Harcrut Data.
    https://drive.google.com/file/d/1S84N92lhRaUezwD-IxQ-96M49IzNm8DN/view?usp=sharing
    MicroSoft limit the number of digits printed out in the trendline equations just to save space, I think!
    The coefficients can be made more accurate as explained on the figure but if the size of text on the graph is changed it seems not to work so increase accuracy before enlarging the equation. The problem in using the equation is exacerbated when using large numbers, like dates, so I tend to reduce the dates by subtracting, say 1800, as most data is post 1800.

Comments are closed.