170 Years of Earth Surface Temperature Data Show No Evidence of Significant Warming

Author: Thomas K. Bjorklund, University of Houston, Dept. of Earth and Atmospheric Sciences, Science & Research

[Notice: This November 2019 post is now updated with multiple changes on 5/3/2020 to address many of the issues noted since the original posting]

Key Points

  1. From 1850 to the present, the noise-corrected, average warming of the surface of the earth is less than 0.07 degrees C per decade, possibly as low as 0.038 degrees C per decade.
  2. The rate of warming of the surface of the earth does not correlate with the rate of increase of fossil fuel emissions of CO2 into the atmosphere.
  3. Recent increases in surface temperatures reflect 40 years of increasing intensities of the El Nino Southern Oscillation climate pattern.

Abstract

This study investigates relationships between surface temperatures from 1850 to the present and reported long-range temperature predictions of global warming. A crucial component of this analysis is the calculation of an estimate of the warming curve of the surface of the earth. The calculation removes errors in temperature measurements and fluctuations due to short- duration weather events from the recorded data. The results show the average rate of warming of the surface of the earth for the past 170 years is less than 0.07 degrees C per decade, possibly as low as 0.038 degrees C per decade. The rate of warming of the surface of the earth does not correlate with the rate of increase of CO2 in the atmosphere. The perceived threat of excessive future global temperatures may stem from misinterpretation of 40 years of increasing intensities of the El Nino Southern Oscillation (ENSO) climate pattern in the eastern Pacific

Ocean. ENSO activity culminated in 2016 with the highest surface temperature anomaly ever recorded. The rate of warming of the earth’s surface has declined 45 percent since 2006.

Introduction

The results of this study suggest the present movement to curtail global warming may by premature. Both the highest ever recorded warming currents in the Pacific Ocean and

technologically advanced methods to collect ocean temperature data from earth orbiting satellites coincidently began in the late 1970s. This study describes how the newly acquired high-resolution temperature data and Pacific Ocean transient warming events may have convolved to result in long-range temperature predictions that are too high.

HadCRUT4 Monthly Temperature Anomalies

The HadCRUT.4.6.0.0 monthly medians of the global time series of temperature anomalies, Column 2, 1850/01 to 2019/08 (Morice, C. P., et. al. 2012) was used for this report together with later monthly data to 2020/02. Only since 1979 have high-resolution satellites provided

simultaneously observed data on properties of the land, ocean and atmosphere (Palmer, P.I., 2018). NOAA-6 was launched in December 1979 and NOAA-7 was launched in 1981. Both were equipped with microwave radiometry devices (Microwave Sounding Unit-MSU) to precisely monitor sea-surface temperature anomalies over the eastern Pacific Ocean and the areas of ENSO activity (Spencer, et al., 1990). These satellites were among the first to use this technology.

The initial analyses of the high-resolution satellite data yielded a remarkable result. Spencer, et al. (1990), concluded the following: “The period of analysis (1979–84) reveals that Northern and Southern hemispheric tropospheric temperature anomalies (from the six-year mean) are positively correlated on multi-seasonal time scales but negatively correlated on shorter time scales. The 1983 ENSO dominates the record, with early 1983 zonally averaged tropical temperatures up to 0.6 degrees C warmer than the average of the remaining years. These natural variations are much larger than that expected of greenhouse enhancements and so it is likely that a considerably longer period of satellite record must accumulate for any longer-term trends to be revealed”.

Karl, et al. (2015) claim that the past 18 years of stable global temperatures is due to the use of biased ocean buoy-based data. Karl, et al. state that a “bias correction involved calculating the average difference between collocated buoy and ship SSTs. The average difference globally was

−0.12°C, a correction that is applied to the buoy SSTs at every grid cell in ERSST version 4.” This analysis is not consistent with the interpretation of the past 18-year pause in global warming. The discussion below of the first derivative of a temperature anomaly trendline shows the rate of increase of relatively stable and nearly noise-free temperatures peaked in 2006 and has since declined in rate of increase to the present.

The following is a summary of conclusions by Karl, et al. (2015) (called K15 below) by Mckitrick (2015): “All the underlying data (NMAT, ship, buoy, etc.) have inherent problems and many teams have struggled with how to work with them over the years. The HadNMAT2 data are

sparse and incomplete. K15 take the position that forcing the ship data to line up with this dataset makes them more reliable. This is not a position other teams have adopted, including the group that developed the HadNMAT2 data itself. It is very odd that a cooling adjustment to SST records in 1998-2000 should have such a big effect on the global trend, namely wiping out a hiatus that is seen in so many other data sets, especially since other teams have not found reason to make such an adjustment. The outlier results in the K15 data might mean everyone else is missing something, or it might simply mean that the new K15 adjustments are invalid.”

Mears and Wentz (2016) discuss adjustments to satellite data and their new dataset, which “shows substantially increased global-scale warming relative to the previous version of the dataset, particularly after 1998. The new dataset shows more warming than most other middle tropospheric data records constructed from the same set of satellites.” The discussion below shows the warming curve of the earth has been decreasing in rate of increase of slope since July 1988; that is, the curve is concave downward. Based on this observation alone, their new dataset should not show “substantially increased global-scale warming.”

Analysis of Temperature Anomalies-Case 1

All temperature measurements used in this study are calculated temperature anomalies and not absolute temperatures. A temperature anomaly is the difference of the absolute measured temperature from a baseline average temperature; in this case, the average annual mean temperature from 1961 to 1990. This conversion process is intended to minimize the effects on temperatures related to the location of the measurement station (e.g., in a valley or on a mountain top) and result in better recognition of regional temperature trends.

In Figure 1, the black curve is a plot of monthly mean surface temperature anomalies. The jagged character of the black temperature anomaly curve is data noise (inaccuracies in measurements and random, short term weather events). The red curve is an Excel sixth-degree polynomial best fit trendline of the temperature anomalies. The curve-fitting process removes high-frequency noise. The green curve, a first derivative of the trendline, is the single most important curve derived from the global monthly mean temperature anomalies. The curve is a time-series of the month-to-month differences in mean surface temperatures in units of degrees Celsius change per month. These very small numbers are multiplied by 120 to convert the units to degrees per decade (left vertical axis of the graph). Degrees per decade is a measure of the rate at which the earth’s surface is cooling or warming; it is sometimes referred to as the warming (or cooling) curve of the surface of the earth. The green curve temperature values are close to the values of noise-free troposphere temperature estimates determined at the University of Alabama in Huntsville for single points (Christy, J. R. May 8, 2019). The green curve has not previously been reported and adds a new perspective to analyzes of long-term temperature trends.

Figure 1. The black curve is the HadCRUT4 time series of the mean monthly global land and sea surface temperature anomalies, 1850-present. Anomalies are deviations from the 1961-1990 annual mean temperatures in degrees Celsius. The red curve is the trendline of the HadCRUT4 data set, an Excel sixth-degree polynomial best fit of the temperature anomalies. The green curve is the first derivative of the trendline converted from units of degrees C per month to degrees C per decade, that is; the slope of the trendline curve. The green curve is the warming curve of the surface of the earth. The two solid blue circles are warming values of the troposphere calculated at the University of Alabama in Huntsville from global energy balance studies.

In a recent talk, John Christy, director of the Earth System Science Center at the University of Alabama in Huntsville, reported estimates of noise-free warming of the troposphere in 1994 and 2017 of 0.09 and 0.095 degrees C per decade, respectively (Christy, J. R., May 8, 2019). These values were estimated from global energy balance studies of the troposphere by Christy and McNider using 15 years of newly acquired global satellite data in 1994 (Christy, J. R., and R.

T. McNider, 1994) and, a repeat of the 1994 study in 2017 with nearly 40 years of satellite data (Christy, J. R., 2017). From this work, using two points derived from the 1994 data and the 2017 data they concluded the earth warming in the troposphere for the last 40 years was approximately a straight line that sloped 0.095 degrees per decade. They call this curve the “tropospheric transient climate response”, that is, “how much temperature actually changes due to extra greenhouse gas forcing.”

The green curve in Figures 1 and 3 could be called the earth surface transient climate response after Christy, although some longer wave-length noise from 40 years of intense ENSO activity remains in the data. The 2017 average value for the green curve is 0.154: this value is 0.059 degrees per decade higher than the UAH estimate for the troposphere. The latest value in February 2020 for the green curve is 0.117 degrees C per decade. The average degrees C per decade value of earth warming based on the green curve over 2,032 months since 1850 is 0.068 degrees C per decade. The average from 1850 through 1979, the beginning of the most recent ENSO, is 0.038 degrees C per decade, a value too small to measure.

A warming rate of 0.038 degrees C per decade would need to significantly increase or decrease to support a prediction of a long-term change in the earth’s surface temperature. If the earth’s surface temperature increased continuously starting today at a rate of 0.038 degrees C per decade, in 100 years the increase in the earth’s temperature would be only 0.4 degrees C., which is not indicative of a global warming threat to humankind.

The 0.038 degrees C per decade estimate is likely beyond the accuracy of the temperature measurements. Recent statistical analyses conclude that 95% uncertainties of global annual mean surface temperatures range between 0.05 degrees C to 0.15 degrees C over the past 140 years; that is, 95 measurements out 100 are expected to be within the range of uncertainty estimates (Lenssen, N. J. L., et al. 2019). Very little measurable warming of the surface of the earth has occurred from 1850 to 1979.

In Figure 2, the green curve is the warming curve; that is, a time series of the rate of change of the temperature of the surface of the earth in degrees per decade. The blue curve is a time series of the concentration of fossil fuel emissions of CO2 in units of million metric tons of carbon in the atmosphere. The green curve is generally level from 1900 to 1979 and then rises slightly due to lower frequency noise remaining in the temperature anomalies from 40 years of ENSO activity. The warming curve declined since early 2000 to the present. The concentration of CO2 increased steadily from 1943 to 2019. There is no correlation between a rising CO2 concentration in the atmosphere and a relatively stable, low rate of warming of the surface of the earth from 1943 to 2019.

Figure 2. The green curve is the first derivative of the trendline converted to units of degrees C per decade, that is, the rate of change of the surface temperature of the earth. See Figure 1 for the same curve along with the temperature anomalies curve and the trendline curve. The blue dotted curve showing total carbon emissions from fossil fuels in the atmosphere is modified from Boden, T. A., et al. (2017); the time frame shows only emissions since 1900. There is no correlation between increase in the concentration of carbon in the atmosphere and the surface temperature of the earth.

In Figure 3, the December 1979 temperature spike (Point A) is associated with a weak El Nino event. During the following 39 years, several strong to very strong intensity El Nino events (single temperature spikes in the curve) are recorded; the last one, in February 2016, the highest ever recorded mean global monthly temperature anomaly of 1.111 degrees C (Goldengate Weather Services (2019). Since then, monthly global temperature anomalies declined over 23 percent to a temperature of 0.990 degrees C in February 2020 as the ENSO decreased in intensity.

Figure 3. An enlarged portion of Figure 1 from 1963 to 2019 with vertical scale enlarged more than the horizontal scale to emphasize important changes in the shape of the green curve.

Points A, B and C mark very significant changes in the shape of the green warming curve (left vertical axis).

  1. The green curve values increased each month from 0.088 degrees C per decade in December 1979 (Point A) to 0.136 degrees C per decade in July 1988 (Point B); this is a 60 percent increase in rate of warming in nearly 9 years. The warming curve is concave upward. Point A marks a weak El Nino and the beginning of increasing ENSO intensities.
  2. From July 1988 to September 2006, the rate of warming increased from 0.136 degrees C per decade to 0.211 degrees per decade (Point C); this is a 55 percent increase in 18 years but about one-half the total rate of the previous 9 years because of a decrease in the rate of increase each month. The July 1988 point on the x-axis is an inflection point at which the warming curve becomes concave downward.
  3. September 2006 (Point C) marks a very strong El Nino and the peak of the nearly 40- year ENSO transient warming trend, imparting a lazy S shape to the green curve. The rate of warming has declined every month since peaking at 0.211 degrees per decade in September 2006 to 0.117 in February 2020; this is nearly a 45 percent decrease in 14 years. When the green curve reaches a value of zero on the left vertical axis, the absolute temperature of the surface of the earth will begin to decline, and the derivative of the red curve will be negative. The earth will be cooling. That point could be reached within the next decade.

The premise of this analysis is the rate of increase of surface temperatures over most of the past 40 years reflects the effects of the largest ENSO ever recorded, a transient climate event. Since September 2006 (Figure 3), the rate of increase in surface temperatures has slowly decreased as the intensity of the ENSO has decreased. The derivative of the red temperature trendline, that is; the green curve, does not remove all transient noise during the past 40 years

of ENSO activity. The curve shows a slight increase and decrease in rates of change of temperatures during that period. Nevertheless, the continuous slowing of the rates of increase in surface temperatures since September 2006 is highly significant and should be accounted for in long-term earth temperature forecasts.

Analysis of Temperature Anomalies-Case 2

Scientists at NASA’s Goddard Institute for Space Studies (GISS) updated their Surface Temperature Analysis (GISTEMP v4) on January 14, 2020 (https://www.giss.nasa.gov/). To help validate the methodology of the HadCRUT4 earth temperature data analysis described above, a similar analysis was carried out using NASA earth temperature data and NASA’s derivation of the temperature trendline.

Figure 4 is comparable to Figure 1 but derived from a different data set. The NASA land-ocean temperature anomaly data extend from 1880 to the present with a 30-year base period from 1951- 1980. The solid black line in Figure 4 is the global annual mean temperature, and the solid red line (trendline through the temperature data) is the five-year Lowess Smooth. Lowess Smooth (https://www.statisticshowto.datasciencecentral.com/lowess-smoothing/) creates a smooth line through a scatter plot to determine a trend as does the Excel sixth-degree polynomial best fit method.

Figure 4. The black curve is the NASA GISTEMP v4 time series of the mean annual global land and sea surface temperature anomalies, 1880-present. Anomalies are deviations from the 1951-1980 annual mean temperatures in degrees C. The red curve is the five-year lowess smooth, best fit of the temperature anomalies. The green curve is the first derivative of the trendline converted from units of degrees C per year to degrees C per decade, that is; the slope of the trendline curve. The green curve is the warming curve of the surface of the earth.  The two solid blue circles locate single point warming values of the troposphere calculated at the University of Alabama in Huntsville from global energy balance studies.

The director of the Earth System Science Center at the University of Alabama in Huntsville reported estimates of noise-free warming of the troposphere in 1994 and 2017 of 0.09 and 0.095 degrees C per decade, respectively, located by the solid blue circles on Figure 1 (Christy,

J. R., May 8, 2019). The rates of warming of the earth surface estimated from the derivatives of the red temperature trendline shown on Figure 1 and Figure 4 are 0.078 (average of 170 years of data) and 0.068 (average of 138 years of data) degrees C per decade, respectively. These temperature estimates are probably too high because not all noise from several decades of strong ENSO activity has been removed from the raw temperature data. Before the beginning of the current ENSO in 1979, the average rate of warming from 1850 through 1979 estimated from the derivatives of the red temperature trendline is 0.038 degrees C per decade, possibly the best estimate of the long-term rate of warming of the earth.

Truth and Consequences

The “hockey stick graph”, which had been cited by the media frequently as evidence for out-of- control global warming over the past 20 years, is not supported by the current temperature record (Mann, M., Bradley, R. and Hughes, M. 1998). The graph is no longer seen in the print media.

None of 102 climate models of the mid-troposphere mean temperature comes close enough to predicting future temperatures to warrant drastic changes in environmental policies. The models start in the 1970s at the beginning of a time period that culminated in the strongest ENSO ever recorded and by 2015, less than 40 years, the average predicted temperature of all the models is nearly 2.4 times greater than the observed global tropospheric temperature anomaly in 2015 (Christy, J. R. May 8, 2019). The true story of global climate change has yet to be written.

The peak surface warming during the ENSO was 0.211 degrees C per decade in September 2006. The highest global mean surface temperature ever recorded was 1.111 degrees C in February 2016; these occurrences are possibly related to the increased quality and density of ocean temperature data from the two, earth orbiting MSU satellites described previously rather than indicative of significant long-term increase in the warming of the earth. Earlier large intensity ENSO events may not have been recognized due to the absence of advanced satellite coverage over oceans.

The use of a temperature trendline to remove high frequency noise did not eliminate the transient effects of the longer wavelength components of ENSO warming over the past 40 years; so, estimates of the rate of warming for that period in this study still include background noise from the ENSO. A noise-free signal for the past 40 years probably lies closer to 0.038 degrees C per decade, the average rate of warming from 1850 to the beginning of the ENSO in 1979 than the average rate from 1979 to the present, 0.168 C degrees per decade. The higher number includes uncorrected residual ENSO effects.

Foster and Rahmstorf (2011) used average annual temperatures from five data sets to estimate average earth warming rates from 1979 to 2010. Noise removed from the raw mean annual temperature data is attributed to ENSO activities, volcanic eruptions and solar variations. The result is said to be a noise-adjusted temperature anomaly curve. The average warming rate of the five data sets over 32 years is 0.16 degrees C per decade compared to 0.17 degrees C per decade determined by this study from 384 monthly points derived from the derivative of the temperature trendline. Foster and Rahmstorf (2011) assume the warming trend is linear based on one averaged estimate, and their data cover only 32 years. Thirty years is generally considered to be a minimum period to define one point on a trend. This 32-year time period includes the highest intensity ENSO ever recorded and is not long enough to define a trend. The warming curve in this study is curvilinear over nearly 170 years (green curve on Figures 1 and 3) and is defined by 2,032 monthly points derived from the temperature trendline derivative.

From 1979 to 2010, the rate of warming ranges from 0.08 to 0.20 degrees C per decade. That trend is not linear.

Conclusions

The perceived threat of excessive future temperatures may stem from an underestimation of the unusually large effects of the recent ENSO on natural global temperature increases. Nearly 40 years of natural, transient warming from the largest ENSO ever recorded may have been

misinterpreted to include significant warming due to anthropogenic activities. All warming estimates are theoretical and too small to measure. These facts are indisputable evidence global warming of the planet is not a future threat to humankind.

The scientific goal must be to narrow the range of uncertainty of predictions with better data and better models before prematurely embarking on massive infrastructure projects. A rational environmental protection program and a vibrant economy can co-exist. The challenge is to allow scientists the time and freedom to work without interference from special interests. We have the time to get the science of climate change right. This is not the time to embark on grandiose projects to save humankind, when no credible threat to humankind has yet been identified.

Acknowledgments and Data

All the raw data used in this study can be downloaded from the HadCRUT4 and NOAA websites. http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/series_format.html https://research.noaa.gov/article/ArtMID/587/ArticleID/2461/Carbon-dioxide-levels-hit- record-peak-in-May

References

  1. Boden, T.A., Marland, G., and Andres, R.J. (2017). National CO2 Emissions from Fossil- Fuel Burning, Cement Manufacture, and Gas Flaring: 1751-2014, Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, doi:10.3334/CDIAC/00001_V2017.
  2. Christy, J. R., and R. T. McNider, 1994: Satellite greenhouse signal. Nature, 367, 325.
  3. Christy, J. R., 2017: Lower and mid-tropospheric temperature. [in State of the Climate 2016]. Bull. Amer. Meteor. Soc., 98, 16, doi:10.1175/ 2017BAMSStateoftheClimate.1.
  4. Christy, J. R., May 8, 2019. The Tropical Skies Falsifying Climate Alarm. Press Release, Global Warming Policy Foundation. https://www.thegwpf.org/content/uploads/2019/05/JohnChristy-Parliament.pdf
  5. Foster, G. and Rahmstorf, S., 2011. Environ. Res. Lett. 6044022
  6. Goddard Institute for Space Studies. https://www.giss.nasa.gov/
  7. Golden Gate Weather Services, Apr-May-Jun 2019. El Niño and La Niña Years and Intensities.
  8. https://ggweather.com/enso/oni.htm
  9. HadCrut4dataset. http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/series_format.html
  10. Karl, T. R., Arguez, A., Huang, B., Lawrimore, J. H., McMahon, J. R., Menne, M. J., et al.
  11. Lenssen, N. J. L., Schmidt, G. A., Hansen, J. E., Menne, M. J., Persin, A., Ruedy, R, et al. (2019). Improvements in the GISTEMP Uncertainty Model. Journal of Geophysical Research: Atmospheres, 124, 6307–6326. https://doi.org/10. 1029/2018JD029522
  12. Lowess Smooth. https://www.statisticshowto.datasciencecentral.com/lowess- smoothing/
  13. Mann, M., Bradley, R. and Hughes, M. (1998). Global-scale temperature patterns and climate forcing over the past six centuries. Nature, Volume 392, Issue 6678, pp. 779- 787.
  14. Mckitrick, R. Department of Economics, University of Guelph. http://www.rossmckitrick.com/uploads/4/8/0/8/4808045/mckitrick_comms_on_karl2 015_r1.pdf, A First Look at ‘Possible artifacts of data biases in the recent global surface warming hiatus’ by Karl et al., Science 4 June 2015
  15. Mears, C. and Wentz, F. (2016). Sensitivity of satellite-derived tropospheric temperature trends to the diurnal cycle adjustment. J. Climate. doi:10.1175/JCLID- 15-0744.1. http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-15-0744.1?af=R
  16. Morice, C. P., Kennedy, J. J., Rayner, N. A., Jones, P. D., (2012). Quantifying uncertainties in global and regional temperature change using an ensemble of observational estimates: The HadCRUT4 dataset. Journal of Geophysical Research, 117, D08101, doi:10.1029/2011JD017187.
  17. NOAA Research News: https://research.noaa.gov/article/ArtMID/587/ArticleID/2461/Carbon-dioxide-levels- hit-record-peak-in-May June 4, 2019.
  18. Palmer, P. I. (2018). The role of satellite observations in understanding the impact of El Nino on the carbon cycle: current capabilities and future opportunities. Phil. Trans. R. Soc. B 373: 20170407. https://royalsocietypublishing.org/doi/10.1098/rstb.2017.0407.
  19. Perkins, R. (2018). https://www.caltech.edu/about/news/new-climate-model-be-built- ground-84636
  20. Science 26 June 2015. Vol. 348 no. 6242 pp. 1469-1472. http://www.sciencemag.org/content/348/6242/1469.full
  21. Spencer, R. W., Christy, J. R. and Grody, N. C. (1990). Global Atmospheric Temperature Monitoring with Satellite Microwave Measurements: Method and Results 1979–84.

Journal of Climate, Vol. 3, No. 10 (October) pp. 1111-1128. Published by American Meteorological Society.

Similar material on the WUWT website is copyright © 2006-2019, by Anthony Watts, and may not be stored or archived separately, rebroadcast, or republished without written permission. For permission, contact Watts. All rights reserved worldwide.ro

Similar material on the medium.com website may need permission to be republished.

5 3 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

262 Comments
Inline Feedbacks
View all comments
November 15, 2019 4:16 am

That low CO2 at the start is an estimate of what mankind was emitting at the time. Add natural emissions and you get to about 180 ppm. But, that 180 ppm is also questionable, because of the thousands of chemical analyses of ambient levels in those early days that often give higher results. Of course, some were sampled in places locally high, but there remains doubt about how useful that 180 ppm is. Geoff S

gudoLamoto
November 15, 2019 4:31 am

If we use HRs/month as our metric for determining the HR production anomaly for baseball players, then every general manager in the league must go into a panic when their stars’ numbers fall from +6 in July & Aug to -6 in Dec & Jan. Maybe Hr/yr is a more appropriate measure…..I rather doubt that the gene frequency in the pre-Colombian bison herd changed much from decade to decade or even from century to century. A millennial time scale would be more appropriate to measure evolution compare to climate….And why have the MasterMinds determined that a 30 yr average defines “climate” when there’s a rather obvious 60 yr undulation in the extant temp records?….What’s “normal” for temps & climate? Shouldn’t we be using a 100 or 500 yr average at a minimum for determining a meaningful anomaly?

Rod Evans
November 15, 2019 4:38 am

Sitting here in the UK midlands, can I make a request for more global warming as soon as is possible to be delivered. We don’t have snow cover just yet, but all the indications are that it is coming. We have all the cold and all the water we are going to need for the next year at least. The oft claimed global warming, just never seems to show up in the weather patterns we are getting?
Thankfully the evidence is, we are about 0.8 deg C warmer than we were back in the middle of the 1800 just before the American Civil War. I hope there is no connection between those cold climate times and the anger of man…
Where is all the warmth we need?

November 15, 2019 4:46 am

The decrease in the rate of warming since about 1994 is in the data. It is not an artifact. It just turned negative (i.e. cooling) after 2016.

https://wattsupwiththat.com/2019/02/06/the-planet-is-no-longer-warming/

Since this is a 26 year trend in the decrease in the rate of warming it will become very obvious to all in a few years when we get away from the 2015 El Niño that temporarily affected it.

Ian Wilson
November 15, 2019 4:48 am

The following graph shows the flux optical depth anomaly (expressed as a percentage) for the Earth’s atmosphere between 1948 and 2007. It turns out that this anomaly is a rough measure of the total column density of water vapor in the Earth’s atmosphere from year to year over this time period.

comment image

The following graph is a comparison between the polar Fast Fourier Transform (FFT) of the flux optical depth anomaly between 1964 and 2001 and a periodogram of the ENSO/SOI over the same time period.

comment image

[N. Sidorenkov, Astronomy Reports, Vol. 44, No. 6, 2000, pp 414 – 419, translated from Astronomischeskii Zhurnal, Vol. 77, No. 6, 2000, pp 474 – 480]

Remarkably, the 6.2 (& 6.0), 4.8, 3.6, 2.4, and 2.1-year periodicities in the ENSO/SOI periodogram of Siderenkov (2000), are also clearly evident in the FFT of the flux optical depth anomaly data.

Four of these six long-term periodicities (i.e. 2.4, 3.6, 4.8, and 6.0 years) are sub-harmonics of the 1.2 year period of the Earth’s free nutation i.e. the Chandler Wobble. In addition, all six of the long-term periodicities are very close to the super-harmonics of the 18.6 year period of the Earth’s forced nutation (i.e. 6.2, 4.7, 3.7, 2.3, and 2.1 years) i.e. the periodic precession of the line-of-nodes of the Lunar orbit.

This data tells us that the ENSO must play a major role in setting the overall column density of water vapor in the Earth’s atmosphere. In addition, it indicates that the ENSO must also be an important factor in setting the World’s means temperature, since water vapor is the dominant greenhouse gas in the Earth’s atmosphere. The El Nino/ LA Nina may also play an important role in changing the Earth’s albedo via its effects upon the overall amounts of regional low and high-level cloud

What is even more remarkable, is the fact that common frequencies seen in the two data sets are simply those that would be expected if ENSO phenomenon was the resonant response of the Earth’s (atmospheric/oceanic) climate system brought about by a coupling between the Earth’s forced (18.6-year Nodical Lunar Cycle) and unforced (1.2-year Chandler Wobble) nutations.

Support for this hypothesis is provided by the following paper:

Wilson, I.R.G. and Sidorenkov, N.S., 2019, A Luni-Solar Connection to Weather and Climate III: Sub-Centennial Time Scales, The General Science Journal, 7927

https://www.gsjournal.net/Science-Journals/Research%20Papers-Astrophysics/Download/7927

This paper shows that the the variations in the rate of change of the smoothed HadCRUT4 temperature anomalies closely follow a “forcing” curve that is formed by the simple sum of two sinusoids, one with a 9.1-year period which matches that of the lunar tidal cycle, and the other with a period of 10.1469-year that matches that of half the Perigean New/Full moon cycle. This is precisely what you would expect if the natural periodicities associated with the Perigean New/Full moon tidal cycles were driving the observed changes in the world mean temperature anomaly on decadal time scales [between 1850 and 2017].

comment image

Reply to  Ian Wilson
November 15, 2019 5:21 am

This data tells us that the ENSO must play a major role in setting the overall column density of water vapor in the Earth’s atmosphere.

That is not surprising. El Niño implies a huge transfer of energy from the ocean to the atmosphere and the vector is water vapor.

What is even more remarkable, is the fact that common frequencies seen in the two data sets are simply those that would be expected if ENSO phenomenon was the resonant response of the Earth’s (atmospheric/oceanic) climate system brought about by a coupling between the Earth’s forced (18.6-year Nodical Lunar Cycle) and unforced (1.2-year Chandler Wobble) nutations.

The problem with that theory is that ENSO frequency has been changing during the Holocene, with El Niño being almost absent during the Holocene Climate Optimum, while the Moon and the Earth axis were doing their thing the same. While it is not impossible that ENSO responds to changes in Earth movements, it is clear that other factors play an even more important role.

Ian Wilson
Reply to  Javier
November 15, 2019 7:51 am

Javier,
Your objections to the Lunar influence on the initiations El Nino are no longer valid. I cannot fault you since you are not fully aware of all the recent findings. However, you will be seeing a series of papers coming out over the coming year that will place this connection on a more solid foundation.

The lunar Periegan New/Full tidal cycles naturally produce Gleissberg (~ 88 years) and de Vries (208 years) cycles. These cycles are evident in some recent climate records such as the South American Moonsoon.

In addition, they are clearly evident in geological stratigraphy data that is over 90 million years old (after making some allowance for the slow drift of the Moon away from the Earth due to tidal torquing).

Kevin Kilty
November 15, 2019 5:02 am

Figure 2 is a bit of a mystery. Unless someone else does this beforehand, I will get to Boden (2017) when I reach my office later this morning, and try to determine how the blue, dotted curve in Figure 2 was constructed.

There are all sorts of things I might suggested looking at other than the high order polynomial. A Kalman filter, for instance, or some adaptive filter.

Reply to  Kevin Kilty
November 15, 2019 5:31 am

As explained above figure 2 is not mystery at all. It is just emissions expressed as ppm.

Kevin Kilty
Reply to  Javier
November 15, 2019 6:30 am

The title and caption do not match and the data are apparently some mixed items. See my post below.

Kevin Kilty
November 15, 2019 5:20 am

Curiosity got the best of me and so I had a look at Boden’s data while sitting at breakfast. The full mystery of Figure 2 will not be cleared up by examination of this data (for example if Figure 2 shows additions to global CO2, then why plot the final data point which is obviously the current level, and another thing entirely). However, Boden’s data is CO2 emissions per country all of which start reporting at different times, and some of which, like nations in the old Soviet orbit do not begin to report until 1992 and start with large values. Thus, totals built from that data have a low bias in the past. Another instance of an estimate that should include some error bars.

Reply to  Kevin Kilty
November 15, 2019 9:57 am

Boden et al., 2017 dataset has three excel documents, one global, one regional and one national. The global one contains their estimate of carbon emissions from 1751 to 2014. I have plotted that data in GtCO2 from 1900 updated to 2018 and it looks exactly as figure 2 blue curve except for the 2014-18 not included and that last point that does not belong to the dataset.

And you can’t blame the author of this article for Boden et al. estimates. It is after all the most accepted, used dataset for emissions.

Kevin Kilty
Reply to  Javier
November 16, 2019 2:08 pm

I don’t blame the author for anything, but it is his problem that he mislabeled the graph, title and vertical axis both, against the text of the caption, and mixed data within the graph itself–i.e. the last data point in the set should have been clearly identified with a different symbol. Perhaps a paper with such a remarkable claim should have taken a little more care in presenting his central evidence.

Tom Bjorklund
Reply to  Kevin Kilty
November 17, 2019 10:56 pm

I agree entirely. Carelessness. It has been fixed.

Tom Bjorklund
Reply to  Kevin Kilty
November 17, 2019 10:59 pm

I agree completely. Carelessness. It has been fixed.

November 15, 2019 5:29 am

None of 102 climate models of the mid-troposphere mean temperature comes close enough to predicting future temperatures to warrant changes in environmental policies.

Unfortunately the tipping point was passed some time ago, and it is this: “Climate Change” has achieved “Too Big To Fail” status.

John Endicott
Reply to  steve case
November 15, 2019 10:32 am

and fail the climate cult must if civilization is to survive.

Reply to  steve case
November 16, 2019 8:49 am

Steve case — yes, I’m afraid our future is more & more “orange vest” events, eventually becoming world-wide. The elites/bureaucrats against the common people.

Tom Abbott
November 15, 2019 7:07 am

From the article: “The highest global mean surface temperature ever recorded was 1.111 degrees C in February 2016”

So the Earth reached a temperature 1.111C above the estimated average global temperature since 1850, in February 2016.

The UN IPCC and associated alarmists used to say humanity had to limit the increase in global temperatures to 2C above the global average if we are to avoid catastrophic global warming effects.

In the intervening years after the 2C limit claim was made, the global temperatures did not rise the way the IPCC thought they would, so in order to continue the climate change crisis, they have now lowered the point at which humanity will face a global warming crisis to 1.5C above the estimated global average temperature since 1850.

But since Feb. 2016, the temperatures have cooled by about 0.4C, so the Earth is heading away from the 1.5C crisis point, not towards it. And that is while CO2 continues to increase. Good news for Greta! She can relax. The Earth is not heading into a disaster.

Btw, 2016, the “hottest year evah!” was only 0.1C warmer than 1998, a statistical tie, and in the United States, 1934 was 0.4C warmer than 2016, according to Hansen 1999, and the UAH satellite chart.

Hansen 1999:

comment image

UAH satellite chart:

http://www.drroyspencer.com/wp-content/uploads/UAH_LT_1979_thru_October_2019_v6.jpg

Randall L Hunt
November 15, 2019 7:15 am

Interesting stuff. I’m a geophysicist in the energy biz, and not convinced by the catastrophic warming narrative out there. These kinds of studies are critical in countering that false narrative, but it would be better for them to appear in peer-reviewed publications, because otherwise, warming zealots have an excuse to discount them. Is this study published in a peer-reviewed journal somewhere? If not, why not, or are there plans to publish? Thanks

Randall

November 15, 2019 8:09 am

I agree several others here about the inadvisability of fitting a sixth order polynomial to such data. It’s frequently a mistake that new grad students make which their advisors have to correct, in particular then taking the derivative of the fitted curve is particularly egregious. Check the following for information on this:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.569.2504&rep=rep1&type=pdf
E.g. “Results based on high order polynomial regressions are sensitive to the order of the polynomial. Moreover, we do not have good methods for choosing that order in a way that is optimal for the objective of a good estimator for the causal effect of interest. Often researchers choose the order by optimizing some global goodness of fit measure, but that is not closely related to the research objective of causal inference”.
“Based on these arguments we recommend that researchers not use such methods, and instead control for local linear or quadratic polynomials or other smooth functions.”

Tom Bjorklund
Reply to  Phil.
November 16, 2019 10:28 am

The data points are well-behaved. I assert that I could fit the data by hand-fitting the curve with a crayon and realized essentially the same results. I have not taken the time to do this and will not it.

Reply to  Tom Bjorklund
November 19, 2019 11:56 am

Your assertion is incorrect. I suggest you plot your six order curve including the error bars and then plot the derivative curve with its error bars you’ll find significant deviations particularly near the ends of the curve.

Coach Springer
November 15, 2019 8:11 am

Trends are relevant to the future until they aren’t. This study seems robust enough to question alarmist predictions, regardless of its own predictive value.

Ian Sloan
November 15, 2019 9:14 am

Can you image how happy people would be if they moved the arbitrary 0.0 degrees anamoly line up just 0.6 degrees? They would be happily looking back at the poor buggers who had to endure all those freezing conditions in just “wolfskins, and no internet” back in the 1940’s safe in the knowledge that they had “cured” the planet of hypothermia.

November 15, 2019 9:23 am

I hate to keep bringing up uncertainty and significant digits as a problem, but it is scary to see people misrepresent data. For at least a hundred years, temperatures were recorded to the nearest integer temperature. That means the uncertainty is +/- 0.5 at a minimum. This doesn’t include any assessment of accuracy or other systemic uncertainty in the recorded data. One should also remember that each recorded piece of data contains this uncertainty. The uncertainty must be taken into account when doing each calculation to reach the end result.

You can not erase this uncertainty through determining anomalies or averaging. The standard use of significant digits means you can only have data that is stated to the nearest integer for the period when data was recorded as integers.

The following comment is from Washington Univ. at St. Louis. “Significant Figures: The number of digits used to express a measured or calculated quantity. By using significant figures, we can show how precise a number is. If we express a number beyond the place to which we have actually measured (and are therefore certain of), we compromise the integrity of what this number is representing. It is important after learning and understanding significant figures to use them properly throughout your scientific career.” The link is at: http://www.chemistry.wustl.edu/~coursedev/Online%20tutorials/SigFigs.htm .

This kind of erases the conclusion of the paper since the uncertainty is wider than the calculated numbers.

JRF in Pensacola
Reply to  Jim Gorman
November 15, 2019 10:41 am

Jim:
And then throw in the uncertainty in bio-indicator data like tree rings. Stat analyses cannot undo that.

unka
November 15, 2019 9:38 am

“There is no correlation between a rising CO2 concentration in the atmosphere and a relatively stable, low rate of warming of the surface of the earth from 1943 to 2019.” – Figure 2

Wrong! You must compare the rate of warming with the rate of CO2 change, i.e, the derivative of the blue curve.

Clyde Spencer
Reply to  unka
November 15, 2019 10:33 am

unka, you said, “You must compare the rate of warming with the rate of CO2 change, …” I agree.

However, a plot of surface temperature versus cumulative anthropogenic CO2 emissions should provide an understanding of how humans might be contributing to increasing global temperatures. A first or second-order regression would provide a coefficient of determination (R^2) that would tell us how useful CO2 emissions are for explaining or predicting future temperatures — assuming that the correlation is not spurious.

unka
Reply to  Clyde Spencer
November 15, 2019 11:16 am

Compare red curve form Fig. 1 and blue curve from Fig.2 and you see correlation.

Comparing green curve form Fig.2 with red blue curve form Fig. 2 is methodologically incorrect.

November 15, 2019 9:54 am

Recent statistical analyses conclude that 95% uncertainties of global annual mean surface temperatures range between 0.05 degrees C to 0.15 degrees C over the past 140 years;

Those uncertainties are 3 to 10 times smaller than the lower limit of resolution of the historical instruments. Utter incompetence; standard for that field.

Reply to  Pat Frank
November 15, 2019 7:03 pm

Who is reviewing these so-called studies that ignores basic scientific accuracy and precision requirements?

It is hard for me to understand how basic fundamental metrology isn’t just ignored, it doesn’t even seem to be on these peoples radar. It points to ignorance of the physical world. It reminds me of kids in school 50+ years ago that had no idea how hot a soldering iron gets or difference in wattage in resistors. These folks are computer programmers who don’t even know how to handle rounding and precision in excel spreadsheets. How sad!

Reply to  Pat Frank
November 15, 2019 11:54 pm

Jim, it literally is not on their radar, just as you suggest.

Reading their papers, one gets the impression that none of these folks has ever made a measurement or struggled with an instrument

They assume all the error away, the Central Limit Theorem is their magic talisman, and their lives become very easy.

I published a paper earlier this year on the radiolysis of the amino acid cysteine in a synchrotron X-ray beam. I had to track the resolution and accuracy of the beam line right down to whether the pre-amp was calibrated and the nitrogen ionization chamber voltage was above its saturation knee.

I kill myself to make sure my data are good and so does everyone else I know. The bland carelessness of these climate people, and their hostility toward rigor sticks in my craw more than anything else.

Clyde Spencer
Reply to  Pat Frank
November 16, 2019 9:41 am

Pat
I suspect that part of the problem is that many of the scientists who have not yet retired come from a generation when academic standards were declining. Consequently, they don’t know what they don’t know.

I taught at Foothill College (Los Altos Hills, CA) from 1971 through 1982. Many of our students came from Palo Alto High School, considered one of the better in the Bay Area (Probably only Bellarmine had a better reputation). During the decade I taught, I saw a noticeable decline in the quality of students. There were many that didn’t know that one could multiply by 10 by moving the decimal point! Yet, Foothill awarded A’s and B’s to 50% of the students.

Reply to  Clyde Spencer
November 16, 2019 11:09 am

Clyde, it’s all so awful.

But somehow I think the climate scientists are a special case. Most of the modelers are mathematicians who know nothing of science or physical reasoning as a matter of course.

But I also think that climate scientists plain are not well educated. Look at Phil Jones, of a prior generation, who nevertheless knows nothing of validating data quality and assumes all the error away. His younger follow-on colleagues are the same.

And, I’m sorry to say, Roy Spencer showed no understanding of how to assess error or even to reproduce the linear emulations of my paper. He’s also of a prior generation.

These problems seem endemic in the field.

So, while I fully agree with you that rigor has been nearly eviscerated from instruction, it seems like there hasn’t been any in climate science for 40 years or more.

Charles Brooks did heroic work in 1926 testing ship intake SST validity. And in 1963 J. Saur did a careful analytical study of ship intake temperatures.

But their analytical rigor seems to have been lost in consensus climate science, and the whole field seems to lack competence.

Jim Masterson
Reply to  Pat Frank
November 16, 2019 1:20 pm

>>
These problems seem endemic in the field.
<<

It’s already been mentioned by many on this post and in other posts: averaging intensive properties (it’s nonsense to do so), starting with a list of numbers with unit precision and averaging out to the thousandth place (utter nonsense and violates the rules of significant figures), and averaging averages (not mathematically valid in general).

Jim

chris
November 15, 2019 11:38 am

but we don’t have 170 of legit data! (I read that on this blog so it must be true)

I guess we’ll never know what is causing all that ice melting, fires, increased frequency of record highs (and an order of magnitude record lows), and so on.

Thank the author so much! I’m going to buy some Florida ocean side real estate before he gets it all!

MarkW
Reply to  chris
November 15, 2019 5:32 pm

Care to actually address the arguments, or like steve, is snark the limits of your intellectual abilities?

Reply to  chris
November 16, 2019 6:02 am

No one is saying the original recorded data is not legit (although some of it may be made up at various times). What is being said is that the use of that data is not legit.

You simply can not take data that is recorded as integers and through mathematical calculations like subtraction (anomalies) or finding averages extend the precision of the recorded temperatures out to tenths, hundredths, or thousandths of a degree. That is simply not legitimate science. If the original precision was units, that is as accurate as you can ever be, like it or not.

I can’t tell you how many students I have tutored for hours about significant digits will invariably write down the maximum number of decimal digits on their calculators when dealing with measurements. Climate scientists make the same mistake but I suspect they don’t even know better.

Do you ever wonder why these graphs and conclusions never ever show any error/uncertainty bars or state the conclusion along with an uncertainty value?

Carlo, Monte
Reply to  Jim Gorman
November 16, 2019 7:45 am

Two nights ago I was watching a UK forensics drama in which the highly trained pathologists and forensic scientists were puzzling over a pair of burn marks on a body, trying to determine their origin and hypothesizing they were from a taser or stun gun. One of the scientists took a small plastic 10 cm ruler, placed it against the body to measure the separation between the burns and announced the distance as: “three point one seven five centimeters”. From this they went on to conclude: “Ah-ha! This one and a quarter inches so this was an illegal stun gun made in the USA!”

I cringed at this bit of theatrics, knowing the ruler was only graduated in millimeters yet somehow the separation was determined with a resolution 50 microns. Obviously the writer took 1.25 inches and multiplied by 2.54 in a cell phone calculator, which presented the answer as 3.1750 cm, and put the number into the script. The actors were none the wiser, and this condition is apparently extended to climate scientists.

Reply to  Carlo, Monte
November 16, 2019 4:35 pm

Great example! I love it. It describes climate scientists exactly.

November 15, 2019 11:55 am

The challenge is to allow scientists the time and freedom to work without interference from special interests

Most of the “scientists” working in the climate field ARE the “special interests”.

Or at least one of three big special-interest groups – the other two being the “renewable energy” industry and the politicians who make policies and direct public money in the currently fashionable direction.

The jobs of climate scientists and their research grants depend on there being a human-caused warming trend that is having an adverse effect on both the natural environment and human civilization, and that this trend will accelerate and have even more and even worse adverse effects in the future.

How else to explain that there is not one study among the many thousands published every year that can find ANY beneficial effects – local or global – of past, present or (inferred) future warming?

(unrelated complaint) I don’t like polynomial-fitting in general, and especially I don’t like it when it’s being applied to a parameter that may vary in a cyclic way, and even more especially if there’s less than one complete cycle in the data, and yet more especially if the data may contain more than one set of superimposed cycles. And using the last inflection of a polynomial-fitted curve, or any other curve fitted to inherently noisy data, to extrapolate outside the data range (i.e. – in this case, into the future) should be a criminal offence.

Matthew R Marler
November 15, 2019 12:11 pm

3. Recent increases in surface temperatures reflect 40 years of increasing intensities of the El Nino Southern Oscillation climate pattern.

Did something cause the increasing intensities of ENSO? What was it? How do we know that?

ENSO is part of the global net of mechanisms that transfer heat through the ocean and atmosphere. Of course any net warming will entail net warming effects of at least some of the mechanisms, no matter what is driving the net warming.

This analysis does not really help decide whether sun, CO2, urbanization, or something else has been driving the warming, imo.

Vuk
Reply to  Matthew R Marler
November 15, 2019 1:50 pm

There may be no need for any energy input rise/fall for the atmospheric global warming/cooling to take place.
Ocean currents take warm water from equatorial region towards the poles, and cold water in the opposite direction while the processes of up/down -welling provide for the bidirectional energy exchange between the oceans and atmosphere. The time scale of ocean currents travel between various critical locations defines time constants at the bases of number of natural warming/cooling cycles. Just having three cycles (e.g. Atlantic, Pacific and Indian oceans) with even small individual amplitude and periodicity variability would make it next to impossible to resolve within the very narrow window of the global temperature data available.

Reply to  Matthew R Marler
November 15, 2019 11:58 pm

Matthew, how do we know the El Nino response is not time-lagged?

What’s the ocean-response time constant?

What if today’s El Ninos are a response to the Medieval Warm Period? Does anyone know?

Pat Smith
November 15, 2019 12:20 pm

A question – I assume that UAH measures average temperatures. I believe that HadCRUT and other surface thermometer-based temperature series measure the highest and lowest temperatures and divide them in two. I understand from Anthony Watts’s presentation that most of the warming has taken place at night, that is the lowest temperature setting increasing much faster than the highest (perhaps as a result of all the UHI interference and the poor siting of the measuring apparatus). Would this explain why HadCRUT, etc show a higher decadal increase in average global temperature?

November 15, 2019 1:53 pm

Facts, facts , nothing but facts. But what about the 97 % certainty ?

Mr. and Mrs. average come home from a tiring day at work, and settle down
to a meal and relaxation. So who do they believe, the facts, pages and pages
of them , or just a simple statement like the 97 %.

MJE VK5ELL

saveenergy
November 15, 2019 2:07 pm

Figure 2. Is wrong & makes no sense.

In Figure 2, “The blue curve is a time series of the concentration of fossil fuel emissions of CO2 in parts per million in the atmosphere.” (Starting at ~ 30ppm)
BUT
the top of the curve shows TOTAL (Anthropogenic & Natural) atmospheric CO2 (414.7 ppm)

So what are we supposed to be seeing ? Total atmospheric CO2 or fossil fuel emissions of CO2 ???

If it’s Total atmospheric CO2 then starting point should be ~ 295

If it’s fossil fuel emissions the end should be ~ 200ish

You can’t have both on the same curve

Stephen Wilde
Reply to  saveenergy
November 15, 2019 2:20 pm

The side bar of Fig 2 needs correcting.
The intent is to show the rate of increase in our emissions over time compared to the rate in change of temperature.
It is obvious that the rate of change in temperature does not correlate with the rate of change in our emissions.
The head post goes on to say that the rate of change in temperature does correlate with the level of El Niño activity.
Both assertions are clearly correct.
All the objections posted thus far simply distract from that underlying truth.
That is partly the fault of the author but adverse responders are also at fault for missing the essential point.

November 15, 2019 2:18 pm

Are we looking at a ENSO event which as the scientists tell us that bottom
water from the Poles to the final up welling can take up to 800 years to
occur.

So what was happening back in the 12th century, why that was still the GWP
So is today’s situation caused by the accumulated warms from back then. ?

A very interesting article.

MJE VK5ELL

November 15, 2019 2:44 pm

Over at Open Mind, I found this…and I never even look at that site.
It popped up on a Cortana search.

https://tamino.wordpress.com/2019/11/15/back-to-basic-climate-denial/

Reply to  Nicholas McGinley
November 16, 2019 10:21 am

Now that’s impressive.

Reply to  Nicholas McGinley
November 18, 2019 6:01 pm

Using rate of change is a good approach. Tamino uses it frequently. A key parameter that has a non-linear correlation with rate of change is Duration. High rates of change last for short periods of time.
Bjorklund’s 0.7 deg C per Century (blue X) and Tamino’s rate of change (red x’s) on plot. https://imgur.com/a/RqKTdba

Gamecock
November 15, 2019 2:50 pm

‘170 Years of Earth Surface Temperature Data Show No Evidence of Significant Warming’

170 years ago would be 1849. Sorry, but we didn’t have much data in 1849. We didn’t have much more in 1900. Adequate data started in 1979. Analyses to 1849 may be fun, but they are more of a parlor game than science.