An analysis of BEST data for the question: Is Earth Warming or Cooling?

Guest essay by Clyde Spencer

The answer to the question is, “Yes!” Those who believe that Earth is going to Hell in a hand basket, because of anthropogenic carbon dioxide, go to extraordinary lengths to convince the public that uninterrupted warming is occurring at an unprecedented rate. One commonly reads something to the effect that the most recent year was the xth warmest year in the last n years (use your personal preferences for x and n), or that the last n years have been the warmest in the last m years. It is common for NOAA to make claims that current temperatures are higher than some previous year by an amount that is of the same order of magnitude as the uncertainty in the temperature of the year being compared to. [For an extended discussion and analysis of the veracity of these kinds of claims, go to this link: http://www.factcheck.org/2015/04/obama-and-the-warmest-year-on-record/ ] I’d like to start off by examining the logical fallacy of the common idea that these pronouncements support the idea of continued warming. They only provide evidence for it currently being warm!

Let’s conduct a simple thought experiment that most can relate to. Imagine that you have a pot of water on the stove at room temperature. You place a thermometer in the water, take a reading, and turn on the heat. We’ll monitor the increase in temperature by taking frequent readings at fixed intervals. Assume that the thermometer is calibrated in tenths of a degree, and we’ll try to read it to the nearest ½ of a tenth. Therefore, we can expect that there will be some random errors in the reported temperature because of observation errors. If the pot is not well stirred, some stratification may occur that will further obscure the true average temperature. We can expect to see a steady, approximately linear increase in temperature until the water is nearly at the boiling point. The pot is removed from the heat, and readings are continued as before. We can expect that the water in the pot will cool more slowly than it heated, the rate depending on such factors as the surface-to-volume ratio, the room temperature, and the material of which the pot is constructed. In any event, we can expect that the temperature readings will not change much, if any, for the first couple of readings. Subsequent readings may or may not be lower because of the random errors mentioned above. Eventually, we will get a reading that is obviously lower than when we removed the pot from the heat. A subsequent one could be slightly higher because of a reading error. If we were to stop at that point, we could make such statements as, “The last n number of readings are higher than the average of all previous temperatures, which proves that the water is still heating.” Or, “The last n readings are the highest ever recorded;” another classic, for one of the last readings which had a random error, “The probability that the last reading is higher than all other temperatures is 38%.” We know very well that the pot is no longer heating, and it is just sophistry to try to make it appear that it is.

Something I find peculiar about modern climatology is the use of so-called temperature anomalies. While not unheard of in other disciplines, there are usually good reasons, such as for simplifying a Fourier analysis of a time series. One of the issues of using anomalies is that if a published graph is reproduced, and separated from the metadata in the text of the article, then one is at a loss to know what the anomalies mean; they lose their context. Another issue is that the authors are free to choose whatever base period they want, which may not be the same as others, and it makes it difficult to compare similar analyses. The psychological impression conveyed is that (recent) data points above the baseline are extraordinary. Lastly, the use of anomalies tends to influence the subjective impression of the magnitude of changes because very small changes are scaled over the full vertical range of the graph. See Figure 2 below, which shows actual temperatures, for a comparison to the anomalies that you are used to seeing in the literature.

In the recent NOAA paper by Karl et al. (2015), the authors decided to adjust modern ocean buoy temperatures upward to agree with older, problematic, engine-room water-intake temperatures. The decision to adjust high quality data to agree with lower quality data is, at best, unorthodox. The authors did not give a good reason for the decision. As one defender remarked, whether one adds temperatures to the anomalies on the right or subtracts them on the left, the slope stays the same. True, but the result is to have a higher ending-temperature than if the more orthodox approach was taken. Supporters of anthropogenic global warming are then ‘justified’ in claiming an uninterrupted increase in recent temperatures and any claimed pause in warming is an illusion.

I take exception to the practice of conflating Sea Surface Temperatures (SST) with land air temperatures. There are several issues with this practice. While a weak excuse is, there is a strong correlation between SST and nighttime air temperatures, that is hardly justified with modern instrumentation. The biggest problem is that the heat capacity of water is so high that water exhibits strong thermal inertia. That means, warm water influenced by contact with the air will always lag colder air temperatures. Thus, even if Earth were to enter a cooling phase, water would be the last to provide evidence for it. Because the theory behind so-called ‘greenhouse warming’ predicts that the air should heat first (or more properly, cool more slowly), the most sensitive indicator of changes will be found in air temperatures. Using ocean temperatures is analogous to using subsurface land temperatures and averaging them with land air temperatures. At relatively shallow depths in the soil, the diurnal temperature changes are smoothed out and, at greater depths, even the seasonal effects are eliminated. However, we don’t average subsurface ground-temperatures with land air-temperatures! Why should we average SSTs with land air-temperatures? It is a classic example of comparing apples and oranges. SSTs are of interest and provide climate insights, but they should not be averaged with air temperatures!

Lastly, global averages of all temperature readings typically are reported instead of the high and low temperatures. This is important because the highs and lows behave differently and the lows should be a better indicator of the impact of the so-called ‘green house’ effect.

clyde-spencer-fig1

Fig 1.

Figure 1, above, which shows the differences between the high and low temperatures from the Berkeley Earth Surface Temperature (BEST) data, appears to reflect some abrupt transitions in the behaviors of the two temperatures. My interpretation of Figure 1 is that between about 1870 and 1900, neither the high nor the low temperatures were changing systematically. Then, between about 1900 and 1983, the low temperatures were increasing more rapidly than the high temperatures, causing a decline in the differences. This is what I would expect for a ‘greenhouse’ signal. However, since 1983, it appears that the high temperatures have been increasing more rapidly than the lows, resulting in a steep increase in the difference in the temperatures. I don’t believe that this has been reported before and is begging for an explanation since it isn’t something I would expect from carbon dioxide and water vapor alone.

This brings us to the point of my expanded analysis of the BEST temperature data set. Figure 2, below, shows the high and low temperatures for the period of 1870 to mid-2014. The data set starts earlier than 1870, but the uncertainty is so great in the early data that I didn’t feel it contributed much. [Should the reader be interested, there is a graph of land temperature data starting about 1750 at this link: http://berkeleyearth.lbl.gov/regions/global-land ] The main thing worth noting is that the high temperatures were increasing rapidly in the two decades before my graphs start and the lows were coming down from a high in about 1865. The pastel shading reflects the 95% uncertainty range, which becomes imperceptible by the present day. The green, smooth line is a 6th-order polynomial fit of monthly temperature data that have been smoothed. Rather than attempt any further smoothing of the once-smoothed data, I chose to model the low-frequency response with a polynomial least-squares fit trend-line. This approach to characterizing recent temperature changes is more sophisticated than drawing straight lines through the data, where one is free to choose the start and stop times subjectively; subjective time-periods allow for conscious or unconscious mischief.

clyde-spencer-fig2

Fig. 2.

The 6th-order fit captures nearly 80% of the variance in the high-temperature data. It notably doesn’t do an optimal job of capturing the transient warming events around 1878 and 1902, or the broader warming event of the 1940s. Visually, the 6th-order fit seems to be doing a good job of characterizing the data from about 1950 to the present day, which is important for the question at hand, which is whether we are still experiencing warming. Similarly, the 6th-order fit captures more than 89% of the variance in the low-temperature data; visually, the fit appears superior to that for the high-temperature data. By comparison with a graph generated with the BEST long-term smoothed data, these regression curves are smoother than the 20-year moving average; however, they are similarly shaped. Although, the point of this exercise isn’t to smooth the data.

It is easy to take the first-derivative of a polynomial function and obtain quantitative values for the slope (tangent) of the temperature-curve versus time. That is, one can obtain annual values of the warming rate for every month for both the high and low-temperature global averages.

In order to pick up the last six months of 2014, which are missing from the 12-month smoothed data, I repeated the above analysis with the un-smoothed monthly data. There were no surprises other than the fact that the extrapolation of the last six-months of 2014-slopes for the smoothed, data were nearly identical to the slopes of the un-smoothed monthly data; the differences are trivial. I say that the results are similar for the last 6 months is surprising because all too often when one tries to extrapolate a polynomial fit beyond the actual data, the curve diverges abruptly! The polynomial coefficients are very similar for both the smoothed and un-smoothed data. The only advantage to showing the un-smoothed monthly data would be to emphasize how much noisier it is than the smoothed data. For brevity, I have omitted the additional graph. Polynomial regressions of lower order gave lower coefficients of determination (R2) and, subjectively, are poorer fits visually.

Let me summarize what the slopes tell us about the temperature records with the tables below. I’ve listed the approximate years when the highs and lows had zero slope (no warming), maximum slope (maximum warming/cooling, point of inflection on the curve), and what has been happening most recently. The slopes are in degrees Celsius change per year. Examine Figure 2 to verify what I’m saying.

High Temperatures

Year 1870 1875 1883 1896 1917 1943 1956 1968 1998 2013 2014
Slope 0.020 0 -0.008 0 0.015 0 -0.005 0 0.039 0 -0.012
Character Peak Inflection Point Trough InflectionPoint Peak Inflection Point Trough Inflection Point Peak
Temperature Increasing High Changing Low Changing High Changing Low Changing High Decreasing

Low Temperatures

Year 1870 1876 1890 1916 1954 1994 2010 2014
Slope  -0.006 -0.011 0 0.019 0.004 0.033 0 -0.037
Character Inflection Point Trough Inflection Point Inflection Point Inflection Point Peak
Temperature Decreasing Changing Low Changing Changing Changing High Decreasing

For the high-temperature series, the slopes start at a rate of about 0.020°C per year in 1870, decline to 0 about 1875 (temperature-high), become negative and reach -0.008°C per year by 1883. The slopes then change direction, become zero about 1896 (temperature-low), increase to 0.015°C per year by 1917, then again decline, reaching zero about 1943 (temperature-high). The slopes continue to decline until about 1956 (point of inflection), reverse direction and reach zero again about 1968 (temperature-low). This now is the beginning of the much heralded ‘modern warming,’ reaching a maximum of about 0.039°C per year in 1998, and then declining to zero (temperature-high) in 2013. That is to say, the rate of warming started to decline about 1998. The time series closes out 2014 at a rate of -0.012°C per year. The average warming rate for the period 1870 through 2014 was 0.9°C per century.

The low-temperature slopes follow a similar pattern: Starting in 1870 with a rate of -0.006°C per year, reaching a minimum of about -0.011°C per year in 1876, reversing direction (point of inflection), reaching a slope of zero in 1890 (temperature-low), and then increasing to a maximum (point of inflection) of over 0.019°C per year in 1916. The slopes then decline to about 0.004°C per year in 1954; they then climb to a maximum of almost 0.033°C per year in 1994 (point of inflection). The slope then decreases to zero (temperature-high) in late-2009, and becomes negative for the remainder of the record, ending 2014 with a slope of -0.037°C per year! That is to say, the rate of warming started to decline about 1994. The average warming rate for the period 1870 through 2014 was 1.1°C per century.

Thus, the low-temperature averages have been increasing slightly more than the highs, which is what I would expect from reduced radiative cooling at night and in the Winter. However, there isn’t a big difference between the two and there is a need to explain the recent anomalous increase in the high temperatures (post-1983) as shown in Figure 1. To put this into perspective, the 144-year temperature increases have been less than the 95% uncertainties of the monthly temperatures in 1870! There is an abrupt change in temperature differences around 1950, as shown in Figure 1; a close examination of Figure 2 suggests that there is also an abrupt change in the temperature uncertainties about the same time. This is something that needs explanation. Something to consider is whether the predictions of future heat waves are reliable, given that it is the low temperatures that have shown the greatest and most consistent increase over the last 125 years. Furthermore, it has been claimed that warming in the Arctic is at least twice the rate of the rest of the globe (Screen, et al., 2012), biasing the global averages upward. Is it reasonable then to expect heat waves at mid-latitudes from extrapolating global averages?

In summary, Fig. 2 does not support a claim that 2014 had the highest high or low temperatures in modern times, and the analysis suggests we are currently in a cooling phase, not just a plateau.


 

References

Estimated Global Land-Surface TMAX based on the Complete Berkeley Dataset: http://berkeleyearth.lbl.gov/auto/Global/Complete_TMAX_complete.txt

Estimated Global Land-Surface TMIN based on the Complete Berkeley Dataset: http://berkeleyearth.lbl.gov/auto/Global/Complete_TMIN_complete.txt

Karl, Thomas R., Anthony Arquez, Boyin Huang, Jay H. Lawrimore, James R. McMahon, Matthew J. Menne, Thomas C. Peterson, Russel S. Vose, and Huai-Min Zhang, (2015); Possible artifacts of data biases in the recent global surface warming; Science 26 June 2015, Vol. 348 no. 6242 pp. 1469-1472: https://www.ncdc.noaa.gov/news/recent-global-surface-warming-hiatus

Screen, J. A., C. Deser, and I. Simmonds, (2012); Local and remote controls on observed Arctic warming, Geophy. Res. Lett., Vol. 39, L10709, doi:10.1029/2012GL051598: http://onlinelibrary.wiley.com/doi/10.1029/2012GL051598/pdf

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
142 Comments
Inline Feedbacks
View all comments
Science or Fiction
August 11, 2015 4:50 pm

There is an enormous amount of claims in this article but your summary only have to main points. Without relating to the first point, the second is:
“the analysis suggests we are currently in a cooling phase, not just a plateau”
I cannot see that your analysis suggest that?
And if it is based on trends alone, whether by first order or 6´th order curves I would like to object that trends can be deceiving. What is the mechanism behind the trend? What is your hypothesis? Which predictions have you deducted from your hypothesis.? Which tests has your hypothesis been exposed to? May I remind you about the scientific method:
1 A hypothesis is proposed. This is not justified and is tentative.
2 Testable predictions are deduced from the hypothesis and previously accepted statements.
3 We observe whether the predictions are true.
4 If the predictions are false, we conclude the theory is false.
5 If the predictions are true, that doesn’t show the theory is true, or even probably true. All we can say is that the theory has so far passed the tests of it.
(Courtesy to Patric Maher)
I cannot immediately see that your summary is supported by the information you provide.

Reply to  Science or Fiction
August 12, 2015 9:46 am

+1 Truth is generally a transient thing – More of a currently not proven to be wrong, until the next paradigm shift 🙂

David
August 11, 2015 4:54 pm

I’m a proud denier (that the world will end, it will not, climate always changes and people will adapt (some easier than others)) based on all the fiddling with data that is occurring, but I’d like to see the sample size for each year for this data set. Really how many accurate temperature sources (even calibrated) were there in the 1860s? Also you are using two and three significant digits. Were the stations recording temps in the late 1800s able to record to 1/1000s of C accurately? Based on the below, the answer is no.
“The 20th century also saw the refinement of the temperature scale. Temperatures can now be measured to within about 0.001°C over a wide range, although it is not a simple task.”
http://www.capgo.com/Resources/InterestStories/TempHistory/TempHistory.html

RD
Reply to  David
August 12, 2015 1:03 pm

David, I get what you are saying but I will not ever accept a term of derision that is inexorably linked to the Holocaust. I encourage others to resist this dishonest libel, too.

Reply to  David
August 13, 2015 12:29 am

I am not a denier, I am a refuter.

August 11, 2015 5:47 pm

The science and math is interesting to a great many. I enjoy reading the opinion. But in the end, we will adapt or die.

Seems to be like counting angels on the head of a pin.
But thanks to everyone for the philosophy.

RD
Reply to  Wayne Delbeke
August 12, 2015 1:06 pm

ROFL! Thanks for posting.

August 11, 2015 7:23 pm

There is an inherent assumption made in the land temperature data (GISS, CRUTEM4, BEST). The assumption is that any global warming signal can be quantified by first subtracting out seasonal normal temperatures individually for each station. Weather stations are located at very different altitudes ranging from sea level to 4000m above sea level. Consequently absolute temperarures are wildly different, whereas anomalies are assumed to be similar. Is this allways true? Well no it probably isn’t since for example the temperature response to the same forcing is temperaure dependent.
DT1 = T2^3/T1^3 DT1
This is one reason why the arctic warms faster than the tropics.
In order to make a global (anomaly) average, stations are divided according to a geographic grid – typically 5×5 degrees (CRUTEM4). Even within a single grid cell stations can differ in altitude by 2000m. Yet their anomalies are simply averaged together. Likewise the global average is a simpare weighted averge of the grid anomalies.
The normalisation period (eg 1961-1990) pivots together the zero values for all stations. This assumes that all stations warm and cool in unison like synchronised swimming. Is this true ? Do some locations warm earlier than others. Just one reason why they might is the UHI.
Most people simply assume that UHI makes cities warmer than the surrounding countryside. This is true but once a city has developed the temperature anomaly remains the same as the countryside. It is only during a period of rapid growth that the anomaly changes. The choice of normalisation period necessarily makes most cities appear cooler in the past. The overall UHI effect in CRUTEM4 is mostly to cool the past. The BEST result showed that the UHI pparently had no effect on global warming. This is only true after 1940. Many of the early stations were situated in towns which saw rapid urban growth during the early 20th century.
http://clivebest.com/blog/wp-content/uploads/2015/06/C4-Xclude-cities.png
A good example of this effect is Sao Paolo which grew from a small village into one of the world’s largest cities in under 100y. By 1961 it was already ‘warm’. The normalisation subtracts this warmth off the anomaly thereby making Sao Paolo appear colder than the surroundings in the early 20th century. Sao Paolo doesn’t measure global warming at all. It measures the effect of rapid urbanisation. There are many other examples – Shanghai, Beijing, Moscow etc.
http://clivebest.com/world/pics/station-837810.png
The red curves are NCDC after their automatic corrections. Most of these corrections due to sighting changes and instrument changes have been generated automatically by detecting temperature ‘shifts’. In general these also have the effect to cool the past which anyway have the largest uncertainties. I estimate that 0.2-0.3C of the observed warming from 1850-1940 is probably spurious and unrelated to CO2.

Reply to  clivebest
August 11, 2015 9:57 pm

“There is an inherent assumption made in the land temperature data (GISS, CRUTEM4, BEST). The assumption is that any global warming signal can be quantified by first subtracting out seasonal normal temperatures individually for each station.”
Except we dont do that. We dont work with anomalies,
We dont average temperatures.
We dont subtract out seasonal norms.
We actually did what many skeptics here and and CA suggested.
do figure
on UHI here is the clue.
Use only rural stations. you get the same answer
here is the next clue. the land is 30% of the total.

Reply to  Steven Mosher
August 12, 2015 1:52 am

Steve,
Of course you work with anomalies. It’s just that you derive them in a different (more sophisticated?) way. You minimise the weighted sum of squares for each station offset to nearby stations for those years that the station is present rminus a global ‘anomaly’. The fit will move the global anomaly result to be relative to a time period which has statistically the most number of stations. The station anomalies are derived relative to the mean regional temperature and the global anomaly relative to the time period which has the maximum coverage and stations. This period is always somewhere between 1951-1990. That is why the BEST results can be compared directly to Hadcrut4 and GISS offset a bit. That is why they agree so well with each other. Most recent stations overlap and are based on GHCN V3.
Of course you average temperatures – that is exactly what your fit ends up doing with the regional mean.
You also subtract out seasonal norms because the fitted regional average is essentially subtracted out each month. The yearly average then of course removes any remaining monthly ‘deltas’.
Regarding UHI. I may be wrong but I think you used MODIS to determine rural stations and then got no change in the fitted global anomaly. Well you could also do a similar exercise and drop all stations which were at least 500m higher in altitude then surrounding stations. You would also get no change. What matters is not the altitude or the size of a city but the rate of urbanisation. Only this can changes the global (land) temperature anomaly. Most urbanisation occured before 1990.
Yes it is true that land is only 30% of the surface so that in that sense the UHI is small on the global level. However the majority of the population live near cities.

Reply to  clivebest
August 12, 2015 6:27 am

I agree with Clive Best that at least 0.2C to 0.3C of the warming trend is caused by spurious warming adjustments.
But that was before Tom Karl’s latest manipulation which added 0.12C to the sea surface temperatures (or about 0.1C to the global average including land).
So, now the spurious/unjustified adjustments are 0.3C to 0.4C and this number is increasing every day that the NCDC is running their adjustment algorithms behind closed doors with Tom Karl leaning over the shoulder over some analyst running the code and individual station fudges.
Best and Mosher just take the station records around the world and remove ALL cooling periods thereby creating an artificial set of numbers that cannot be described as a temperature record but is more of a straight line going up simulation. 20 warming adjustments/station breaks for every 1 cooling adjustment/station break.

Reply to  Bill Illis
August 12, 2015 9:33 am

So, now the spurious/unjustified adjustments are 0.3C to 0.4C a

Back up your assertions with data. You are lacking units and time periods as well as code and data sources.
Here’s a result of the changes in trend (degC/decade) from every GISS release I and others have been able to obtain back to 2005. What’s shown is degC/decade from 1880 to 2005.comment image?dl=0
So it’s 0.62 degC rise for 1880-2005 for the 2005 release, and 0.75degC rise since 1880-2005 for the 2015 release. A change of 0.13degC for that 12.5 decades. Which is far less than your number. But still significant.
If you look at the change since 1950, the supposed* start of the AC02 signal, it looks like this:comment image
So that’s a 0.55degC rise in the 1950-2005 time period on the 2005 release and 0.71 degC rise in the 1950-2005 time period in the 2015.6 release. That’s 0.16degC degree difference. If you are predicting trends however it’s a pretty significant increase of 0.3degC/century.
So yes they are manipulating the data. But the changes from 2005 to 2015, which I can find original data for, is less than what you are saying, though I’m just guessing as you didn’t provide units or time periods.
I keep looking for the source of the hockey stick data manipulation trends but nobody will provide it in a consumable manner. In the age of Hyperlinks that’s just silly.. Please provide a link if you have it.
Peter
Source: https://www.dropbox.com/sh/qi9h70otb2p9j9h/AABPE2Uf-s8xe8iGGr1BhQULa?dl=0
* Hard to find a definitive source for the start of signal. Then again how much signal is A has no definitive source either, it’s likely not accurately measurable…

Reply to  Bill Illis
August 12, 2015 9:35 am

Second link didn’t embed picture, trying again:comment image?dl=0

1sky1
Reply to  Bill Illis
August 12, 2015 2:20 pm

Peter Sable:
It’s proof positive of an anthropogenic effect upon global climate data.

Reply to  Bill Illis
August 12, 2015 4:08 pm

Peter,
Here is the data which shows how the various corrections made to GHCN stations over the years have moved temperature anomalies to show ever steeper warming. The green points are GHCN V1 from 1990. The blue points are GHCN V3U ( raw values). The red points are GHCN V3C (corrected values). All station data from each set has been processed in exactly the same way using a normalisation period 1961-1990. GISS used to apply it’s own corrections but now simply use V3C corrections.
http://clivebest.com/blog/wp-content/uploads/2015/05/Compare-V1-V3-raw-corr.png
As you can clearly see, the effect of corrections has been to increase net warming since the 19th century by at least 0.3C.

Reply to  Bill Illis
August 14, 2015 7:48 pm

Here is the data which shows how the various corrections made to GHCN stations over the years have moved temperature anomalies to show ever steeper warming. The green points are GHCN V1 from 1990
I see a graph. where’s the data? Wheres the source code for generating the graph?

Reply to  Peter Sable
August 15, 2015 12:52 am

Data is here https://www.ncdc.noaa.gov/ghcnm/v3.php
Source code is in PERL and is a modified version of the CRU station analysis. In the case of V1/V3 it first calculates the monthly averages between 1961-1990 for each station. Then it uses these to calculate the monthly ‘anomalies’ since 1850 within a 5×5 grid. Finally it makes a global weighted average and a yearly global average. If you want it I can send it to you. The same software appield to the 6000 stations used by CRU exactly reproduces CRUTEM4.

Jeff Alberts
August 11, 2015 7:33 pm

“Is Earth Warming or Cooling?”
You certainly won’t find out by averaging temperatures. No physical basis, junk science.

August 11, 2015 10:12 pm

If temperatures fall
And it gets colder
There’s a much bigger problem
We’ll have to shoulder
But the CAGW alarmists,
They really are clowns,
Are more concerned about
When mankind drowns!
http://rhymeafterrhyme.net/what-if-it-got-colder/

ren
August 11, 2015 10:54 pm
August 12, 2015 12:52 am

I’ve mentioned Casino and technical stock market analysis before, but it’s a good time to bring it up again. The issue with all this trend obsession is that no one trying to asses the trends has any reason to expect that minor variances have any meaning whatsoever. Like a roulette player looking back over past results of red or black, all your trend lines only have significance in the past. They mean nothing, they tell you nothing. The more complex the analysis the bigger the bs artist conducting it. It’s just another fraud. Just another way to artificially create an “expert” who hopes to get financial or sociological reward for his efforts.
There is no Grednhouse Effect. This has been empirically and definitively shown many times. So anyone speculating over “trends” is just wasting people’s time with trivia or light entertainment.

Neo
August 12, 2015 10:01 am

What Karl et al. (2015) showed, beyond a shadow of a doubt, is that “climate scientists” can torture the data to make it say almost anything.

1sky1
August 12, 2015 2:35 pm

Even in the English-speaking world, where max/min thermometry is most prevalent, monthly average max/min data are rarely obtained by identical algorithms. Thus the global average diurnal range based upon a hodge-podge of non-uniform records, such as shown in Figure 1, needs to interpreted with great caution. What has been very widely ignored in “climate science” is that urban growth usually narrows the diurnal range by increasing night-time temperatures much more strongly (especially in winter) than any credible attribution to GHGs, while the effect upon daytime temperatures (especially in summer) is much weaker.

robinedwards36
August 12, 2015 2:50 pm

I would imagine that anyone who has written multiple regression software and used it in a commercial or scientific environment will be aware of the dangers of extrapolation. Any extrapolation is presumably intended for some sort of forecasting purpose. The very sound conventional advice if your model is a polynomial is “don’t extrapolate”. The higher order the polynomial the more dangerous it is to use the regression for future projections. 6th order is virtually unheard of, however wonderful the fit over the data range. It happens that with the TMax data the most recent observations which end at the end of 2014, the short term slope is very slight, so for a few years little harm will come. But after a few more, a precipice-like decrease in the projections occurs.
It is interesting to examine the residuals from the 6th order fit. Superficially “normal” in appearance, calculation of their skewness and kurtosis reveals that although they are very symmetrical, they are very much more peaked than the expected or hoped for normal distribution, k being 3.06. Something is odd about this fit. Again I think a warning about high order polynomials.
There is an alternative way of looking at “noisy” time series, which does not rely on any sort of preliminary smoothing and which uses all the available data, and does not impose a model on the series. It lets the data speak for themselves. It is useful in the identification of possible discontinuities, which helps in the development of piecewise fits to the data, normally using linear models over restricted ranges. These could be second order if the primary analysis indicates that a steadily increasing or decreasing trend in the original observations may be present. My researches seem to suggest that much of climate change occurs because of very rapid or step changes, and indeed some can be found in the TMax series, for example at 1937.