Impact of "Pause-Buster" Adjustment on GISS Monthly Data

Guest Post By Walter Dnes:

from 2004.75

Image Credit and a special thanks to Josh

With GISS incorporating NOAA/NCEI “Pause-Buster” adjustments for their monthly anomalies as of June 2015 data, our friend Daft made another appearance. Also, I noticed that my temperature-tracking spreadsheet at home, and on Google Docs both needed to have their Y-axes extended, because the highest anomaly in the data (i.e., January 2007) was raised. This prodded me to check the progress of the GISS anomalies over time.

I only started downloading GISS data in 2008, plus I picked up a few older uploads back to 2005 from the “Wayback Machine”. This accounts for the limits on my comparisons. GISS data downloaded from here is given in hundredths of a Celsius degree; e.g. 15 ==> 0.15 C degree. This allows the numbers to be integers.

For those of you who wish to do their own analyses, the downloads are available in a zip file saved to WUWT here. The files are in 2 formats. The files named gissYYYYMM.txt (“YYYY” = year and “MM” = month) are in the original tabular download format, with 12 months of data per row. This is human-readable, but very difficult to import into a spreadsheet. For each such file, I’ve generated a file named gYYYYMM.txt, which is suitable for importing into a spreadsheet. The generated files contain date in decimal format, a comma, and the anomaly. As noted above, the anomaly is an integer equal to 100 times the actual anomaly. All files are in DOS/Windows format. Linux/Unix/BSD users, please run dos2unix on the files for use in a posix environment. Note that this data set uses the corrected data issued by GISS on July 19th. For details see the “News” section on GISS’s website where they acknowledge Nick Stokes for noticing a recent bug in the GISS data.

First, let’s look at the difference between GISS anomaly data from May 2015 and June 2015.

Walter Dnes – Click the pic to view at source

There were additional adjustments going back to 2005. Here is the graph of combined adjustments between August 2005 and June 2015.

Walter Dnes – Click the pic to view at source

As mentioned earlier, I had to extend the Y-axis in my graph, because the temperatures were adjusted upward. A quick analysis showed that the highest anomaly for almost every download (starting from 2007, obviously) was for the January 2007 anomaly. The only exception was the September 2012 download. It showed the March 2002 anomaly 1/100th of a Celsius degree higher than the January 2007 anomaly. The following graph shows the inexorable upward march of the March 2002 and January 2007 anomalies. Seven years ago in mid-2008, GISS told us that the January 2007 anomaly was +0.85. Today, they’re telling us that the January 2007 anomaly was +0.97. I wonder what they’ll be telling us seven years from now.

Walter Dnes – Click the pic to view at source

This encouraged me to look at the lowest anomalies for each download. From my earliest available download, August 2005, through May 2015 the lowest anomaly was always for the month of January 1893. But in the June 2015 download, the January 1893 anomaly jumped up +0.17 of a Celsius degree, giving up the lowest anomaly ranking to December 1916. Ten years ago, back in mid-2005, GISS was telling us that December 1916 was -0.56. Today they’re saying December 1916 was -0.77. Again, what will it be ten years from now?

Walter Dnes – Click the pic to view at source
Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
164 Comments
Inline Feedbacks
View all comments
Chris Hanley
July 24, 2015 3:23 pm

Now apparently it is specifically the sea surface temperature anomaly that is the indicator of human-induced global warming and the Met Office Hadley Centre claim to have the SST data (for 70% of the Earth’s surface) back to 1845 to an accuracy of one tenth of a degree C.
Even a layman (or is it only a layman?) can see this is utterly absurd.

D.I.
July 24, 2015 4:48 pm

Anomaly BS dies by its own sword-
http://data.giss.nasa.gov/gistemp/abs_temp.html
The last Q/A says it all.

Khwarizmi
Reply to  D.I.
July 24, 2015 10:09 pm

That’s a fascinating and refreshingly honest Q&A. I thought it was the pretense of precision rather than the use of “anomalies” that died in the exchange:
================
The Elusive Absolute Surface Air Temperature (SAT)
Q. What do I do if I need absolute SATs, not anomalies ?
A. In 99.9% of the cases you’ll find that anomalies are exactly what you need, not absolute temperatures. In the remaining cases, you have to pick one of the available climatologies and add the anomalies (with respect to the proper base period) to it. For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, but it may easily be anywhere between 56 and 58°F and regionally, let alone locally, the situation is even worse.
====================

July 24, 2015 5:00 pm

Why not just ask each IPCC “scientist” to draw their own data set picture from what they think is happening? Just give them a blunt crayon each and let them go free hand. Sounds just as scientific to me and a lot cheaper,

Charles Hendrix
July 24, 2015 6:34 pm

In statistics an anomaly is data that’s well beyond the bounds of random variation and is therefore worthy of investigation. For instance, key entry errors, erroneous labeling of samples in a lab. Deviations are simply departures from the expected, whether large of small. Somehow, somewhere along the way, all deviations now seem to be anomalies.
Just saying. (Curmdgeon – retired statistician)

Claude Harvey
July 24, 2015 6:59 pm

The penguins and the polar bears are pretty much gone, the Arctic is ice free in summer, the ski industry is destitute and our younger children don’t know what snow looks like; all as AGW theorists predicted in the 1990’s. Oh, and the Stature of Liberty is up to her ankles in sea water. I admit the Pope might not be a Catholic and they didn’t see that one coming, but it seems to me their overall record of prognostication is outstanding. Why are you guys picking these nits?

Claude Harvey
Reply to  Claude Harvey
July 24, 2015 7:01 pm

Make that “Statue”.

Reply to  Claude Harvey
July 25, 2015 2:35 am

Actually “Stature” fits pretty well. Except it is up to her hips.

David A
July 24, 2015 8:22 pm

The surface record is diverging from the satellites at two degrees per century…
http://realclimatescience.com/wp-content/uploads/2015/07/ScreenHunter_243-Jul.-24-19.38.jpg

Reply to  David A
July 25, 2015 11:02 am

The surface record is diverging from the satellites at two degrees per century…
Since 2002. How about since the start of the satellite record, 1979?

Reply to  Peter Sable
July 25, 2015 11:05 am

Not surprising since satellites don’t measure surface temps.

David A
Reply to  Peter Sable
July 25, 2015 6:12 pm

Joel you repeat without learning. The troposphere, per CAGW theory, is supposed to warm more then the surface.
Peter, the divergence is increasing throughout as the adjustments continue, however for a time there was a much stronger phase harmony between the surface and the satellites, with El Nino consistently showing greater amplitude in the satellite data sets. The methodology for the satellites is almost identical over the entire set, and their verification with weather balloons is continues. The surface data sets are contrary to known physics.

David A
Reply to  Peter Sable
July 25, 2015 7:03 pm

Peter, also the number of USHNC stations that are part of their data base, but not being used, is increasing dramatically. Up to 40% of the stations are filled in from stations each month now. It is not, IMV, a coincident that the rural stations are missing, being filled in from urban areas.
https://stevengoddard.wordpress.com/2014/12/13/ushcn-replacing-rural-temperatures-with-urban-ones/
In addition to the obvious increase in the anomaly from spreading UHI, this continues the trend of reducing the number of stations, thereby mathematically increasing the anomaly average.

charles nelson
July 24, 2015 9:28 pm

A few years ago some of my esteemed colleagues calculated that 600 Angels could dance on the head of a pin. That has been adjusted recently and we now believe that 603 Angels can dance on the head of a pin.

Reply to  charles nelson
July 25, 2015 2:37 am

And here I thought it was 603.0025

Jeremy Shiers
July 25, 2015 2:38 am

A few thoughts following comments by rgb and Pat Frank
HadCru people seem to be under impression they can take a single value for the measurement error of a thermometer, by which they mean every thermometer used at a weather station anywhere in the world from 1850 to now.
HadCru claim (based on what I don’t know) that homegenisation and will remove the effect of systematic errors (yeah right) so all that is left is measurement error AND as they take monthly average the error on this average will be reduced a la standard error.
Paul Homewood and Tony Heller have shown that some temperature ‘measurements’ are actually estimates, what is the error on an estimate I wonder?
Antony and the surface station project have shown there are significant changes to weather stations and there surroundings over time. HadCru make adjustments of a few tenths of a degree to cater for these. It is not clear how adjustments of tenths of agree can adjust for effects which are greater than 1 degree
EM Smith has shown that in US whatever temperatures are measured as they are at times recorded to 1F and there may be times when measurements are rounded to even degrees F
USCRN is a network of around 115 ‘pristine’ weather stations designed to avoid the issues found by surface station project. The data is freely available and the the trend from 2004 to 2015 is falling at 0.05F/year ie 5F/century. The climate change/agw panic is based on a rise of 0.8C over 130 years, about 1.1F/century

Ashby Lynch
Reply to  Jeremy Shiers
July 25, 2015 4:43 am

Mr. Shires, Could you present more information on this? A plot, map of station locations, etc. Thanks for pointing out this data. How does it match with satellites?

Jeremy Shiers
Reply to  Ashby Lynch
July 25, 2015 5:14 am

Hi Ashby
the data is available here
http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn&parameter=anom-tavg&time_scale=p12&begyear=2004&endyear=2015&month=12
below the graph is a tiny excel image which allows you to download data as csv,then import into excel (or python R….) and create a plot and calculate a trend line
cheers
Jeremy

David A
July 25, 2015 3:57 am

Jeremy says, “USCRN is a network of around 115 ‘pristine’ weather stations designed to avoid the issues found by surface station project. The data is freely available and the the trend from 2004 to 2015 is falling at 0.05F/year ie 5F/century. The climate change/agw panic is based on a rise of 0.8C over 130 years, about 1.1F/century”
———————————–
Exactly so ! Curious is it not, the USCRN network is damm near a match for UAH and RSS.

Ashby Lynch
Reply to  David A
July 25, 2015 4:49 am

Perhaps the next billion dollars of global warming research money should be spent on extending this network over the globe.

David A
Reply to  Ashby Lynch
July 25, 2015 10:30 pm

It would be great, and of course not expensive. However the satellite data sets are, in affect, just that.

July 25, 2015 9:01 am

GISS, HADCRUT4, and UAH data series differ slightly but the only one having been “massaged” over time was GISS, with a slight tendency to increased warming and to transform the so-called pause into some rising pattern (¿por qué será?).
In any case no useful correlation can be drawn from the data since the beginning of the pause (approx 1998).
Graphic representations of these data sets, as is ,and with various smoothing techniques :
see http://bit.ly/1TZiNGJ

David A
Reply to  Michel
July 25, 2015 7:07 pm

It is not slight. UAH and RSS both show 1998 as the warmest year by one thousand percent more then GISS shows 2014 or 2015 as the warmest year.

NucEngineer
July 25, 2015 12:06 pm

And Winston looked at the sheet handed him:
“Adjustments prior to 1972 shall be -0.2 degrees and after 1998 shall be +0.3 degrees.”
Winston wondered at the adjustment to the data. At this point, no one even knows if the data, prior to his adjustments, was raw data or already adjusted one or more times previously.
It didn’t matter. All Winston was sure of is that one of the lead climatologists needed more slope to match his computer model outputs. He punched out the new Fortran cards and then dropped the old cards into the Memory Hole where they were burned.
“There!” Winston exclaimed to himself. “Now the temperature data record is correct again; all is double-plus good.”