NASA GISS Tweaks the Short-Term Global Temperature Trend Upwards

SEE UPDATE 2 AT END OF POST

Guest Post by Bob Tisdale

GISS released its August 2014 global surface temperature data today. As I was preparing the graphs for the August 2014 surface and lower troposphere temperature update, I noticed a sizeable jump in the short-term trend in the GISS data. (I’ll try to post the full update this evening.) The August GISS LOTI value is higher than July, but it should not have had that much of an effect on the trend for the period of January 1998 to present. Not too surprisingly, much of the increase in trend was caused by adjustments to data from 2000 to 2013.

Figure 1 compares the short-term annual trend of two recent versions of the GISS global surface temperature data, from 1998 to 2013. The version as of August 7, 2014 (through June 2014) is available through the Wayback Machine here, and the August 2014 update is available through the GISS website here.

Figure 1

Figure 1

Now keep in mind that we’re not looking at the 2014 data so any variations this year do not impact these trends. In June 2014, the 1998-2013 trend was 0.062 deg C/decade, and a few months later, it jumped up to 0.066 deg C/decade.

The old short-term trend must not have been high enough. GISS must not like it that the UKMO’s HADCRUT4 data is catching up with them during this period. Can’t have that.

It has been said before. It will be said again. The adjustments always seem to add to global warming.

PS: Yes, I realize we’re discussing a trends presented in thousandths of a deg C/decade. But these small changes keep coming and they add up.

UPDATE (September 15, 2014): Sorry, I should’ve included a graph with the year-to-date (January to June) 2014 data to also show the impacts of the tweaks on this year. See Figure 2.

Figure 2

Figure 2

With the adjustments, 2014 has a better chance of matching or breaking records.

That explains it.

# # #

UPDATE 2 (September 16, 2014): Animation 1 compares GISS trend maps. One was downloaded on June 12, 2014. The second was downloaded yesterday June 15, 2014. To complicate the comparison, GISS recently revised their Robinson projection maps. On their Updates to Analysis webpage, they write:

September 15, 2014: Color maps using the Robinson projection are now presented without contour smoothing, since that process occasionally results in skipping some color bands.

It appears, however, that there is new data in the Arctic, north of eastern Siberia. GISS must’ve found an island in the Arctic Ocean with some data so they could infill that region with missing data.

GISS Trends Before and After

Animation 1

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
115 Comments
Inline Feedbacks
View all comments
September 15, 2014 11:37 am

No surprise. This is how MatheMannics and AlGoreithms work.

Resourceguy
September 15, 2014 11:45 am

Ka-ching

Frank K.
September 15, 2014 12:06 pm

Folks – please remember that this analysis by GISS is their **interpretation** of the NCDC data. Their methods are based on a lot of questionable assumptions (to say the least) and the numbers mean very little in terms of actual trends (and are certainly thermodynamically meaningless). In the modern era, the satellite data is, I believe, the most reliable indicator of the “average earth temperature”.
In other words, just ignore the GISS analysis, much like you would ignore the Weather Channel’s cute naming of low pressure systems and cold fronts… [heh!]

September 15, 2014 12:11 pm

Thanks, Bob, I think.
I’m afraid that the new way of doing science might take hold. That would be catastrophic, man-made, and even global. But warm? It sends shivers down my spine.

Mary Brown
September 15, 2014 12:21 pm

What will happen when the trend spread between satellite temps and land-based adjusted temps becomes too huge?
In other words, how long can they get away with this ?

Frank K.
Reply to  Mary Brown
September 15, 2014 12:30 pm

“What will happen when the trend spread between satellite temps and land-based adjusted temps becomes too huge?”
Remember – the GISS trends are **meaningless** in terms of actual, thermodynamic temperatures. For example, if their temperature anomaly “index” (to put it charitably) increases by 10% over one year, does that mean that the average temperature of the earth increased by that much? Of course not! The true average may in fact have increased, but probably not nearly as much. And given all of the “corrections” and “adjustments” it is all but impossible to compare their index values with those derived from historical data.
Bottom line – GISS anomalies have little to do with actual temperatures. It is mostly for entertainment purposes only.

Rud Istvan
Reply to  Mary Brown
September 15, 2014 1:12 pm

UAH and RSS constrain them a bit after 1979. Nothing constrains them before unless we vote the rascals out.

GaryW
September 15, 2014 12:27 pm

Truly the saddest part of the adjustment scheme we are seeing is that from an instrumentation perspective, it is completely invalid. Sure, someone with a strong academic standing can write peer reviewed papers explaining just how correct and wonderful the scheme is but that does not make it hold water in the real world of instrumentation engineering. It is simply mistaking real world measurement system errors for theoretical random number noise problems and differences between nearby (?) stations as an indication of calibration error.
To justify shifting a set of observations up or down when a step change is assumed, calibration reference reading from both the beginning and end of the period to be shifted must be available and a calibration reference must be available for just after that period. Even that is still shaky technically as it will assume a linear error shift from the beginning to the end of the period to be shifted. In the case of a white washed wooden structure continuously exposed to weather, obviously issues such as dust/dirt build up and rain wash just some of the simplest things that can temporarily shift observed temperature values by fractions of a degree.
Next, homogenization algorithms, if they are to be used at all, must always be performed against raw values from neighboring sites. That does not appear to be the case. Homogenizing a site’s records with previously homogenized sites destroys actual correlation comparisons between those sites. It is no longer possible to know whether the ups and downs on a site’s records are real or errors if compared against a previously artificially smoothed site’s records.
What is needed is to stop trying to polish good historic temperature data into laboratory grade data. The information needed to correct real or theoretical deviations from perfect does not exist. Now amount of second guessing the real world will improve the accuracy of that data. No matter what algorithms are used, the accuracy will still remain plus or minus one degree Celsius at best – for every single temperature observation value and every value derived from it.

Dodgy Geezer
September 15, 2014 12:37 pm

Is there a reason why no-one can ask for the adjustments to be explained?

TonyL
Reply to  Dodgy Geezer
September 15, 2014 12:55 pm

You are not allowed to ask that question. We know the reason, and that is enough.

Reply to  Dodgy Geezer
September 15, 2014 1:01 pm

you are assuming that adjustments is the reason.
GISS only do 1 adjustment: UHI.

David A
Reply to  Steven Mosher
September 16, 2014 3:47 am

Steven M, is all the data GISS receives raw? (You know it is not) Also there is much logical evidence that the UHI adjustment is wrong.
Steven M, please explain the Iceland adjustments. http://stevengoddard.wordpress.com/2014/09/14/occupy-iceland/
Seriously Mr. Mosher. Just justify this ONE adjustment. You berate some for not understanding how it s so complicated. Well Sir, simplify it!! Explain just this one adjustment. (That should not be to hard.)

nutso fasst
September 15, 2014 12:46 pm

Western region Coop station data, as it appears on NCDC websites, has changes to, and deletions from, data logged by observers. Whoever is doing the changing/deleting does not know what they are doing, which is apparent in the data from stations where data is logged in the morning or logged twice per day. The high temperature logged at stations that log data in the morning will be Tmax from the day before, but NCDC data show it as being the high temperature for the day on which it was logged. In some cases this resulted in record low high temperatures announced by the NWS in August being discarded by NCDC.

SRJ
September 15, 2014 12:51 pm

From the FAQ section at GISTEMP’s homepage:
“Q. How can we combine the data of the two stations above in a meaningful way?
A. What may be done before combining those data is to increase the new data or lower the old ones until the two series seem consistent. How much we have to adjust these data may be estimated by comparing the time period with reports from both stations: After the offset, the averages over the common period should be equal. (This is the basis for the GISS method). As new data become available, the offset determined using that method may change. This explains why additional recent data can impact also much earlier data in any regional or global time series.”
Source: http://data.giss.nasa.gov/gistemp/FAQ.html
It is a consequence of the way GISTEMP combines data from different stations as explained in the quote.

AJB
Reply to  SRJ
September 15, 2014 6:16 pm

You left out the example of two stations referred to above …
“Q. Can you illustrate the above with a simple example?
A. Assume, e.g., that a station at the bottom of a mountain sent in reports continuously starting in 1880 and assume that a station was built near the top of that mountain and started reporting in 1900. Since those new temperatures are much lower than the temperatures from the station in the valley, averaging the two temperature series would create a substantial temperature drop starting in 1900.”
… and the important bit following …
“Another approach is to replace both series by their anomalies with respect to a fixed base period. This is the method used by the University of East Anglia’s Climatic Research Unit (CRU) in the UK. The disadvantage is that stations that did not report during that whole base period cannot be used.
More mathematically complex methods are used by NOAA National Climatic Data Center (NOAA/NCDC) and the Berkeley Earth Project, but the resulting differences are small.”
This implies that homogenisation of the entire raw data is reworked from scratch (right back to 1880) as time passes and new values are appended. That in turn means the whole thing is floating about and the increment or decrement each month has no meaning when compared to values in previous versions. Then we have this at the bottom of both versions of dataset used to produce the graph below.
“Best estimate for absolute global mean for 1951-1980 is 14.0 deg-C or 57.2 deg-F, so add that to the temperature change if you want to use an absolute scale (this note applies to global annual means only, J-D and D-N !)”
Is it any wonder folk are confused? How many megawatt hours of CO2e have been wasted on blog arguments worldwide due to this misunderstanding of crackpot numerology whose credibility disappears up its own rounding/truncation errors? You cannot create precision where there isn’t any.

AJB
Reply to  AJB
September 15, 2014 8:44 pm

I have no idea how this code was put together but unwitting penny millionaires are everywhere.
If you look at the J-D annual averages compared to monthly values it looks as though they are all truncated rather than rounded. But maybe they’re derived independently of monthly averages and that’s just coincidence. The output format also suggests integer arithmetic (based on hundredths of a degree) may have been used throughout, which would naturally truncate intermediate and output values by default. Usual line of mod and conditional increment required. It could of course all be floating point in some language with output formatting that does the same. IIRC some older varieties of FORTRAN used to do that.
Even if it’s using traditional 5 up 4 down rounding it’ll still bias older values down and new ones up with masses of individual small values increasing over time centred around zero. 5 up 4 down has a natural bias. Tax man’s rounding (6 up 4 down, 5 to the nearest even higher order digit) would likely make it a bit more stable.
Academic as far as actual warming is concerned (who’s interested in hundredths of a degree FFS). But at least it might reduce the waste of coal on these endless silly arguments over something that has little basis in physical reality in the first place.

September 15, 2014 12:54 pm

1999 to 2014 trend – no context of course – is 1.3C/100 years. Even out-of-context-bogus there is nothing to support the CAGW narrative.

steverichards1984
September 15, 2014 1:08 pm

SRJ: I am glad they said that their method combined data in a meaningful way.
It is a shame that the result is meaningless!

September 15, 2014 1:44 pm

I think GISTEMP must update its time series if it believes that it is necessary. You can compare their data with those of other organizations (NOOA, HADCRUT, Berkeley etc.). I hope that these organizations are working independently. I found for the Gistemp LOTI data for Aug 2014 following running annual means Trends: 15 yr: 0.090+-0.01 °C/Dec, 30yr: 0.170+-0.05 °C/Dec , 60 yr 0.135+-0.02 °C/Dec. Obviously, the trends strongly depend on the time interval used.

Reply to  Paul Berberich
September 15, 2014 1:55 pm

You all look at the raw data a lot more than me . . what happens when you plot the temperature versus the global wind patterns – they change as the jet streams move?

george e. smith
September 15, 2014 1:57 pm

It is my understanding, that GISS, UAH, RSS, HADCRUT, whatever, ALL report at some periodicity (daily / weekly / monthly / yearly / decadely / whatever; a single number, that is some computed “Temperature” measured on some arbitrary NON-SI Temperature scale, commonly referred to as an anomaly (it surely is).
So is there in existence, a peer reviewed paper, or series of such papers, that gives a rigorous physical proof, that the earth weather / climate / whatever , can indeed be completely and unambiguously represented by such a single Temperature (anomaly) number reported periodically, without any reference to any other physical parameter of the earth climate system.
What is it about Temperature, that enables it to survive alone, as the ONLY index of earth climate ?? Why do all other variables cancel out as quite irrelevant to earth climate and climate history ??
Just asking of course.
How many of the 57 approved climate models give a proof, of the total cancellation of all other physical variables besides Temperature ??

Reply to  george e. smith
September 15, 2014 2:34 pm

When I attended University in the early 60s – you could only PEER REVIEW a Peer review request required all data, method, formulas, math, and all pertinent information for the Reviewers to attempt to recreate the hypothesis – none of the climate papers are real Peer Reviewed – as the Law now permits them to keep datasets, methods, math used proprietary [aka undisclosed] – so how can one review Political OPINION?
Surely not per Scientific method . . I would have been flunked out of many classes if I submitted that kind of paper for a grade – muchless peer review. The e-greens were not getting enough papers through so they could apply for more grant money so they started there own journals and selected their own peer reviewers.
Humm sounds like a money scam to me.

David A
Reply to  george e. smith
September 16, 2014 3:54 am

big fat 0

Bruce Sanson
September 15, 2014 1:57 pm

It now seems to me that a large segment of the so called climate science community (N.Z. scientists included) is quite prepared to produce “fantasy” data in support of their own personal beliefs. Do they really think they are fooling anyone? I suspect only the already deluded will place any value to the above tortured data set.

Reply to  Bruce Sanson
September 15, 2014 2:37 pm

Check the Australians they have bailed on the false science of CAGW.

John Peter
September 15, 2014 2:40 pm

So maybe Steven Goddard is on to something after all. Looks like more analysts are discovering what he has been promoting for some time now.

September 15, 2014 2:42 pm

Every little bit they can do for the upcoming “Summit” in NY. WMO and Nasa made their contribution with the news on the overwhelming amount of CO2 with no mention of NO warming. I guess they are adding what warming they can in time. As in a billy pail of bog water, the bugs are swimming faster and faster as the end approaches.

otsar
September 15, 2014 2:48 pm

The warming adjustments will make the descent appear much steeper when a cooling cycle is realized.
It will be interesting to watch the acrobatics.

September 15, 2014 2:55 pm

Don’t forget the impact of the Zombie Weather Stations
Want some more warming? Close down some more of the cooler stations and “manufacture” readings more friendly to “The Cause.”

James Abbott
September 15, 2014 3:10 pm

Back to the figure for August:
NASA GISS LOTI is running at +0.64 C for the year to date including August. So 2014 could be a top 3 year on their data set.
Because recent years have been consistently warm, with no cooler years such as 2008 to drag down the average, the 5 year running mean could go marginally record warm when the 2014 data is added in.

David A
Reply to  James Abbott
September 16, 2014 3:55 am

They have a goal.

SIGINT EX
September 15, 2014 3:27 pm

Just Gavined up Schmidty data.
Ha ha

Editor
September 15, 2014 3:36 pm

Bob, in my post http://wattsupwiththat.com/2014/07/03/giss-hockey-stick-adjustments/ I included a zip file https://wattsupwiththat.files.wordpress.com/2014/03/work.zip with 94 monthly GISS downloads between August 2005 and May 2014, but there are some gaps in the 2006 and 2007 downloads. Read the readme.txt fie after unzipping, for instructions.
Feel free to download it and use it in your analysis. I also have June/July/August on my hard drive. Let me know if you need any of them.

AJB
September 15, 2014 3:38 pm

There must be a trend and a hockey stick in there somewhere Bob. Every little yelps …
http://s28.postimg.org/icmtt7g6l/jestco.png
[Rather “every little yelp hurts” ? .mod]

Bob Diaz
September 15, 2014 4:28 pm

So increased CO2 results in increased adjustments in data in the upward direction. Can I have a few million in grant funds to prove this alarming trend?
;-))

Bill H
September 15, 2014 5:00 pm

How dose GISS square this with the CRN data that shows a definite cooling trend? Something stinks to high heaven..

wayne
September 15, 2014 7:46 pm

On the average, over the period since 1940, the adjustments to all global datasets have kept up their upward trek of a mere 0.000625°C per month. Now that seems rather insignificant and you would almost never be able to actually detect it but this over time has adjusted the datasets at that rate time 1200 months in a century of 0.75°C/cy. The surprising result of this, cooling the past and always assuming today’s readings are the correct ones, create this graph when you go back and merely remove that 0.000625°C from each month since 1940. That rate of 0.75°C/cy comes from published adjustments from two datasets and a third had the slope at 0.81°C/cy, so this approximate. Kind of like the ‘thousands cuts’.
http://i44.tinypic.com/29axhua.png
(and with more smoothing)
http://i39.tinypic.com/1118rnl.png
This is basically my attempt to “un-adjust” the HadCRUT4 adjustments. You get basically the same if you use GISS or USHCN datasets and they have the adjustments published, in chart form at least.
I always find this telltale of what has happened… it has all been in the adjustments!