Why the NOAA Global Temperature Product Doesn’t Comply With WMO Standards

The opening paragraph of NOAA’s press release NCDC Releases June 2013 Global Climate Report begins with alarmist statistics and an error (my boldface):

According to NOAA scientists, the globally averaged temperature for June 2013 tied with 2006 as the fifth warmest June since record keeping began in 1880. It also marked the 37th consecutive June and 340th consecutive month (more than 28 years) with a global temperature above the 20th century average. The last below-average June temperature was June 1976 and the last below-average temperature for any month was February 1985.

First, the error: According to the NOAA Monthly Global (land and ocean combined into an anomaly) Index (°C), the “last below-average temperature for any month was” in reality was December 1984, not February 1985. Makes one wonder, if they can’t read a list of temperature anomalies, should we believe they can read thermometers?

Second, it’s very obvious that NOAA press releases have degraded to nothing but alarmist babble. More than two years ago, NOAA revised the base years they use for anomalies for most of their climate metrics. The CPC Update to Climatologies Notice webpage includes the following statement (my boldface):

Beginning with the January 2011 monthly data, all climatologies, anomalies, and indices presented within and related to the monthly Climate Diagnostics Bulletin will be updated according to current WMO standards. For datasets that span at least the past 30 years (such as atmospheric winds and pressure), the new anomalies will be based on the most recent 30-year climatology period 1981-2010.

Apparently, the NCDC didn’t get the same memo as the CPC. The Japanese Meteorological Agency (JMA) got the memo.

The following graph compares the NCDC global surface temperature product from January 1979 to June 2013, with the base years of 1901-2000 used by the NCDC and the base years of 1981-2010 recommended by the WMO.

Figure 1

If the NCDC had revised their base years to comply with WMO recommendations, the press release wouldn’t have the same alarm-bell ring to it:

According to NOAA scientists, the globally averaged temperature for June 2013 tied with 2006 as the fifth warmest June since record keeping began in 1880. It also marked the 17th consecutive June and 16th consecutive month (less than two years) with a global temperature above the 1981-2010 average. The last below-average June temperature was June 1996 and the last below-average temperature for any month was February 2012, though December 2012 was basically zero.

The monthly global surface temperature stats would be pretty boring if NOAA complied with WMO standards. Pretty boring indeed.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

62 Comments
Inline Feedbacks
View all comments
izen
July 22, 2013 5:23 am

Errr…
The reference period chosen makes absolutely NO difference to the magnitude of the anomalies or the ranking of which year or month is hottest or fourth warmest or whatever.
Only the mathmaticallyiliterate are going to think that a 0.22degC anomaly on a 1981/2010 is somehow ‘better’ than a 0.64degC anomaly on the 1901/2000 reference period.

John West
July 22, 2013 5:25 am

So, it’s the warmest decade in a century (after adjusting away the dust bowl). It’s also the coolest millennium in this Epoch. Reporting in this one-sided manner shows these agencies press releases are no longer controlled by objective observes but are indeed in control of advocates of a political agenda.
How would the observations differ if the warming were natural and cyclical over longer timescales than our measly record?

July 22, 2013 5:40 am

Thanks, Bob. A very good point indeed.
It seems like NOAA is willing to go down in warmista flames to support a dying agenda that causes poverty.

July 22, 2013 5:58 am

Who cares what the baseline is? Trends show up regardless, and anyone intelligent and/or interested enough to be reading articles like this can easily understand the shifting anomalies. UAH shifted its baseline the other way. Comparisons to average 20th century temperatures are useful, as are comparisons to the 1981-2010 averages. The major temp keepers have always used different baselines, and there are numerous sources that plot them together. Much ado about nothing.

Carbon500
July 22, 2013 6:03 am

The Central England Temperature record (CET) doesn’t suggest that June 2013’s temperature differs in any notable way from those of preceding centuries. This June it was 13.6 degrees Centigrade, and in 1659 13.0 degrees.
A simplistic observation?- of course, but a look at all the temperatures for June over the years indicates that recent June temperatures are typical for this month.
At no time has the average yearly temperature quoted in the CET gone above 11 degrees Centigrade. That’s from 1659 to the present day.
Yes, averages lose detailed information about what happened month by month in any given year (the average temperatures in 1956, 2010,1902, 1754, and 1659 are all given as 8.83 degrees Centigrade) – but isn’t it interesting that 2010 is in the above list?

Matt
July 22, 2013 6:09 am

I’m still waiting for someone to explain that new quartiles map to me. Why are areas with negative temperatures anomalies consistently shown as above normal on the quartiles map? Why do areas with a slight positive anomaly show as much above normal on the quartile map? I understand it depends on the area and anomaly, but I just can’t wrap my head around why the quartile map looks so much warmer than the raw anomaly map.
These are the two maps I am talking about:
http://www.ncdc.noaa.gov/sotc/service/global/map-blended-mntp/201306.gif
http://www.ncdc.noaa.gov/sotc/service/global/map-percentile-mntp/201306.gif

Bill Illis
July 22, 2013 6:29 am

The global temp record is quoted in an anomaly versus that month’s average global temperature so every monthly anomaly is comparable to every monthly anomaly (and there is a seasonal cycle in global temperatures with July being the warmest month).
It makes no sense to say this June versus that June.
And then the NCDC is adjusting its historical temperature record every month and it is a systematic change, cooling the past (particularly around 1900) and warming the recent temperatures (particularly in the mid-1990s). They have added 0.15C to the trend in just the last five years.
http://s7.postimg.org/3y7l79bpn/NCDC_Changes_since_Dec_2008.png

Mikeyj
July 22, 2013 6:40 am

I’m appalled. If I believed I was responsible for global warming I would kill myself to save the planet. I’m not and I won’t. CAWG has, and will always be about the money and control.

Catcracking
July 22, 2013 6:50 am

Buzz B says:
July 22, 2013 at 5:58 am
“Who cares what the baseline is? Trends show up regardless, and anyone intelligent and/or interested enough to be reading articles like this can easily understand the shifting anomalies. UAH shifted its baseline the other way. Comparisons to average 20th century temperatures are useful, as are comparisons to the 1981-2010 averages. The major temp keepers have always used different baselines, and there are numerous sources that plot them together. Much ado about nothing.”
“who cares what the baseline is?
Apparently they do since they pick one that allows them to hawk global warming which has not been happening for about 15 years despite every effort to manipulate the raw data.
Clearly their methodology is to provide red meat for the progressive agenda which is to mislead the people, nothing more. Do you see the acceleration in warming mentioned by the Administration?

July 22, 2013 6:50 am

Anthony: Off topic, but you’d probably be interested in this (if you haven’t already seen it): http://cliffmass.blogspot.ca/2013/07/are-nighttime-heat-waves-increasing-in.html

Keith
July 22, 2013 6:58 am

Hi Bob,
Sorry for the off-topic comment, but are you aware of anywhere I could find the latest El Nino Modoki index weekly values please? The most recent I can find online only go to October 2012. To judge by trade wind anomalies I’d expect that the current value may be in seriously negative territory, but I can’t find the data that would show whether this is the case or not.

Greg
July 22, 2013 7:25 am

Bob, it seems to me that you are confusing the base temperature for the anamaly calculation with the _average_. Before sounding off you need to be more careful.
Whatever base period is used does not affect what years are above or below average.
The key criticism is the one Bill Innis makes, that most of the reason temperatures are “warmer” is because they keep cooling the past. They are rigging the data to fit what they want to say.

dp
July 22, 2013 7:31 am

In the intervening time since the end of the LIA anything but a steady increase in temperature and changes to those things affected by increasing temperature would be alarming. This temperature drift is not alarming. The governmental response to it gives me goose bumps

Tom
July 22, 2013 7:35 am

izen says:
July 22, 2013 at 5:23 am…………………………
“……………………………….Only the mathmaticallyiliterate are going to think that a 0.22degC
anomaly on a 1981/2010 is somehow ‘better’ than a 0.64degC anomaly on the 1901/2000 reference period.”
_____________________________________________________________________________
This is press release and therefore goes out the dyscalculic press and then to the even more dyscalculic public.

July 22, 2013 7:37 am

The monthly Global Temperature Report – UAHuntsville is easy to read, at http://nsstc.uah.edu/climate/
Also very clear, the Latest Global Average Tropospheric Temperatures at http://www.drroyspencer.com/latest-global-temperatures/
When I read Matt’s “Why are areas with negative temperatures anomalies consistently shown as above normal on the quartiles map?” above, I though it is because the normal temperature for that area is lower than the negative anomaly. But even then, given the background of alarmist rhetoric from NOAA, I stick with UAH.

knr
July 22, 2013 7:41 am

the press release wouldn’t have the same alarm-bell ring to it: and therefore would have failed to achive its objective .

Richard M
July 22, 2013 7:48 am

Izen: “Only the mathmaticallyiliterate are going to think that a 0.22degC anomaly on a 1981/2010 is somehow ‘better’ than a 0.64degC anomaly on the 1901/2000 reference period.”
That’s the whole reason why they do it. Most of the voting public is “mathmaticallyiliterate”.

John F. Hultquist
July 22, 2013 7:56 am

Thanks Bob.
——————
Several years ago I, and others, made note of the problem the gate keepers of the data would have after 2010 when the updated 30 year climatology would appear. Having adjusted the temperature record to be high during the 2001-2010 years the properties of a calculated ‘mean’ or simple average (strongly affected by outliers) began to stare them in the face. Only if the CO2 hypothesis worked could the numbers continue to show what they seemed to want. With strong natural variation at work there is little wonder that NCDC failed “to get the memo.”
On the other hand, one ought to understand why the 30 year period became “climatology” in the first place and then reflect on whether or not it has meaning in the science of climate. I don’t think it does. It is just a convenience for human’s memory. Why not use all the record one has for the science? That would have NCDC using say, 1901 thru 2012 and not stopping at the year 2000. They are conflicted.

Chuck L
July 22, 2013 8:14 am

I always thought Dr. Uccellini was a class act, I can’t believe he is now a part of NOAA’s global temperature manipulation for political ends. I an hopeful (though increasingly doubtful) that he has not yet had a chance to evaluate this situation.

July 22, 2013 8:18 am

Thanks, John F. Hultquist.
I made a mental note of your words then, now I’m beginning to see the problems floating up from the bottom of the deception.
I see more extreme acrobatics in the scam, epicycles all the way down.

1 2 3