December UAH global temperature anomaly – down by almost half

December 2009 UAH Global Temperature Update +0.28 Deg. C

by Roy W. Spencer, Ph. D.

UAH_LT_1979_thru_Dec_09

The global-average lower tropospheric temperature anomaly fell back to the October level of +0.28 deg. C in December.

The tropics continue warm from El Nino conditions there, while the NH and SH extratropics anomalies cooled from last month. While the large amount of year-to-year variability in global temperatures seen in the above plot makes it difficult to provide meaningful statements about long-term temperature trends in the context of global warming, the running 25-month average suggests there has been no net warming in the last 11 years or so.

[NOTE: These satellite measurements are not calibrated to surface thermometer data in any way, but instead use on-board redundant precision platinum resistance thermometers carried on the satellite radiometers.]

YR MON GLOBE NH SH TROPICS

2009 1 +0.304 +0.443 +0.165 -0.036

2009 2 +0.347 +0.678 +0.016 +0.051

2009 3 +0.206 +0.310 +0.103 -0.149

2009 4 +0.090 +0.124 +0.056 -0.014

2009 5 +0.045 +0.046 +0.044 -0.166

2009 6 +0.003 +0.031 -0.025 -0.003

2009 7 +0.411 +0.212 +0.610 +0.427

2009 8 +0.229 +0.282 +0.177 +0.456

2009 9 +0.422 +0.549 +0.294 +0.511

2009 10 +0.286 +0.274 +0.297 +0.326

2009 11 +0.497 +0.422 +0.572 +0.495

2009 12 +0.280 +0.318 +0.242 +0.503

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
246 Comments
Inline Feedbacks
View all comments
George E. Smith
January 6, 2010 9:40 am

“”” pft (17:20:15) :
“NOTE: These satellite measurements are not calibrated to surface thermometer data in any way, but instead use on-board redundant precision platinum resistance thermometers carried on the satellite radiometers:”
This is somewhat evasive, Thermometers on board can not direcly measure surface temperature. There is obviously an algorithm being used to convert satellite measurements to surface temperature, and what some folks worry about is that the algorithm can be tweaked to give more “accurate’ temperatures by those who deem them too low. “””
Well it is not evasive at all. Maybe a little short on details; and yes it would be nice to know those details. I’m sure somewhere there exists a complete and full technical description of these satellites and every instrument they carry on board.
But back to the lack of ground reference:-
Let’s say I have an incoming signal (s) which is detected by a detector (A) that converts that signal to an output (t) which is alleged to represent a temperature that supposedly (s) is a proxy for. But over here I have a real thermometer (B), for example a Platinum Resistance thermometer; that eons of use have convinced scientists is a believeable measure of what we call temperature. Also on board, I have a signal generator that can generate controlled signals that when fed to the PR thermometer, will change its temperature by a highly repeatable amount; and I can simultaneously apply that same signal to my detector (A) to get it’s output (t).
So now I can directly compare the real temperature (T) read by the PR thermometer, with the ersatz temperature (t) registered by my onboar satellite sensor; both of which are fed the exact same stimulus.
This process has allowed me to directly substitute an accurately manufactured “signal” from my signal generator for the incoming detected signal (s), knowing that my manufactured signal stimulates the same response in my sensor (A) as it does in a real (PR) thermometer.
Well this sure is a crummy explanation of the process, but gazillions of scientific instruments function to measure various physical variables, by either “balancing” or “substituting” precisely known manufactured signals to obtain the same sensor response.
Things like Atomic Force Microscopes, measure tiny forces, by exactly balancing them (in a feedback loop) against very precise and accurately knowable forces, that are generated by well understood physical laws. An example would be the precisely calculable force between two coils carrying a known current.
Something along those lines, is what Roy means when he says the onboard satellite sensors, are calibrated against Platinum Resistance thermometers. (while on board); and it is known that those on board PRs still read real temperature whn on the ground, so no ground reference is needed.
Heck the most common example of this process, is the hundreds of years old Chemical Balance, which can’t weigh a damn thing; but it can very accurately tell when two different things have exactly the same weight; so it matches the weight of the unknown sample, against some slugs of metal, that have been manufactured to a certain degree of accuracy, as to their total mass and ultimately weight, subject only to the vagaries of earth’s gravitational field variations.
I wish I (and the general public) had more detail information about how some of these satellite instruments work. Magazines like Scientific American would seem to be perfect vehicles for describing the instrumentation of some of these things, so people knew how they work; but unfortunately, SA has become more of a political rag, and too many of its authors seem bent on pushing an agenda, rather than in publishing useful information.

January 6, 2010 9:48 am

Sean Ogilvie (09:27:45) :
OK it’s official. Now I’m happy…
Interestingly, the table also shows a 12-month running mean. Apparently listed as the mean of the current [latest] and previous 11 months…

George E. Smith
January 6, 2010 9:59 am

“”” Bart (18:53:05) :
I’m not getting all of the complaints about a 25 month running average. The monthly data are plotted along with it, so nothing is being “hidden”. What’s the beef? “””
Well my beef, Bart, is that it is well known that the integral of a sine function or a cosine function; over any integral number of cycles is precisely zero. It is also known that the integral of a sine or cosine function over any NON-integral number of complete cycles is generally not zero; and further more has a value that depends on the phase of the end points of the integral. It will only be zero for an interval that is symmetric about a zero or 180 phase point.
Since it is known that the earth takes part in a cylce of exactly 12 months; during which time, it goes through cyclic physical variable changes, some of which are approximately sinusoidal; it seems obvious that averaging over an integral number of years, would remove the effect of that cylcic variable.
The same thing could be said for the daily min/max readings from the Stevenson Screens and the like. What if they were simply read at 13 or 25 hour intervals, instead. The result would be a cyclic variation over a number of days, even if the daily cycle exactly replicated every single day.

Pamela Gray
January 6, 2010 10:03 am

I am still waiting for some bright up-and-coming Ph.D. candidate to redo satellite and a selection of unadjusted surface anomalies (without splicing the two together – that would be bad form) by calculating 3-month temp anomalies in order to compare with SST 3-month temps directly. In fact, all such variables, including atomospheric CO2, cosmic rays, etc, should be quantified using the same 3-month average system.

January 6, 2010 11:54 am

Roy Spencer has just now posted on his blog at http://www.drroyspencer.com/2010/01/how-the-uah-global-temperatures-are-produced/ an explanation of how the satellites get their temperature measurements.

Michel Lafontaine
January 6, 2010 12:05 pm

A fellow by the name of Gavin thinks the IPCC projections in the past are doing OK compared to reality. As anyone looked at this?
http://www.realclimate.org/index.php/archives/2009/12/updates-to-model-data-comparisons/
Happy New Year to all

DirkH
January 6, 2010 12:32 pm

“scienceofdoom (02:12:48) :
[…]
“The top 3.2m of the ocean has the same heat capacity as the entire atmosphere, and the total ocean heat content is about 1,000 times that of the atmosphere” (p.9)”
Great. Thanks for this gem.

Paul Vaughan
January 6, 2010 1:39 pm

Pamela Gray (10:03:38) “[…] 3-month average system.”
All spatiotemporal bandwidths need to be considered (not just 3mo) — that is the lesson hammered in intro-level Physical Geography grad-courses – it’s not a new idea (dates back 100s of years), but due to whatever inconsistencies & deficiencies in the mainstream education system, it has not become “common knowledge” yet. It could easily be covered in intro-level college courses – or even in the highschool system – but physical geographers, landscape ecologists, etc. have built a “publication mill” around the concept of spatiotemporal-pattern depending on aggregation-criteria. It’s a simple concept – and yet it’s paradoxical to many, it seems, probably simply because they never encountered it at a young age. For those interested in digging, google “modifiable areal unit problem” or “MAUP” – the same concepts apply temporally (as spatially). Although a simple concept, it results in burdensome software-programming & analysis-labor since it adds so many dimensions.

yonason
January 6, 2010 2:24 pm

scienceofdoom (02:12:48) :
Thanks for the reference.

Gary Hladik
January 6, 2010 3:53 pm

Joe Born (11:54:02) : “Roy Spencer has just now posted on his blog at http://www.drroyspencer.com/2010/01/how-the-uah-global-temperatures-are-produced/ an explanation of how the satellites get their temperature measurements.”
Comments are turned off on the article at Dr. Spencer’s site, so I’ll just thank him here for the info, and thank Joe Born for the link.

Richard Heg
January 6, 2010 4:10 pm

In Ireland it has been the coldest december in almost 30 years.
http://www.met.ie/news/display.asp?ID=44
the way january has been going there will be more records broken.

robr
January 6, 2010 6:14 pm

Scott B. (10:20:11)
I’ve had the same problem, so I went through the raw data, all in anomaly, but if you incrementally average the values, you will eventually get an average anomaly close to zero. This will tell the period being used to calculate the individual anomalies from.
So now you can pick some arbitrary average Temp value say 270K or 0C and plot the anomaly as temp but the scale is important. If you pick a Y scale equal to what one could expect, say plus or minus 30C from average, your plot will be a straight horizontal line.
So now you know why they use anomaly. Because they are all arguing about and trending about a scale less than 1/60th of either the positive or negative fluctuation of actual temp one might record in a year.

Baa Humbug
January 6, 2010 6:19 pm

Tom P (06:21:39) :
“A comprehensive and longer-term
perspective on IPCC predictions, such as
this, suggests that more recent predictions are
not obviously superior in capturing climate
evolution.”
Precisely

David Space
January 6, 2010 8:17 pm

This doesn’t seem to make sense. Record cold weather was reported all over the world in December, yet it’s still .28 degrees hotter than the average for the last 30 years – which themselves are among the warmest on record.
Where exactly has this warm weather occurred?
Looking at the figures in more detail is even more baffling. I’ve coped in the data row for December from http://vortex.nsstc.uah.edu/public/msu/t2lt/tltglhmam_5.2, as follows
2009 12 0.280 0.318 0.242 0.503 30.
The first figure is global, the second Nothern Hemisphere. Then SH, then tropics.
Given what we’ve all experienced, how can it be true that the Northern Hemisphere was a whole 0.328 degrees hotter than the average for the last 30 years, when they in turn have been the hottest on record?!

Bart
January 6, 2010 10:17 pm

Leif Svalgaard (19:54:18) :
Paul Vaughan (00:34:09) :
“As you increase the bandwidth, you cut more centred-averages off the ends of a series.”
Well, yeah. Obviously. But, that’s just the red line. The gray unfiltered line with the blue data points is there, though, so do you why care? you can just eyeball the unfiltered data and see for yourself where it is going.

Bart
January 6, 2010 10:27 pm

George E. Smith (09:59:18) :
You, too. Yeah, it would bother me if all he presented were the red line. Am I somehow looking at a different graph than anyone else, the one that has a red line and red text which says “running 25 month average” and a gray line with blue dots which clearly shows substantial interannual variability?

savethesharks
January 6, 2010 10:30 pm

David Space (20:17:44) : “This doesn’t seem to make sense. Record cold weather was reported all over the world in December, yet it’s still .28 degrees hotter than the average for the last 30 years – which themselves are among the warmest on record.\
Where exactly has this warm weather occurred?”

And even then…amidst all the cold….leave it to Yahoo the AP to spin it along the following lines:
http://news.yahoo.com/s/ap/20100107/ap_on_sc/sci_big_chill
The headline?? EXPERTS: COLD SNAP DOES NOT DISPROVE GLOBAL WARMING.
I’m sorry….but this is pathetic.
[SORRY GUYS… IT DOESN’T PROVE IT, EITHER!!]
Hahaha……how ridiculous the AP has become on the subject…..
Another reason not to give them the time of day.
Chris
Norfolk, VA, USA

Bart
January 6, 2010 10:40 pm

Paul Vaughan (00:34:09) :
“As you increase the bandwidth, you cut more centred-averages off the ends of a series.”
And, you must have meant “decrease”. When you increase the length of an FIR low pass filter, you most often decrease the frequency bandwidth. Or, if you design it for that purpose, you can keep the bandwidth but increase the attenuation of higher frequencies.
MATLAB has a useful routine in its Signal Processing Toolbox called “filtfilt” which processes the data both backwards and forwards and matching end points so that you get an estimate of what is going on there.

Tom P
January 7, 2010 2:54 am

Baa Humbug (18:19:27) :
“A comprehensive and longer-term
perspective on IPCC predictions, such as
this, suggests that more recent predictions are
not obviously superior in capturing climate
evolution.”
The temperature observations to date do not support this statement. The 1990 IPCC projection obviously overstates the warming whereas the more recent projections are much closer to the observed trend:
http://img13.imageshack.us/img13/811/ipccprojections.png

phlogiston
January 7, 2010 7:16 am

Paul Vaughan
Thanx 4 the useful feedback. In the wikipedia article on the ACW a view is expressed that ACW is linked to ENSO. If so this wd make more meaningful a role in global air temp driving from deep-2-surface ocean heat exchange. At each new el Nino patches of warmer surface water appear in the Pacific to thrill the AGW-ers. Where does this heat come from? Focussed rays of sunlight? – calm down Lief, merely a rhetorical device! No it is deep-2-surface ocean heat exchange. This exchange is clearly emergently osscillatory under the influence of numerous forcings and internal time constants.

January 9, 2010 9:02 am

I have a question I need some help with. I was on a site not long ago that had great info on the sensors taking the satellite measurements and what exactly was the lower troposhpere but unfortunately I have so many links saved I can’t find it again.
How does the lower troposphere readings compare to the surface instruments – should they be trending higher or lower than the so called warming at the surface?
I recall reading that part of the AGW model was that upper atmoshpere temps would go down.
We have a decadal trend on the UAH data of 0.28 and a much larger trend on surface “adjusted” data. I just want to get more background on this and whether that indicates possible fault with the surface data. Thanks for any advice in advance.

1 8 9 10