RSS global temp drops, version change adjusts cooler post 1998

Remote Sensing Systems of Santa Rosa, CA has published the January 2011 global temperature anomaly. It is not far from zero, and dropped quickly much like Dr. Roy Spencer’s UAH data this month. But, there’s a surprise. RSS has changed from Version 3.2 to 3.3 of their dataset, and adjusted it a bit cooler in the near term. Here’s a comparison plot:

Sources:

V3.2 ftp://ftp.ssmi.com/msu/monthly_time_series/rss_monthly_msu_amsu_channel_tlt_anomalies_land_and_ocean_v03_2.txt

V3.3 ftp://ftp.ssmi.com/msu/monthly_time_series/rss_monthly_msu_amsu_channel_tlt_anomalies_land_and_ocean_v03_3.txt

Curiously, there’s no mention of this new v3.3 data set on their web page describing the MSU data products they produce:

http://www.remss.com/msu/msu_data_description.html

Perhaps they just haven’t gotten around to updating it yet.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
68 Comments
Inline Feedbacks
View all comments
Chris Wright
February 4, 2011 6:06 am

WUWT has an interesting explanation about satellite calibration etc by Dr Roy Spencer:
http://wattsupwiththat.com/2010/01/12/how-the-uah-global-temperatures-are-produced/
He answers my question right at the end:
“Fortunately, John Christy has spent a lot of time comparing our datasets to radiosonde (weather balloon) datasets, and finds very good long-term agreement.”
I think I’m satisfied. Of course, it does mean we have to worry about whether the rediosonde data has been ‘adjusted’ to match the surface data.
Overall, I’m reasonably satisfied that the satellite record is reliable. By the way, doesn’t AGW theory predict that the satellite (atmospheric) record should show more warming than the surface?
Chris

February 4, 2011 6:10 am

Alan the Brit says:
February 4, 2011 at 2:39 am
___________________________
A practical optimist wouldn’t worry too much about the next ice-age. It seems that ice-ages develop fully when reduced solar heating is exacerabated by the loss of solar heat by reflectance from a larger cover of ice and snow.
Just as we could likely thaw the polar regions at present by dispersing soot on them, so could we manage the cryosphere when our solar orbit would otherwise cause continental glaciation.

Pamela Gray
February 4, 2011 6:19 am

My bet: Readjusted satellite orbital drift due to a quiet Sun. The adjustment then forced a recalculation of readings.

kramer
February 4, 2011 6:34 am

Where does RSS take the temp readings from, the entire earth? If so, that would include UHI temp, right?

February 4, 2011 6:46 am

I’m just wondering if these adjustments are only the typical effect of showing anomalies on a graph. Since the data points are deviations from “normal”, they must change if you change “normal”. And if (as UAH did recently) you shift from a 20-year, to a 30-year baseline, “normal” will change and so will many of the data points.

Scott Covert
February 4, 2011 6:55 am

I won’t speculate why the temps are adjusted (Possibly recalibration of the on board RTDs? Not likely unless cosmic rays are messing with them since RTDs are pretty stable)… wait, I just speculated…
It’s refreshing to see some adjustments on the other side of positive (I guess, but all adjustments need to be clearly documented and justified).

MackemX
February 4, 2011 7:22 am

Good reason to adjust 2010 figures down (after everyone’s been told it was the warmest year ever) is that it increases the likelihod of 2011 being warmer than 2010 (according to the record) or at least, not as much cooler as is likely to be the case.
Is that too cynical?

February 4, 2011 7:24 am

I just sent them an email asking about it.

roger
February 4, 2011 7:35 am

David W says:
February 4, 2011 at 4:56 am
“Nah! He spent it all on a bridge to live under.

pyromancer76
February 4, 2011 7:37 am

A change without mention of the new data version? Why should they; they’re in charge.
@Lucy Skywalker. Beautiful update. Excellent analysis from “experimental” data — the real thing. I am sending it on. Thanks.

Tom T
February 4, 2011 7:38 am

Lucy Skywalker: Although I don’t think he is connected with RSS, Dr Spencer has stated often that the satellite data is not calibrated with ground based instruments. The point of the satellites is to have an independent record of temperature.

February 4, 2011 8:16 am

I was about ready to comment that we should demand that the Adjustment be plotted on the chart along with the anomaly. Then I realized I’d fallen into the mental trap…. Adjustment to the Anomaly???
Just like the ice curves where we see not only the anomaly, but we can see the total ice cover, we need to see the total average temperatures against the averages for the prior years. “But if we do that, we would not be able to see the anomaly.” Precisely.
We are being brainwashed into looking at a temperature divergence from some hidden-from-the-reader mean, itself corrupted with unknown manual adjustments that accounts for unknown instrument drift and unknown UHI bias. As bad as that is we are spoon-fed plots that completely ignore the standard deviation of the sample population, which is a function of the calendar data, and the mean standard error of the original temperature mean.
The statistical sins we are committing are enormous. Yet, through shear repetition, the sins become accepted and the brainwashing succeeds.
“There’s a flea on the wing on the fly on the frog on the bump on the log in the hole in the middle of the sea.” — Children’s Campfire Song

Daniel H
February 4, 2011 8:27 am

The v3.3 dataset is mentioned briefly in a presentation that was given to NOAA by RSS scientist Carl Mears and RSS founder/scientist Frank Wentz back in September of 2009. On page 22 of the 24 page presentation, they state:

Schedule
Next few months:
1. Release Version 3.3, which includes data from AMSU on NOAA-18, AQUA, and MetOP-A
2. Submit paper on error analysis
Next 18 months:
1. Streamline and modularize processing system – port as many components as possible to python. (HDF-EOS4??)
2. Get ready for ATMS on NPP – Jan 2011 launch?
3. Improve monitoring tool/automatic report generation

So they are a bit more than a year late in releasing the v3.3 dataset (if they are indeed referring to the same thing). The PDF for the presentation can be downloaded here [792KB]:
ftp://ftp.ncdc.noaa.gov/pub/data/sds/SDS_AMSU.Mears.Sept09.pdf

j.pickens
February 4, 2011 8:54 am

Looking closely, I see that while the highs were lowered, the lowers were “highed” as well. Look at month 250, the low valleys are lower than they were preadjustment, but the majority of the more recent low valleys were adjusted upwards. What’s up with that?

j.pickens
February 4, 2011 8:58 am

Ooops, I read it wrong, the red is the older set.
So, they did raise the lows around month 250, and have lowered the lows since.
This adjustment around month 250 goes counter to the rest of the adjustments.
What does this mean?

February 4, 2011 10:00 am

Stephen Rasey says:
“As bad as that is we are spoon-fed plots that completely ignore the standard deviation of the sample population, which is a function of the calendar data, and the mean standard error of the original temperature mean.”
Bang on. And then we get years ranked as the warmest by differences of a few hundreths of a degree F.
“Global climate statistics are like sausages: You really don’t want to know what goes into them.”

Duster
February 4, 2011 11:04 am

Alan the Brit says:
February 4, 2011 at 2:39 am
Ice core data shows a distinct “saw toothed” shape with the descent into glacial epochs marked by a long, gradual cooling followed by an abrupt warming. Additional texture seems to be supplied by shorter-term Dansgaard-Oeschger events and other noise. If you restrict the data to the terminal Pleistocene and the Holocene, we appear to be on the downslope into the next glacial epoch, but still near the upper edge of the shoulder in the curve, even with the resent “warming.”

Robuk
February 4, 2011 11:18 am

You don`t need thousands of weather stations you just need a couple of hundred pristine rural stations scattered across the planet, the New Zealand set is a good start.
When will someone make the commitment, everything is linked to these bloody dodgy temperatures.

February 4, 2011 11:47 am

racookpe1978 says:
Now, the question becomes: Has the 400 year long-term climate cycle peaked between 2000-2010, and we begin the Modern Ice Age?
Or do we continue the long climb up from the Little Ice Age towards a Modern Warm Period peaking in 2060-2070? (Then begin the 450 year decline into the Modern Ice Age?)
Both and neither – see http://www.agwnot.blogspot.com/

richard verney
February 4, 2011 11:50 am

Robuk says:
February 4, 2011 at 11:18 am
You don`t need thousands of weather stations you just need a couple of hundred pristine rural stations scattered across the planet, the New Zealand set is a good start.
When will someone make the commitment, everything is linked to these bloody dodgy temperatures.
/////////////////////////////////
Agreed. There is absolutely no point in trying to make a so called global average temperature/temperature anomaly . This is especially so since for the main part climate change is a local issue and many places have their own localised climate. further, it is important to know where in the world there is warming (eg., the poles, the NH, equitorial regions, or SH etc) since the impact on man will be significantly different in each area. It would also be interesting to know the pattern of warming during the course of the day and the seasonal pattern.
Each country should compile its own record based upon good quality rural data that does not need adjustment. If these as a whole do not show a warming trend, then global warming would appear to be a myth.

February 4, 2011 12:33 pm

Keep in mind that by adjusting recent temps down slightly, it makes next years lows look less drastic… It’s all about trying to keep the trend alive.

Edim
February 4, 2011 12:48 pm

Robuk,
Absolutely agree!!! In fact, I think even 10 – 20 very good stations covering all continents would be enough and much more scientific than that fake official “average global anomalies”. Something like a global temperature index or even more of them (10, 20, 50, 100 stations). Like top 10, top 40, top 100 “global temperature index”. Even a trend over the last 30 – 60 or more years would be very telling.
No adjustments! Not even UHI! Every station have some positive UHI trend, but for some very rural stations it is probably negligible.

February 4, 2011 2:31 pm

I re-read Roy Spencer’s explanation of the satellite temperature derivation.
He does say that within around 1 deg C is as good as it gets but that you can use it for climatology because the measurement methods are very stable.
Then mentions that the instrument drifts due to body temperature, which it should not do.
I don’t like the calibration method either. Any temperature above 290 or so K is an extrapolation and using 2.7K as the low end is well below any lowest temperature likely to exist in the atmosphere which in reality most of the time will be 200K or warmer.
The RTDs and associated electronics are never recovered. There are huge ASSUMPTIONS about stability and the effects of ionising radiation.
I’m unimpressed. Satellites are great tool for forecasting as it’s like having a very dense grid of radiosonde data. For climatology – meh.
And where are the error bars on that graph?

Greg Meurer
February 4, 2011 2:37 pm

To those who can do math:
The change represented by ver3 appears to be enough to change the slope of trend line.
The trend line reported for ver2 through Dec. ’10 is 0.163 K/decade. The trend line looks like it has dropped to less than 0.15 K/decade. Could someone check this. the last time I had to calculate I trend line was as an undergrad and over 4 decades ago.
Thanks.

February 4, 2011 2:50 pm

Strange. RSS v3.3 average annual anomaly for 2010 is 0.476°C, significantly lower than the 0.55°C in 1998. And here comes the trillion dollar question.
Is it worse than we thought or better?