Both Lucia and Steve McIntyre beat me on this story, so I’ll defer to them. That’s what I get for going to dinner with relatives last night and sleeping in.
Below is a plot from McIntyre showing the RSS data compared to UAH MSU. Both are down significantly in June 2009 with UAH MSU at .001°C
RSS is down from 0.090C in May 2009 to 0.075C in June 2009
Steve McIntyre writes a little parody of the issue: RSS June – “Worse Than We Thought”
Lucia actually expected RSS to climb and has an analysis here
Even NCDC’s director Tom Karl has something to say about satellite data, read on.
Both of the datasets are available in raw form if you want t plot for yourself.
RSS (Remote Sensing Systems, Santa Rosa)
RSS data here (RSS Data Version 3.2)
UAH (University of Alabama, Huntsville)
Reference: UAH lower troposphere data
There had been some comments in the UAH thread earlier that May and June seem to have cycled lower in the UAH data set in recent years. It seems that RSS is following also.
I expect we’ll hear an announcement from NOAA/NCDC soon about it being the nth warmest June on record. They will of course cite surface data from stations like this one at the Atmospheric Sciences Department, University of Arizona at Tucson:
Here is a testimony in March 2009 before congress from NCDC’s director Tom Karl, where he complains about satellite data and the “adjustments” required:
It is important to note raw satellite data and rapidly produced weather products derived from satellite sensors are rarely useful for climate change studies. Rather, an ordered series of sophisticated technical processes, developed through decades of scientific achievement, are required to convert raw satellite sensor data into Climate Data Records (CDRs).
You mean “sophisticated technical processes” like these performed on raw surface temperature data at NCDC?
larger image
larger image
Source: http://cdiac.ornl.gov/epubs/ndp/ushcn/ndp019.html
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.



Robert Wood (05:28:21) :,
“The CO2 cause is disproved by the current temperatures”
Please expand on that.
“The International Air Transport Association, which represents the world’s airline industry, is urging the government to abandon a rise in air passenger duty, which would mean people paying more for long-haul flights. ”
http://news.bbc.co.uk/1/hi/business/8145699.stm
The cost of AGW taxes
From here on the Left Coast of the USA this doesn’t surprise me. It’s 8am and 55F on the patio. Wunderground shows the temps right now to be about the same as far south as Santa Rosa and as far north as Ukiah (though it could be further each way, I just did a quick scan of the coastal sites).
In mid July one expects to be waking up dreading the heat and having just barely gotten that house cooled down to tolerable during the night. One does not expect to have clutched their blanket close and wished for a second… In mid July, one expects to be focused on round two of heat tolerant vegetables for the garden (common bean, cow peas (blackeyed peas), corn, tomato) not planning to start some peas, kale, spinach …
It is important to note raw satellite data and rapidly produced weather products derived from satellite sensors are rarely useful for climate change studies. Rather, an ordered series of sophisticated technical processes, developed through decades of scientific achievement, are required to convert raw satellite sensor data into Climate Data Records (CDRs).
You mean “sophisticated technical processes” like these performed on raw surface temperature data at NCDC?
One needs to remember that the opposite of “raw” is “cooked” …
Someone is cooking the books. My garden tells me so…
Plants are my friends. Plants do not lie. And while you can cook plants, they continue to accurately report temperature effects in the process.
“In 1991 did you all start drooling at the thought that GW had reversed?”
I can’t speak for others here, but in 1991, I believed in “global warming” wholeheartedly. A few years later, I learned how to think critically. Now I recognize “global warming” for the hoax that it is.
As far as “drooling” goes, I admit that I enjoy each data point which comes in against global warming. I do agree with your implied point that a couple months of (relatively) low surface temperatures doesn’t falsify the hypothesis. But it doesn’t help your side.
What really damages “global warming” is, IMO, the argos data. If that keeps up, I can do some serious drooling.
VG (15:12:57) :
Looks like big mainstream media has had enough
http://www.spectator.co.uk/the-magazine/features/3755623/meet-the-man-who-has-exposed-the-great-climate-change-con-trick.thtml
watch the thing tumble. Even K Rudd admitted so this morning re Copenhagen
This statement, by Professor Ian Plimer, sums up the current plight of
all collected climate data.
“And that’s why I’m so sceptical of these models, which have nothing to do with science or empiricism but are about torturing the data till it finally confesses.”
Oh deary, deary me.
“Sophisticated technical processes”, oh dear, how unfortunate.
Back to basics I think. Sophisticated comes from the word Sophistry. This means, according to my trusty 1925 Pocket Oxford Dictionary, to “spoil the simplicity or purity or naturalness of” ORRRRR……………………wait for it…………………..to”corrupt or adulterate or tamper with”! Take your pick! I dare say modern dictionaries define it as “the necessary &/or essential adjustment of something or to someone to demonstrate a desired effect”! I suppose that could mean “worldly, knowledgeable, expert?”
That’s the trouble with modern language usage & the way it is changed through casual & or incorrect use, & modern dictionaries no longer provide the correct origins of the words within, just what they mean in modern parlance. A pity in many ways.
I seem to remember many moons ago, when at college in my “yoof” during engineering exams, programmable calculators were forbidden as they could be used to provide the right answers, without a candidate having to demonstrate that he/she understood the question, or the principles invloved to deriving the answers & solutions.
Perhaps you could add a small box that defines what things like “TOBS adjustment” and “FILNET” are / do?
For example, I don’t now what a SHAP adjustment is, or why it ought to cause a steady uplift over so many decades…
I guess I’m basically asking for a road map to the buggering process (or perhaps a recipe for the cooking…)
Flanagan (11:42:05) :
Always the same analysis of small wiggles… Don’t you see that the very fact that a zero anomaly is hailed t like that (for one month only) is a proof that it has become increasingly rare? How many times in the last 10 years (that is last 120 months) did we get close to, or equal to zero? Now compare that to the 90ies…
Your point has some validity, in that maybe we should continued to conduct unbiased scientific research of our climate. However, given the fact it is a zero anomaly, shouldn’t that at least make us take pause in passing global legislation that most economist estimate will lead to significant decreases in the standard of living for a large percentage of people world wide. In addition, climate policies will result in unprecedented growth in government control of people and increase the likelihood of governments abusing the rights of individual people and groups across the globe. I for one do not want to give my American government any more power based on a ZERO anomaly.
evanmjones (13:07:19) : GISS, if you can believe it, does not adjust raw HCN data. They “unadjust” HCN adjusted data (by algorithm), then readjust.
I’ve read the code and it does do exactly that. But you left out HOW they “unadjust” it… They take an offset between GHCN and USHCN for a few recent years, then apply that offset TO ALL PRIOR YEARS TOO! So if a site had a MMTS change in the last decade, that would be “unadjusted” out of the 1890 to 1980 data as well, even though it was never in them …
So look again at those adjustment graphs up top. The rapid rise in the tail of them is what is being subtracted from all prior history… That’s where GIStemp gets it’s temperature anomaly rises from. Read it and weep.
See:
http://chiefio.wordpress.com/2009/03/01/gistemp-step0-the-process/
From my technical notes:
The questionable things I have found in this STEP0 are:
1) Temperatures in F.xx from NOAA that have 2 digits of false precision (original daily data report in whole degees F) are converted to C.x with one digit of false precision.
2) There are duplicate entries in USHCN and GHCN data sets. The GHCN data from years prior to USHCN first data are thrown out, GHCN duplicates are thrown out along with anything prior to 1880. It is not clear to me that 1880 is somehow special (though it is passed as an argument so ‘tuning’ is possible) nor do I know if the criteria used for selecting between GHCN and USHCN are valid. From the do_comb_sept0.sh script: echo “replacing USHCN station data in $1 by USHCN_noFIL data (Tobs+maxmin adj+SHAPadj+noFIL)” with the variable $1 being the GHCN data set. During this process, if both GHCN and USHCN data exist for a given year, that year data are replaced with the USHCN data after an ‘adjustment’ is subtracted from USHCN so that the ‘offset’ induced is removed. Why?
So what makes GHCN less valid than USHCN?
From: http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/index.php?name=bias
Methods – Bias Adjustment
At present, GHCN-Daily does not contain adjustments for biases resulting from historical changes in instrumentation and observing practices. However, there is ongoing work at NCDC to develop adjustments that can be applied to daily maximum and minimum temperatures, and a GHCN daily derived product containing adjusted daily temperatures may become available in the future.
End web quote.
So it looks to me like the USHCN data have valid corrections (time of day, equipment bias, etc.) that GHCN do not have: why ‘adjust’ for the ‘offset’ induced at all?
From: gistemp.txt
Replacing USHCN-unmodified by USHCN-corrected data:
The reports were converted from F to C and reformatted; data marked as being filled in using interpolation methods were removed. USHCN-IDs were replaced by the corresponding GHCN-ID. The latest common 10 years for each station were used to compare corrected and uncorrected data. The offset obtained in way was subtracted from the corrected USHCN reports to match any new incoming GHCN reports for that station (GHCN reports are updated monthly; in the past, USHCN data used to lag by 1-5 years).
End gistemp.txt quote
This makes it sound like a simple swap of corrected data for uncorrected, with the only adjustment being to match the tail end of the data a bit better to any very recent (months time frame) data that might be in GHCN, but not yet in USHCN. While it is true that for stations not in GHCN or for a year where GHCN has no data for a station the USHCN data are used: when there are data for a given station for a given year in both USHCN and GHCN, the reality is much different. Over time, I would expect ever more of the data to be in both GHCN and USHCN, no? What happens then:
The ‘USHCN-unmodified’ data are NOT replaced by ‘USHCN-corrected’; they are replaced by:
USHCN_corrected(year,month) – (average of roughly 10 of: [ USHCN_corrected(semi_randomyear,month) – USHCN_unmodified(semi_randomyear,month) ] )
How does that make any sense? How are these created data any better than valid data from USHCN? Why are these changes applied over a large number of years?
If the goal is simply to get USHNC_corrected data into the GHCN data set, then they are failing. Any ‘offset’ of ‘trend’ is an artifact of GHCN having unadjusted data with no better record available and ought to be ignored (or adjust the GHCN data to match the better data). This just seems like a very clumsy way to do a ‘fix’ for a problem that is not there. Making the trend ‘look better’ rather than the data overall to ‘be better’.
[…]
The relevant bit of code is this. (I have left out the data declarations, file reads, & cleanup steps):
C**** compare data
40 id=id0
do m=1,12
nok=0 ; di=0
do n=last_yr-1979,1,-1
if(mo0(n,m).gt.-9000.and.mo(n,m).gt.-9000) then
nok=nok+1
di=di+(mo0(n,m)-mo(n,m))
end if
if(nok.eq.10) go to 50
end do
50 dif(m)=di/nok
if(nok.lt.1) dif(m)=-9999
end do
write(10,'(i10,12f7.0)’) id,dif
[ ‘mo0’ is the USHCN temp in a given year/month while ‘mo’ is the GHCN temp in a given year/month. -9999 is used as a missing data flag. We count backwards from the last year USHCN data are available (this get longer each year) and for all months in that year with data present, add one to the counter ‘nok’ and keep a running total ‘di’ of the ‘variances’ found by subtracting GHCN HCN temperature from the USHCN temperature for that month (I do not know the typical sign on this result. Do the biases result in a typically positive or negative ‘correction’? To the extent TOB & Equipment et.al. are upward biases, and removing them shrinks USHCN, then di will be negative.) When we have 10 years of corrections in a given month, or reach 1980 (1980-1979=1 the end of loop criterion), we go to line 50 and set the average difference for that month to be di/nok (total of all differences divided by number of data points). There is a ‘sanity check’ to make sure the adjustment value is set to an error flag of -9999 when no valid difference was computed, then we proceed to the next month.]
At the end, we write out the station ID and the difference values (a single constant for each month) to be used to ‘adjust’ the USHCN data for the ‘offset’ adjustment in the following steps.
Issues I see:
You get varying numbers of years and varying range of years used to ‘adjust’ the monthly data (varying both ‘in different months for one location’ and ‘in different locations’). Each location/month adjusted by a different metric on a different base. Maybe OK, maybe not? Hard to detect after the fact (irregular impacts).
This is based on the assumption that the methodology is valid. What makes it valid to subtract an average of (sort of) 10 years (sort of monthly) deltas as a correction for (whatever) in individual months? If the USHCN data are already better (corrected) why are we removing some of that correction by a strange method?
Up to 10 years of variance are flattened into a single adjustment number that is then used for all instances of a month in the ‘adjusted’ data over all years. What if the basis years are not representative? What if something, like an ENSO event, colors or biases this adjustment factor in these particular years? Since the years chosen are dependent on coverage in the data collection, and change with the last collected data, this seems subject to error. In particular, TOB bias varies positive or negative depending on morning vs evening TOB. So what if the 10 year span has one TOB and the data ‘corrected’ were collected with another? The implied assumption that all the data were collected at the same TOB seems to invalidate this method.
END note quotes.
I’m going to cut off the quote here because this is already a bit long. I thought I’d put a cleaned up summary in my blog pages, but it seems to be gone. Don’t know I it’s bit rot, if wordpress deletes things that are too “old and unread” or if I’m just not remembering it right. I’ll dig through my archival set at some point and figure it out. For right now, though, you get the idea…
GIStemp subtracts a hashed up version of recent changes from all past history. The graph at the top shows that this recent change data is excessively high, so will excessively bias the past low. That will give a bogus slope of rising temperatures over time.
IMHO, these lines of code from this specific program are one of the major sources of error and bias in GIStemp.
It would be enlightening to run GIStemp and remove, one at a time, the various buggeries in it, starting with this one, and watch the globe “cool”…
What does this have to do with a satellite article? Well, it looks like they are using adjustments in that data to cook it as well… My guess would be that they are “calibrating” against the “historical record” via something like GIStemp. A very bad idea…
The fundamental problem with Holdren is that he doesn’t understand our system of government. In his writings is this gem:
He believes that unless the Constitution specifically conveys a right to the people, then they don’t have that right but government does. His fundamental understanding of how our government works is backwards. Unless it is specifically in the constitution, the GOVERNMENT doesn’t have a right. All other rights are owned by the people. First sentence of the 10th amendment.
While he is writing about reproduction, this thinking would extend to anything. Our right to anything. He could just as easily be saying, “where in the Constitution is your right to electricity, or a heater, or a car”.
The answer is, “In the 10th amendment”
“The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.”
There is your right to lightbulbs, cars, heaters, etc.
Jeff Id (21:26:06) :
Here is an unscaled plot comparing TSI and CO2. Which do you think has more possibility of changing temperature?!!! It’s a tricky one!
http://img528.imageshack.us/img528/7777/tsico2realscale.jpg
And sure enough, Accuweather was right on schedule with the headline “Satellites Indicate June 2009 was NORMAL Globally”.
“when at college in my “yoof” during engineering exams, programmable calculators were forbidden”
In my “yout” (in NY) we used slide rules – one Physics prof wanted answers
to 3 decimal places. He would have loved calculators with answers to 17
decimal places. The difference between Engineers and Scientists.
Pofarmer (08:10:54) : If you do a minimal bit of Googling, you will find some lows of late that have not been seen for decades in the Northern US and Canada. It’s hot down South, but colder than normal up North. Like someone said, it ain’t all the US – that’s why I like UAH GLOBAL data. There is 30 years worth of data there and in some climatological circles, 30 years is long enough to be considered climate rather than weather.
BBC Bloom
For the regulars here, this post
Shanta (03:14:54) :
Did Al Gore deserve the Nobel ‘Peace’ Prize? Discuss this on BBC Bloom:
http://www.bbc.co.uk/blogs/climatechange/2009/07
This must be Shanta Barley who is currently “running” the BBC blog Climate Change The Blog of Bloom. It is about the only source of sceptisism on the BBC and the fact that Shanta has posted here must say something.
The blog attracts plety of sceptical comment although there is a very irritating troll who rants and rants.
Cheers
Paul
E.M.Smith (09:30:43) : Looks like many surface temp data sets are “adjusted” out of all proportion. Your analysis highlights a boat load of that statistical fairy dust. Love it!
For those who need a quick background on the University of Alabama Huntsville’s (UAH) microwave sounding unit (MSU) on board a satellite, here is the following synopsis:
http://www.uah.edu/News/climatebackground.php
-snip- Houston, Tx and tell me if you think this June is cooler than any other we’ve had. We’ve got drought conditions and have been in 100º+ weather for weeks.
Tell you what… I’ll even let ride you ride in my car that has the AC blown out in it.
MMMMmmm but it’s still 48 degrees centigrade over here…………….can ya help?
Flanagan (11:42:05) :
“Always the same analysis of small wiggles… Don’t you see that the very fact that a zero anomaly is hailed t like that (for one month only) is a proof that it has become increasingly rare? How many times in the last 10 years (that is last 120 months) did we get close to, or equal to zero?”
I am not sure whether yoiu are being unintentionally or deliberately misleading. GMT is not characterized as being stable, and therefore zero is not a normal anomaly. From 1993 to 2002 we were on the positive side of multiple oscillations in addition to recovery from volcanic impact. To get back to zero now is not at all surprising for those who do not expect CO2 to dominate. Add on an emergence from the LIA (which a reasonable person could say is not dependent on CO2), and one would not expect zero very often in non-satellite data.
Chris V. (18:50:12) : i have read it and I stand by my statement. If he meant that such analyses have to be done to get climate data and have been in certain cases, he should have noted as such. As it stands it sounds like he is saying that the RSS and UAH data still suffer from those problems, even though they don’t (in RSS’s case there is actually evidence their corrections are overzealous but since no one on the warm side of this seems to care about evidence suggesting that’s the case (UAH MUST be wrong-we just KNOW) I won’t bother.
The only thing I will change is to say that he is definitely not ignorant. But he is disingenuous or sloppy.
“I’ll even let ride you ride in my car that has the AC blown out in it.”
A/C repairmen are in high demand in San Antonio right now.
My better half sets the thermostat at 88°F to save energy ($$).
It’s not comfortable, but it’s tolerable.
Tom in Texas (11:01:34) :
“when at college in my “yoof” during engineering exams, programmable calculators were forbidden”
In my “yout” (in NY) we used slide rules – one Physics prof wanted answers
to 3 decimal places. He would have loved calculators with answers to 17
decimal places. The difference between Engineers and Scientists.
My old dad just gave me one of his old slide rules. A danish 130mm job in a leather slip case. Now I’ll be able to work out the effect of a big solar flare after the next Carrington event hits us.
My uncle told me engineers used to design things 10x stronger than needed, because an order of magnitude error on a slide rule the wrong way was bad news.
timetochooseagain (12:18:19) :
i have read it and I stand by my statement. If he meant that such analyses have to be done to get climate data and have been in certain cases, he should have noted as such.
Are you sure you’ve read it??? He gives several examples where it has been done (page 3, second paragraph). One example he gives specifically deals with UAH and RSS (the report cited in his footnote #3).
So it’s pretty clear that UAH and RSS meet his criteria for climate data records.
jules (12:02:10) :
See …
http://www.climatedepot.com/a/1226/Brrrrr-Too-cold-for-ice-cream-Parts-of-US-forecast-to-have-a-year-without-a-summer
http://news.yahoo.com/s/nm/20090710/us_nm/us_blight_usa
http://wattsupwiththat.com/2009/07/09/chicago-coolest-july-8th-in-118-years/
http://forecast.weather.gov/product.php?site=NWS&issuedby=OKX&product=PNS&format=CI&version=4&glossary=0