CRU's shifting sands of global surface temperature

Excerpt from The Inconvenient Skeptic by John Kehr

The Inconvenient Skeptic

The longer I am involved in the global warming debate the more frustrated I am getting with the CRU temperature data.   This is the one of the most commonly cited sources of global temperature data, but the numbers just don’t stay put.  Each and every month the past monthly temperatures are revised.  Since I enter the data into a spreadsheet each month I am constantly seeing the shift in the data.  If it was the third significant digit it wouldn’t bother me (very much), but it is much more than that.

For example, I have two very different values for January of 2010 since September 2010.  Here are the values for January based on the date I gathered it.

Sep 10th, 2010:  January 2010 anomaly was  0.707 °C

Jan 30th, 2011:  January 2010 anomaly is now 0.675 °C

That is a 5% shift in the value for last January that has taken place in the past 4 months.  All of the initial months of the year show a fairly significant shift in temperature.

Monthly Temperature values for global temperature change on a regular basis.

Read the entire post here

=============================================================

Some of this may be related to late reporting of GHCN stations, a problem we’ve also seen with GISS. Both GISS and CRU use GHCN station data which are received via CLIMAT reports. Some countries report better than others, some update quickly, some take weeks or months to report in to NOAA/NCDC who manage GHCN.

The data trickle-in problem can have effects on determining the global temperature and making pronouncements about it. What might be a record month at the time of the announcement may not be a few months later when all of the surface data is in. It might be valuable to go back and look at such claims later to see how much monthly temperature values quoted in news reports of the past have shifted in the present.

More on CLIMAT reports here in the WMO technical manual.

UPDATE: For more on the continually shifting data issue, see this WUWT post from John Goetz on what he sees in the GISS surface temperature record:

http://wattsupwiththat.com/2008/04/08/rewriting-history-time-and-time-again/

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

123 Comments
Inline Feedbacks
View all comments
Alexej Buergin
January 31, 2011 10:30 am

In spite of all the adjustments, the trend during the first decade of the millenium is (slightly) negative. See
http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt

January 31, 2011 10:44 am

As far as I am concerned, the lack of respect accorded to data that is being used to justify economic shifts in the trillions around the world is one of the biggest problems with climate data today. Regardless of concerns regarding accuracy of the initial data there is somehow a belief that it is perfectly ok to modify historical data at any time without any justification or audit trail. This would not wash in practically any other field where any historical adjustments need to be carefully recorded and justified to ensure that data is not being updated to match the theory rather than theories being updated to match the data.

January 31, 2011 11:01 am

P Wilson says:
January 31, 2011 at 5:30 am
Don says:
January 31, 2011 at 5:00 am
God forbid if the data for the efficacy and safety of a new drug was collected in this matter.
Good analogy. Where facts and data are vital, such as medicine and drugs, such latitude isn’t permitted, neither is propaganda. With climate *science*, such abrogation is not unsafe, since there’s nothing at stake either way, being mere voodoo as a consensus.
==================
And I see another poster above has weighed in with an account of how well regulated medical research is compared with climate science….
Yet there is a LONG list of drugs, procedures and devices that have been withdrawn after passing the four stages of clinical trials and regulatory approval.
This despite the method of reporting adverse reactions after release being woefully patchy and inadequate.
There are also several notorious cases where big Pharma DID manipulate the data, mainly by omission of negative results, to gain approval.
Recently individual clinicians have been threaten with being sued for libel when they raised the issue of problems with medical products at scientific meetings.
Climate science is a model of ethical probity compared with some of the shenanigans that have occurred in medical research. That is one of the reasons FOR the extensive and close regulation that is imposed.
Anyone who thinks that medical research is fraud free while climate science is riddled has it backwards and would be rapidly disillusioned by a modicum of investigation.

Jean Parisot
January 31, 2011 11:02 am

I wonder if there is any value in applying some of our Government’s favorite performance measurement/earned value techniques to the “accuracy” of the various institutions measurements and models?

January 31, 2011 11:03 am

Where do they get atmospheric temperature readings to 3 decimal places of accuracy – in the laboratory or computer, perhaps?

Rob R
January 31, 2011 11:20 am

“The future is certain and the past is always near” from the song “Roadhouse Blues” by Jim Morrison and the Doors via John Lee Hooker.
For the Team it seems to be the that past is always here.

January 31, 2011 11:20 am

John:
Last year I attended a “Global Warming” lecture by a “retired climatologist” from the University of Wisconsin, Madison. During the lecture, we were free to ask questions. The audience was a combination of the Minnesota Futurists and the South Metro (Minneapolis) Critical Thinking Club. About 60 people in attendance.
I asked 6 questions during the talk. None of them “agenda” oriented, all of them technical. The first question I asked was, “Dr. X, I’m going to ask you to do a though experiement with me..” He agreed, if it didn’t involve too much thought. I then outlined the idea of taking 10, 16 oz styrofoam cups and putting in various amounts of hot water, warm water, cold water to randomly in each cup. I said, “Now let’s measure the temperature of the water in each cup, then put all the contents into ONE insulated container. If I “average” the ABSOLUTE temperature measured in each cup, will it match the “temperature” of the aggregate?”
Fortunately, the retired professor was sharp enough and honest enough to say “No..”
There were 2 or three interveening questions, then I asked this: “An 86 F day in Minnesota at 60% RH has a total enthalpy in a cubic foot of air of 38 BTU. A 105 degree Day in PHX at 15% RH has 33 BTU per cubic foot. Which volume has more ENERGY in it?”
Again, fortunately, he answered, “The Minnesota air.”
Then I asked, which enviroment was “hotter”. “Obviously the AZ air!” He answered.
This, of course, lead to my FINAL QUESTION…, “Since the ‘Global Warming’ claim, is predicated upon the atmospheric energy balance, and since “average temperature” does not include the effect of relative humidity, isn’t “average temperature” a completely ficticious number, and something without merit on a technical basis?”
AFTER the presentation, a group of 3 engineers (myself included) and two programming types, were outside the banquet hall, on a balcony, having some “free pop” from the event. One of the other engineers asked me, “Did you get any bad dents from this?” I said, “What do you mean by dents?” He said, “When you RAN OVER Dr. X, after setting him up with the atmospheric energy questions…and he gave you the ‘deer in the headlights’ look when you asked your final question.
We all laughed! Totally true. However, the good Dr. did try to salvage something, “Well, actually, RH humidity IS included in the calculation of average temperature.” Now I must give a hat tip to one of the programming types. He took my card, and a few days later Emailed me a host of information that indicated that “average temperature”, is indeed, just that…numerically averaging temperature values over spacial distribution and time. BUT there is NO consideration of “atmospheric energy”.
SO dear fellow Engineer (Chemical too, I might add!) John K., can we get down to brass tacks (if not brass knuckles!) and start to exposit that the “King has NO NEW CLOTHES” and that “average temperature” is a meaningless contrivance?
I know you are heading this way as it is, but I thought I’d give you a nudge in that direction.
Max

Al Gored
January 31, 2011 11:21 am

Juraj V. says:
January 31, 2011 at 5:59 am
“Those who read George Orwell’s “1984″ will understand.”
Indeed. The January 2010 anomaly has always been 0.707 °C.
This data is as reliable and credible as government economic statistics. So, while it may have been the hottest year at least there was no inflation, even with ‘green shoots’ popping up everywhere as the economic boom got going. And U.S. unemployment isn’t all that bad if you don’t count everybody.
Off topic, sort of, here’s a great condensed summary of why it was so generous of the ‘little people’ to have bailed out the poor Banksters:
http://www.zerohedge.com/article/step-aside-bernank-here-comes-timothy-jeethner-bears-explain-banker-bailouts-and-screwing-am#comments

bubbagyro
January 31, 2011 11:24 am

Sarah says:
January 31, 2011 at 10:44 am
Thank you Sarah, for picking up on this; I was afraid to have gotten off-topic.
But yes, almost all other industries have accountability. What I was getting around to was proposing changes, like Sarbanes-Oxley rules, so that accountability would be brought to bear on these climate change artists. We could start with criminal penalties, as now exist for Pharmas, Wall Street, etc., if data does not have a validation trail. We could adopt ISO international protocols for data gathering, storage, and transmittal. I loathe to suggest another bureaucracy, but something is needed to provide quality assurance.
For FDA or ISO standards, for example, the manufacturers have to provide levels of validation, from insuring that equipment is accurate and serviced, that software is validated to ensure that what is measured is what is stored, to provide protocols for rounding out figures or for significant figure standards for calculations, how averages are calculated (do we average three stations and then average the three results? Or average all nine at once? It makes a difference.), how data stations are extrapolated and interpolated, how error remediation is performed, how Stevenson Screens are brought into conformity, and how or whether the data are used until they are.
It seems to me that we also need more monitoring stations, not less, in remote places, under strict SOP protocols. And this would cost orders of magnitude less than what we are spending now on frivolous polar bear studies and the like. Not that land stations are not the ultimate solution to measuring the globe’s energy balance. Anthony and others have shown the perils in interpreting these results, no matter how accurate the method that gathers the data. At least it would eliminate some variables.
I can’t touch on all the weaknesses, like fixing peer review, activism vs. science, and the like, but I think we should all become cAGW activists! (citizens Against Government Waste, that is)

Jeff
January 31, 2011 11:24 am

Maybe they should bring back slide rules so folks could learn about precision…
Speaking of tweaked figures and panic, here’s more from the sky is falling department:
http://content.usatoday.com/communities/greenhouse/post/2011/01/arctic-waters-warmest-2000-years/1?csp=34
(hope I got the link right)
it’s getting to be more than the polars can bear….

bubbagyro
January 31, 2011 11:29 am

Site validation and approval
Right now, all Pharmas have to also validate their R&D, clinical, and manufacturing sites. They undergo constant inspection, and if they fail, they are shut down, with disastrous consequences.
Same with manufacturing sites, in general. In order to pass ISO standards, they have to be open to outside auditing and inspection by disinterested third parties, both in the US and abroad. This way, each country operates under the same rules, and quality is assured across countries.
REPLY: And I’ll point out that you won’t find a single climate center in the world with an ISO-9000 or ISO-8000 certification – Anthony

R. de Haan
January 31, 2011 11:33 am

There is a saying that covers it. “Blowing smoke up your ass”.
Because that’s what they’re doing.
Creating chaos in data land keeps us busy an in the mean time…..
Watch the news.

sky
January 31, 2011 12:12 pm

richard verney says:
January 31, 2011 at 5:33 am
If you think that land-station records are ALL so bad that they should be scrapped in favor of oceanic records, then you must be unaware that there are NO century-long oceanic records made at any FIXED location. The oceanic time series that everyone uses are SYNTHESIZED from scattered observations made UNDERWAY by ships of opportunity. It’s a nightmare from the standpoint of data integrity and reliability. At least with land stations we can select records that are little affected by UHI and land-use changes and reject all those that have been arbitrarily “adjusted” or “homogenized.”

latitude
January 31, 2011 12:35 pm

Sarah, it’s a lot easier to adjust historical temps down, than present day temps up, to show artificial warming.
That way, no matter what the actual temp is today, it will always be warmer.

Sunspot
January 31, 2011 12:44 pm

I have downloaded a number of GISS temperature anomaly data sets from “Wood For Trees” over the past 7 years and I can tell you that very little remains constant. Comparing the graphs, most temperatures as far back as 1880 are now adjusted lower, where as temperature beyond 1998 adjusted upwards. The later graphs show much more of an incline than the older data sets. While this is all happening, temperature data is also being adjusted at the local weather recording station here in OZ before it gets up to Dr. Jim

Mark T
January 31, 2011 12:52 pm

Max, Max, Max…

AFTER the presentation, a group of 3 engineers (myself included) and two programming types, were outside the banquet hall, on a balcony, having some “free pop” from the event.

I’m not sure I buy the free pop statement. 🙂
Mark

onion2
January 31, 2011 1:00 pm

If anyone is really bothered about the data handling used to produce the CRU and GISTEMP surface records and wants to verify them I recommend producing your own surface temperature record maintained by someone you trust.
Take the station input and demonstrate how scientists should be handling it – use whatever backup system, adjustments (or lack of), change logs, maintenance, you think is best. Whatever you think the CRU and GISTEMP scientists are doing wrong in terms of data handling, do it yourselves properly. And by doing so demonstrate to everyone that this works better.
I mean you might discover that the subject of this article – past data changing – is an unavoidable feature of maintaining a surface temperature record from month to month. Or you might not.
But until you try you are just leaving issues like this up in the air for speculation.
REPLY: Just curious since you are in the UK, do you work for CRU or the government there? – Anthony

January 31, 2011 1:01 pm

It is a shame they can’t keep a log of adjustment and corrections like UAH does.

melinspain
January 31, 2011 1:02 pm

ISO standards – good, bad or both? but always a costly nightmare to implement.
A thread on this matter would be interesting.

George E. Smith
January 31, 2011 1:03 pm

“”””” Ric Werme says:
January 31, 2011 at 6:06 am
Tony Hansen says:
January 31, 2011 at 5:02 am
John,
> I too worry about the ‘adjustments’, but I also worry about your ’5%’.
5% of what?
Of the anomaly.
$ python
>>> .707 * .95
0.67164999999999997 “””””
Actually, my stick in the sand calculator gives 0.67165 exactly for 95% of 0.707.
You need a new supercomputer; or maybe just a new stick.

onion2
January 31, 2011 1:07 pm

Sarah says:
January 31, 2011 at 10:44 am
“Regardless of concerns regarding accuracy of the initial data there is somehow a belief that it is perfectly ok to modify historical data at any time without any justification or audit trail.”
You don’t need an audit trail. Take the initial data yourself and check that the result follows from it. Having detailed source code, change logs and audit trails is useful tracking down a problem, but you don’t need any of those to determine whether or not there is one, which is the first port of call.
In this case take the initial GHCN data and see whether 2010 really is the 2nd warmest year in that record, or find out whatever rank it does fall as. If you find a very different result (eg 2010 is the 10th warmest year) then there’s an issue to explore. But if you find it’s the 2nd, 1st or 3rd then there is no indication the result is wrong.
In my mind the different temperature products all check each other in this way. Different source code on the same initial data producing results that can be compared.

onion2
January 31, 2011 1:12 pm

Berényi Péter says:
January 31, 2011 at 9:05 am
“No, it must be something else. I make local backups of GHCN V2 from NOAA. I have a copy of v2.mean (raw monthly temperatures by station) from 2010-06-28 22:48:37 UTC. Now I have downloaded it again at 2011-01-31 15:33:26 UTC.
There are only three differences between the two copies for January 2010, they are as follows:
13167341000 LOURENCO MARQUES/COUNTINHO (MOZAMBIQUE) -25.90 32.60 (27.0°C)
30489056000 CENTRO MET.AN (CHILE) -62.42 -58.88 (0.2°C)
42572597000 MEDFORD/MEDFO (U.S.A.) 42.37 -122.87 (7.9°C)
In the June 2010 version there’s no January 2010 data for these stations while in the current version they have one (in parentheses at end of line).
John Kehr says between 10 September 2010 and 30 January 2011 CRU global anomaly for January 2010 has changed by almost 5%. The June 2010 version contains 1447 valid temperatures for January 2010 from all over the globe; the addition of just 3 data points can not possibly cause such a shift.
Do they keep changing the adjustment algorithm? Is it documented anywhere?”
Don’t tell me everyone here has missed this:
http://hadobs.metoffice.com/crutem3/jan_2010_update.html

onion2
January 31, 2011 1:25 pm

no

onion2
January 31, 2011 1:27 pm

Re Dallas Tisdale (No relation to Bob, er that Bob) says:
January 31, 2011 at 1:01 pm
“It is a shame they can’t keep a log of adjustment and corrections like UAH does.”
Yes it would be nice if there was more information published whenever anything changed each month. In this case a quick summary would do it sounds like something has changed with the processing in the last few months (which the jan2010 update I linked to obviously doesn’t cover).
I know they were planning to change the algorithm at some point (I thought it was new year). If they have done that then there’s nothing on the site indicating it.

January 31, 2011 1:43 pm

@-Bob(Sceptical Redcoat) says:
January 31, 2011 at 11:03 am
“Where do they get atmospheric temperature readings to 3 decimal places of accuracy – in the laboratory or computer, perhaps?”
The number of decimal places is a reflection of the sample size rather than the accuracy of the readings.
Try thinking of it like this…
If a large number of thermometers can only be read to the nearest whole degree and are measuring a temperature of about 0.5 deg then the data you get will be a long list of zeros and ones.
Like a coin toss, where you give the heads/tails a 1-0 values, if you average just a few data points the answer will be somewhere between 0 and 1 but with great uncertainty.
But if you average thousands of coin tosses/thermometer readings the average will narrow down to 0.500 and the number of decimal places that you can give will be related to the number of data points.
If it doesn’t narrow down to 0.5, but to some other value, then you can conclude with certainty that the coin is weighted, or that the temperature is not 0.5 deg but closer to what your average turns out to be.

Verified by MonsterInsights