Excerpt from The Inconvenient Skeptic by John Kehr
The longer I am involved in the global warming debate the more frustrated I am getting with the CRU temperature data. This is the one of the most commonly cited sources of global temperature data, but the numbers just don’t stay put. Each and every month the past monthly temperatures are revised. Since I enter the data into a spreadsheet each month I am constantly seeing the shift in the data. If it was the third significant digit it wouldn’t bother me (very much), but it is much more than that.
For example, I have two very different values for January of 2010 since September 2010. Here are the values for January based on the date I gathered it.
Sep 10th, 2010: January 2010 anomaly was 0.707 °C
Jan 30th, 2011: January 2010 anomaly is now 0.675 °C
That is a 5% shift in the value for last January that has taken place in the past 4 months. All of the initial months of the year show a fairly significant shift in temperature.
Monthly Temperature values for global temperature change on a regular basis.
Read the entire post here
=============================================================
Some of this may be related to late reporting of GHCN stations, a problem we’ve also seen with GISS. Both GISS and CRU use GHCN station data which are received via CLIMAT reports. Some countries report better than others, some update quickly, some take weeks or months to report in to NOAA/NCDC who manage GHCN.
The data trickle-in problem can have effects on determining the global temperature and making pronouncements about it. What might be a record month at the time of the announcement may not be a few months later when all of the surface data is in. It might be valuable to go back and look at such claims later to see how much monthly temperature values quoted in news reports of the past have shifted in the present.
More on CLIMAT reports here in the WMO technical manual.
UPDATE: For more on the continually shifting data issue, see this WUWT post from John Goetz on what he sees in the GISS surface temperature record:
http://wattsupwiththat.com/2008/04/08/rewriting-history-time-and-time-again/
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Grainger:
‘Surely, it is ridiculous to produce temperature figures with an implied accuracy to three decimal places? Does any one reporting these temperatures read the instrument to that level of precision?’
No! It is not ridiculous at all. Equipment can easily measure the temperature anywhere to +/-0.001 degree (satellite data) or +/-0.01 for ground and sea data at any instant of time though of course a few seconds later that temperature will have changed.
Thus you’ll collect a large cluster of raw temperature measurements together with their error bars. Average them, do the statistics and see if the average value differs significantly from that of another data set. I think we can assume that in the figures quoted the statistics justify the thousandth place data.
As some who has spent time measuring water density to 1 part per million I can say that water sitting anywhere other than in a proportionally-controlled water bath designed to keep temperatures to +/- 0.001 degree, varies in temperature by several hundredths of a degree almost by the second which is, I guess, why the quoted averages are less precise than those for the satellite data. Unnecessary and expensive acquisition of data!
Onion2,
Interesting that they acknowledge that they did not get things right, such as not doing the recent ‘normals’, but now that they have modified the major effect seems to be a lowering of the temperature in the early part of the record.
Wow, who would have thought that changes to recent temp records could have such an effect!
I thoroughly dislike the anonymity of “global” temperatures. There are so many c**p issues involved, station loss (urban vs rural esp Arctic), UHI, instrument/station sensitivity changes (several issues), station location (lots of issues), for starters, then all the “necessary” adjustments. And when all the c**p has done its work, then I fear the satellites themselves might risk being adjusted to phony calibration that do NOT allow properly for orbit decay etc.
The answer is very simple. Under our noses in fact. IMHO.
Why try to get a totally sci-fi “global average” at all? Why not be content with some local stories; just use enough rural stations of reasonable trustworthiness to provide a statistically significant signal. Like John Daly did. Like I suspect my article does if it could be statistically quantified – certainly it compels visually.
As Anthony pointed out, none of the world climate centres are credited with any of the ISO standards, yet our governments are keen to make policy based on this junk data. Goes to show you what you will do when there is s lot of money at stake.
Monthly raw is not raw. The data is collected daily or more often eg Max Min. The “Monthly” already has a missing days treatment. Then the missing months are estimated. But the “missing months” were in general not empty, just fewer days than some limit. The more normal approach to missing daily records is to go back to the daily record and fit from the measurements you have. If you know there is no trend, you can use later measurements in the fitting mix. However, warmers can NOT assume no trend. If you want to find any trends in the data you have to privilege the data from around the same period. If you have 25 days in May 2007, 5 days in June, and 20 days July 2007. Normal methods to estimate June would be to use the days you have, and not throw away the 5 days in June. Common methods in other fields would be to infill in the gaps with the mean of the two end dates, or better, fit a polynominal through the known points.
You also need to investigate why you have missing data as far as possible to estimate systematic errors – eg you may have less missing Feb as it is a shorter month. Plus a lot of the “missing data” only appears to be missing if you do not look for it.
You should be working with the daily data even if you want to graph month results; but people are not doing this as the raw daily just is not publicly available. I wonder if it is even available to the magic circle.
“”””” alleagra says:
January 31, 2011 at 1:51 pm
Grainger:
‘Surely, it is ridiculous to produce temperature figures with an implied accuracy to three decimal places? Does any one reporting these temperatures read the instrument to that level of precision?’
No! It is not ridiculous at all. Equipment can easily measure the temperature anywhere to +/-0.001 degree (satellite data) or +/-0.01 for ground and sea data at any instant of time though of course a few seconds later that temperature will have changed. “””””
Now that satellite Temperature measurement to 1/1000 th degree; just where exactly in, on, or around the planet was that measurement made ? Would not the Temperature change that much between two points say 1 cm apart, say on the ground somewhere in a typical shopping mall (outside buildings). Can you really separate points that close from a satellite ?
“”””” izen says:
January 31, 2011 at 1:43 pm
@-Bob(Sceptical Redcoat) says:
January 31, 2011 at 11:03 am
“Where do they get atmospheric temperature readings to 3 decimal places of accuracy – in the laboratory or computer, perhaps?”
The number of decimal places is a reflection of the sample size rather than the accuracy of the readings. “””””
Well you see that IS the whole point. The “sample size” is precisely one reading, of which they record two a day; the maximum of all of the readings, and the minimum of all of the readings.
That is insufficient sampling density to even correctly determine the daily average Temperature (which is NOT a sinusoidally varying (single frequency) function of time).
The data set should report the number of stations reporting for that particular number presented… that way if the number of stations changes you’d know why! Also the entire history and all calculations leading up to each and every number MUST specify it’s source and full audit trail of computations for any of this “climate” “data” to be trusted.
onion2 says:
January 31, 2011 at 1:00 pm
If anyone is really bothered about the data handling used to produce the CRU and GISTEMP surface records and wants to verify them I recommend producing your own surface temperature record maintained by someone you trust.
That’s always missed the point. So why don’t you instead figure out why what you recommend is not relevant? Hint, the necessary scepticism of the scientific method as applied to the “materials and methods” of the CRU and Giss reconstructions does not mean it also has to determine a “correct” GMT from surface stations. It doesn’t even involve having to accept that the GMT means anything at all, apart from the way it is constructed.
Is quoting Orwell cliche and hackneyed?
“All history was a palimpsest, scraped clean and reinscribed exactly as often as was necessary. Even the written instructions which Winston received…….never stated or implied that an act of forgery was to be committed; always the reference was to slips, errors, misprints, or misquotations which it was necessary to put right in the interests of accuracy……there did not exist, and never again could exist, any standard against which it could be tested.”
izen says:
January 31, 2011 at 11:01 am
your comparison proves the point. Drugs are withdrawn. Litigation commences.
In climate science, fraud on every scale is promoted and the aftermath is that when questioned, this *ethical probity* is whitewashed away and put under the carpet (or in the broom cupboard) and they continue on their complacent and ideological path.
I don’t see the probity in this procedure
Murray Grainger says:
January 31, 2011 at 5:14 am
Surely, it is ridiculous to produce temperature figures with an implied accuracy to three decimal places?
——–
No it’s not because the numbers are given explicit error bars.
It’s also useful to keep the decimal places to help pick up variations due to processing errors.
And to act as guard digits if the values are propagated into other calculations.
——–
Does any one reporting these temperatures read the instrument to that level of precision?
——–
No they don’t. The values are not raw.
John Marshall says:
January 31, 2011 at 5:35 am
Do CRU still use surface measured data? If so then this shows how antiquated the system is. Since the surface data system is not a complete coverage, with that 2/3 loss of stations in 1990, then these people should shift to satellite data sets. These show a slight global cooling which is not what these people need to show.
——–
Err, no.
Having multiple data sets with different measurenent techniques and data processing methods allows cross-comparisons and sanity checks to be performed, not to mention a bit if competition to encourage people to do their best.
The satellites were put up to solve known problems with the surface data sets. But it is my understanding that the satellite data had calibration problems for a number of years. Ask Roy Spencer about that.
Max Hugoson says
——–
“Now let’s measure the temperature of the water in each cup, then put all the contents into ONE insulated container. If I “average” the ABSOLUTE temperature measured in each cup, will it match the “temperature” of the aggregate?”
———
Let me guess. You have not actually done the experiment!
Max Hugoson says
——–
“Since the ‘Global Warming’ claim, is predicated upon the atmospheric energy balance,
——
No it’s not
Ed Scott asks
——-
Why is it, that in discussing the greenhouse “effect,” heat transfer by convection is never mentioned.
——-
Because there is no air in outer space.
Please Sir? Can we call it fraud now?
REPLY: And I’ll point out that you won’t find a single climate center in the world with an ISO-9000 or ISO-8000 certification – Anthony
That is an excellent point…
Jimmy Haigh says:
February 1, 2011 at 12:43 am
REPLY: And I’ll point out that you won’t find a single climate center in the world with an ISO-9000 or ISO-8000 certification – Anthony
Actually the UK Met office DOES have ISO-9001-2000 certification.
http://www.metoffice.gov.uk/corporate/contracts/
“The Met Office holds ISO 9001-2000 accreditation and, wherever practical, requires that its suppliers also hold such accreditation.
The Met Office also holds ISO 14001-2004 and will increasingly look for environmentally aware suppliers.”
“”””” LazyTeenager says:
January 31, 2011 at 11:11 pm
Ed Scott asks
——-
Why is it, that in discussing the greenhouse “effect,” heat transfer by convection is never mentioned.
——-
Because there is no air in outer space. “””””
Very simple Ed; if you look at Dr Kevin Trenberth’s cartoon diagram purporting to show the energy budget of the earth; you will see all manner of energy and especially heat transporting mechanisms;too many of them for me to list here, and “heat transfer by convection”, is one of them he mentions.
Of that long list of mechanisms, ONLY ONE, the 390 W/m^2 surface emitted Long Wave Infra-red Radiation, is affected in ANY way by GREENHOUSE GASES, and in particular by CARBON DIOXIDE.
NO OTHER thermal process on earth is in any way impacted by the amount of CO2 in the atmoshpere. There has never ever been enough CO2 in the earth’s atmosphere to affect atmospheric convection one iota. Thermals, Evapo-transpiration, and Surface radiation, which together amount to 492 W/m^2 as depicted by Trenberth, are not functions of atmospehric CO2 abundance, or any other non-condensing greenhouse gas.
So the only effect of NCGHGs is partial absorption of the 390 W/m^2 Surface Radiation.
Murphy’s Law corollary:
“Constants aren’t; variables won’t.”
Arguments about decimal places overlook the impact of adjustments made to the station records that totally alter the century long trend. For example, USHCN Ver2 adds .97K to the trend at San Antonio TX relative to the Ver1 record for the period 1895-2005, changing its POLARITY! That’s how the AGW gorilla forages in the data base.
@-George E. Smith says:
“NO OTHER thermal process on earth is in any way impacted by the amount of CO2 in the atmoshpere. There has never ever been enough CO2 in the earth’s atmosphere to affect atmospheric convection one iota. ….
So the only effect of NCGHGs is partial absorption of the 390 W/m^2 Surface Radiation.”
You might want to reconsider this assertion. The partial absorption of IR surface radiation is followed rather rapidly by conversion to thermal energy of the bulk atmosphere. This means the whole of the atmosphere is warmed, and that energy shared is then re-radiated by the GHG’s. This establishes a temperature gradient from the surface upwards because most IR is absorbed in the first few metres which drives the subsequent convection.
The warming of the bottom of the atmosphere by thermalisation of absorbed surface IR is an important component of the process that provides energy to drive convection.
If you removed all GHG’s, including water vapour from the atmosphere, the only way the surface could warm the atmosphere is through direct conduction at the surface, so very little of the atmosphere would be warmed and conection would be MUCH less.
Put back the water vapour and its absorption of IR from the surface will increase convection . The phase change as it condenses into clouds also moves thermal energy higher. Not as fast as the IR would have traveled, c is the maximum velocity. However, when the temperature drops below freezing the effect from water vapour is effectively lost. It is unclear that water vapour could sustain temperatures above freezing leading to a snowball Earth.
The role of NCGHGs in providing some of the thermal energy that drives convection should be obvious. All these factors interact, the idea that changing one component of an energy flow will not change other aspects is to underestimate the causal factors governing the thermodynamics.
The role of NCGHGs in providing the thermal energy that drives convection should be obvious. All these factors interact, the idea that changing one component of an energy flow will not change other aspects is to underestimate the interelated factors governing the thermodynamics.