Excerpt from The Inconvenient Skeptic by John Kehr
The longer I am involved in the global warming debate the more frustrated I am getting with the CRU temperature data. This is the one of the most commonly cited sources of global temperature data, but the numbers just don’t stay put. Each and every month the past monthly temperatures are revised. Since I enter the data into a spreadsheet each month I am constantly seeing the shift in the data. If it was the third significant digit it wouldn’t bother me (very much), but it is much more than that.
For example, I have two very different values for January of 2010 since September 2010. Here are the values for January based on the date I gathered it.
Sep 10th, 2010: January 2010 anomaly was 0.707 °C
Jan 30th, 2011: January 2010 anomaly is now 0.675 °C
That is a 5% shift in the value for last January that has taken place in the past 4 months. All of the initial months of the year show a fairly significant shift in temperature.
Monthly Temperature values for global temperature change on a regular basis.
Read the entire post here
=============================================================
Some of this may be related to late reporting of GHCN stations, a problem we’ve also seen with GISS. Both GISS and CRU use GHCN station data which are received via CLIMAT reports. Some countries report better than others, some update quickly, some take weeks or months to report in to NOAA/NCDC who manage GHCN.
The data trickle-in problem can have effects on determining the global temperature and making pronouncements about it. What might be a record month at the time of the announcement may not be a few months later when all of the surface data is in. It might be valuable to go back and look at such claims later to see how much monthly temperature values quoted in news reports of the past have shifted in the present.
More on CLIMAT reports here in the WMO technical manual.
UPDATE: For more on the continually shifting data issue, see this WUWT post from John Goetz on what he sees in the GISS surface temperature record:
http://wattsupwiththat.com/2008/04/08/rewriting-history-time-and-time-again/

>Brian H says:
>January 31, 2011 at 5:14 am
>“The future is certain; only the past is subject to revision.”
>Polish Soviet adage.
That’s got a nice ring to it. Maybe CRU or the IPCC could pick that up as their motto?
richard verney says:
January 31, 2011 at 5:33 am
“Where facts and data are vital, such as medicine and drugs, such latitude isn’t permitted, neither is propaganda. With climate *science*, such abrogation is not unsafe, since there’s nothing at stake either way, being mere voodoo as a consensus.”
… except when people needlessly die on icy roads or in floods.
This has been going on for decades. I’ve monitored the published NASA/ NOAA surface station records, and yes, old data changes each month, often significantly. The temperature record for Manhattan, KS used to show a negative trend, but old data (from the late 1800s and early 1900s) has continuously “corrected” downward, in some cases by multiple degrees, and now shows a warming trend here.
Of course, scientific-sounding assumptions are given in sciency literature, but these are assumptions are arbitrary, untested, and statistically unchallenged. It’s unbelievable to me that so-called academics are allowing this to happen.
They should really report as they do on election night – and say “with 800 out of 1000 stations reporting, the global temperature anomaly is modelled (or calculated) to be 0.707. The stations not yet reporting have a history of pulling this value down, so we’ll wait for more reports before we can call this temperature a record.”
Don says:
January 31, 2011 at 5:00 am
>>
God forbid if the data for the efficacy and safety of a new drug was collected in this manner.
>>
Oh come on! Do you really imagine that the “scientists” employed by pharma labs are totally honest and objective? They’re whoring their PhDs on a full time basis.
The questions raised by Climatgate are not limited to climate science. They have broken the illusion that the public had about how objective and innocent the whole of the scientific community is.
A damn good thing this too. Blind faith is rarely a good thing.
“Does it really take as long as 23 months to adjust GHCN?”
In one respect it never stops “adjusting”. I believe the way it works is something like:
If a value is missing and a fill value must be calculated, it is calculated from an average of the surrounding stations. This average includes the current month’s value. Next month that “average” might change again and the next month it might change yet again. So this way the fill values calculated for missing data are constantly changing.
An anomalously cold month can cause adjustments in many years though 5% seems a bit much.
eadler says:
January 31, 2011 at 7:01 am
These differences [in GMT series] to not appear to be a real issue in the discussion of Global Warming.
Okay, eadler, since it’s your religion, what are you doing in your personal life to decrease your own contribution to fossil fuel CO2 production, in order to allegedly help save yourself or the World from CO2CAGW? Or isn’t it really that much of an issue?
NASA using tax-payer money to indoctrinate and propagandize kids on global climate change.
From NASA we have the greenhouse “effect.”
Why is it, that in discussing the greenhouse “effect,” heat transfer by convection is never mentioned.
Control the language; control the debate.
—————————————————————————————
http://climate.nasa.gov/kids/index.cfm
What is a greenhouse?
A greenhouse is a house made of glass. It has glass walls and a glass roof. People grow tomatoes and flowers and other plants in them. A greenhouse stays warm inside, even during winter. Sunlight shines in and warms the plants and air inside. But the heat is trapped by the glass and can’t escape. So during the daylight hours, it gets warmer and warmer inside a greenhouse, and stays pretty warm at night too.
How is Earth like a greenhouse?
Earth’s atmosphere does the same thing as the greenhouse. Gases in the atmosphere such as carbon dioxide do what the roof of a greenhouse does. During the day, the Sun shines through the atmosphere. Earth’s surface warms up in the sunlight. At night, Earth’s surface cools, releasing the heat back into the air. But some of the heat is trapped by the greenhouse gases in the atmosphere. That’s what keeps our Earth a warm and cozy 59 degrees Fahrenheit, on average.
@Geoff Alder, January 31, 2011 at 6:02 am
“I think ‘panic mongers’ has a better ring to it!”
Nah….I still prefer: Warm Mongers.
Anthony:
You worry me that some of the CLIMAT data are not in yet. I remember your article last year how some CLIMAT stations (most of which are at “hot” airports, another problem) were reporting minus temperatures as plus temperatures; for example, some Arctic stations at -30°C were being reported as PLUS 30°C, and this led to huge overall errors in the averages. Does this still happen sometimes? Did they put in a quality control step to prevent this after your excellent report?
One of the problems with automation—no “spell checking” going on to remediate reporting errors.
I also remember the “adjustment method” that came out in the Climategate emails, where seemingly arbitrary numbers were just added to the algorithms, adjusting upward past years. Is my remembrance correct?
Are the temperature figures presented the mean temperatures ?
or the mean temperatures per sq km ? The former is virtually
meaningless.
And why do they never make a stab at an actual global average,
instead of talking in ‘anomalies’ all the time? Is it because an actual
average would be disappointingly low, and not ‘hot’ at all ?
These guys clearly do not understand, or intentionally ignore, the principle of causality.
Mark
@Michael says:
January 31, 2011 at 6:30 am
“The point is to get the data out as soon as possible…”
Why?
climatebeagle says: January 31, 2011 at 7:15 am
Interesting comment in the original article:
“Oddly enough the yearly average seems to stay the same, but the monthly values that create the average are in constant flux.”
That reminds me of a period which must have been around 2007, when for something like six months the HADCRUT yearly average was within 0.01C of a single value.
To say I was suspicious is a complete understatement, someone’s mucky hand was on the data – I suspected to keep a final year figure “in line” with some directive from above.
Of course … as soon as I started posting, the discrepancy disappeared and it is now impossible for me to prove my assertion.
…WHICH IS EVEN MORE WORRYING! because there is just a statistical chance that data might be within a knat’s dick of some value each and every month … but it would stay that way … it wouldn’t suddenly disappear!
Don’t worry about how this happens. It’s all clearly explained in HARRY_READ_ME.txt
Gendeau – kind of like the unemployment claim numbers. Every week, they are lower than the week before, but never seem to get below 400k The revise them upwards after the initial report so they can make the claim. Just like the temperature numbers.
“Some of this may be related to late reporting of GHCN stations, a problem we’ve also seen with GISS. Both GISS and CRU use GHCN station data which are received via CLIMAT reports. Some countries report better than others, some update quickly, some take weeks or months to report in to NOAA/NCDC who manage GHCN.
The data trickle-in problem can have effects on determining the global temperature and making pronouncements about it. What might be a record month at the time of the announcement may not be a few months later when all of the surface data is in. It might be valuable to go back and look at such claims later to see how much monthly temperature values quoted in news reports of the past have shifted in the present.”
No, it must be something else. I make local backups of GHCN V2 from NOAA. I have a copy of v2.mean (raw monthly temperatures by station) from 2010-06-28 22:48:37 UTC. Now I have downloaded it again at 2011-01-31 15:33:26 UTC.
There are only three differences between the two copies for January 2010, they are as follows:
13167341000 LOURENCO MARQUES/COUNTINHO (MOZAMBIQUE) -25.90 32.60 (27.0°C)
30489056000 CENTRO MET.AN (CHILE) -62.42 -58.88 (0.2°C)
42572597000 MEDFORD/MEDFO (U.S.A.) 42.37 -122.87 (7.9°C)
In the June 2010 version there’s no January 2010 data for these stations while in the current version they have one (in parentheses at end of line).
John Kehr says between 10 September 2010 and 30 January 2011 CRU global anomaly for January 2010 has changed by almost 5%. The June 2010 version contains 1447 valid temperatures for January 2010 from all over the globe; the addition of just 3 data points can not possibly cause such a shift.
Do they keep changing the adjustment algorithm? Is it documented anywhere?
” P. Solar says:
January 31, 2011 at 8:34 am
Oh come on! Do you really imagine that the “scientists” employed by pharma labs are totally honest and objective? They’re whoring their PhDs on a full time basis. ”
Any good observer will notice there are a lot more drugs withdrawn from approval by big pharma due to problems in the clinical trials then claims of AGW by withdrawn by grant dependent PHDs.
Re ‘panicists’ to ‘panic mongers’, I think the more correct terminology is-
panic merchants.
John
Did you try contacting the responsible group?
.
.For all climate questions, please contact the National Climatic Data Center’s Climate Services and Monitoring Division:
Climate Services and Monitoring Division
NOAA/National Climatic Data center
151 Patton Avenue
Asheville, NC 28801-5001
fax: +1-828-271-4876
phone: +1-828-271-4800
email: ncdc.info@noaa.gov
They might even be able to tell you when their reporting is complete.
P. Solar says:
January 31, 2011 at 8:34 am
At the scientist level, that does not happen in large Pharmas. I am retired from several large and small Pharmas over 40 years’ employment, and corrupting the data is almost impossible.
For pharmaceuticals that are under FDA investigation, there are protocols that cannot be corrupted. Each measurement has a witnessing protocol, where another scientist has to legally testify, by literally looking over the shoulder of the originator, and signing and witnessing the entry in a bound notebook, that is electronically copied and placed in electronic storage, eventually in Iron Mountain, with the original notebooks.
Next, the data are checked and verified by QC (Quality Control). Before reporting the data in electronic format to the FDA, the QA (Quality Assurance) officer is called in to verify that the equipment has been serviced and validated (SOP verification) for the data. Next, if there is a transposition step (human entry), the original data is checked line-by-line versus the reported data by the QA officer, who does not report to QC, but to the CEO.
There is, in this chain, no vested interest in falsifying the data. Only big trouble, meaning jail time, is the outcome of any monkey business, and a chain of people would have to be in the conspiracy.
Because of Sarbanes/Oxley reporting rules by Congress, the CEO and CFO then would be also ultimately culpable, even if there is inadvertent error, and he would face jail time also.
These rules are the strictest in any business. On top of this, there are ISO rules (international protocols) that have to be followed for the manufacturing in large scale.
Oh, that the climate scientists would have to follow similar rules! If so, a lot of this chicanery would never have happened.
Does this mean that 2010 is no longer tied for the hottest on record?
Clinical trial data are subjected to several layers of 3rd party independent audit….
….without FOIA requests!!!
don says:
January 31, 2011 at 9:09 am
Clinical trials are designed to do just that! The four stages of trials, Phase I, II, III and IV are designed to incrementally expose an increasing number of patients to gather the statistics to detect toxicity. Phase I is typically done in 30 patients. Only a small percentage of drug candidates survive this step, because they don;t show a cure, or side effects emerge that the rat studies did not show. Phase II, usually in 100-200 patients these are usually done many times (Phase IIA, IIB, and so on), accumulating statistics on efficacy and safety. Phase III, often called a pivotal study, is in say a thousand people, and is usually not done in the clinical hospital, but is a take-home scenario, duplicating actual usage. These are usually repeated, if adequate statistics are not forthcoming. Phase IV is a post marketing step, designed to detect vanishingly small anomalies, or side effects, that only huge numbers, in the millions of patients would detect. Then, any side effects must be reported, — each company has a “hot line” that doctors and patients can access long past the commercialization stage. Called “adverse event reporting”, each event, even if a person dies of a hangnail while being treated for earlobe infection, must be reported immediately to the FDA.
That is why few drugs are approved each year, because companies have to run this gauntlet in which few drugs survive. I was in this field for 40+ years, have a couple of drugs on the market that I invented, and I do trust this system. I am retired, and have no vested interest. By the way, the Pharma industry is additionally protected by whistle-blowing statutes. Believe me, I would have blown a large whistle if I had ever seen anything crooked happening, and I was in position to see it all.
If global warming hypotheses, like declining polar bears, had to go through these rigorous steps before allowing publication, how many would survive?