UPDATE: See the first ever CONUS Tavg value for the year from the NCDC State of the Art Climate Reference Network here and compare its value for July 2012. There’s another surprise.
Glaring inconsistencies found between State of the Climate (SOTC) reports sent to the press and public and the “official” climate database record for the United States. Using NCDC’s own data, July 2012 can no longer be claimed to be the “hottest month on record”. UPDATE: Click graph at right for a WSJ story on the record.
First, I should point out that I didn’t go looking for this problem, it was a serendipitous discovery that came from me looking up the month-to-month average temperature for the CONtiguous United States (CONUS) for another project which you’ll see a report on in a couple of days. What started as an oddity noted for a single month now seems clearly to be systemic over a two-year period. On the eve of what will likely be a pronouncement from NCDC on 2012 being the “hottest year ever”, and since what I found is systemic and very influential to the press and to the public, I thought I should make my findings widely known now. Everything I’ve found should be replicable independently using the links and examples I provide. I’m writing the article as a timeline of discovery.
At issue is the difference between temperature data claims in the NCDC State of the Climate reports issued monthly and at year-end and the official NCDC climate database made available to the public. Please read on for my full investigation.
You can see the most current SOTC for the USA here:
http://www.ncdc.noaa.gov/sotc/national/2012/11
In that SOTC report they state right at the top:
Highlighted in yellow is the CONUS average temperature, which is the data I was after. I simply worked backwards each month to get the CONUS Tavg value and copy/paste it into a spreadsheet.
In early 2011 and late 2010, I started to encounter problems. The CONUS Tavg wasn’t in the SOTC reports, and I started to look around for an alternate source. Thankfully NCDC provided a link to that alternate source right in one the SOTC reports, specifically the first one where I discovered the CONUS Tavg value was missing, February 2011:
http://www.ncdc.noaa.gov/sotc/national/2011/02
That highlighted in blue “United States” was a link for plotting the 3-month Dec-Feb average using the NCDC climate database. It was a simple matter to switch the plotter to a single month, and get the CONUS Tavg value for Feb 2011, as shown below. Note the CONUS Tavg value at bottom right in yellow:
All well and good, and I set off to continue to populate my spreadsheet by working backwards through time. Where SOTC didn’t have a value, I used the NCDC climate database plotter.
And then I discovered that prior to October 2010, there were no mentions of CONUS Tavg in the NCDC SOTC reports. Since I was recording the URL’s to source each piece of data as well, I realized that it wouldn’t look all that good to have sources from two different URL’s for the same data, and so for the sake of consistency, I decided to use only the CONUS Tavg value from the NCDC climate database plotter, since it seemed to be complete where the SOTC was not.
I set about the task of updating my spreadsheet with only the CONUS Tavg values from the NCDC climate database plotter, and that’s when I started noticing that temperatures between the SOTC and the NCDC climate database plotter didn’t match for the same month.
Compare for yourself:
NCDC’s SOTC July 2012:
http://www.ncdc.noaa.gov/sotc/national/2012/07
Screencap of the claim for CONUS Tavg temperature for July 2012 in the SOTC:
Note the 77.6°F highlighted in blue. That is a link to the NCDC climate database plotter which is:
Screencap of the output from the NCDC climate database, note the value in yellow in the bottom right:
Note the difference. In the July 2012 State of the Climate Report, where NCDC makes the claim of “hottest month ever” and cites July 1936 as then benchmark record that was beaten, they say the CONUS Tavg for July 2012 is: 77.6°F
But in the NCDC climate database plotter output, the value is listed as 76.93°F almost 0.7°F cooler! They don’t match.
I initially thought this was just some simple arithmetic error or reporting error, a one-off event, but then I began to find it in other months when I compared the output from the NCDC climate database plotter. Here is a table of the differences I found for the last two years between claims made in the SOTC report and the NCDC database output.

In almost every instance dating back to the inception of the CONUS Tavg value being reported in the SOTC report, there’s a difference. Some are quite significant. In most cases, the database value is cooler than the claim made in the SOTC report. Clearly, it is a systemic issue that spans over two years of reporting to the press and to the public.
It suggests that claims made by NCDC when they send out these SOTC reports aren’t credible because there are such differences between the data. Clearly, NCDC means for the plotter output they link to, to be an official representation to the public, so there cannot be a claim of me using some “not fit for purpose” method to get that data. Further, the issue reveals itself in the NCDC rankings report which they also link to in SOTC reports:
Note the 76.93°F I highlighted in yellow. Since it appears in two separate web output products, it seems highly unlikely this is a “calculation on demand” error, but more likely simply a database output and that is simply displayed data.
Note the claim made in the NCDC July 2012 SOTC for the July 1936 CONUS Tavg temperature which is:
The previous warmest July for the nation was July 1936, when the average U.S. temperature was 77.4°F.
But now in two places, NCDC is reporting that the CONUS Tavg for July 2012 is 76.93°F about 0.47°F cooler than 77.4°F claimed as the previous monthly record in 1936, meaning that July 2012 by that comparison WAS NOT THE HOTTEST MONTH ON RECORD.
The question for now is: why do we appear to have two different sets of data for the past two years between the official database and the SOTC reports and why have they let this claim they made stand if the data does not support it?
There’s another curiosity.
Curiously, the last two months in my table above, October and November 2012 have identical values between the database and the SOTC report for those months.
What’s going on? Well, the explanation is quite simple, it’s a technology gap.
You see, despite what some people think, the nation’s climate monitoring network used for the SOTC reports is not some state of the art system, but rather the old Cooperative Observer Network which came into being in the 1890’s after Congress formed the original US Weather Bureau. Back then, we didn’t have telephones, fax machines, radio, modems or the Internet. Everything was observed/measured manually and recorded by hand with pen and paper, and mailed into NCDC for transcription every month. That is still the case today for a good portion of the network. Here’s a handwritten B91 official reporting form from the observer at the station the New York Times claims is the “best in the nation”, the USHCN station in Mohonk, New York:
Source: http://www7.ncdc.noaa.gov/IPS/coop/coop.html
Note that in cases like this station, the observer sends the report in at the end of the month, and then NCDC transcribes it into digital data, runs that data through quality control to fix missing data and incorrectly recorded data, and all that takes time, often a month or two for all the stations to report. Some stations in the climate network, such as airports, report via radio links and the Internet in near real-time. They get there in time for the end of the month report where the old paper forms do not, hence the technology gap tends to favor more of a certain kind of station, such as airports, over other traditional stations.
NCDC knows this, and reported about it. Note my bolding.
NOAA’s National Climatic Data Center (NCDC) is the world’s largest active archive of weather data. Each month, observers that are part of the National Weather Service Cooperative Observer Program (COOP) send their land-based meteorological surface observations of temperature and precipitation to NCDC to be added to the U.S. data archives. The COOP network is the country’s oldest surface weather network and consists of more than 11,000 observers. At the end of each month, the data are transmitted to NCDC via telephone, computer, or mail.
Typically by the 3rd day of the following month, NCDC has received enough data to run processes which are used to calculate divisional averages within each of the 48 contiguous states. These climate divisions represent areas with similar temperature and precipitation characteristics (see Guttman and Quayle, 1996 for additional details). State values are then derived from the area-weighted divisional values. Regions are derived from the statewide values in the same manner. These results are then used in numerous climate applications and publications, such as the monthly U.S. State of the Climate Report.
NCDC is making plans to transition its U.S. operational suite of products from the traditional divisional dataset to the Global Historical Climatological Network (GHCN) dataset during in the summer of 2011. The GHCN dataset is the world’s largest collection of daily climatological data. The GHCN utilizes many of the same surface stations as the current divisional dataset, and the data are delivered to NCDC in the same fashion. Further details on the transition and how it will affect the customer will be made available in the near future.
See: http://www.ncdc.noaa.gov/sotc/national/2010/10
The State of the Climate reports typically are issued in the first week of the next month. They don’t actually bother to put a release date on those reports, so I can’t give a table of specific dates. The press usually follows suit immediately afterwards, and we see claims like “hottest month ever” or “3rd warmest spring ever” being bandied about worldwide in news reports and blogs by the next day.
So basically, NCDC is making public claims about the average temperature of the United States, its rank compared to other months and years, and its severity, based on incomplete data. As I have demonstrated, that data then tends to change about two months later when all of the B91’s come in and are transcribed and the data set becomes complete.
It typically cools the country when all the data is used.
But, does NCDC go back and correct those early claims based on the new data? No
While I’d like to think “never attribute to malice what can be explained by simple incompetence“, surely they know about this, and the fact that they never go back and correct SOTC claims (which drive all the news stories) suggests some possible malfeasance. If this happens like this in CONUS, it would seem it happens in Global Tavg also, though I don’t have supporting data at the moment.
Finally, here is where it gets really, really, wonky. Remember earlier when I showed that by the claims in the July 2012 SOTC report the new data showed July 2012 was no longer hotter than July 1936? Here’s the SOTC again.
Note the July 1936 words are a link, and they go to the NCDC climate database plotter output again. Note the data for July 1936 I’ve highlighted in yellow:
July 1936 from the NCDC database says 76.43°F Even it doesn’t match the July 2012 SOTC claim of 77.4°F for July 1936. That can’t be explained by some B91 forms late in the mail.
So what IS the correct temperature for July 2012? What is the correct temperature for July 1936? I have absolutely no idea, and it appears that the federal agency charged with knowing the temperature of the USA to a high degree of certainty doesn’t quite know either. Either the SOTC is wrong, or the NCDC database available to the public is wrong. For all I know they both could be wrong. On their web page, NCDC bills themselves as:
How can they be a “trusted authority” when it appears none of their numbers match and they change depending on what part of NCDC you look at?
It is mind-boggling that this national average temperature and ranking is presented to the public and to the press as factual information and claims each month in the SOTC, when in fact the numbers change later. I’m betting we’ll see those identical numbers for October and November 2012 in Table 1 change too, as more B91 forms come in from climate observers around the country.
The law on such reporting:
Wikipedia has an entry on the data quality act, to which NCDC is beholden. Here are parts of it:
=============================================================
The Data Quality Act (DQA) passed through the United States Congress in Section 515 of the Consolidated Appropriations Act, 2001 (Pub.L. 106-554). Because the Act was a two-sentence rider in a spending bill, it had no name given in the actual legislation. The Government Accountability Office calls it the Information Quality Act, while others call it the Data Quality Act.
The DQA directs the Office of Management and Budget (OMB) to issue government-wide guidelines that “provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies”.
…
Sec. 515 (a) In General — The Director of the Office of Management and Budget shall, by not later than September 30, 2001, and with public and Federal agency involvement, issue guidelines under sections 3504(d)(1) and 3516 of title 44, United States Code, that provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies in fulfillment of the purposes and provisions of chapter 35 of title 44, United States Code, commonly referred to as the Paperwork Reduction Act.
=============================================================
Here’s the final text of the DQA as reported in the Federal Register:
http://www.whitehouse.gov/sites/default/files/omb/fedreg/reproducible2.pdf
Based on my reading of it, with their SOTC reports that are based on preliminary data, and not corrected later, NCDC has violated these four key points:
In the guidelines, OMB defines ‘‘quality’’ as the encompassing term, of which ‘‘utility,’’ ‘‘objectivity,’’ and ‘‘integrity’’ are the constituents. ‘‘Utility’’ refers to the usefulness of the information to the intended users. ‘‘Objectivity’’ focuses on whether the disseminated information is being presented in an accurate, clear, complete, and unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. ‘‘Integrity’’ refers to security—the protection of information from unauthorized access or revision, to ensure that the information is not compromised through corruption or falsification. OMB modeled the definitions of ‘‘information,’’ ‘‘government information,’’ ‘‘information dissemination product,’’ and ‘‘dissemination’’ on the longstanding definitions of those terms in OMB Circular A–130, but tailored them to fit into the context of these guidelines.
I’ll leave it to Congress and other Federal watchdogs to determine if a DQA violation has in fact occurred on a systemic basis. For now, I’d like to see NCDC explain why two publicly available avenues for “official” temperature data don’t match. I’d also like to see them justify their claims in the next SOTC due out any day.
I’ll have much more in the next couple of days on this issue, be sure to watch for the second part.
UPDATE: 1/7/2013 10AMPST
Jim Sefton writes on 2013/01/07 at 9:51 am
I just went to the Contiguous U.S. Temperature July 1895-2012 link you put up and now none of the temperatures are the same as either of your screen shots. Almost every year is different.
2012 is now 76.92 & 1936 is now 76.41 ?
Just in case it was some rounding / math issue with Javascript, I checked the source code & then checked the page in both IE & Chrome… the data for the comma-delimited data is distinct and matches those of the plot. So, in the 2 days since your post it has changed yet again… for all years apparently?
That’s verified, see screencap below made at the same time as the update:
This begs the question, how can the temperatures of the past be changing?
Here’s comment delimited data for all months of July in the record:
1895,71.04
1896,73.43
1897,72.97
1898,72.93
1899,72.68
1900,72.82
1901,75.93
1902,71.81
1903,71.58
1904,71.06
1905,71.60
1906,72.03
1907,72.20
1908,72.80
1909,72.24
1910,73.66
1911,72.28
1912,71.90
1913,72.66
1914,73.68
1915,70.53
1916,73.92
1917,74.19
1918,72.00
1919,73.95
1920,72.31
1921,74.24
1922,72.61
1923,73.37
1924,71.49
1925,73.72
1926,73.01
1927,72.28
1928,72.98
1929,73.24
1930,74.63
1931,75.30
1932,73.75
1933,74.73
1934,75.98
1935,74.76
1936,76.41
1937,74.19
1938,73.36
1939,74.44
1940,73.72
1941,73.62
1942,73.55
1943,73.89
1944,72.39
1945,72.53
1946,73.43
1947,72.43
1948,72.90
1949,73.85
1950,70.85
1951,73.26
1952,73.69
1953,73.75
1954,75.13
1955,74.10
1956,72.73
1957,73.98
1958,72.29
1959,73.27
1960,73.56
1961,72.92
1962,71.77
1963,73.39
1964,74.40
1965,72.37
1966,74.79
1967,72.28
1968,72.64
1969,73.86
1970,73.73
1971,72.18
1972,71.97
1973,73.08
1974,73.95
1975,73.39
1976,72.77
1977,74.30
1978,73.68
1979,73.03
1980,75.63
1981,73.79
1982,73.08
1983,73.92
1984,73.07
1985,73.94
1986,73.51
1987,73.26
1988,74.75
1989,74.15
1990,73.27
1991,73.93
1992,71.28
1993,72.25
1994,73.53
1995,73.61
1996,73.56
1997,73.24
1998,75.49
1999,74.44
2000,73.90
2001,74.61
2002,75.90
2003,75.50
2004,72.98
2005,75.34
2006,76.53
2007,74.77
2008,74.21
2009,72.74
2010,74.83
2011,76.28
2012,76.92
SUPPLEMENT:
For now, in case the SOTC reports should suddenly disappear or get changed without notice, I have all of those NCDC reports that form the basis of Table 1 archived below as PDF files.
State of the Climate _ National Overview _ October 2010
State of the Climate _ National Overview _ January 2011
State of the Climate _ National Overview _ February 2011
State of the Climate _ National Overview _ March 2011
State of the Climate _ National Overview _ April 2011
State of the Climate _ National Overview _ May 2011
State of the Climate _ National Overview _ June 2011
State of the Climate _ National Overview _ July 2011
State of the Climate _ National Overview _ August 2011
State of the Climate _ National Overview _ September 2011
State of the Climate _ National Overview _ October 2011
State of the Climate _ National Overview _ November 2011
State of the Climate _ National Overview _ December 2011
State of the Climate _ National Overview _ January 2012
State of the Climate _ National Overview _ February 2012
State of the Climate _ National Overview _ March 2012
State of the Climate _ National Overview _ April 2012
State of the Climate _ National Overview _ May 2012
State of the Climate _ National Overview _ June 2012
State of the Climate _ National Overview _ July 2012
State of the Climate _ National Overview _ August 2012
State of the Climate _ National Overview _ September 2012
State of the Climate _ National Overview _ October 2012
State of the Climate _ National Overview _ November 2012











@Philip Bradley
There is most definitely more than one way to calculate an average.
Arithmetic Mean
Subcontrary Mean (harmonic mean)
Geometric Mean
When calculating the average of metrics that are ratios, the harmonic mean should be used if the frame of reference is in the denominator. For example if you average the fuel economy readings of a vehicle expressed in litres per 100 km you will get the wrong answer. Calculating the harmonic mean gives the correct answer.
Oddly enough, I would not ascribe this to malice. My guess is the press and the pols were pressuring them for numbers and so they started reporting when the bulk of the information was in, on the assumption that the rest of it would have the same or similar average. What I’m guessing is that the reports that come in by mail are from locations with no electronic access at all (otherwise why mail them?). That being the case, the early data comes from sites that are urbanized (read UHI inflated) while the late data comes from sites with no urbanization (hence no inflation). They didn’t go back to check their assumption as to averages not being changed much by the later data….. and ooooops! I bet that if you did a match between late data and category 1/2 sites, you’d get a strong correlation.
There was a thunder clap not too long ago, the sound of thousands of bureaucratic sphincter valves snapping shut. Wow Anthony, what a find!
It seems the Russians are coming up with the right answer-
http://www.slattsnews.observationdeck.org/?p=7008
and they more than any of us should know.
Averaging numbers without knowing their inherent distribution or the scope of their inherent error or precision is futile. Since virtually all land based temperatures prior to the 1970s are based on non-random, non-replicated single daily observations, Why would you expect to get anything other than anecdotal information (garbage) from it ?
We do not even know if the recorded temperature of ANY one day during that period is the ACTUAL minimum or maximum, the variance, standard deviation, size or type of errors, or almost anything except we have this one worthless number for minimum and one for maximum of unknown utility.
So we use it. We adjust the number(s) to fit our agenda or to correct perceived errors. But don’t have any illusions about the value of the data thus obtained.
I find it especially concerning that the lag in data making via mail for the areas which are more rural, get used to artificially warm the results. This is good thinking to consider that. The urban heat island effect is real… and used to benefit alarmists with an agenda.
Great job! Well done. 🙂
Don’t remember the specifics but in one of the climategate emails “they” conspired to release an early statement so the latter official report would hopefully be overlooked by the media. Anyone recall this email?
Can we start calling you Dr. Wattson now? 😉
Why don’t you report the DQA violations for inflation and unemployment while you’re at it….would simply be lancing windmills. It’s not just the data that’s been corrupted.
And in 2008 (B91 from Mohonk) they are/were still only measuring temperature to whole degrees Fahrenheit !!! And how much guesswork (confirmation bias, anyone?) does that involve when the mercury is mid-way between gradations?
Another perfect example of why WUWT has become so popular. Unbiased facts, presented as they fall, with readers shown the evidence and encouraged to draw they’re own conclusions as that evidence suggests. Anthony Watts continues to embarrass those who would have us believe in the contrived fairy story that is AGW climate catastrophe.
Why is it, whilst reading the Data Quality Act, I couldn’t help but keep thinking of one James and one Gavin. Has anyone been cheeky enough to send the DQA to the aforementioned gents?
Good work Anthony.
Crispin in Waterloo says:
January 6, 2013 at 10:07 pm
I was referring to the arithmetic mean, and while you correctly point out there are other means, the statement that ‘there is only one way to calculate an average (that is arithmetic or any other kind of mean)
is still true. Although of course, different calculations are required for different types of mean.
BTW, your example is wrong. If I were to average (arithmetic average) the fuel consumption of vehicles each of which travelled 100 km, I would get the average amount of fuel required to travel 100 km.
Amazing post mr watchdog. This is why mainstream media is dying and is replaced by blog reporting. The internet is only 20 years old, an already changing the world.
The GHCN has Zombie Thermometers that suddenly have data show up long after they seemed dead… and start walking the earth again. This means that any “average” gives slightly different values based on when the data were looked at and not just what span of time is chosen…
https://chiefio.wordpress.com/2010/02/15/thermometer-zombie-walk/
There is no standard average temperature. Then again, since we don’t know if temperatures have a standard normal distribution the “mean” may well be statistically undefined anyway:
https://chiefio.wordpress.com/2012/12/10/do-temperatures-have-a-mean/
Then again, since temperature is an intrinsic property, and the average of an intrinsic property has no physical meaning, any average of a temperature is kind of meaningless:
https://chiefio.wordpress.com/2011/07/01/intrinsic-extrinsic-intensive-extensive/
(One really needs mass and specific heat / enthalpy to have an extrinsic property – heat – to average and have meaning.)
But “it’s what every one does” even if it is meaningless…
One thing it does do, though, is (IMHO) offer pretty darned good evidence that the move to automated equipment “warmed” the series. The newer automatic MMTS are reporting “now” while the older slower are reporting later…
@Michael D. Smith:
Wow! That’s some chart!
@BioBob:
Glad to see someone else who “gets it”. BTW, at one time I found and online copy of the directions to the folks reading the temperatures and put up a link to it. Shortly after it was scrubbed / removed…. This was before I learned to screen capture EVERYTHING so the Data Langoliers don’t disappear it…. For years (decades? centuries?) the official guideline for how to read the thermometers said basically “If you don’t know, guess.”
It was encouraged to just make up what you thought the temperature was and write that down.
What kind of error bars does that put on the record?
@SimonJ:
At least it isn’t in whole degrees of C!
https://chiefio.wordpress.com/2012/01/21/degrees-of-degrees/
But yes. That’s why I keep trying to point out the absurdity of saying anything about temperatures to more than 1 F of precision. (Yes, you can remove RANDOM error via an average, but not SYSTEMATIC error And what we have is lots of systematic error. UHI, wrong way ‘adjustment’ for MMTS rollout, Stevenson Screen paint aging, etc.)
@Philip Bradley:
Except in computer programming the exact order of processing can change the result due to various number limitations and underflow / overflow / bit precision artifacts. Do you add all the max-min then divide? Or do each one one at a time? It matters…
An example here that warms GIStemp:
http://chiefio.wordpress.com/2009/07/30/gistemp-f-to-c-convert-issues/
100 billion dollars in US taxpayers money spent on climate science, and not a single climate scientists spotted this problem in the data? How many other data problems have they missed?
It is clearly a data lag problem, as per davidmhoffer January 6, 2013 at 10:12 pm The early data is what gets reported, and the later data adjusts the figures downward. Thus the most recent months show no problem.
This would not have gone unnoticed at NOAA – which does suggest malice in that they have not taken steps to correct the problem. Most likely out of fear of running afoul of the politically correct line and the threat to continued employment.
I understood that three sets of books were normally kept. The ones for the tax man, the ones for the accountant and the real ones. No soup for you until you find the real ones!
“Continuous” and “contiguous” are not interchangeable. Continuous is a time term. Contiguous is a space term. At present, I am writing a reply. “I am writing” is the present continuous tense of the verb “to write”. At present, Alaska and Hawaii continue to be part of the USA. They are part of the continuous USA. They are not part of the contiguous USA.
Time and Space are two different things.
Great work Anthony. Keep at them, and belated best wishes for the New Year from the old world.
So temps from the faster-reporting sites (airports, cities, etc) are higher than those from the slower sites (more rural). And when they’re all in, the temp goes down. Doesn’t that suggest UHI?
Typical of a large government bureaucracy. Probably nothing deliberately corrupt, although I won’t rule it out, just civil service incompetence and confusion.
Mr Watts, you are a machine!!!!
Definition of CONUS? I couldn’t give a continental.
But seriously… is there anywhere on the internet where I can get global historical temperature data in table form going back at least 20 years? Charts available on several sites but transcribing is laborious work and subject to error.
Anthony
This is an interesting story you have written about which strangely is related to a couple of articles I wrote. Firstly, I wrote about ‘Mohonk best in the Nation.’ Well, if that’s the best the US has got problems. The article starts;
“The Little Ice Age thermometers http://climatereason.com/LittleIceAgeThermometers/
which predate Giss and Hadley/Cru, provide an interesting insight into the longer term climatic cycles that the shorter records often seem to miss. This was demonstrated with Uppsala/Stockholm and Hohenpeissenberg in my article; http://noconsensus.wordpress.com/2009/11/05/invisible-elephants/
Today we examine another temperature triplet linked by the Hudson river, drop in to see James Hansen and Gavin, visit a shanty town and pay our respects to John Lennon. In other words my usual eclectic mix of history, trivia, science and serious investigation.
The first of our records comes from Mohonk, which in the world of climate science is a bit of a hero. It is a most interesting station, as this link demonstrates; ”
http://noconsensus.wordpress.com/2009/11/25/triplets-on-the-hudson-river/#comment-13064
Last year I followed this up with an article on the unreliability of the temperature record right from the days it became big business rather than a scientific endeavour-in the States that was around 1880 when many current stations were set up. The manner in which data was collected was roundly condemned at the time, including by a leading climatologist writing in 1900. That story was carried here
http://wattsupwiththat.com/2011/05/23/little-ice-age-thermometers-%e2%80%93-history-and-reliability-2/
—- ——
After some 5 years of writing historically based climate articles, the two things I have learnt is that the temperature record is a moveable feast that depends on who is creating it and what purposes they want to use it for, (there is usually an ‘agenda’ but not deliberate malpractice) combined with astonishment at the sheer unreliability of portions of the data.
I think a lot of the problem is that since the advent of the computer number crunchers like to play about with data and create models and scenarios and sometimes dubious data is then further manipulated. Some of the basic material on which far reaching studies are being based is frankly bizarre, there is no better example than SST’s which we believe we have a global knowledge of back to 1860 to fractions of a degree.
It will be interesting to see how your current investigation plays out, but Hubert Lamb-first director of CRU – had it sussed out by saying that when examining temperatures ‘we can know the tendancy but not the precision.’
tonyb
If you wany something done properly, ask a busy man. Well done Anthony!
If it is the case that that, as in most Govt. statistics (think employment numbers, GDP, etc) there is a preliminary estimate (which is hawked around and moves the media and markets) and then a set of revisions in subsequent months (which are ignored) then NOAA is guilty of not making it clear that the headline figures are provisional, and not making any announcement of the revisions. This is b-a-a-d.
It’s difficult to believe, after the billions spent on climate science, that the network still relies on pony express technology to communicate the figures. Even some domestic electric and gas meters can radio in their updates automatically.
Make you wonder what sort of mess 3rd world monitoring must be if the world’s high tech leader can’t get it right.