There is a new paper out that investigates something that has not previously been well dealt with related to the surface temperature record (at least as far as the author knows). “Sensor measurement uncertainty”. The author has defined a lower limit to the uncertainty in the instrumental surface temperature record.

UNCERTAINTY IN THE GLOBAL AVERAGE SURFACE AIR TEMPERATURE INDEX: A REPRESENTATIVE LOWER LIMIT
Patrick Frank, Palo Alto, CA 94301-2436, USA, Energy and Environment, Volume 21, Number 8 / December 2010 DOI: 10.1260/0958-305X.21.8.969
Abstract
Sensor measurement uncertainty has never been fully considered in prior appraisals of global average surface air temperature. The estimated average ±0.2 C station error has been incorrectly assessed as random, and the systematic error from uncontrolled variables has been invariably neglected. The systematic errors in measurements from three ideally sited and maintained temperature sensors are calculated herein. Combined with the ±0.2 C average station error, a representative lower-limit uncertainty of ±0.46 C was found for any global annual surface air temperature anomaly. This ±0.46 C reveals that the global surface air temperature anomaly trend from 1880 through 2000 is statistically indistinguishable from 0 C, and represents a lower limit of calibration uncertainty for climate models and for any prospective physically justifiable proxy reconstruction of paleo-temperature. The rate and magnitude of 20th century warming are thus unknowable, and suggestions of an unprecedented trend in 20th century global air temperature are unsustainable.
INTRODUCTION
The rate and magnitude of climate warming over the last century are of intense and
continuing international concern and research [1, 2]. Published assessments of the
sources of uncertainty in the global surface air temperature record have focused on
station moves, spatial inhomogeneity of surface stations, instrumental changes, and
land-use changes including urban growth.
However, reviews of surface station data quality and time series adjustments, used
to support an estimated uncertainty of about ±0.2 C in a centennial global average
surface air temperature anomaly of about +0.7 C, have not properly addressed
measurement noise and have never addressed the uncontrolled environmental
variables that impact sensor field resolution [3-11]. Field resolution refers to the ability
of a sensor to discriminate among similar temperatures, given environmental exposure
and the various sources of instrumental error.
In their recent estimate of global average surface air temperature and its uncertainties,
Brohan, et al. [11], hereinafter B06, evaluated measurement noise as discountable,
writing, “The random error in a single thermometer reading is about 0.2 C (1σ) [Folland,et al., 2001] ([12]); the monthly average will be based on at least two readings a day throughout the month, giving 60 or more values contributing to the mean. So the error
in the monthly average will be at most 0.2 /sqrt60= 0.03 C and this will be uncorrelated with the value for any other station or the value for any other month.
Paragraph [29] of B06 rationalizes this statistical approach by describing monthly surface station temperature records as consisting of a constant mean plus weather noise, thus, “The station temperature in each month during the normal period can be considered as the sum of two components: a constant station normal value (C) and a random weather value (w, with standard deviation σi).” This description plus the use of a 1 / sqrt60 reduction in measurement noise together indicate a signal averaging statistical approach to monthly temperature.
…
I and the volunteers get some mention:
The quality of individual surface stations is perhaps best surveyed in the US by way of the commendably excellent independent evaluations carried out by Anthony Watts and his corps of volunteers, publicly archived at http://www.surfacestations.org/ and approaching in extent the entire USHCN surface station network. As of this writing, 69% of the USHCN stations were reported to merit a site rating of poor, and a further 20% only fair [26]. These and more limited published surveys of station deficits [24, 27-30] have indicated far from ideal conditions governing surface station measurements in the US. In Europe, a recent wide-area analysis of station series quality under the European Climate Assessment [31], did not cite any survey of individual sensor variance stationarity, and observed that, “it cannot yet be guaranteed that every temperature and precipitation series in the December 2001 version will be sufficiently homogeneous in terms of daily mean and variance for every application.”
…
Thus, there apparently has never been a survey of temperature sensor noise variance or stationarity for the stations entering measurements into a global instrumental average, and stations that have been independently surveyed have exhibited predominantly poor site quality. Finally, Lin and Hubbard have shown [35] that variable field conditions impose non-linear systematic effects on the response of sensor electronics, suggestive of likely non-stationary noise variances within the temperature time series of individual surface stations.
…
…
The ±0.46 C lower limit of uncertainty shows that between 1880 and 2000, the
trend in averaged global surface air temperature anomalies is statistically
indistinguishable from 0 C at the 1σ level. One cannot, therefore, avoid the conclusion
that it is presently impossible to quantify the warming trend in global climate since
1880.
The journal paper is available from Multi-Science publishing here
I ask anyone who values this work and wants to know more, to support this publisher by purchasing a copy of the article at the link above.
Congratulations to Mr. Frank for his hard work and successful publication. I know his work will most certainly be cited.
Jeff Id at the Air Vent has a technical discussion going on about this as well, and it is worth a visit.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Here is a link to a letter to the APS editorial page from this author that may be the genesis of his published study. http://www.aps.org/units/nes/newsletters/fall09.cfm
Bravo!
At last some glimmer of reality. Now stand by for heavy flack.
Wow. Finally. Uncertainty officially acknowledged at this level. Combined with all the station problems you documented Anthony, and related adjustments, this should pretty much put all the quibbling about one-tenth degree changes, etc. to an end… but it won’t of course.
Well it is good to see that this has been published. This paper is bound to get quite some reaction. I know many who have long considered that the claimed warming is indeterminal from the noise. Given all the uncertainties, it is rediculous to claim measurements to 1/100ths of a degree.
Now if they could only clean up the UHI problem and bias through station drop out, we may be better able to consider the significance of any observed temperature change and to put it in its proper context.
It won’t stop the alarmists because it isn’t about AGW. It never was. It has always been about crisis, any crisis, that can be used to motivate the people into whatever action the political higher-ups want.
More significantly, the study concludes that the trend in surface tmeperatures since 1880 is statistically indistinguishable from a 0-degree C trend. That is, although it has often been widely acknowledged, even among skeptics of AGW, that the globe has warmed over the last century, there is no statistically significant support at even a one-sigma level in the data for such warming. There may in short have been no warming pattern at all, simply shorter term noise. The fault may not be in our star but in our numbers.
Wow, this “settled science” is always good for a big surprise. There has never ever been a thorough scientific assessment of station errors?! Extraordinary! I have learned a lot since I first became aware of “WUWT”, and I knew very well that “climate science” is full of holes. But to read now that nobody ever even bothered to assess the quality of the temperature measurements properly is still a somewhat shocking. The whole “climatism” (or “alarmism” or “warmism”) scam has to be brought to an end, the sooner the better!
I can only thank Anthony Watts for bringing all these informations to the wider public. Here in Germany we still have a long way to go before the scam ends. There seems to be an “unholy alliance” of MSM, politics, NGOs, “climate scientists” and “green energy” profiteers which all work hard to promote ever more regulations, taxes, fees and research grants. Until now the people unwillingly bear the burden…
Just curious, now that the Hudson Bay has frozen over, have BBQ conditions set in for the UK?
Has Mount Romm erupted?
It’d be a big volcanic eruption , lots of sulfate aerosols dispersed up high, lots of dust from the disintegrating gray dense stony matter. Mount Romm will be venting steam for weeks, spewing spurts of fire and brimstone for months.
I feel a chill in the air, like the world has gotten colder…
I don’t believe man is reponsible for the temperature rise, but we do see the following evidences that it has risen:
Less ice in the Arctic
Similar temperature measurements from satellites
Deviations with good explanations (Pinatubo, El Niño, La Niña)
probably more…
IMO the argument that we don’t know that the temperature has risen is not a good one. The evidence just doesn’t point that way.
These blow holes in CAGW though:
MWP
Cloud albedo feedback uncertainty
The way the “science” was hijacked
probably more…
Here is some relevant info about Metrology. I sent this to Mr McIntyre in 2008 but he never replied, and eventually I got around to sticking it on a blog for posterity…
http://pugshoes.blogspot.com/2010/10/metrology.html
[snip – sorry, your topic is way off topic this sort of thing is better published here. I’m sure they’ll take it. -Anthony ]
It’s as bad as we thought.
Whats the betting that in about 2 weeks time, Dr Spencer will be reporting a global temperature anomaly of about zero against the 30 year trend.
Anything about large lakes surface temperature uncertainty? NASA’s JPL gives figures way above Frank and Alto’s noise variance.
Good thoughtful article, however….
…….may be hard to sell to journalists and the general public.In Fig3 above, it certainly looks as if the temperature has risen since 1960. The general public may think that the strongest signal is the mean, with the error bars indicating a gradient of uncertainty.
The full uncertainty (Type A + Type B) of ±0.46 C vs Type A of ±0.2 C is typical of the difference between a full uncertainty analysis versus the commonly reported “accuracy”.
NIST provides an introduction:
Essentials of expressing measurement uncertainty
For details see: B.N. Taylor and C.E. KuyattGuidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results NIST TN 1297
It’ll be interesting to see what Tamino’s response to this is (if any). Isn’t this supposed to be right up his alley?
-Scott
A 130 years of data, and a straight line is still within the noise………….
==============================================
coaldust says:
January 20, 2011 at 3:19 pm
I don’t believe man is reponsible for the temperature rise
=================================================
Man is not responsible at all coal…..
…..if man had anything to do with it, it would be a lot warmer
My main issue with “Average” ( or mean), in this or many other contexts, is that most people who are not statistically literate equate that term with “Majority”. Not true of course, but even fewer people have a clue what mode or median is, much less how those and other terms (skew, kurtosis, etc. ) are used to describe a distribution. Which would be far more meaningful than a simple point estimate.
Thanks Anthony, and of course Patrick Frank,
I am qouting from this article at http://www.oarval.org/ClimateChange.htm and http://www.oarval.org/CambioClima.htm (Spanish).
To Anthony and the Surface Stations team… thank you, and congratulations.
By the way, I hear that Vegas bookies have your team leading the hockey team by (whatever is statistically more significant than) 6 Sigma!!!
So is there any science left in the CAWG concept? Its incontravertable eh.?
However you spell absolutely for sure the only answer.
Temperatures now measured electronically to 0.1 degrees but have been recorded to only +/- 0.5 degrees
http://www.srh.noaa.gov/ohx/dad/coop/EQUIPMENT.pdf … page 11
This doesn’t seem to have been considered.
Well the paper is about sensor errors. How about the errors that occur for about 70% of the surface area; which happens to be water. I don’t see any attempt to correct the erroneous data that was gathered prior to 1980, which in 2001 (Jan) was found to not correlate with the Temperatures that are measured on land ie from the atmosphere rather than from the water temperatures, as were the older oceanic data readings.
These global Temperature anomaly proxies have more changes than any carillon on earth.