In my post on the Mohonk Weather Station, the question came up about “raw” temperature data. Tom in Texas complained that he’d looked at data from the observer B91 forms and that it didn’t match what was posted in published data sets.
Neither NOAA nor NASA serve weather station data “raw”.

We’ve all seen examples posted here of how GISS adjusts data. But, it is not only NASA GISS that does this practice, in fact, NOAA adjusts temperature data also, and it is by their own admission. For example here is a NOAA provided graphs showing the trend over time of all the adjustments they apply to the entire USHCN dataset.


As illustrated in the graphs above, in simplest terms NOAA adds a positive bias to the raw data reported by weather station observers with their own “adjustment” methodology.
It is important to note that the graph on the bottom shows a positive adjustment of 0.5°F spanning from 1940 to 1999. The agreed upon “global warming signal” is said to be 1.3°F (.74C) over the last century.
The NOAA source for these graphs is: http://cdiac.ornl.gov/epubs/ndp/ushcn/ndp019.html
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
..grotesque…
Apologies if the tags do not work correctly.
In case you have not seen the paper “Economists and Climate Science: A Critique” by David Henderson, “former Treasury official; and much later, as Head of what was then the Economics and Statistics Department in the OECD Secretariat” it is available at
In this he mentions:
This is what is missing after “In this he mentions”
Other aspects of the work for WGI have also been subject to expert challenge. One
such aspect concerns the instrument-based series for global average surface
temperature which the Working Group and other official sources have relied on: the estimated temperature anomalies appear as subject to doubt because of imperfections in coverage and reliability, questionable statistical procedures, and non-climatic influences for which full allowance may not have been made.
joshv (19:01:26) :
I would like somebody to point me to an example of any field in the hard sciences where a researcher can get away with adjusting raw data for unmeasurable biases. Even if you can find such an example, I’d be surprised if the adjustment was of the same order of magnitude as the measurement.
The second graph is incredibly damning. Don’t like the fact that your raw data shows only 0.2 degC/century in warming? Adjust in 3 times the warming. This is not science. A scientist would find ways to determine if data is contaminated and discard it.
In fact, on US defense contracts, people have gone to jail for this sort of thing. Falsifying test data to “demonstrate” something is considered fraud under Earned Value Management. The most famous example was the A10 development program. I believe the fine was ~$1.5B (yes, Billion), jail time for executives at the contractor, etc.
Pity the government doesn’t have to follow its own rules.
As the observed temperature goes up wouldn’t the absolute value of the adjustment amount increase? Isn’t that what these graphs are saying (if it is TOB)? I admit that it seems strange that summer and winter changes in TOB (given human nature) don’t come closer to canceling out.
Why not come up with a standardized climate reporting package with remote automatic sensing?
John Philip (01:25:12) :
You commented that UAH data trended lower than the GISS, RSS, and HADCRUT data sets. I believe the UAH data is a lower troposphere, atmospheric measurements, whereas the others are surface measurements. If one assumes that all are adjustments are appropriate this would seem consistent with more TSI radiation from the Sun, passing through the atmosphere and reaching the surface of the Earth. This does not sound like a greenhouse affect where the atmosphere warms first and then the Earth.
That second graph is extraordinary. What could generate a consistent positive trend lasting for decades?
.
Of course, we know only too well about the poor quality of many stations. Some changes, such as moving to areas where there’s more concrete or more air conditioners, could produce a positive trend. But I would expect many changes could be either positive or negative and so would tend to cancel out on average.
.
Time of Observation Bias has been mentioned. Could this lead to a consistent positive trend over decades? If TOB changes tend to be random (e.g. a new observer prefers different, more convenient times) then again the effect should randomly cancel out. On the other hand, if observation times have been consistently changing over decades to a time that tends to give higher temperatures, then maybe.
.
Of course, the big elephant in the room is UHI. If this were properly adjusted for then it should produce a large negative offset, precisely the opposite of this graph. Michaels & McKitrick showed that, if UHI were properly accounted for, the total warming in recent decades would be about a half of the generally accepted figure. If they’re right then the UHI adjustment should be about -0.3C ( -0.6F).
.
To be honest, I simply don’t believe in the adjustments shown in this graph. I smell a huge rat. Clearly, the people who have done this work desperately want to prove AGW, particularly as opinion polls show steadily increasing scepticism. We’re asked to believe that these adjustments were done purely in the interests of science and the truth, and were not fabricated in order to frighten the politicians into throwing billions of dollars at climate scientists. Well, I’m not convinced. I don’t believe this is a direct conspiracy, but rather the result of vested interests and group think. Group think, by a process of rationalisation, can easily turn the worst motives, such as greed, into a noble cause, such as the wish to save the world. But, however noble it may sound, it’s still a scam. This graph could almost be described as the poster-child of this scam.
I try not to be too angry, but sometimes it’s difficult….
Chris
OT: For whoever is interested in the Redoubt eruption:
http://volcanism.wordpress.com/
As the solar minimum continues and the planet is in a natural cooling cycle, volcanic eruptions dumping emissions at at stratosphere levels could accelerate the cooling process.
The plume now has reached a level of 15.500 meters.
Currently we have 4 ongoing volcanic eruptions reaching stratospheric levels from time to time.
None of the current eruptions however will pose a climate risk at this moment in time.
In your visit to the NCDC last spring Anthony, they gave you a presentation on the newest set of adjustments to US temperatures.
The adjustments are now stated in Celsius (rather than F) and they split the adjustments into the two different steps and then how each affects Max and Min temperatures separately.
By my math, the TOBS adjustment added 0.2C to the raw data trend.
And the Homogenization adjustment added about 0.225C to the raw data trend.
Together, the two adjustments added 0.425C or 0.765F to the raw data trend(versus the 0.55F added up to 1999)
http://wattsupwiththat.com/2008/05/13/ushcn-version-2-prelims-expectations-and-tests/
Here is the PowerPoint although it doesn’t always load properly (you might have to retry it a few times).
http://wattsupwiththat.files.wordpress.com/2008/05/watts-visit.ppt
While the TOBs adjustment sounds reasonable enough, with the degree of urbanization which has happened in the US, the Homogenization adjustment should be negative to account for the UHI.
Not quite OT
More Greens and Marxists vowing to topple the free world
http://www.guardian.co.uk/business/2009/mar/22/g20-anti-globalisation-protests
At least they aren’t hiding behind as much greenwash as they normally do.
If there is a justifiable suspicion that the raw data is (a) being withheld and (b) manipulated to produce a politically-desired result, then it certainly would behoove people interested in the truth to file FOIA requests, and lawsuits if necessary, to obtain and make public the relevant data, on a continuing basis.
Mr. Watts here, and many others, as legitimate researchers in the field, certainly would have standing for legal action.
Two sites where you might get help with legal and FOIA issues:
http://www.eff.org/issues/bloggers/legal/journalists/foia
http://www.judicialwatch.org/open-records
Also, if it could be shown that the data were being massaged, this would be news that even the AGW sycophants in the mass media could not ignore.
Perhaps Roger Sowell, who is an attorney, could comment—and maybe help organize some formal action?
/Mr Lynn
The NOAA adjustment between 1955 – 1995 is approximately linear.
So for this adjustment to be an artifact of time-of-observation change, the time-of-observation would need to have changed monotonically.
This means a movement of the time-of-observation in each year of the 40-year period.
For example: 1955, measure at all stations at noon; 1956, measure them at 11:59; 1957 at 11:58. And so on up to 1995 when they measure at 11:20.
Is there any evidence of a monotonic movement in time-of-observation over this period?
Exactly why do they need to “adjust” the raw data ?
and the good old Guardian also advocates we grow a very comrade like beard to save the planet
http://www.guardian.co.uk/environment/ethicallivingblog/2009/mar/23/beard-environment-consumption-waste-carbon-frugal-money-saving
just like most guerillas and terrorists do 😉
gives new meaning to the phrase “manmade climate change.”
Neo (05:28:59) : Exactly why do they need to “adjust” the raw data ?
Depends upon what “raw” means. Ther are many reasons for adjustment.
Most sensor data is rarely clean thus needs filtering. Filtering is an adjustment. Then there’s the problem of calibration. If the temperature values take UHI effects into account then that too is an adjustment.
The real issue is modifying the data supporting an hypothesis (model) using the hypothesis itself as part of the adjustment criteria. That’s self defeating.
Anthony or others,
As a recent reader of this forum, I do not know if this has been asked before but is it possible to request the raw temperature data using the Freedom of Information Act? I do not think that raw temperature data would qualify as a FOIA exemption.
http://www.corporateservices.noaa.gov/foia/
http://www.hq.nasa.gov/pao/FOIA/
Can someone help me here. Is there any relationship between the GISS data and the NOAA USHCN data, or are they independent. Also are they used as confirmation of each others data if they are regarded as “independent”.
It’s worse than you think. Like something out of some SF-channel genetic horror movie.
GISS takes adjusted USHCN data. They apply an “unadjustment” algorithm. (No, really.) The resulting spawn is somewhat analogous to Lord Voldemort shortly before he returns to power.
Then they readjust it using their own procedures.
Metadata from the Black Lagoon.
Is there a uncorrupted data set?
“Mama, we all go to hell.”
While the TOBs adjustment sounds reasonable enough
Except when it turns out they are not using the actual TOBS as listed on the B-91 forms?
In computers, we work with two types of data. “raw” and “cooked”. Given that we’re looking at temperatures adjusted to warmer values, I think the term “cooked” applies quite well. It is a better antonym than “adjusted” which should be pared with “unadjusted” or “original”. I think the cooked term would have a proper connotation as well.
Exactly why do they need to “adjust” the raw data ?
There are legitimate reasons.
But those reasons provide the opportunity for much mischief.
“”” Ed Reid (04:13:53) :
We know, with absolute certainty (more or less), that the global average temperature is 14.44 +/- 2(+) degrees C, based on the temperature series as reported and the information on the quality of the measuring sites at surfacestations.org.
Isn’t that “close enough for government work”? 🙂 “””
Actually we don’t know any such thing. Being pedantic; most of the earth is at a temperature of at least 400 K, and some may be as hot as 10,000 K; well make that 5773 K-17320 K putting in the obligatory 3:1 climatology fudge factor.
Well so maybe that 14.44 is really just a surface temperature, and not average for the whole earth. Remember that the earth’s surface goes from – a few hundredas of metres at the dead sea to well over 8000 metres in the high mountains; I guess you can hardly call that lower troposphere either.
But even if you say it is for the earth’s surface, you still have a problem; in that 73% of the earth’s surface is oceans; and we don’t have reliable data for any temperature measurments over the oceans before around 1980, when buoys were set up to measure water and air temperatures simultaneously.
What the found over 20 years was that they aren’t correlated; which means that ocean water temperatures are not a proxy and never have been for oceanic air temperatures.
So the proxy data is only believable since about 1980; and before that it is simply wild guesswork for 73% of the earth’s surface.
Even since then, we don’t have a suitable earth bound temperature sampling network to accurately measure the average earth surface temperature; even for a single instant; let a lone averaged (properly) over say a full year orbit of the sun.
And GISS etc do not report earth temperatures; they report temperature anomalies; which relate to some fictional anomalie average over some 30 year or so interval; which also is unknown as an actual temperature.
Besides; anyone who says it is 14.44 +/-2, is just blowing smoke.
And as for adjusting data; if it’s “adjusted” it isn’t data; it’s “modelling”.
Why should temperature anomaly modelling to correctly “adjust” the readings, be any more believable than the 3:1 fudge factor that goes into temperature trend modelling or ocean level rise modelling.
If the instruments don’t read right; get rid of them and replace them with instruments which read correctly; and if there are external influences that affect te accuracy of the measurments; get rid of them too.
A good way to get a fatal processing plant explosion; is to infer a relationship bewteen that which you wish to control (or observe) and something else that you can control (or observe); which is :”modelling”, and then control the inferred variable; instead of directly measuring and then controlling the variable whose value you wish to contain.
But perhpas this is what Congress had in mind; when they earmarked $140million of our tax dollars for “Climate Data Modelling”; not vlimate modelling but climate data modelling; which is fudging data in my book.
Yes when the Japanese science advisers to their governmment described the climate modelling an recommendations of climatology’s UNIPCC, as “ancient astrology”; they were surely being unfair to ancient astrology.
I don’t mind adjustments but when the adjustments are as large as the signal they need to be questioned.
IMHO, these corrections look so unreasonable it’s hard to express it. They are basically saying that all our instruments raw data measurements are dropping at the same rate as global warming. – It makes no sense whatsoever.
I always read that the 4 global temperature metrics tracked fairly well. Didn’t I use to see graphs of this on WUWT? What’s getting lost in this discussion (except for John Philip at 01:25:12) is that regardless of the seemingly biased USHCN adjustments, the end result is a dataset that tracks fairly well with the other metrics. (Correct me if I’m wrong.) Perhaps the bias is nothing more than an attempt to make the dataset look reasonable compared to the others?
I would like somebody to point me to an example of any field in the hard sciences where a researcher can get away with adjusting raw data for unmeasurable biases.
Sociology. No, wait . . .