NOAA it's 12AM, do you know where your metadata is?

For example, until surfacestations volunteer Juan Slayton photographed it, there was no metadata to record the fact that this official USHCN station of record is sited over a tombstone.

Hanksville_looking_north
Image: NOAA USHCN COOP station at Hanksville, UT, sited over a grave. Click for larger image. Photo by surfacestations volunteer Juan Slayton

From Dr. Roger Pielke Senior:

Candid Admissions On Shortcomings In The Land Surface Temperature Data [GHCN and USHCN] At The September Exeter Meeting

At the meeting in Exeter, UK September 7-9, 2010 ,

Surface temperature datasets for the 21st Century

there were several  candid admissions with respect to the robustness of the global and USA surface temperature record that are being used for multidecadal surface temperature trend assessments (such as for the 2007 IPCC report).

These admissions were made despite the failure of the organizers to actually do what they claimed when they organized the meeting. In their announcement prior to the meeting [and this information has been removed in their update after the meeting] they wrote

“To be effective the meeting will have to be relatively small but, as stated above, stringent efforts will be made to entrain input from non-attendees in advance.”

In asking colleagues (such as my co-authors on our 2007 JGR paper)

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229

which has raised serious issues with the USHCN and GHCN analyses, none of us were “entrained” to provide input.

Nonetheless, despite the small number of individuals who were invited to be involved, there still are quite important admissions of shortcomings.

These include those from Tom Peterson

who stated in slide 8

“We need to respond to a wide variety of concerns – Though not necessarily all of them”

[from Introductory remarks – Tom Peterson];

Matt Menne, Claude Williams and Jay Lawrimore who reported that

“[GHCN Monthly]Version 2 released in 1997….but without station histories for stations outside the USA)”

“Undocumented changes [in the USHCN] can be as prevalent as documented changes even when extensive (digitized) metadata are available”

“Collectively station changes [in the USHCN] often have nearly random impacts, but even slight deviations from random matter greatly”

“Outside of the USA ~60% of the GHCN Version 3 average temperature trends are larger following homogenization”

“There is a need to identify gradual as well as abrupt changes in bias (but it is may (sic) be problematic to adjust for abrupt changes only)”

“Automation is the only realistic approach to deal with large datasets”

“More work is required to assess and quantify uncertainties in bias adjustments”

“Critiques of surface temperature data and processing methods are increasingly coming from non traditional scientific sources (non peer reviewed) and the issue raised may be too numerous and too frequent for a small group of traditional scientists to address”

“There is a growing interest in the nature of surface temperature data (reaching up to the highest levels of government)”

from Lessons learnt from US Historical Climate Network and Global Historical Climate Network most recent homogenisation cycle – Matt Menne;

and Peter Thorne from Agreed outcomes – Peter Thorne who wrote

“Usage restrictions

Realistically we are not suddenly going to have open unrestricted access to all withheld data. In some areas this is the majority of the data.”

There are very important admissions in these presentations.  First, outside of the USA,  there is inadequate (or no) publicly available information on station histories, yet these data are still used to create a “homogenized” global average surface temperature trend which reaches up to the “highest level of government”.  Even in the USA, there are undocumented issues.

While the organizers of the Exeter meeting are seeking to retain its leadership role in national and international assessments of the observed magnitude of global warming, it is clear that serious problems exist in using this data for this purpose. We will  post information on several new papers when ready to introduce readers of this weblog to quantification of additional systematic biases in the use of this data for long-term surface land temperature trend assessments.

There is a need, however, to accept that the primary metric for assessing global warming and cooling should be upper ocean heat content, since from 2004 onward the spatial coverage is clearly adequate for this purpose (e.g. see).  While there, of course, is a need for regional land surface temperature measurements including anomalies and long-term trends, for obtaining a global average climate system heat content change the oceans are clearly the more appropriate source of this information.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
87 Comments
Inline Feedbacks
View all comments
Dave F
September 21, 2010 12:08 am

I’ll get this one out of the way.
It was a grave error.
There. Proceed with the discussion.
Ok, I’ll even join in here. REALLY?! This has got to be a hoax! No agency could be so incompetent! I feel like Joaquin Phoenix of the past two years was modeled after the guy who keeps the metadata. I can understand a few errors, but all the others, and now a gravesite?! This will go down in the dictionary as the definition of “facepalm”.

September 21, 2010 12:13 am

The real failing of these meetings is that it is full of idealistic “scientists” and not “ground in the dirt knowledgeable of real conditions on the ground” engineers.
Tell a “scientist” that the global data shows a trend and they dream of a nobel prize.
Tell an engineer that the global data shows a trend and they will assume that there is another problem with the equipment and/or use on the ground.

R.S.Brown
September 21, 2010 12:36 am

The antlers and the rake on top should compensate for
the flagstones at the base… right ?

Chuck
September 21, 2010 12:46 am

Before one puts a thermometer in one’s mouth, one should ask,”Where has this been?”
Time to get the facts straight! The Conservative Head Hunters are coming.
Payback is a Mother!

LabMunkey
September 21, 2010 12:50 am

Literally speechless. Except to say that.
And that.

Malaga View
September 21, 2010 1:12 am

“Realistically we are not suddenly going to have open unrestricted access to all withheld data. In some areas this is the majority of the data.”
SCEPTIC TRANSLATION
Run around in circles making lots of noise, kick up plenty of dust and tell everyone we are not circling the wagons and that we are not hiding the bodies.
SUPER SCEPTIC TRANSLATION
The trick © is to hide the real data.

Evan Jones
Editor
September 21, 2010 1:16 am

Even in the USA, there are undocumented issues.
Not to mention lots of “documented” station moves (according to NCDC/MMS) that never happened! Blue Hill, for example, showed three major station moves. (It has been within 20′ of its present location since around 1895. A magnificent old Hazen Screen, one of the very few still operating.)

Evan Jones
Editor
September 21, 2010 1:18 am

RIP NOAA

September 21, 2010 1:19 am

This is a bizarre post. I can’t see what these new “admissions” are. For example
““Automation is the only realistic approach to deal with large datasets””
Is that an admission?? Or anything but the obvious?
But OK, this one was stated twice, and is in bold here:
There are very important admissions in these presentations. First, outside of the USA, there is inadequate (or no) publicly available information on station histories, yet these data are still used to create a “homogenized” global average surface temperature trend which reaches up to the “highest level of government”.
So is that it? Well here’s what Peterson said in the overview paper which accompanied the release of V2 in 1997, and has been prominently featured with that release ever since. It’s not hidden – it is the first para of sec 7 on Metadata:
“For long-term climate stations, there are two types of metadata. The first type is historical metadata that indicate changes with time. Many countries maintain detailed station history files that document relevant station attributes such as the type of thermometer used and when the instruments changed. Such metadata are very difficult if not impossible to acquire on a global basis. Therefore, historical metadata are not available for GHCN.”
It’s right there upfront – not an “admission” at a workshop in 2010.

REPLY:
Nick, I daresay there could be firepits under the stations not recorded in metadata and you’d give NOAA a free pass and offer some justification for ignoring the blatant sloppiness. If this were anything else but “save the planet science” it would be thrown out. – Anthony

John Marshall
September 21, 2010 1:22 am

It is on these flimsy and inadequate data sets that government policy is formulated for which we pay. It is about time the whole so called AGW theory is chucked away. If a theory is set on such data then the theory is wrong. It also violates the laws of thermodynamics. Enough said.

September 21, 2010 1:40 am

Through my training as an electronic technician I was shown the problems with instrument calibration and drift. My work as and amateur photographer has show me how difficult it is to control all the variables, to bad climate scientist did not learn these lessons.
If they had the answer to the question is the world warming would be to the best of out abilities we believe the answer is yes but the range of warming is still in the instrumental error margins, so based on that we can not say for certain that it is.
It now looks like after thirty year of grand pronouncement they finally may have to admit that there are problems with the data and there may be poor control of the variables. Something I figure from the beginning.

September 21, 2010 1:41 am

John Marshall says: “It is on these flimsy and inadequate data sets that government policy is formulated for which we pay. It is about time the whole so called AGW theory is chucked away.”
John, I’d have to disagree. Just because the data is lousy, it doesn’t mean the theory is wrong. The problem is that the known and obvious problems with the data gathering are totally ignored (probably more accurate to say hidden) when the “scientists” express the presumed uncertainty of the data.
What governments are being told is “the data shows”. What they are not being told is that “the data is so lousy that even if we invest billions into instrumentation capable of measuring the small change we are trying to detect, it will be several decades before we really know whether there is any meaningful trend”.

Michael Ozanne
September 21, 2010 1:50 am

As things roll on we seem to be confirming that far from “The science is settled” we are at “The science hasn’t been done”……

Alan the Brit
September 21, 2010 2:00 am

OT but there is a principle at stake here, not that NOAA are necessarily principled! The issue of poor surface station accounting etc will probably never go away, but the principles apply to other areas of science.
Recently back here in the motherland of dear old blighty (I know you colonials long to return), there was much ballyhoo & excitement about a discovery by a naturalist who managed to obtain video footage of a breeding pair of Asian Tigers above the valley jungle scape just below the snow line, in Butan I believe. This was indeed sensational news because experts have always insisted that these shy retiring creatures couldn’t possibley exist outside the jungle contraints. The claque & clique of other naturalists were terribly excited about this truly amazing discovery that gave one & all much encouragement about the survival abilities of these wonderful animals so on & so forth. I do not dissagree one jot! However, or BUT, as it is a big BUT. None of them seemed able to accept any humility about this dicovery by admitting that “we were wrong!”. I know it’s all rather petty in the grand scale of things but I thought it worth pointing out! I also note an article for BBC Online that the official list of 600,000 global species of plant life has been cut down to probably about 400,000 – because much has been counted twice!!!!!!! These guys are supposed to be experts & scientists for crying out loud!

Brownedoff
September 21, 2010 2:11 am

Nick Stokes
September 21, 2010 at 1:19 am
So, it has been operated by a bunch of cowboys from the outset then – no?

Roy
September 21, 2010 2:40 am

“Critiques of surface temperature data and processing methods are increasingly coming from non traditional scientific sources (non peer reviewed) […]”

It is interesting that in my own field (software development) the products of working professionals are expected, with ample justification, to be more nearly fit-for-purpose than similar work by (peer reviewed) academics. That is not to say that academics don’t do invaluable work in their field, but it certainly shows that one should actually expect valid critiques come from non-traditional sources, and indeed it would be surprising if academics did do a better job of the vital if unexciting detail work.

simpleseekeraftertruth
September 21, 2010 2:48 am

From Oxburgh at HoC evidence transcript;
In Q36;
A: “So it is a pretty difficult business. That is why on the serious publications massive uncertainty bands are associated with temperature reconstructions.”
In Q40;
A: “I, personally, think that in various publications for public consumption those who have used the CRU data and those who have used other climatic data have not helped their case by illuminating the very wide uncertainty band associated with it.”
Is he saying that there is massive uncertainty which should not be conveyed as it does not help the case for a warming? It certainly reads like that.
And
In Q38;
Q: “Do you think that the CRU scientists are people of integrity but out of their depth when it comes to statistical analysis?”
A: “You are quite right. We were fortunate in having a very eminent statistician on our panel. …….. And he was really quite serious in saying that this was not the best way to do it. Having said that, even though they had adopted their own methods, he said, actually, had they used more sophisticated and state of the art methods, it wouldn’t have made a great deal of difference.”
Could it be that the panel statistician (Prof hand) came to the conclusion that the error bands were so massive that state of the art methods wouldn’t have made a great deal of difference? I am sure that Prof Hand did not do a comparison of results from different statistical methods in his head, he certainly did not do them conventionaly yet came to this conclusion: “wouldn’t have made a great deal of difference.”
Is he saying that there are errors so massive so as to render statistical analysis methods moot? It certainly reads like that too.
The inscription on that tombstone: does it read ‘Here lies Robust. Victim of abuse but now at peace.’

Sam the Skeptic
September 21, 2010 3:20 am

Mike Haseler says:
September 21, 2010 at 1:41 am

What they are not being told is that “the data is so lousy that even if we invest billions into instrumentation capable of measuring the small change we are trying to detect, it will be several decades before we really know whether there is any meaningful trend”

Which surely is what John Marshall is saying:- We don’t damn well know! We are being asked to spend trillions of pounds, dollars, euros — not even on improving the instrumentation but on mitigation and/or rectification of the “problem” on the assumption that the data are sound. The theory may be fine, though personally I doubt it, but the data backing it up are not robust enough to warrant the level of certainty being claimed by the climate science community. Anthony keeps publishing photographs to back up that statement while all the climate science community can do in reply is lie, sulk, insult or waffle.
Nothing that we have experienced in the last forty years has fallen outside the normal range of climatic variation.

Ian W
September 21, 2010 3:22 am

From the final conclusion:
While there, of course, is a need for regional land surface temperature measurements including anomalies and long-term trends, for obtaining a global average climate system heat content change the oceans are clearly the more appropriate source of this information.
air temperature measurements cannot provide any indication of ‘heat content’ due to the huge atmospheric enthalpy variations with humidity. Ocean temperatures however are equivalent to heat content so they can be used. Why do climatologists in both warming and sceptic camps insist on using the incorrect metrics? Surely, it cannot be out of ignorance.

September 21, 2010 3:55 am

Sam the Skeptic says: “Which surely is what John Marshall is saying:- We don’t damn well know!” … all the climate science community can do in reply is lie, sulk, insult or waffle.
Nothing that we have experienced in the last forty years has fallen outside the normal range of climatic variation.

My point is that we shouldn’t use the same bogus “science” as the climate community and short certain all proper science to link (bogus)data with (bogus) theory.
The theory of CO2 warming is bogus because there are totally imaginary forces used to suggest the massive warming they need to scare us all with and the theory would clearly be at odds with the way real science works even if the temperature data were rock solid.
The shambolic temperature data doesn’t prove the theory wrong: you can’t prove a theory wrong by bad data any more than you can prove it right! What the shambolic temperature data does prove is that those creating and using the data as if it were rock solid, are cowboy scientists (in the UK we call cheap builders who e.g. don’t put in foundations to buildings: cowboy builders).

September 21, 2010 3:57 am

Brownedoff says: September 21, 2010 at 2:11 am
….
No, the data is what it is. The NOAA weren’t responsible for the measurements; they have to do the best analysis they can with the data we have. Imperfections have been acknowledged all along. What do you think they should do?
REPLY: “The NOAA weren’t responsible for the measurements; ”
Nick, that’s BS, pure and simple.
NOAA is entirely responsible for the USHCN stations and the data they produce. They are wholly responsible for; the equipment, the maintenance, the instrumentation, the calibration, training of observers, and the collation/adjustment of the data.
http://www.nws.noaa.gov/om/coop/
In GHCN, they are responsible for the collation/adjustment of the data under one dataset at NCDC. NCDC’s Tom Peterson is “scientist zero” in making those adjustments and choosing what stations make it into the data.
To say NOAA has no responsibility illustrates just how ignorant you are to the problems associated with the network. – Anthony

September 21, 2010 4:26 am

Mr. Stokes at 3:57 You are right , NOAA is not responsible, and should be defunded.

Malaga View
September 21, 2010 4:42 am

Nick Stokes says:
September 21, 2010 at 3:57 am
What do you think they should do?

Now that is a hard question…
I will opt for D) from the list of available answers.
What do you think they should do?
A) Go get a proper job
B) Go for a long holiday
C) Go get a life
D) Go forth and multiply

September 21, 2010 4:47 am

Come on !!!! Hansen et al are not in the least concerned with the quality of the data. They will adjust it until it fits their needs.
On the other hand, if there are questions to the quality of the data, that justifies them adjusting it !!!
A win/win for AGW !!!
or Global Disruption or what ever the heck they call it now

Editor
September 21, 2010 5:01 am

No time to post today, and even less time to think, but I can’t pass this up:

“Outside of the USA ~60% of the GHCN Version 3 average temperature trends are larger following homogenization”

So when milk is homogenized, the milkfat content goes up, right?. Does that mean you can get more ice cream if you separate out the cream after homogenization? What if you separate out the cream first, and homogenize what’s left? At first blush that would seem ineffective, but let’s see what NOAA can come up with!

1 2 3 4