For example, until surfacestations volunteer Juan Slayton photographed it, there was no metadata to record the fact that this official USHCN station of record is sited over a tombstone.
From Dr. Roger Pielke Senior:
At the meeting in Exeter, UK September 7-9, 2010 ,
Surface temperature datasets for the 21st Century
there were several candid admissions with respect to the robustness of the global and USA surface temperature record that are being used for multidecadal surface temperature trend assessments (such as for the 2007 IPCC report).
These admissions were made despite the failure of the organizers to actually do what they claimed when they organized the meeting. In their announcement prior to the meeting [and this information has been removed in their update after the meeting] they wrote
“To be effective the meeting will have to be relatively small but, as stated above, stringent efforts will be made to entrain input from non-attendees in advance.”
In asking colleagues (such as my co-authors on our 2007 JGR paper)
Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229
which has raised serious issues with the USHCN and GHCN analyses, none of us were “entrained” to provide input.
Nonetheless, despite the small number of individuals who were invited to be involved, there still are quite important admissions of shortcomings.
These include those from Tom Peterson
who stated in slide 8
“We need to respond to a wide variety of concerns – Though not necessarily all of them”
[from Introductory remarks – Tom Peterson];
Matt Menne, Claude Williams and Jay Lawrimore who reported that
“[GHCN Monthly]Version 2 released in 1997….but without station histories for stations outside the USA)”
“Undocumented changes [in the USHCN] can be as prevalent as documented changes even when extensive (digitized) metadata are available”
“Collectively station changes [in the USHCN] often have nearly random impacts, but even slight deviations from random matter greatly”
“Outside of the USA ~60% of the GHCN Version 3 average temperature trends are larger following homogenization”
“There is a need to identify gradual as well as abrupt changes in bias (but it is may (sic) be problematic to adjust for abrupt changes only)”
“Automation is the only realistic approach to deal with large datasets”
“More work is required to assess and quantify uncertainties in bias adjustments”
“Critiques of surface temperature data and processing methods are increasingly coming from non traditional scientific sources (non peer reviewed) and the issue raised may be too numerous and too frequent for a small group of traditional scientists to address”
“There is a growing interest in the nature of surface temperature data (reaching up to the highest levels of government)”
and Peter Thorne from Agreed outcomes – Peter Thorne who wrote
“Usage restrictions
Realistically we are not suddenly going to have open unrestricted access to all withheld data. In some areas this is the majority of the data.”
There are very important admissions in these presentations. First, outside of the USA, there is inadequate (or no) publicly available information on station histories, yet these data are still used to create a “homogenized” global average surface temperature trend which reaches up to the “highest level of government”. Even in the USA, there are undocumented issues.
While the organizers of the Exeter meeting are seeking to retain its leadership role in national and international assessments of the observed magnitude of global warming, it is clear that serious problems exist in using this data for this purpose. We will post information on several new papers when ready to introduce readers of this weblog to quantification of additional systematic biases in the use of this data for long-term surface land temperature trend assessments.
There is a need, however, to accept that the primary metric for assessing global warming and cooling should be upper ocean heat content, since from 2004 onward the spatial coverage is clearly adequate for this purpose (e.g. see). While there, of course, is a need for regional land surface temperature measurements including anomalies and long-term trends, for obtaining a global average climate system heat content change the oceans are clearly the more appropriate source of this information.
I’ll get this one out of the way.
It was a grave error.
There. Proceed with the discussion.
Ok, I’ll even join in here. REALLY?! This has got to be a hoax! No agency could be so incompetent! I feel like Joaquin Phoenix of the past two years was modeled after the guy who keeps the metadata. I can understand a few errors, but all the others, and now a gravesite?! This will go down in the dictionary as the definition of “facepalm”.
The real failing of these meetings is that it is full of idealistic “scientists” and not “ground in the dirt knowledgeable of real conditions on the ground” engineers.
Tell a “scientist” that the global data shows a trend and they dream of a nobel prize.
Tell an engineer that the global data shows a trend and they will assume that there is another problem with the equipment and/or use on the ground.
The antlers and the rake on top should compensate for
the flagstones at the base… right ?
Before one puts a thermometer in one’s mouth, one should ask,”Where has this been?”
Time to get the facts straight! The Conservative Head Hunters are coming.
Payback is a Mother!
Literally speechless. Except to say that.
And that.
“Realistically we are not suddenly going to have open unrestricted access to all withheld data. In some areas this is the majority of the data.”
SCEPTIC TRANSLATION
Run around in circles making lots of noise, kick up plenty of dust and tell everyone we are not circling the wagons and that we are not hiding the bodies.
SUPER SCEPTIC TRANSLATION
The trick © is to hide the real data.
Even in the USA, there are undocumented issues.
Not to mention lots of “documented” station moves (according to NCDC/MMS) that never happened! Blue Hill, for example, showed three major station moves. (It has been within 20′ of its present location since around 1895. A magnificent old Hazen Screen, one of the very few still operating.)
RIP NOAA
This is a bizarre post. I can’t see what these new “admissions” are. For example
““Automation is the only realistic approach to deal with large datasets””
Is that an admission?? Or anything but the obvious?
But OK, this one was stated twice, and is in bold here:
There are very important admissions in these presentations. First, outside of the USA, there is inadequate (or no) publicly available information on station histories, yet these data are still used to create a “homogenized” global average surface temperature trend which reaches up to the “highest level of government”.
So is that it? Well here’s what Peterson said in the overview paper which accompanied the release of V2 in 1997, and has been prominently featured with that release ever since. It’s not hidden – it is the first para of sec 7 on Metadata:
“For long-term climate stations, there are two types of metadata. The first type is historical metadata that indicate changes with time. Many countries maintain detailed station history files that document relevant station attributes such as the type of thermometer used and when the instruments changed. Such metadata are very difficult if not impossible to acquire on a global basis. Therefore, historical metadata are not available for GHCN.”
It’s right there upfront – not an “admission” at a workshop in 2010.
REPLY: Nick, I daresay there could be firepits under the stations not recorded in metadata and you’d give NOAA a free pass and offer some justification for ignoring the blatant sloppiness. If this were anything else but “save the planet science” it would be thrown out. – Anthony
It is on these flimsy and inadequate data sets that government policy is formulated for which we pay. It is about time the whole so called AGW theory is chucked away. If a theory is set on such data then the theory is wrong. It also violates the laws of thermodynamics. Enough said.
Through my training as an electronic technician I was shown the problems with instrument calibration and drift. My work as and amateur photographer has show me how difficult it is to control all the variables, to bad climate scientist did not learn these lessons.
If they had the answer to the question is the world warming would be to the best of out abilities we believe the answer is yes but the range of warming is still in the instrumental error margins, so based on that we can not say for certain that it is.
It now looks like after thirty year of grand pronouncement they finally may have to admit that there are problems with the data and there may be poor control of the variables. Something I figure from the beginning.
John Marshall says: “It is on these flimsy and inadequate data sets that government policy is formulated for which we pay. It is about time the whole so called AGW theory is chucked away.”
John, I’d have to disagree. Just because the data is lousy, it doesn’t mean the theory is wrong. The problem is that the known and obvious problems with the data gathering are totally ignored (probably more accurate to say hidden) when the “scientists” express the presumed uncertainty of the data.
What governments are being told is “the data shows”. What they are not being told is that “the data is so lousy that even if we invest billions into instrumentation capable of measuring the small change we are trying to detect, it will be several decades before we really know whether there is any meaningful trend”.
As things roll on we seem to be confirming that far from “The science is settled” we are at “The science hasn’t been done”……
OT but there is a principle at stake here, not that NOAA are necessarily principled! The issue of poor surface station accounting etc will probably never go away, but the principles apply to other areas of science.
Recently back here in the motherland of dear old blighty (I know you colonials long to return), there was much ballyhoo & excitement about a discovery by a naturalist who managed to obtain video footage of a breeding pair of Asian Tigers above the valley jungle scape just below the snow line, in Butan I believe. This was indeed sensational news because experts have always insisted that these shy retiring creatures couldn’t possibley exist outside the jungle contraints. The claque & clique of other naturalists were terribly excited about this truly amazing discovery that gave one & all much encouragement about the survival abilities of these wonderful animals so on & so forth. I do not dissagree one jot! However, or BUT, as it is a big BUT. None of them seemed able to accept any humility about this dicovery by admitting that “we were wrong!”. I know it’s all rather petty in the grand scale of things but I thought it worth pointing out! I also note an article for BBC Online that the official list of 600,000 global species of plant life has been cut down to probably about 400,000 – because much has been counted twice!!!!!!! These guys are supposed to be experts & scientists for crying out loud!
Nick Stokes
September 21, 2010 at 1:19 am
So, it has been operated by a bunch of cowboys from the outset then – no?
It is interesting that in my own field (software development) the products of working professionals are expected, with ample justification, to be more nearly fit-for-purpose than similar work by (peer reviewed) academics. That is not to say that academics don’t do invaluable work in their field, but it certainly shows that one should actually expect valid critiques come from non-traditional sources, and indeed it would be surprising if academics did do a better job of the vital if unexciting detail work.
From Oxburgh at HoC evidence transcript;
In Q36;
A: “So it is a pretty difficult business. That is why on the serious publications massive uncertainty bands are associated with temperature reconstructions.”
In Q40;
A: “I, personally, think that in various publications for public consumption those who have used the CRU data and those who have used other climatic data have not helped their case by illuminating the very wide uncertainty band associated with it.”
Is he saying that there is massive uncertainty which should not be conveyed as it does not help the case for a warming? It certainly reads like that.
And
In Q38;
Q: “Do you think that the CRU scientists are people of integrity but out of their depth when it comes to statistical analysis?”
A: “You are quite right. We were fortunate in having a very eminent statistician on our panel. …….. And he was really quite serious in saying that this was not the best way to do it. Having said that, even though they had adopted their own methods, he said, actually, had they used more sophisticated and state of the art methods, it wouldn’t have made a great deal of difference.”
Could it be that the panel statistician (Prof hand) came to the conclusion that the error bands were so massive that state of the art methods wouldn’t have made a great deal of difference? I am sure that Prof Hand did not do a comparison of results from different statistical methods in his head, he certainly did not do them conventionaly yet came to this conclusion: “wouldn’t have made a great deal of difference.”
Is he saying that there are errors so massive so as to render statistical analysis methods moot? It certainly reads like that too.
The inscription on that tombstone: does it read ‘Here lies Robust. Victim of abuse but now at peace.’
Mike Haseler says:
September 21, 2010 at 1:41 am
Which surely is what John Marshall is saying:- We don’t damn well know! We are being asked to spend trillions of pounds, dollars, euros — not even on improving the instrumentation but on mitigation and/or rectification of the “problem” on the assumption that the data are sound. The theory may be fine, though personally I doubt it, but the data backing it up are not robust enough to warrant the level of certainty being claimed by the climate science community. Anthony keeps publishing photographs to back up that statement while all the climate science community can do in reply is lie, sulk, insult or waffle.
Nothing that we have experienced in the last forty years has fallen outside the normal range of climatic variation.
From the final conclusion:
While there, of course, is a need for regional land surface temperature measurements including anomalies and long-term trends, for obtaining a global average climate system heat content change the oceans are clearly the more appropriate source of this information.
air temperature measurements cannot provide any indication of ‘heat content’ due to the huge atmospheric enthalpy variations with humidity. Ocean temperatures however are equivalent to heat content so they can be used. Why do climatologists in both warming and sceptic camps insist on using the incorrect metrics? Surely, it cannot be out of ignorance.
Sam the Skeptic says: “Which surely is what John Marshall is saying:- We don’t damn well know!” … all the climate science community can do in reply is lie, sulk, insult or waffle.
Nothing that we have experienced in the last forty years has fallen outside the normal range of climatic variation.
My point is that we shouldn’t use the same bogus “science” as the climate community and short certain all proper science to link (bogus)data with (bogus) theory.
The theory of CO2 warming is bogus because there are totally imaginary forces used to suggest the massive warming they need to scare us all with and the theory would clearly be at odds with the way real science works even if the temperature data were rock solid.
The shambolic temperature data doesn’t prove the theory wrong: you can’t prove a theory wrong by bad data any more than you can prove it right! What the shambolic temperature data does prove is that those creating and using the data as if it were rock solid, are cowboy scientists (in the UK we call cheap builders who e.g. don’t put in foundations to buildings: cowboy builders).
Brownedoff says: September 21, 2010 at 2:11 am
….
No, the data is what it is. The NOAA weren’t responsible for the measurements; they have to do the best analysis they can with the data we have. Imperfections have been acknowledged all along. What do you think they should do?
REPLY: “The NOAA weren’t responsible for the measurements; ”
Nick, that’s BS, pure and simple.
NOAA is entirely responsible for the USHCN stations and the data they produce. They are wholly responsible for; the equipment, the maintenance, the instrumentation, the calibration, training of observers, and the collation/adjustment of the data.
http://www.nws.noaa.gov/om/coop/
In GHCN, they are responsible for the collation/adjustment of the data under one dataset at NCDC. NCDC’s Tom Peterson is “scientist zero” in making those adjustments and choosing what stations make it into the data.
To say NOAA has no responsibility illustrates just how ignorant you are to the problems associated with the network. – Anthony
Mr. Stokes at 3:57 You are right , NOAA is not responsible, and should be defunded.
Nick Stokes says:
September 21, 2010 at 3:57 am
What do you think they should do?
Now that is a hard question…
I will opt for D) from the list of available answers.
What do you think they should do?
A) Go get a proper job
B) Go for a long holiday
C) Go get a life
D) Go forth and multiply
Come on !!!! Hansen et al are not in the least concerned with the quality of the data. They will adjust it until it fits their needs.
On the other hand, if there are questions to the quality of the data, that justifies them adjusting it !!!
A win/win for AGW !!!
or Global Disruption or what ever the heck they call it now
No time to post today, and even less time to think, but I can’t pass this up:
So when milk is homogenized, the milkfat content goes up, right?. Does that mean you can get more ice cream if you separate out the cream after homogenization? What if you separate out the cream first, and homogenize what’s left? At first blush that would seem ineffective, but let’s see what NOAA can come up with!
Nick Stokes says:
September 21, 2010 at 3:57 am
“No, the data is what it is. The NOAA weren’t responsible for the measurements; they have to do the best analysis they can with the data we have. Imperfections have been acknowledged all along. What do you think they should do?”
Ahhh, it’s the old “It’s not my dog” explanation.
I would still like to know why my tax dollars are going to this so-called research? Poor data is worse than no data.
And inaccessible data is the same as manipulated data.
A bit of fact-free denigration of NOAA here. Is anyone able to answre the question – what were these “admissions”? Prof P Sr is eventually emboldened to make a list. He gets to first, which turns out to be a nonsense, which he really should have known. And there doesn’t seem to be a second.
Nick Stokes says:
September 21, 2010 at 3:57 am
Brownedoff says: September 21, 2010 at 2:11 am
….
“No, the data is what it is. The NOAA weren’t responsible for the measurements; they have to do the best analysis they can with the data we have. Imperfections have been acknowledged all along. What do you think they should do?”
Nick – the answer to this question was obvious to me from the first day I followed Anthony’s surface stations project at Roger Pielke Srs. original blog. NOAA should have been the agency to go out and investigate ALL of the USHCN sites!! The sad fact of the matter is that people like Karl and Peterson were the ones who poo-pooed the whole surface stations idea (along with their like-minded buddies at NASA) and treated Anthony like crap. When, pictures like the one that accompanies this post started showing up, they started panicking and waving their hands to try to make it go away (remember, UHI doesn’t matter anymore, right?).
By the way, has NOAA published their climate data analysis code? Has their code been validated? Do we have access to ALL of their adjustments, detailed numerical algorithms, and code? If not, what are they afraid of???
That tombstone is NOAA´s. It passed away on November 19th, 2009, the unfortunate day when “Climate-Gate” emails were released.
John Q Public says:“Poor data is worse than no data. “
Ahh.. well it all depends if it is poor – poor data or good poor data.
Radiation decay is “poor” data because it is so random. However if we can control everything except the one thing that makes it poor, it becomes “good poor” data.
In theory, global temperature data is poor data, which in the right conditions, and with the right frame of mind, could have had most of variables that made it poor tightly controlled so that like radiation decay rates … given enough time the good data showed through.
But when you have a bunch of cretins with an agenda manipulating the data to suit their political bias and wish for more grant money … even good data becomes poor!
I second the call for using upper ocean heat content as the primary metric for measuring overall change, whether it be warming or cooling or no trend at all. This is long overdue.
“It’s a trick. Get an axe!”
— Ash, from housewares
Nick Stokes says:
September 21, 2010 at 3:57 am
Brownedoff says: September 21, 2010 at 2:11 am
….
No, the data is what it is. The NOAA weren’t responsible for the measurements; they have to do the best analysis they can with the data we have. Imperfections have been acknowledged all along. What do you think they should do?
Start at the beginning like real scientists do. Otherwise it’s “gigo city”, baybeeee! And as proven.
What is it with government agencies and graves? First we’ve got the mismanagement of Arlington cemetery with old grave stones being used for erosion control (with the names of the dead still visible) and bodies buried who knows where, now we see a weather station built right on top of a grave. That’s got to be a joke!
Nick Stokes says: September 21, 2010 at 1:19 am
This is a bizarre post. I can’t see what these new “admissions” are.
It tells me the best dataset is probably that of the U.S., and it is crap. That admission is news to me, perhaps you already knew that though.
Ian W:
Why do climatologists in both warming and sceptic camps insist on using the incorrect metrics? Surely, it cannot be out of ignorance.
One quite valid sceptic tactic is to “beat them at their own game”: grant/ignore their [probably incorrect] hypothesis, then show that their method is defective in some way or other regardless. I think that’s also Steve McIntyre’s approach in doing his “audits”.
R.S.Brown says:
September 21, 2010 at 12:36 am
The antlers and the rake on top should compensate for
the flagstones at the base… right ?
Oh wow, I thought this was a joke but then I loaded the high resolution image and it’s true — there really are antlers and a rake on top! Talk about white trashy. Now we just need to find some bullet holes and a few discarded cans of Coors Light to make the picture complete.
As in the unfortunate Thompsons’ case, an “odour assesment” of NOAA surface stations would be advisable….LOL!
It’s hard to believe the comedy levels reached by global warmers/climate changers.
However, they stubbornly insist on it. Just yesterday, he who bears the mark on his fronthead, Ban Ki Moon, spoke at the UN of the need to fight climate change.
I think as a minimum Adena and Jesse Baldwin deserve having their dignity restored. It’s too late to save Hanksville.
to Simpleseekeraftertruth:
In Q38;
Q: “Do you think that the CRU scientists are people of integrity but out of their depth when it comes to statistical analysis?”
A: “You are quite right. We were fortunate in having a very eminent statistician on our panel. …….. And he was really quite serious in saying that this was not the best way to do it. Having said that, even though they had adopted their own methods, he said, actually, had they used more sophisticated and state of the art methods, it wouldn’t have made a great deal of difference.”
There is that fantay word again, “sophisticated”!
OE Pocket Dictionary, 1925:-“sophisticated”,……spoil the simlicity or purity or natualness of, corrupt or adulterate or tamper with! Sounds like a pretty good definition to me, & if we get high-brows like Lord Oxgurgh using these words without fully appreciating the origins of their definitions then what hope is there?
BTW, I have not been a particular fan of the honours system for many years, it has tendency to reward those who have done the government’s bidding, at least since June 1997!
Check the paint on that station. Or is that dirt?
If a switch from whitewash to paint can change albedo enough to alter temps, what does the browning of that weather box do to its accuracy?
Mike Haseler says:
September 21, 2010 at 1:41 am
I have this theory that antlers on top of thermometer boxes attract grave stones. I have a data set that shows it is true (forget YAD06). We must prevent this sort of thing from happening so we need to institute a tax program to prevent the calamity. The world is pursuing a CO2 agenda as foolish as my speculation. I can see the motive for profit in AGW but I must admit I cannot see the profit in antlers on thermometer boxes.
Is there a way to group any stations that have a long historic record, say 50-100 years of constant use using old thermometers? This nonsense about calibration is a red herring, what we would look for is change, since change is the subject matter of climate change. Criteria would include: using the same thermometer and SOP for time of measurement, no movement of the station over time, no construction during the period in question.
I am an analytical chemist and we use rate of change when calibration is not available. Non-parametric statistics gives the result.
If we choose those long-standing stations, say at temperature latitudes, one set from the NH, one from the SH, we would only need a small sampling to get a statistical result, perhaps n=7 or n=11. A first look might look at decadal averages to see if one decade is different from another, then proper trend analysis could be done.
Is this quick-and-dirty look possible? Has it been done and I missed it?
I wonder if someone with a sense of humor wanted someone to walk on their grave everyday. Maybe they wanted to know if they’d still feel it.
☺
Anthony, who can argue this – except that with great coverage only since 2004, what good does it do, except as “weather”? With such a short period, what do we compare it against? When you come down to it, isn’t climate – and AGW – all about comparing?
And with past records sparser and sparser for ocean heat content, don’t we get into proxies more and more (and more quickly, no?) the further back we go, with their attendant uncertainties? Do we just change one set of badly measured data for another?
I don’t care about HOW many sediment cores and coral measurements they make, extrapolating local or global climate from them is still reading into it more than is really there to be read.
Nick,
The irony is that something like Exeter tries to address complaints by making raw data and adjustment processes more easily available and transparent, and the only message that gets promulgated on the skeptic blogs is that the GHCN metadata is often poor (surprise!).
I’d strongly encourage folks to read through all the Exeter presentations. People like John Christy, Ian Jolliffe, Peter Thorne, Matt Menne, and others are all trying to address many of the prior criticisms of the surface temperature record and data availability. You should at least make a modicum of effort to figure out and honestly appraise what their goals are.
Zeke the REAL irony is that the people who have been actively studying the metadata issue (Pielke, myself and others) were not invited to attend or present our findings
Typical shutout as we’ve seen before in climate science and once again Menne plagiarizes my work and that of surfacestations volunteers without so much as giving attribution
What a travesty
bubbagyro,
Nick Stokes took a stab at the approach you are suggesting awhile back: http://moyhu.blogspot.com/2010/05/just-60-stations.html
This really is a can of worms. Several directions this comment can go.
One is that they have KNOWN about undocumented station changes for a long time, probably forever.
Two is that stations known to have undocumented changes (I suppose they are discovered some time later?) should be excluded from the GHCN database.
Three, given that 80% of stations were dropped, what effort was made to assure that the retained ones were “high quality”? One would certainly think that would have been a top priority for retaining a station in the GHCN database. If not, they have no one to blame but themselves for the fine mess they’ve gotten themselves into. (My main reaction is, “And they call themselves scientists????”)
Four, the statement
is total crap. The “small group of” climatologists are not the ones who need to do any of the work to provide this. That should be done by the attendants at the met stations. That is not a monumental task for any of them. Collectively, yes, it is a lot of effort – individually, no, it is only a few minutes (or hours, at most) for someone to track this stuff down and email it in.
It seems like they are just beginning to see the fine mess they are in, and now it will be decades of get-togethers while they determine what to do about it.
And it never would have been addressed at all, if it weren’t for the
And BTW, THAT statement should be its own posting here!
In saying “non traditional scientific sources,” he was obviously NOT referring to the WWF, since he is talking of critiques. And he can’t be referring to CA, because SteveM is part of the peer-reviewed scientific community. Is he actually referring to WUWT? Is he calling WUWT a “scientific source”? (albeit “non traditional”)
If so, it is crossing a line I never thought I’d see.
You didn’t actually EXPECT them to credit you, did you?
It is completely typical of scientists: ONLY attribute other peer-reviewed folks.
In American archeology that has been rampant for over a century – if an amateur finds anything, it is ignored or dismissed (often as fraudulent), and the site itself is put on a sh__ list. (In Europe and Asia that is not nearly so true.)
If non-peer-reviewed work is to be used, they claim the right to “do it right,” – repeating the work themselves, now that someone has brought up the issue. Watch. There will be papers.
“If you point it out, they will come.”
But don’t expect them to thank you. They DO shoot messengers, don’t they?
Anthony,
I agree that they should have invited you and Pielke. However, it might be useful to ask John Christy to do a guest post on his experiences with the conference before dismissing it out of hand.
Questions about the photo:
1. What IS that black tube in the stand on the left?
2. What in the world is with the flower pot tied to the screen support?
Anybody?
If I show my ignorance, no problem here. I just want to know.
REPLY: 1- Rain Gauge 2- Thermometer bling
– Anthony
Anthony (8.50 am) is absolutely right.
The hypocrisy is stunning.
At the surfacetemperatures blog, run by Peter Thorne, various people asked who had been invited to the Exeter meeting. Thorne said he couldnt release that information. I have just looked again now, and can’t find it, so even those questions and answer have been deleted!
Now, there is a new blog entry (“a few perspectives”), linking to various warmist blogs and their comments about the Exeter meeting. There is no mention of Pielke Sr’s comments. I posted a very brief comment mentioning that Pielke had a blog post about it, but it was censored.
And yet this is from a man, Peter Thorne, who wrote in his talk at the Exeter meeting,
“All voices and perspectives are important”
“A key principal is openness and transparency”.
Given the problems with the US, NZ, Australian and Canadian climate databases documented on this site and my personal experience with BC Environment surface water database I have come to the conclusion that “it’s worse than we thought”. Fixing the problem with “highly automated / fool proof” data collection and storage systems only works if the human resources are provided to insure that the required QA/QC is also carried out. Otherwise, you just acquire an automated garbage collection and storeage system.
If is bad in America, what about Africa? 1960s onwards, that continent has had a lot of turmoil. During Congo’s 30 odd years of wars, and Angola’s 25 or so, are we seriously to belileve that good chaps daily trotted down to their Stevenson shelter, took the recordings accurately, posted them off to the national meteorological centre where they’ve been safely stored for 50 years? Don’t make me laugh. And how many other crisis ridden areas has civil government and all its luxuries collapsed for significant periods? So how do we get temp figures for much of Africa, E Europe 1990-2000, for example? Homogenize? Is that a scientiofic term for ‘make up’? If we really don’t know whats going on in significant swathes of the world, because there is no meaningful data, how can we possibly say anything about global temperature?
This is a bizarre post. I can’t see what these new “admissions” are. For example
““Automation is the only realistic approach to deal with large datasets””
Is that an admission?? Or anything but the obvious?
Well, it’s obvious, alright.
Um, you do know they are still operating off mostly handwritten B-91 forms, don’t you?
What a wonderful picture to use on Halloween. All it needs is a couple of jack o’ lanterns. Maybe they could hang a ghost off of that black electronic thingy with the great big heat sink on the side of the Stevenson screen.
Zeke Hausfather says:
September 21, 2010 at 8:45 am
Thanks, Zeke. The Stokes approach does not seem to exclude 2 of the three criteria I listed, especially the stations either changing methodology or siting over time. I realize that in the nineties, some old sites were “modernized” and these sites could be used (alcohol or mercury bulb thermometers only) up to that point where modernization occurred. Perhaps we can only use this method into the 70s or 80s, but it would give some basis to see past measurement. It would be at least an improvement over the very variable proxy methods we see all over the place.
I am thinking that even if only 7 sites fit the criteria, this would be an interesting exercise. Also, we would not look at the absolute measurement, but at the change. For example, at 1850, set all of the temperatures at zero to normalize the subsequent change, then look at the anomaly each decade from the group until 1950 to assess what climate is doing in the NH and then in the SH. It is likely that the sampling may be too small and the variation in change for these stations will swamp the trend, but maybe not.
Anyone could do this. How do we find “old” stations? Is there a dataset? I know there are several in UK, and Germany, and Switzerland, but don’t know how to find them or their records.
Caveat: One could not “cherry-pick” as the warm earthers have done countless times with proxies, and one needs to set up the criteria blind, before the data is crunched. In this way the true variance could be known.
“To be effective the meeting will have to be relatively small but, as stated above, stringent efforts will be made to entrain input from non-attendees in advance.”
Obviously, “entrain” is Newspeak for sweep promptly under the rug.
entrain, v. [Chemistry] To carry (suspended particles, for example) along in a current.
http://www.thefreedictionary.com/entrain
Why do climatologists in both warming and sceptic camps insist on using the incorrect metrics? Surely, it cannot be out of ignorance.
Three reasons:
— All news is local: We live on the land, not in the water.
— Land Surface Temperature is more volatile than Sea Surface temps. And everyone loves a headline.
— SST records are completely unreliable prior to the ARGO data (2004).
Nick Stokes says
September 21, 2010 at 1:19 am
September 21, 2010 at 3:57 am
One of the problems of trying to defend the crap work is that in doing so you inadvertantly let the cat out of the bag as to how awful it is. You gave a quote from 1997:
“Such metadata are very difficult if not impossible to acquire on a global basis. Therefore, historical metadata are not available for GHCN.”
If these people had any integrity at all, they would have said:
“Such metadata are very difficult if not impossible to acquire on a global basis.
Therefore, because whatever we do in processing the incomplete data, the output will be useless, consequently we shall not proceed with this project.”
In your second post, well, how pathetic;
“No, the data is what it is. The NOAA weren’t responsible for the measurements; they have to do the best analysis they can with the data we have. Imperfections have been acknowledged all along. What do you think they should do?”
“weren’t responsible for the measurements” – did they not do QA then?
“do the best analysis … Imperfections …” – I think you need help from Lord Oxburgh to look up “best” and ” imperfections” in his little dictionary to select the most agreeable definitions to support your absurd statements.
You seem to be getting a hard time above so , no more.
bubbagryo said
“Anyone could do this. How do we find “old” stations? Is there a dataset? I know there are several in UK, and Germany, and Switzerland, but don’t know how to find them or their records.”
There are numerous pre 1850 station records plus articles on climate history here at my site
http://climatereason.com/LittleIceAgeThermometers/
tonyb
Instead of using zombie data from the NOAA death spiral station network, we could just measure the rings from a single tree in Mongolia. Problem solved!
Anthony and Zeke Hausfather
Zeke said regarding the Exeter Climate Workshops
September 21, 2010 at 9:15 am
“Anthony,
I agree that they should have invited you and Pielke. However, it might be useful to ask John Christy to do a guest post on his experiences with the conference before dismissing it out of hand.”
Zeke, as has been pointed out the criteria of the meeting was;
*’Engendering broad input into the process design from expert communities outside of the traditional Climate Change Community.”
* “All voices and perspectives are important”
* “A key principal is openness and transparency”.
That these principles were not being upheld was the subject of various private emails between myself and Julia Slingo, and myself and Kate Willett who was on the organising committee. It was quite evident from the tone of the replies-friendly but cetainly dismissive- that the top brass at the Met office did not want to engage in any sort of meaningful discussion with those who disagreed with them.
There were a good dozen people who could and should have been invited but ultimately they felt it was too technical for our poor little brains to cope with.
Incidentally I live 15 miles from the Met office, this is my site
http://climatereason.com/LittleIceAgeThermometers/
In addition there is a whole treasure trove of climate material that has never been uploaded to it that people have sent me from all over the world. Just the sort of data the Met office say they are interested in.
I’m afraid that there was never any intention by Julia Slingo or Peter Thorne to engage with anyone who had another point of view. I would judge Kate Willett to be more open minded though.
After the Exeter Climate workshop ended I did a brief overview of the information contained in the slide presentations that I found especially interesting. For what its worth, here it is;
“Very interesting reading as it gives a complete run down on the manner in which many aspects of climate are approached. Well worth reading as there is lots of data, all collected in one place, together with an explanation of techniques. All in all it is a very good run down of the current state of the climate industry. I am particularly interested in the information contained in the first paper I have highlighted below these initial general comments.
http://www.surfacetemperatures.org/exeterworkshop2010
General comments;
1) Highly professional contributors who are specialists in own field but seem to get very caught up in their work and don’t necessarily see wider picture
2) Some data very sparse
3) Much of the information is hugely manufactured/altered
4) They don’t seem to know anywhere near as much as they publicly claim-lots of caveats.
5) What were the latest ‘correct’ procedures one year, fall out of fashion and changes the next. Still has the appearance of a young science feeling its way.
6) Much of the information is hugely technical/theoretical and what started out as simple observations are substantially changed by one process or other to end up with something that bears little relation to the original data;
7) Having read this presentation, the only temperature information I now believe to be original, accurate and not amended for one reason or other are the ones I take from my garden thermometer!
Some highlights; These are taken from the main menu
Interesting to see slides 19, 20, 21, 22.
http://5676430411356704223-a-surfacetemperatures-org-s-sites.googlegroups.com/a/surfacetemperatures.org/home/exeterworkshop2010/7_1Wed_exeter-menne.pdf?attachauth=ANoY7cqdbO35n1iTWjuHeNi0GHr9ArYixYWIb6oDax7aWMpXw8y2JWViG201XpCpvw9dswtZ5SEy4zXmBVIshWzlgOaHqND29hdPW8pC9A4LzqVUtwQJW9BqgRY4Ufr60LzWPnqMuVAgtu6UnwtH6gssy7W-f_XGadZZw2a9JbGqtucQWYPGwL-n66r5sMSyKD2VuvsRxgxuRmboWbVgz3We2SC7fNP1RDIyCYFbcdbUe1x2BVdelzg%3D&attredirects=0
Also interesting to note the ICOADS presentation. Elsewhere it is mentioned there are Approx 2 billion daily observed records (temp, rain, snow etc) As slides 6 and 7 show the number from the ocean are very much smaller and almost non existent at the start date of Hadcrut. (CET has some 750,000 data points)
http://5676430411356704223-a-surfacetemperatures-org-s-sites.googlegroups.com/a/surfacetemperatures.org/home/exeterworkshop2010/2_1Tues_ICOADS_Lessonsv4.pdf?attachauth=ANoY7cpnlEXx7r2OOD_A360nfNmcMpt097ViUHJh-bPyNqm7oPzE7MLwYW7eBgysjIs7ELt7kZzgXHVTQChj7AnSQPp666Dhma-VwkZBapffJr2DGyPVzrXFI33nkzfDLzYQ33Kh4pWhYQmQ8T7pzahPwSLPrvye1p6pLJPJsWNZZTSFRKmgWZganNMUidFgF63DZdt0KWwxxMh_gzqCrRJZcqMs8-nJRdfCWobcxmiem586ghxhyz1yF7Xn3USsRJuARoyb022U&attredirects=0
See slides 5 and 6 position of stations
http://5676430411356704223-a-surfacetemperatures-org-s-sites.googlegroups.com/a/surfacetemperatures.org/home/exeterworkshop2010/7_2WedHOMOGENISATION_EXETER.pdf?attachauth=ANoY7crU-LzepZTDMzD5Brb7SUqM1bmTO3lGu1AIMoH7LJZb8LSKfII8odI0fuwB-3ZbtODu92er3aWCFidJ5NWOXsQfHRZp08M9k9nq92yTZ3xdEyymHz37xy-SJzuS42S9YKZOwz6YOjCKFXivg5HvkSl3MDk-kltIi1OD7htIGChYMnJTmFXz_wh4ShyQtpdjkT3mRsKx67fQr81gCiR9d7SKL3hm40jGnMWMgOPUrG0D2MOM1efhzdkv62sWQU-KnEj3P379&attredirects=0
See text of slide 21
http://5676430411356704223-a-surfacetemperatures-org-s-sites.googlegroups.com/a/surfacetemperatures.org/home/exeterworkshop2010/7_4Wed_sst_homog_analysis_JK_NR.pdf?attachauth=ANoY7cq14NBnQVP6nSohu7dljsmvPnoYFk5pU9Q1PICqVicdTWGcG7owhgP2ver_B0oXx8YqAQSA-0KuiREGTLJIYtxWZHF8meJRyxrOI9gY7Tm3AXKcALBr4Ce6TuJ7kPDjJwK_5bhhbjZue4m7S9MI3anVH4Sz_Ra3vpcM2eVReZI3OmP6F4Zr0hZBH8W17TrstkGnctM9lG9wSskywYW-fEJkBFvQAuDyCcDD4_wExStWIfeGONWVzDEI1fWYrv9oquwflK-K&attredirects=0
Phil Jones helped draft this white paper
http://5676430411356704223-a-surfacetemperatures-org-s-sites.googlegroups.com/a/surfacetemperatures.org/home/exeterworkshop2010/11Wed_interp_v1.pdf?attachauth=ANoY7crk3cVOzi-ktoQyIS08Bj-x4EBgLUe55g33tRgmGdcLSL26MKIhKtqLA6ylWDiUn1zeABUGNOWOCLTXwhyaA82OJCUUoMq2LDxr7Fx_Kaj4YEhqxaFlutpo5RYT2aKEB6_NUtDVMIFXjFFOXUvnC6OcYmP5kLCy6aR3K7xCX0ESf0d5e_BLmS7fussBUyfCdKU_NybxzrsQXWS_TAhFaXuxL8Kb7t7DW5zb26WhQ6QC174JIog%3D&attredirects=0
Overall
Seems to be an admittance in the recommendation papers that the current system is not as robust as claimed and they need to do more to restore trust.
Tonyb
Anthony Watts says:
September 21, 2010 at 8:50 am
“Typical shutout as we’ve seen before in climate science and once again Menne plagiarizes my work and that of surfacestations volunteers without so much as giving attribution”
Anthony, these are demonstrably small-minded people. Its a data collection and storage exercise, not rocket surgery. They are on the track to eff it up again and leaving a well documented trail of their folly: do you really want to be associated with that? IMHO, you stand head and shoulders above them. & bravo!!!!
The ventilation fan attached to the side of the Stevensen screen has just gotta make this site eligible for some prize.
Sited over a tombstone, ahh, The Ghost in the Machine!
Koestler 1967. The human brain has built upon earlier, more primitive brain structures, and that these are the “ghost in the machine”. The author’s theory is that these structures can overcome cognitive logic.
So how many sites are affected? Is the info in the metadata?
“Typical shutout as we’ve seen before in climate science and once again Menne plagiarizes my work and that of surfacestations volunteers without so much as giving attribution.”
I’ve never understood this attitude by NOAA, and I’ve been observing this process since 2007 when the original surfacestations project was initiated. My only conclusion is that NOAA was embarrassed by the state of the USHCN and the fact that they were simply too lazy to anything about it. So, like many bureaucratic organizations, they pretended the problems weren’t significant, and anyway they were launching a new climate network so the old one didn’t matter. And they really resented someone who was “not in the club” calling their methods into question. A real shame…
tonyb says:
September 21, 2010 at 10:47 am
Thanks, T.B.!
Yours is a great site – I recommend it.
The old temperature data from 1800 to 1980 at least, is very flat or falling for the stations I looked at in Europe. I will dig into the others.
The old stations’ data would seem to support the contrast between then and recent decades as showing a recent warming trend, since the rules changed in 1980 or thereabouts, with urban stations dominating, and rural stations being dropped.
In the half dozen “old” stations for 1800-1980 I glanced at, I see a cooling trend after 1940 that does not support the “CO2 causing warming” hypothesis for an industrially active period, however.
All of the “warming” of recent decades is explained, as Mr. Watts and others have repeatedly shown, by replacement of old stations with “new improved” methods and subtraction of irksome rural stations. At least from a cursory look for myself.
Of course, all of the records are replete with a so much variance that it makes the AGW hypothesis, as a “settled science” quite absurd and unproveable.
Ian W says:
September 21, 2010 at 3:22 am
Don’t bother Ian. I’ve been trying to convey that message for at least a year now, no-one seems to care.
DaveE.
Someone has confused a weather station with a post-mortem measurement system designed to determine whether or not the occupant is burning in Hell.
Brownedoff says:
‘If these people had any integrity at all, they would have said:
“Such metadata are very difficult if not impossible to acquire on a global basis.
Therefore, because whatever we do in processing the incomplete data, the output will be useless, consequently we shall not proceed with this project.”’
The “project” is the compiling of past temperature readings. These were made by a great variety of authorities, mainly the Met offices of other countries. What you are saying seems to be that they should not be compiled. No-one should look at them, because of imperfections in associated records.
Well, the records will be looked at, and most people do want to know what they say. The NOAA is getting the best information from them that they can.
“weren’t responsible for the measurements” – did they not do QA then?
Yes, thery did – that’s their main added value. But QA can’t create historical metadata.
bubbagyro says:
September 21, 2010 at 8:32 am
One problem with constantly using the same thermometers, Liquid In Glass or LIG is the fact that glass is a supercooled liquid. Over time the glass actually deforms. Look at some really old glass windows & you will be able to actually see the way the glass has started to pool at the bottom.
DaveE.
That little tombstone will help keep the temperature sensors above toasty warm at night.
Why don’t station personnel just get it over with and paint the completely weathered Stevenson Screen black? Then next year, Dr. James “thumbs on the temperature scale” Hansen can truly justify upgrading the historical record without any loss of conscience (assuming he has one).
Bubbagyro
I was interested in the cooling trends evident over 50 years or more and made this study with a colleague that was carried here a few weeks ago
http://wattsupwiththat.com/2010/09/04/in-search-of-cooling-trends/
tonyb
It’s a pedantic point, but I notice that the time stamp on the photos is obviously wrong. They’re all marked as being taken around 7:30 pm PST. That would make it 8:30 pm in Utah. There’s no way that it’s still daylight in late December at 8:30 pm, so either the date is wrong, or the time is wrong. Given the snow on the ground, I suspect the date is about right. The thing is that if people want to challenge the quality of the data in the surface stations project, they will pick on points like this. “If your camera can’t even show the correct time, how can we trust …” bla bla bla. You may want to head off that sort of response by noting in advance which elements of the data have errors, whether the errors are significant, etc.
Oh, now I get it. The station is no longer being used, so it’s a “dead” station and they were respectful enough to put up a marker!
pytlozvejk:
Good eyes, and point well taken. All my pictures that day were about 12 hrs off on the time stamp. Don’t know how that happened. The actual date and time for Hanksville was 8:30 AM PST (9:30 MTN) on December 28. (This was noted on the site survey form in the gallery.) Now if someone can tell me how to correct the time stamp….
John
How many would be willing to go and take photos of their nearest surface temp station? Perhaps a published database would assist.
Really? A Surface Station mounted atop a tombstone? What would that do to the body of data? A grave situation indeed.
Selling a piece of equipment that works is a powerful incentive to sell something that actually, really, works, because if it does, it will sell itself without the need for a 4 page article and press conference staged by bespectacled white coats. If it doesn’t work, you will simply go out of business if that is all you are selling, or in the old days “git run outa town by the long end of a two barrelled contraption”.
Saying something works in a well thought out (spinned?) article without having the actual piece of equipment scrutinized by the buying public is an entirely different incentive. This is when we start thinkin’ “snake oil” and “run-em outa town” when we “git ahold” of the physical product and discover it was made from “Gramma’s recipe”.
So to the “invited guests only” folks who attended that meeting, go find another gullible country, you had your chance and missed.
I know where my metadata (is) are, but I don’t know when 12AM is.
So the summary of their meeting was, our parachute is not working?What went so wrong? Every relevation reveals no science was used to produce the AWG product.
I am awed by the over reach and arrogant stupidity.But very much angered by the knowledge I am paying for nitwits like that.No data means no science.Most damning statement was ,we will never release it all. But trust me I’m an expert.
Tim says:
September 22, 2010 at 1:10 am
How many would be willing to go and take photos of their nearest surface temp station? Perhaps a published database would assist.
Its been done. surfacestations.org
Are there any hazen thermometer shelters still standing on the east coast ?
I am looking for one that has the ornamental roof on it . I already have the blueprints to build one but would like to see one in person. there were 2 types . One had double doors hung vertically with a rectangular shaped vent holes in the top.
The other version had a horizontal hung single door and the vent holes in the top were 3 leaf clover shaped.
the one photographed over the headstone at the NOAA USHCN COOP station at Hanksville, UT, looks like the ornamental top has rotted away.
thanks
bobby
UPDATE
that shelter at Hanksville UT is NOT an hazen thermometer shelter what was considered to be a rotten down ornamental top upon closer examination is just some junk piled on top of the standard stevenson screen/cotton region shelter
bobby