My response to NCDC's op-ed in the New York Times

Andrew Revkin asked me to provide comments on this article of his where the National Climatic Data Center was asked to respond to Watts et al 2012:

A Closer Look at Climate Studies Promoted Before Publication

Here is what I sent to him:

My comments on Thorne’s response are pretty simple.

They still refuse to get out of the office, to examine firsthand the condition of the network and try to come up with hands on approaches for dealing with station inhomogeneity, but instead focus of trying to spot patterns in data and massage it. In my view this is the wrong approach and the reason that we are in this polarization  today.  We are conducting a grand experiment, and like any scientific experiment, you have to carefully watch how the data is being measured in the experiment environment, or problems will invalidate the measurement. If Climate Science operated under the same rules as Forensic Science, the compromised data would be tossed out on its ear. Instead, we are told to accept it as fully factual in the court of public opinion.

Until I came along with Watts 2009, they really weren’t looking closely at the issue. The SurfaceStations photography forced them into reaction mode, to do two things.

1. Close the worst USHCN stations, such as Marysville, CA (the station that started it all), Tucson, AZ (the University Science Dept/Weather Service Office that had the USHCN weather station in the parking lot), and Ardmore, OK (the USHCN station on the street corner). There are many others that have been closed.

If they are able to correct the data gathering problems back in the office with algorithms, why do they need to close these stations? Additionally, if they think they can get good data out of these stations with the myriad of adjustments they perform, why did they need to spend millions of dollars on the new Climate Reference Network commissioned in 2008 that we never hear about?

According to communications I received from Texas State Climatologist John Nielsen-Gammon, the National Weather Service is developing plans to eliminate up to half of all COOP network stations (of which USHCN is a subset) as a potential cost-cutting measure.

Some possible reasons: (1) not central to the core mission of the NWS; (2) poor data quality; (3) too much of a public relations headache with people putting embarrassing photographs online.

I would argue not for removal of bad stations,  but rather for the replacement of bad stations with well-sited stations, with simultaneous overlapping data collection so that biases can be both measured directly and permanently eliminated. I don’t see anything in what they are doing with Thorne that addresses this. To me, all they are doing is trying to put lipstick on a pig.

2.  Attack me without publishing an appropriate paper intended for peer review first, such as the ghost authored “Talking points” memo issued by NCDC’s Dr. Thomas Peterson, who wouldn’t put his name on it, yet circulated it to every NOAA manager and the press. If the data from these stations is so strong, and the adjustments and corrections so valid, why the cloak and dagger approach?

Note, that in the Thorne response, they carefully avoided saying anything about station siting, preferring instead to focus on data manipulations.  From my viewpoint, until they start worrying about the measurement environment in which our grand global experiment is being measured, all they are doing is rearranging data without looking at and learning from the environment and history that created it.

Perhaps they should follow the advice of the General Accounting Office report that backed up my work:

GAO-11-800 August 31, 2011, Climate Monitoring: NOAA Can Improve Management of the U.S. Historical Climatology Network Highlights Page (PDF)   Full Report (PDF, 47 pages)   Accessible Text Recommendations (HTML)

Finally, let’s spend a few moments looking at another network in the USA that doesn’t seem to suffer from the same sorts of magnitude of issues. The U.S. Population-Adjusted Temperature Dataset (PDAT) developed by Dr. Roy Spencer, which better handles UHI.

The following plot shows 12-month trailing average anomalies for the three different datasets (USHCN, CRUTem3, and ISH PDAT)…note the large differences in computed linear warming trends (click on plots for high res versions):

Where’s the warming?

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Sean Peake

Revkin? Seriously? Why would you bother dealing with him?

Richdo

“Where’s the warming?”
Indeed…Where IS the warming?
Thanks Anthony for everything.

Otter

Who trusts revkin to cover this impartially?

Ally E.

“Where’s the warming?”
*
Exactly! Well done! 🙂

KnR

Once again it basic stuff taught to any undergraduate or even high school student , that you need to use the tools of measurement in correct manner or what they tell you loses value, its actual a reality which is accepted which why there standards in the first place. But what seems to have happened is they think if they throw enough computing power at it the problem goes away. The simply fact is you cannot correct for error in any meaningful way if you neither know its magnitude nor direction. really basic stuff .this

More Soylent Green!

I presume Anthony is the author of this post? I don’t see a byline.

AndyG55

The main point is that until these people go out and look at the HISTORIC CHANGES, not only at EACH AND EVERY site, but also in the LOCAL NEIGHBOURHOOD of each site, then they CANNOT POSSIBLY make proper allowances for local factors on the temperature trends.
If you don’t account for these local trends first, the major cause probably being the large jump in urbanisation in the 1970-2000 period, then the whole calculation of so-called “global” land temperatures is just a crock of s*** !
Roy’s PDAT and this paper by Watts et al. go some small way to addressing this issue.
Is it far enough.. probably not..

Justthinkin

“Some possible reasons: (1) not central to the core mission of the NWS; (2) poor data quality; (3) too much of a public relations headache with people putting embarrassing photographs online.”
So my car(evil gas powered IC) breaks down on the side of the road, because I put the wrong size tires on, and instead of replacing with the right ones, I just burn it there,and tell my insurance company everything is honkydory, because the offending vehicle has been removed from the system.
Must have been a long day. Can’t seem to wrap my head around this logic.

NZ Groover

“They still refuse to get out of the office, to examine firsthand the condition of the network”
And there you have it in a nutshell. Crap in = crap out (albeit highly processed crap. Those computers make it nice and smooth)

What is the yearly cost of a COOP station? Given computer and internet technology today, shouldn’t it be coming down?
Why with the price of technology getting cheaper
and information handling costing next to nothing
and so much at stake in the Climate Change debate,
is NOAA not ADDING to the COOP station network instead of shrinking it?
Sounds like enlarging COOP is a good use of “Stimulus” funds, to me.

foo1

Here’s my own take:
we’re all in the blue.

Resourceguy

Excellent response for reasonable people but not for those with bias as the main agenda

beesaman

NOAA’s attitude to data is akin to the Titanic’s crew rearranging the deckchairs, sadly NOAA hasn’t realised it’s sinking yet…
Or it’s like cooking with garbage, doesn’t matter what recipe you try the results are still going to taste of garbage…

Joachim Seifert

Fine….they cook the books and numbers…..but it still shows the
temp-plateau since 2000 we are on. And this plateau will continue…..
no more warming to come…..

Ian W

What has been exposed is a total lack of governance, quality control and configuration management. Instead of quick easy ‘fixes’ to data because one station looks odd man out then run the standard algorithm on many thousand observations; each station needs a full quality record.
The quality record would provide siting issues and local information. Any adjustments would be applied specifically to that one station, be justified with fully documented details of why and how the adjustment was being made and who by, be signed off by a QA and Configuration Manager as being appropriate for that station. The signatories would then be responsible for that change. Any subsequent ‘adjustment’ to that one station would again have to be agreed and justified and the reason that the previous adjustments were not correct gully documented. This way stations that are odd ones out because of their geographic position or whatever will not be homogenized out as their readings are accurate. The entire worship of the average that climatologists seem to like is not borne out in reality.
Its not like there are a huge number of stations, even worldwide. It is perfectly possible for government funded agencies to run a quality system without great expense. But they do not want to, as the repeated massaging of the data by compliant climate scientists plays to their confirmation bias.

theduke

It’s not Revkin so much as it’s the New York Times. They are “the paper of record.” Unfortunately.
I refuse to subscribe. I do subscribe to the WSJ.

theduke
cui bono

Puzzling. We can launch satellites to try to measure temperatures. We have the Argo network. We’re surrounded by ‘smart’ things, including ‘smart meters’ for loads of stuff.
Yet on the ground, where it should be easiest, we have a mess which requires constant ‘adjustment’ by sedentary statisticians.
Well done Anthony!

Apropos of nothing but accuracy, but didn’t the General Accounting Office (GAO) change their name to Government Accounting Office?

Theo Goodwin

Plain, simple, and straightforward. In a word, brilliant. Keep their feet to the fire. They are not empiricists and will fight to the death rather than admit that temperature stations should be classified according to the physical characteristics of the heat sinks in which they exist.

Actually that’s wrong too. Government Accountability Office is closer.

Severian

Once in my career, I was principal investigator on an IR&D project at a major defense contractor. The project had already started when I was assigned, and the results from the lab were abysmal, huge standard of deviation from multiple experiments that should have been measuring the same thing. I got myself down to the lab full time, rolled up my sleeves, and set about finding out what the problems were. After a couple of weeks I solved a lot of contamination and experimental technique issues, and we got down to SDs of about 5-10% of the mean with very repeatable results from experiment to experiment.
Who knew that I could have stayed in my office and just made up “adjustments” to the data instead of getting my hands dirty in the lab? What a dummy I was!

RockyRoad

Ah, yes…”manipulation”.
It is insightful to define what “manipulation” means, as in “data manipulation”.
First, to the root word “manipulate”:
ma·nip·u·late (me-nip`ye-lat)
tr.v. ma·nip·u·lat·ed, ma·nip·u·lat·ing, ma·nip·u·lates
1. To move, arrange, operate, or control by the hands or by mechanical means, especially in a skillful manner: She manipulated the lights to get just the effect she wanted.
2. To influence or manage shrewdly or deviously: He manipulated public opinion in his favor.
3. To tamper with or falsify for personal gain: tried to manipulate stock prices.
4. Medicine To handle and move in an examination or for therapeutic purposes:
I’ve highlighted the two definitions that apply (although the first one has potential in a devious way, too). How can anybody claim to be a scientist and be in favor of “data manipulation”?
If so, the definition “scientist” no longer applies.

Well said Mr Watts !!!

Exactly what I was saying 24 hours ago on another forum. Shift the measuring sensors into approved locations following the sighting guidelines set out by your own countries (Australia?UK/USA/Europe etc) and international standards and we won’t see this corrupted data being used to manipulate the entire worlds opinion on global warming.

Chris Nelli

I bet Borenstein is on the case like white on rice. HAHAHAHAHAHHAHAHHAHAHAHAHAHAHA!

pat

anthony –
a good response, but i’m afraid there’s a new CAGW PR push, starring the “Muller Conversion” because:
30 July: Reuters: UPDATE 1-U.N. carbon credits fall to new record low
CERs fall to new record low of 2.67 euros a tonne
Reporting by Nina Chestney; Editing by Anthony Barker
Benchmark United Nations’ carbon permits fell 7 percent to a fresh record low on Monday, taking their lead from lower prices for European Union emissions allowances and extending losses made last week…
Volume was low at 575 lots traded.
EU allowances (EUAs) for delivery in December 2013 fell by 3.76 percent to 6.66 euros a tonne, narrowing the spread between the two benchmark contracts to 3.99 euros.
The EUA contract dropped below a 6.85 euro support level earlier on Monday which prompted selling, he added…
Most of the demand for CERs comes from the EU ETS, the world’s biggest carbon market, which itself is oversupplied by over 1 billion carbon permits. Many analysts expect the EU scheme to be oversupplied at least through 2020.
http://www.reuters.com/article/2012/07/30/carbon-market-idUSL6E8IU8Q820120730

pat

31 July: SMH: Ben Cubby: Climate sceptics unmoved by scientist’s about-face
”I’m not convinced that [Professor Muller] was ever a sceptic although, of people I respect, there is a couple who do have a decent opinion of him,” said blogger Jo Nova in Perth, the author of The Skeptic’s Handbook…
”If [Professor Muller’s study] removed stations that are near concrete, car parks, airports and airconditioners, and used only the best data we have, I’d be open to accept their warming trend calculations,” she said. ”But even so, it’s another leap entirely to say that just because the world has warmed that it’s man-made.”
The ”urban heat island effect” is real – it has been documented by meteorologists for decades – but the Bureau of Meteorology removes suspect thermometer sites from its climate change measurements. The two groups of data – one for weather forecasting and one for climate – are available for scrutiny on its website…
http://www.smh.com.au/environment/climate-change/climate-sceptics-unmoved-by-scientists-aboutface-20120730-23a6s.html#ixzz229xt3NT6

Mike

I sense a disturbance in the farce.

Bill Illis

If we want to assess global warming then we should use Anthony’s best-site-only analysis.
If we want to assess UHI, then we should use the poorest sites.
If we want to exaggerate global warming, then we should adopt the Menne/NOAA/NCDC approach.
If we want to fix the temperature record, we need new statistical people at the NCDC (which probably requires voting Republicans into the Senate, House and Presidency – politics is part of this you know).

OssQss

Looks like it is going to be a interesting few weeks. They will be coming from all directions. I believe you will have much fun in the process and that many will learn much, inadvertently, from the MSM.
Now, more importantly, when can we see the Global surface stations work? They would be much better than the US stations, no? Its not hard to globally position cell phone pictures now days for verification. Ya know that metadata thing 🙂
Keep up the good work and stay strong.

I assume you hurried up the press release to counter Muller’s BS. Unfortunately, it would probably have been better to wait, as it turns out. He is getting all the MSM press anyway, and the real players know what a joke he is. Your fully vetted paper would have been a more credible response. Of course, like many, I specialize in hindsight….

Gary

How many things can you get wrong in one post?
– it’s not an ‘op-ed’, it’s a quote in a blog.
– Given that CRN was commissioned in 2008 – after a decade of reports describing the need for such a system, how can you claim that no-one was doing anything until Watts (2009)? Does NCDC have time travel too?
– Where’s the warming? I suppose the changes in plant hardiness zones are fictional, the melting glaciers in the Rockies are fictional, spring arriving earlier is fictional, adjacent ocean temperature rises are fictional, the satellite temperature rise over the US is fictional etc. etc.
– whatever happened to taking the science as it comes instead of this clinging to the wreckage of failed ideas?

The GAO is pretty good. They did a similar trashing of the big EPA secondhand smoke study of 1992 — with the same result: the media never mentioned it and 95% of the US population continues to believe wisps of tobacco smoke pose a deadly threat. Very excellent job Anthony and the rest of you! You’ve stuck with the battle, being what I always like to call “the sand in the gears,” and it looks like you’re finally knocking those treads off the enemy tanks! The big trick lies in getting the media behind you.
You probably thought, as we did in the smoking battle, that having the GAO come out for you would be the telling nail in Warmers’ coffin. How did they wriggle out of it? Simply by ignoring its existence? You need to be vocal enough to make sure that doesn’t happen with your current work. Grab every opportunity possible, no matter how difficult, to get your stuff out to the general public and force the opposition to do more than
A)ignore you, or
B) try to pass you off as denialists who disagree with “all the cognizant authorities” or “the scientific consensus,” or
C) try to paint you as simply being stooges/shills/dupes of Big Oil or the Republicans or whatever.
A suggestion: when you show that three datasets graph, follow it up with a smoothed version (maybe a rolling 1 or 3 year average or some such) so that the casual passersby (who are really the people we are both trying to educate in our battles) will absorb the message at a quick glance. The problem we face is very similar: in both cases most people don’t have the time, energy, education, and motivation to study and evaluate the data carefully. They tend to simply agree with “established authorities” because “that’s what most folks seem to believe” and that’s what’s been set in their minds for the last 20 years; and, in order to shake up that comfy little thought-world you need to make your images clear enough that they cut through the haze and motivate the viewer to take a little more time and thought over what you’re saying. If things look “complicated” (such as a very jaggy graph) they won’t take that time — so that’s why the simplifications are important (as long as you always back them up with the real data nearby for anyone wanting to look more closely.
Well done!
🙂
MJM

This is probably the most powerful argument I have heard from WUWT in years. Thank you Anthony.
Imagine any other science that relies on a million precise measurements that refuses to calibrate their gauges… at all.. and that simply relies of fudging the data to match the gauges they assume are the good ones.
In any discipline that behaves like that all you can really expect to see from their results is data that trends in the direction the scientists thought the trend would go.

Lady in Red

I’m rather liking Andy Revkin these days. The Team gets annoyed whenever he swims in a pond not flooded with their dogma (I forget who it was who threatened to “cut off access” to all mainstream climate scientists if Revkin persisted in speaking with The Wrong People. It was funny. The poor idiot sent his email threat to a large list, some of whom were not sympathetic to the proposed intellectual knee capping…)
More and more, I think Revkin is an honest broker. He must take The Team seriously: they are the funded mainstream. But, gently, slowly, he is looking for truth. ….Lady in Red

Typical government-run agency. It’s not capable of doing any real work so it hires subcontractors to do everything. The FBI and the CIA are probably the only agencies that do real work.

Chuck Kraisinger

Great response, Anthony. Most of the commenters at Revkin’s disparaged you, but I didn’t see any disparaging what you actually said. I think it’s time I contribute to your tip jar. Thanks.

Max Hugoson

I know I’ll sound like a “broken record”, but I’d LOVE to see the “per decade” breakdown of the temp increase, one decade at a time, from (what?) 1885 or 1900 onward.
I’m sure it tells an interesting story with periods of rise, periods of decline…and (probably, really) decline or flat for the last…what 12, 15, 17 years?
One comes to that age old question, what is the TIME CONSTANT for the system response. I.e., is new equilibrium established in one year, two years, three years or decades, or 100’s of years?
Important considerations.
Max

Gunga Din

My opinion, before they dump $500,000,000 into another Solynda, they should put together a number of mobile teams and set up around the various stations. Maybe 4 or 5 units to a team equiped with weather instuments. (Anthony has nice units.) Surround the station being “checked” at a distance that would eliminate/minimize siting problems. Set up for a week or two. Come back different seasons of the year. Move the teams from station to station year round. Compare the mobile data with the station data. That should give a good value to “adjust” each station data by or leave the data alone if it’s accurate.
Those “Green Jobs” would be real and worth it.

Paul K2

Bill Illis says:
July 30, 2012 at 6:39 pm
If we want to assess global warming then we should use Anthony’s best-site-only analysis.

Actually the Watts et. al. draft paper ignored the most modern and best equipped sites entirely. And that speaks volumes about the quality of the work in this paper.
p.s. The paper published by Menne et. al. (2010) did not ignore the best sites.

Lady in Red

Separate Anthony’s tip jar, poor Andy Revkin could use some Thoughtful, Reasoned Support at the NYTimes for writing about this in the first place.
All The Usual Suspects are trashing him for moving off The Team’s reservation. (It is interesting how thoughtless most of these comments are, without reason. Just: You Should Never Talk To Those People, Revkin!
Help him.
Remember, folk: It’s the data — and repeatable science — this “profession” needs. If we focus on Feynman, the truth will out…. ….Lady in Red

Swing *THWACK* miles and miles and miles

The GHCN has a large percentage of airports (over 50% last time I looked) and airports figure strongly in the QA process (where out of ‘acceptable’ readings are replaced by an average of 5 or so ‘nearby’ ASOS stations – at airports…) so checking station quailty ought to start with “Airport Heat Island” effects. Yet it isn’t done.
I did a quick check using Wunderground. It lists 30+ ‘nearby’ stations to the one you select, so you can get the large Airport and the surrouding in one snapshot. Most of the time the temperatures decrease with distance from the airport. Sometimes by a few degrees C.
http://chiefio.wordpress.com/2012/07/27/more-airports-hotter-than-nearby-stations/
(It helps to take the ‘sample’ at night so any ‘slow to report’ stations report high instead of low. Even with that it is very clear that Airports are way hot as reported.)
Until Airports are purged from the USHCN / GHCN (or suitable AHI correction of a couple of degrees C is applied) all they are finding with their statistical manipulation is that tarmac and concrete are hotter than grass fields…
The new method applied by Anthony will be sensitive to that ocean of heat sink around the runway… (Where the thermometer ought to be placed for aviation use; so “density altitude” calculations for take off / landing are correct. The primary purpose IS to report that hot runway… it just makes them unfit for ‘climate science’ use; especially when compared to the grass fields that were there in 1920…)

John F. Hultquist

I’ve just looked at Roy Spencer’s site and the lower tropospheric temperature from 1979 until June of 2012. There is a blue line for monthly values – it crosses the zero anomaly line so many times counting that is difficult. Mostly recently it was at zero or below in 2011 and 2012. The running 13-month average crosses the zero line a dozen times or more. I am not convinced this is going to be catastrophic.
One of your items is “plant hardiness zones.” I’ve lived in a number of places over the last 60 years and have checked. The plants growing in those places are the same as before. Currently, my tomatoes struggle to set fruit and when they do, most of the fruit does not ripen. Still, I try. I’ll tack a hardiness zone update in their midst so they are sure to get the message.

Paul K2

Hilarious post. Great comedy. The lines couldn’t be sequence any better to demonstrate the hilarity!
First Anthony Watts take full credit:
Until I came along with Watts 2009, they really weren’t looking closely at the issue.
Then Anthony Watts complains how they spent millions:
… why did they need to spend millions of dollars on the new Climate Reference Network commissioned in 2008 that we never hear about?
Hmmm… didn’t 2008 come before 2009?
And of course, this new network of our most modern and best equipped and sited stations, was completely ignored in the Watts et. al. draft paper being discussed.
This fact speaks volumes!

Theo Goodwin

Anthony, you have done a great job of framing the debate. You have put NCDC in the painful position of appealing to highly sophisticated magical statistics for the purpose of defending NCDC’s decision not to address siting issues with their temperature stations. You have created such a powerful meme that I fully expect some intelligent politicians to pick it up.

Eric Gamberg

So, is the SD, Cottonwood site straightened out wrt the MMTS location?

A. Scott

FWIW Revkin is IMO doing a decent job here … reporting and obtaining comments from all sides.
I’ll cross post this from the other thread as its relevant here:
A. Scott said:

Revkin has a response – from NCDC at Dot Earth
http://dotearth.blogs.nytimes.com/2012/07/30/a-closer-look-at-climate-studies-promoted-before-publicatio/#more-45511
Fair amount of technical gibberish – I’m sure it means something to pro’s – but worthless to the public discussion. Mann “self aggrandizement” comment came to mind 😉
I asked Revkin to ask NCDC a couple simple questions:

Andrew … please ask the folks at NOAA/NCDC to answer a couple simple questions – without the over technical, and not meaningful to the public, rhetoric …
Have you done any review using the WMO endorsed Leroy 2010 siting specs and if not, why not?
If the answer is no – then a followup – will you do even a small sampling using this siting criteria and report back to the public quickly?
Why are you adjusting highest quality rural, non-airport sites UPWARD to match the poor quality sites?

The Leroy 2010 siting standard is simple common sense. It ADDS the thermal mass – heat sink and heat source – to the siting quality equation. Past siting formulas (Leroy 1999) used only distance with no consideration of the mass of the source or sink.
An example … a lit match, a campfire, a bonfire, a fully engulfed burning building, and a forest fire. All are heat sources. No one would argue the effect would be the same at a fixed distance away from each.
They seem relevant to ask here as well.
What possible reason or justification is there to adjust high quality stations to match poorly sited stations?
And using my example – how could anyone support a claim that the thermal mass – the SIZE – of the nearby heat sources and sinks – is not critically important to the quality rating of a surface station.
I would actually think a a third criteria is highly important to consider regarding siting quality in addition to distance and mass … that of predominant winds. The effect/impact is a different issue if the station is downwind of a large thermal mass – especially so in areas with a strong predominant wind pattern?

Theo Goodwin

Anthony is up on Foxnews.com:
http://www.foxnews.com/scitech/2012/07/30/weather-station-temp-claims-are-overheated-report-claims/
[REPLY: Very nioce. It links to the BEST website, Muller’s NYT op-ed and even MIkey Mann’s facebook page, but, oddly enough NOT to WUWT or the paper. Real Smooth. -REP]