Heh.

From Tom Nelson

Email 600, Sept 2007: Watts expose makes NOAA want to change entire USA method

Email 600

[Tom Karl, Director of the National Climatic Data Center] We are getting blogged all over for a cover-up of poor global station and US stations we use. They claim NCDC is in a scandal by not providing observer’s addresses. In any case Anthony Watts has photographed about 350 stations and finds using our criteria that about 15% are acceptable. I am trying to get some our folks to develop a method to switchover to using the CRN sites, at least in the USA.

Hat tip: AJ

===============================================================

Note this email, because it will be something I reference in the future. – Anthony

Related articles
Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
152 Comments
Inline Feedbacks
View all comments
Steve C
February 5, 2012 12:29 am

Mentioned In Dispatches. That’s only just short of getting a medal! 🙂

MikeH
February 5, 2012 12:51 am

evanmjones says:on February 4, 2012 at 12:31 pm
“As NOAA has refused to release its adjustment code, we cannot reproduce the adjusted data,…”
Evan, Anthony, etc..
Is there correspondence documenting this? Is there a public statement as to why they refuse to reveal their methodology? Last time I checked, NOAA was a publicly operation, U.S. tax dollar$, etc…
Is Senator Jim Inhofe from Oklahoma aware of this? Has he addressed this with NOAA? If he is not aware of this, maybe it should be brought to his attention, he is a strong ally in rationally reviewing the data and the claims of AGW. He is the minority leader in the U.S. Senate Committee on Environment and Public Works. Maybe during one of their public meetings, he can request NOAA to answer a few questions? He seems to be a person who can get hings done…. Maybe some of his constituents, and others, write him to see if there is an answer to this?
As to not releasing the data, I remember watching a documentary a few years ago on the mapping of the human genome. There was a government lab that was crunching away with mapping it out.. But it was going to take years..
But, a private company started doing the same thing, from the other end of the genetic map. They didn’t want to copy the results the government was doing.. Well, since the government lab was publicly funded, it results were public information and released it’s findings regularly. The private company made an early announcement that they had 90% of the human genome mapped, much sooner than expected. They we legally able to include the government data in with their data. So even if each of them had only mapped 45%, the government was not privy to the results from the private company, but the private company could legally use the government’s data. Whether someone views this as ‘right’ or ‘wrong’ is up to the individual, but it was legal.. I wonder if NOAA is viewing this the same way. If they release all of their data and math, someone else will pick up the ball, leaving NOAA in the dust…

DR
February 5, 2012 1:18 am

@Ric Werme
Considering you spelled Anthony’s name wrong, I’d say that’s more than good enough!

Larry in Texas
February 5, 2012 1:21 am

Good work, Anthony! Once again, we are reminded about what went wrong at NOAA and NCDC and how you pointed that out. You should remember this – and remind your critics of it over, and over, and over again.

neil swallow
February 5, 2012 1:51 am

an intriguing post by ed caryl on the Best temperature data.surprised its not been mentioned here at WUWT.http://notrickszone.com/2012/01/30/best-fails-to-account-for-population-and-cold-winters/ .

John Marshall
February 5, 2012 2:21 am

Just as big a problem is the reduction of reporting stations from over 6000 to around 2300 in 1990. Most of these removed stations were in colder regions. The global average temperature in 1990 leapt up to be claimed as another doom scenario.

February 5, 2012 2:58 am

Phil Jones says in e-mail #600:
” IDAG is meeting Jan 28-30 in Boulder. You couldn’t make the
last one at Duke. Have told Ferris about IDAG, as I thought DAARWG
might be meeting in Boulder. Jan 31-Feb1 would be very convenient
for me – one transatlantic flight, I would feel good about my carbon
bootprint and I would save the planet!”
That final exclamation mark sugggests to me that he knows it is all a load of crap.

Lars P.
February 5, 2012 3:31 am

LazyTeenager says:
February 4, 2012 at 3:57 pm
“I wish you would not just make stuff up. Without the context you can’t interpret this properly so you should not be making up interpretations just to suit your propaganda objectives. It’s fundamentally dishonest.”
LT from what I see you are not lazy posting lots of posts after posts and not a teen judging from your posts. This is not a teen speaking.
In the web many people like to talk under a pseudonym which I respect, even if we all know anonymity is just a fable, but wonder why would you like to give the impression you are just an anonymous teen?

February 5, 2012 5:07 am

Just an amusing thought I had in relation to this…
“one transatlantic flight, I would feel good about my carbon
bootprint and I would save the planet!”

http://en.wikipedia.org/wiki/Nineteen_Eighty-Four
If you want a picture of the future, imagine a boot stamping on a human face—forever.
Not a bad analogy, I think.

Editor
February 5, 2012 5:12 am

DR says:
February 5, 2012 at 1:18 am
@Ric Werme
> Considering you spelled Anthony’s name wrong, I’d say that’s more than good enough!
Yeah, I noticed that right after I posted. In my foggy decision process, I debated posting an Oops or just hope no one would notice. However, nothing escapes WUWT nation these days. At the very least it’s supporting evidence of the state of the Drambuie bottle.

EternalOptimist
February 5, 2012 5:26 am

To paraphrase Travesty T
Do you consult your blogger about your surface stations? In science, as in any area, reputations are based on knowledge and expertise in a field and on published, peer-reviewed work. If you need surgery, you want a highly experienced expert in the field who has done a large number of the proposed operations.
Wrong answer!!

Frank K.
February 5, 2012 5:35 am

Robert of Ottawa says:
February 4, 2012 at 4:18 pm
“Don’t feed the lazy trolls.”
Don’t worry – he’s off coding up his own adjustments, since everyone knows how to do that…even NOAA…heh! /sarc.
BTW – I do wonder why NOAA refuses to release their adjustment code. Even GISS (after a lot of prodding and public embarrassment) released their code. Of course, we realized WHY GISS were so ashamed of it after they released it…

February 5, 2012 5:37 am

Let me get this straight:
Surface station monitors on the about 30% of the earths surface that is land, but are not equally representing the entire 30%, and are not sited or functioning properly, need to have their data adjusted or manipulated in some manner in order to provide what can never be considered a “global” temperature since they do not monitor the other 70% of the planet.
Can anything be built upon this “rock”?

Mike Monce
February 5, 2012 6:32 am

Alan Blue said:
“Pretend you have only two thermometers, their accuracy is ±0.1C. They’re “CRN1″, meaning there aren’t any barbeques, overhanging trees, nor jet exhaust (a criteria that 85% of the existing stations fail). Assume they’re auto-adjusted for humidity and elevation. (Some of the adjustments make perfect sense.) Put them anywhere you like so long as they’re fifty miles apart.
Now: Pretend the two thermometers are two corners of a rectangular area (projected on a sphere) and provide the average temperature of the -entire- box to an accuracy of 0.01C under all weather conditions.
Making one’s own code to do exactly that is non-trivial.”
I hope I’m just missing the sarcasm here. Not only is it non-trivial, it’s impossible! If the instrumental uncertainly is +- 0.1C, then it is literally impossible to obtain an average than has an uncertainty a factor of 10 better than what the instruments can actually read.

JPeden
February 5, 2012 6:40 am

LazyTeenager says:
February 4, 2012 at 3:14 pm
So we now have evidence that they acknowledge the existing meteorology less than ideal for climate change monitoring and are motivated to improve it.
Ooops. Seems to contradict notions of nefarious behavior. If they wanted to produce fake data to support some climate conspiracy they would not bother to try to improve the network now would they.

“You lazy teenager, when are you going to get a real job and move out of my basement.”
“Ok, Mom, but I’ve been trying really really hard!”
“Lordy…where’s my blood pressure medicine?”

DirkH
February 5, 2012 6:42 am

LazyTeenager says:
February 4, 2012 at 3:22 pm
“evanmjones says
As NOAA has refused to release its adjustment code, we cannot reproduce the adjusted data, and therefore, of course, any results are Scientifically Insignificant.)
———-
I was under the impression that it’s relatively easy to code your own adjustment code and that it has been done multiple times. And they all come much the same conclusions about the temperature trends.
So doesn’t that make access to the NOAA code kind of irrelevant since the actual principles involved are well known.”
This is proof that Lazy Teenager is in fact a lazy teenager without any experience. Lazy Teenager; even if it were obvious HOW to write such adjustment code, we would still not know whether NOAA managed to do it without introducing errors. Some time later in your young life, you might get introduced to the craft of computer programming, and you will learn that even the most experienced programmers make mistakes all the time. Obviously, you don’t know this by now, otherwise you wouldn’t have written what you wrote.
Of course, this total lack of experience also goes a long way to excuse your entirely unfounded trust in IPCC consensus climate science.

trbixler
February 5, 2012 7:06 am

Anthony as Morpheus Karl as the agent
NOAA wants us all to take their pill.

Editor
February 5, 2012 7:06 am

So we now have evidence that they acknowledge the existing meteorology less than ideal for climate change monitoring and are motivated to improve it.
Ooops. Seems to contradict notions of nefarious behavior. If they wanted to produce fake data to support some climate conspiracy they would not bother to try to improve the network now would they.

Yet they didn’t. They did not convert to CRN. They did not even add CRN stations to the mix. Using NOAA/NWS or Leroy (1999) standards, <10% of stations are acceptable, and using the new and (very much) improved Leroy (2010) standards, ~15% are acceptable.
All they did was to replace 53 stations with 50 other stations that show greater warming than those they replaced. In proportional terms, they replaced 2% of the USHCN1 stations and that resulted in a 20% warmer trend for USHCN2.
Any comment on nefarious behavior?

Rogelio
February 5, 2012 7:12 am

This posting at Real science is by far the most important or relevant posting concerning the whole AGW scam/science since it started… a must read
http://www.real-science.com/hadcrut-global-trend-garbage
it accounts for all that “cold area” before the 80’s. Its THE graph that was used and still is and its complete garbage as I always suspected. This needs to be broadcast far and wide

Victor Venema
February 5, 2012 7:26 am

LazyTeenager wrote: “I was under the impression that it’s relatively easy to code your own adjustment code and that it has been done multiple times. And they all come much the same conclusions about the temperature trends.”
evanmjones answered: “Your impression is absolutely incorrect. Furthermore, we are not interested in anyone else’s adjustments. We are interested in how NOAA makes the adjustments. In order to find out, we require the – exact – procedures/algorithm, working code (and operating manuals) so that NOAA’s adjustments can be replicated.”
You can download the code used to homogenize (compute the adjustments) the NOAA’s USHCN dataset here:
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/#phas
This page also lists all the articles that describe how the software works. Many other homogenization codes are also freely available.
Homogenization adjustments are computed to comparing a candidate station with its neighboring stations. Nearby stations will have about the same climate signal. If there is a clear jump (relocation, change in instrumentation or weather shelter, etc.) or a gradual trend (urban heat island, growing vegetation, etc.) in the difference time series of two nearby stations this is unphysical and need to be corrected.
Rather than going through the code line by line, LazyTeenager is of course right, that it is much smarter to try to understand the principle and to write your own code and apply it to the data. If you get about the same result, NOAA and you did a good job, if you find differences you try to understand why, which of the two codes makes an error. That is how you normally do science.
The NOAA homogenization software was just subjected to a blind test with artificial climate data with inserted inhomogeneities. This test showed that the USHCN homogenization software improves the homogeneity of the data and did not introduce any artificial (warming) trends. The test was blind in that only I knew where the inhomogeneities were inserted and the scientists performing the homogenization did not. More information on this blind test:
http://variable-variability.blogspot.com/2012/01/new-article-benchmarking-homogenization.html

kramer
February 5, 2012 7:43 am

Hey Tom Nelson, I just want to say a BIG THANK YOU for all of your work in posting the climategate emails.
I also want to add that I read somewhere (I think on JoNova’s or Laframboise’s site) that Joe Romm is (I’m paraphrasing) upset over the climategate 2 emails. Nice to know that you’re contributing to this…

Victor Venema
February 5, 2012 7:46 am

JohnWho wrote:
“Let me get this straight: Surface station monitors on the about 30% of the earths surface that is land, but are not equally representing the entire 30%, and are not sited or functioning properly, need to have their data adjusted or manipulated in some manner in order to provide what can never be considered a “global” temperature since they do not monitor the other 70% of the planet. Can anything be built upon this “rock”?”
It is only you guys that focus so much on the surface network and in most cases even only on the surface network in the USA. Which is not very smart: Even if you could show that your national weather service is in a big conspiracy, it would hardly change the global warming signal. America is not that large. If you would like to contribute to science and find a reason why the the global warming signal is too strong, you’d better think of reasons that apply globally.
The oceans are lately covered by satellites and way back into the past by ocean weather ship and voluntary observations ships: International Comprehensive Ocean-Atmosphere Data Set (ICOADS).
http://icoads.noaa.gov/
The vertical dimension is covered by the radiosonde network, many of them on islands to cover the atmosphere above the oceans. Keywords: GUAN: GCOS Upper-Air Network and GCOS Reference Upper-Air Network:
http://www.gruan.org

neill
February 5, 2012 7:49 am

Jessie says:
February 4, 2012 at 8:58 pm
After viewing that video, I now know what causes global warming — Jessica Simpson.

Evan Jones
Editor
February 5, 2012 8:23 am

You can download the code used to homogenize (compute the adjustments) the NOAA’s USHCN dataset here:
But homogenization is just one step in a long process. There is infilling, SHAP, TOBS, UHI, and equipment, to name just a few. (Not to mention the initial tweaking — outliers, etc.)
And, of course, homogenization is not supposed to increase the trend. After all, if all you are doing is, in effect, providing a weighted averaging of stations within a given radius (or grid box or whatever), the overall average would not change (or at least not much, depending on the weighting procedures).
Yet the adjusted data is considerably warmer than the raw data. Using Steve McIntyre’s 20th Century data: The average USHCN1 station has warmed 0.14C per century using raw data. But it is +0.59 for adjusted data.
The data we used for Fall, et al. (2011), for the 1979 – 2008 (positive PDO) period — using USHCN2 adjustment methods — showed a warming of 0.22 C/decade for raw data and +0.31 C/decade for adjusted data.
This test showed that the USHCN homogenization software improves the homogeneity of the data and did not introduce any artificial (warming) trends.
In that case, homogenization is not going to explain the differences. So homogenization code is not terribly relevant to my objections. We need the full and complete adjustment code. The part that creates the — very large — differences between aggregate raw and adjusted data.
As in the infamous Saturday Night Live “paraquat test”: It’s light! To conduct this test, we need an ounce. A FULL and COMPLETE ounce.

Evan Jones
Editor
February 5, 2012 8:37 am

Rather than going through the code line by line, LazyTeenager is of course right, that it is much smarter to try to understand the principle and to write your own code and apply it to the data.
That statement is such an enormity it needs to be addressed separately.
Actually, for Independent Review purposes, not only does one have to understand the underlying principles, but also has to go through the code line by line and be able to run the code and get the exact same results as NOAA.
For example, are they using the TOBS record from the actual B-91 and B-44 forms, or are they using some sort of mishmash regional guesstimate procedure? “Underlying principles” are not going to answer that one. Only a line-by-line review is going to shed any light on that.
Being “much smarter” would get you landed in the hoosegow if you tried to pull that in the private sector.