The scientific method is at work on the USHCN temperature data set

Temperature is such a simple finite thing. It is amazing how complex people can make it.

commenter and friend of WUWT, ossqss at Judith Curry’s blog

Sometimes, you can believe you are entirely right while simultaneously believing that you’ve done due diligence. That’s what confirmation bias is all about. In this case, a whole bunch of people, including me, got a severe case of it.

I’m talking about the claim made by Steve Goddard that 40% of the USHCN data is “fabricated”. which I and few other people thought was clearly wrong.

Dr. Judith Curry and I have been conversing a lot via email over the past two days, and she has written an illuminating essay that explores the issue raised by Goddard and the sociology going on. See her essay:

http://judithcurry.com/2014/06/28/skeptical-of-skeptics-is-steve-goddard-right/

Steve Goddard aka Tony Heller deserves the credit for the initial finding, Paul Homewood deserves the credit for taking the finding and establishing it in a more comprehensible

way that opened closed eyes, including mine, in this post entitled Massive Temperature Adjustments At Luling, Texas.  Along with that is his latest followup, showing the problem isn’t limited to Texas, but also in Kansas. And there’s more about this below.

Goddard early on (June 2) gave me his source code that made his graph, but I

couldn’t get it to compile and run. That’s probably more my fault than his, as I’m not an expert in C++ computer language. Had I been able to, things might have gone differently. Then there was the fact that the problem Goddard noted doesn’t show up in GHCN data and I didn’t see the problem in any of the data we had for our USHCN surface stations analysis.

But, the thing that really put up a wall for me was this moment on June 1st, shortly after getting Goddard’s first email with his finding, which I pointed out in On ‘denying’ Hockey Sticks, USHCN data, and all that – part 1.

Goddard initially claimed 40% of the STATIONS were missing, which I said right away was not possible. It raised my hackles, and prompted my “you need to do better” statement. Then he switched the text in his post from stations to data while I was away for a couple of hours at my daughter’s music recital. When I returned, I noted the change, with no note of the change on his post, and that is what really put up the wall for me. He probably looked at it like he was just fixing a typo, I looked at it like it was sweeping an important distinction under the rug.

Then there was my personal bias over previous episodes where Goddard had made what I considered grievous errors, and refused to admit to them. There was the claim of CO2 freezing out of the air in Antarctica episode, later shown to be impossible by an experiment and the GISStimating 1998 episode,  and the comment where when the old data is checked and it is clear Goddard/Heller’s claim doesn’t hold up.

And then just over a month ago there was Goddard’s first hockey stick shape in the USHCN data set, which turned out to be nothing but an artifact.

All of that added up to a big heap of confirmation bias, I was so used to Goddard being wrong, I expected it again, but this time Steve Goddard was right and my confirmation bias prevented me from seeing that there was in fact a real issue in the data and that NCDC has dead stations that are reporting data that isn’t real: mea culpa.

But, that’s the same problem many climate scientists have, they are used to some skeptics being wrong on some issues, so they put up a wall. That is why the careful and exacting analyses we see from Steve McIntyre should be a model for us all. We have to “do better” to make sure that claims we make are credible, documented, phrased in non-inflammatory language, understandable, and most importantly, right.

Otherwise, walls go up, confirmation bias sets in.

Now that the wall is down, NCDC won’t be able to ignore this, even John Nielsen-Gammon, who was critical of Goddard along with me in the Polifact story now says there is a real problem. So does Zeke, and we have all sent or forwarded email to NCDC advising them of it.

I’ve also been on the phone Friday with the assistant director of NCDC and chief scientist (Tom Peterson), and also with the person in charge of USHCN (Matt Menne). Both were quality, professional conversations, and both thanked me for bringing it to their attention.  There is lots of email flying back and forth too.

They are taking this seriously, they have to, as final data as currently presented for USHCN is clearly wrong. John Neilsen-Gammon sent me a cursory analysis for Texas USHCN stations, noting he found a number of stations that had “estimated” data in place of actual good data that NCDC has in hand, and appears in the RAW USHCN data file on their FTP site

From:John Nielsen-Gammon

Sent: Friday, June 27, 2014 9:27 AM

To: Anthony

Subject: Re: USHCN station at Luling Texas

 Anthony –
   I just did a check of all Texas USHCN stations.  Thirteen had estimates in place of apparently good data.
410174 Estimated May 2008 thru June 2009
410498 Estimated since Oct 2011
410639 Estimated since July 2012 (exc Feb-Mar 2012, Nov 2012, Mar 2013, and May 2013)
410902 Estimated since Aug 2013
411048 Estimated July 2012 thru Feb 2014
412906 Estimated since Jan 2013
413240 Estimated since March 2013
413280 Estimated since Oct 2012
415018 Estimated since April 2010, defunct since Dec 2012
415429 Estimated since May 2013
416276 Estimated since Nov 2012
417945 Estimated since May 2013
418201Estimated since April 2013 (exc Dec 2013).

What is going on is that the USHCN code is that while the RAW data file has the actual measurements, for some reason the final data they publish doesn’t get the memo that good data is actually present for these stations, so it “infills” it with estimated data using data from surrounding stations. It’s a bug, a big one. And as Zeke did a cursory analysis Thursday night, he discovered it was systemic to the entire record, and up to 10% of stations have “estimated” data spanning over a century:

Analysis by Zeke Hausfather
Analysis by Zeke Hausfather

And here is the real kicker, “Zombie weather stations” exist in the USHCN final data set that are still generating data, even though they have been closed.

Remember Marysville, CA, the poster child for bad station siting? It was the station that gave me my “light bulb moment” on the issue of station siting. Here is a photo I took in May 2007:

marysville_badsiting[1]

It was closed just a couple of months after I introduced it to the world as the prime example of “How not to measure temperature”. The MMTS sensor was in a parking lot, with hot air from a/c units from the nearby electronics sheds for the cell phone tower:

MarysvilleCA_USHCN_Site_small

Guess what? Like Luling, TX, which is still open, but getting estimated data in place of the actual data in the final USHCN data file, even though it was marked closed in 2007 by NOAA’s own metadata, Marysville is still producing estimated monthly data, marked with an “E” flag:

USH00045385 2006  1034E    1156h    1036g    1501h    2166i    2601E 2905E    2494E    2314E    1741E    1298E     848i       0

USH00045385 2007   797c    1151E    1575i    1701E    2159E    2418E 2628E    2620E    2197E    1711E    1408E     846E       0

USH00045385 2008   836E    1064E    1386E    1610E    2146E    2508E 2686E    2658E    2383E    1906E    1427E     750E       0

USH00045385 2009   969E    1092E    1316E    1641E    2238E    2354E 2685E    2583E    2519E    1739E    1272E     809E       0

USH00045385 2010   951E    1190E    1302E    1379E    1746E    2401E 2617E    2427E    2340E    1904E    1255E    1073E       0

USH00045385 2011   831E     991E    1228E    1565E    1792E    2223E 2558E    2536E    2511E    1853E    1161E     867E       0

USH00045385 2012   978E    1161E    1229E    1646E    2147E    2387E 2597E    2660E    2454E    1931E    1383E     928E       0

USH00045385 2013   820E    1062E    1494E    1864E    2199E    2480E 2759E    2568E    2286E    1807E    1396E     844E       0

USH00045385 2014  1188E    1247E    1553E    1777E    2245E 2526E   -9999    -9999    -9999    -9999    -9999    -9999

Source:  USHCN Final : ushcn.tavg.latest.FLs.52i.tar.gz

Compare to USHCN Raw : ushcn.tavg.latest.raw.tar.gz

In the USHCN V2.5 folder, the readme file describes the “E” flag as:

E = a monthly value could not be computed from daily data. The value is estimated using values from surrounding stations

There are quite a few “zombie weather stations” in the USHCN final dataset, possibly up to 25% out of the 1218 that is the total number of stations. In my conversations with NCDC on Friday, I’m told these were kept in and “reporting” as a policy decision to provide a “continuity” of data for scientific purposes. While there “might” be some justification for that sort of thinking, few people know about it there’s no disclaimer or caveat in the USHCN FTP folder at NCDC or in the readme file that describes this, they “hint” at it saying:

The composition of the network remains unchanged at 1218 stations

But that really isn’t true, as some USHCN stations out of the 1218 have been closed and are no longer reporting real data, but instead are reporting estimated data.

NCDC really should make this clear, and while it “might” be OK to produce a datafile that has estimated data in it, not everyone is going to understand what that means, and that the stations that have been long dead are producing estimated data. NCDC has failed in notifying the public, and even their colleagues of this. Even the Texas State Climatologist John Nielsen-Gammon didn’t know about these “zombie” stations until I showed him. If he had known, his opinion might have been different on the Goddard issue. When even professional people in your sphere of influence don’t know you are doing dead weather station data infills like this, you can be sure that your primary mission to provide useful data is FUBAR.

NCDC needs to step up and fix this along with other problems that have been identified.

And they are, I expect some sort of a statement, and possibly a correction next week. In the meantime, let’s let them do their work and go through their methodology. It will not be helpful to ANYONE if we start beating up the people at NCDC ahead of such a statement and/or correction.

I will be among the first, if not the first to know what they are doing to fix the issues, and as soon as I know, so will all of you. Patience and restraint is what we need at the moment. I believe they are making a good faith effort, but as you all know the government moves slowly, they have to get policy wonks to review documents and all that. So, we’ll likely hear something early next week.

These lapses in quality control and thinking that infilling estimated data for long dead weather stations is the sort of thing happens when the only people that you interact with are inside your sphere of influence. The “yeah that seems like a good idea” approval mumble probably resonated in that NCDC meeting, but it was a case of groupthink. Imagine The Wall Street Journal providing “estimated” stock values for long dead companies to provide “continuity” of their stock quotes page. Such a thing would boggle the mind and the SEC would have a cow, not to mention readers. Scams would erupt trying to sell stocks for these long dead companies; “It’s real, see its reporting value in the WSJ!”.

It often takes people outside of climate science to point out the problems they don’t see, and skeptics have been doing it for years. Today, we are doing it again.

For absolute clarity, I should point out that the RAW USHCN monthly datafile is NOT being infilled with estimated data, only the FINAL USHCN monthly datafile. But that is the one that many other metrics use, including NASA GISS, and it goes into the mix for things like the NCDC monthly State of the Climate Report.

While we won’t know until all of the data is corrected and new numbers run, this may affect some of the absolute temperature claims made on SOTC reports such as “warmest month ever” and 3rd warmest, etc. The magnitude of such shifts, if any, is unknown at this point. Long term trend will probably not be affected.

It may also affect our comparisons between raw and final adjusted USHCN data we have been doing for our paper, such as this one from our draft paper:

Watts_et_al_2012 Figure20 CONUS Compliant-NonC-NOAA

The exception is BEST, which starts with the raw daily data, but they might be getting tripped up into creating some “zombie stations” of their own by the NCDC metadata and resolution improvements to lat/lon. The USHCN station at Luling Texas is listed as having 7 station moves by BEST (note the red diamonds):

Luling-TX-BEST

But there really has only been two, and the station has been just like this since 1995, when it was converted to MMTS from a Stevenson Screen. Here is our survey image from 2009:

Luling_looking_north

Photo by surfacestations volunteer John Warren Slayton.

NCDC’s metadata only lists two station moves:

image

As you can see below, some improvements in lat/lon accuracy can look like a station move:

image

http://www.ncdc.noaa.gov/homr/#ncdcstnid=20024457&tab=LOCATIONS

image

http://www.ncdc.noaa.gov/homr/#ncdcstnid=20024457&tab=MISC

Thanks to Paul Homewood for the two images and links above. I’m sure Mr. Mosher will let us know if this issue affects BEST or not.

And there is yet another issue: The recent change of something called “climate divisions” to calculate the national and state temperatures.

Certified Consulting Meteorologist and Fellow of the AMS Joe D’Aleo writes in with this:

I had downloaded the Maine annual temperature plot from NCDC Climate at a Glance in 2013 for a talk. There was no statistically significant trend since 1895. Note the spike in 1913 following super blocking from Novarupta in Alaska (similar to the high latitude volcanoes in late 2000s which helped with the blocking and maritime influence that spiked 2010 as snow was gone by March with a steady northeast maritime Atlantic flow). 1913 was close to 46F. and the long term mean just over 41F.

 CAAG_Maine_before

Seemingly in a panic change late this frigid winter to NCDC, big changes occurred. I wanted to update the Maine plot for another talk and got this from NCDC CAAG. 

CAAG_maine_after

Note that 1913 was cooled nearly 5 degrees F and does not stand out. There is a warming of at least 3 degrees F since 1895 (they list 0.23/decade) and the new mean is close to 40F.

Does anybody know what the REAL temperature of Maine is/was/is supposed to be? I sure as hell don’t. I don’t think NCDC really does either.

In closing…

Besides moving toward a more accurate temperature record, the best thing about all this hoopla over the USHCN data set is the Polifact story where we have all these experts lined up (including me as the token skeptic) that stated without a doubt that Goddard was wrong and rated the claim “pants of fire”.

They’ll all be eating some crow, as will I, but now that I have Gavin for dinner company, I don’t really mind at all.

When the scientific method is at work, eventually, everybody eats crow. The trick is to be able to eat it and tell people that you are honestly enjoying it, because crow is so popular, it is on the science menu daily.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
5 1 vote
Article Rating
323 Comments
Inline Feedbacks
View all comments
copernicus34
June 28, 2014 10:33 pm

Mr Watts, as I’ve stated in the past, you are to commended for your tireless work in this field. However, many of your readers, and Heller’s (aka Goddard’s) could see what he was on to something. Maybe some of us don’t have the bias that you claim against him. But I have to see, with all sincerity, it is you sir, that need to do better. You need to open your eyes to the potential malfeasance of some in the climate community (especially those that work at the locations you are talking about here). Having said this, thank you for this post, it was so important to have this all displayed above the fold, its a credit to you, and the rest of the skeptics that all this is open and free for all to follow, instead of what I believe could be a underground element working in the shadows within the climate community. Something is amiss here, and we shouldn’t be so quick to just write it off as ‘they are doing the best they can’. I know thats not what you said, but its implied.

ferdberple
June 28, 2014 11:01 pm

http://notalotofpeopleknowthat.wordpress.com/2014/06/28/ushcn-adjustments-in-kansas
paul homewood appears to confirm that goddard was correct on more issues than the zombie stations. Kansas was adjusted about 1/2 degree upwards in 2013. man made warming indeed.

June 28, 2014 11:33 pm

Joseph Bastardi says at 2:31 pm
One thing that keeps getting clearer to me is the amount of time, treasure etc wasted on 1/100th of the GHG, .04% of the atmosphere which [CO2] has 1/1000th the heat capacity of the ocean and next to the affects of the sun, oceans and stochastic events probably can not be measured outside the noise, is a giant red herring
— ——— ———— ————– ————— —————— ———
Good points all around, but a huge point about the ocean. Plus, with logarithmic absorption it is postulated that CO2 has essentially zero effects after perhaps 50ppm. And the actual evidence on CO2 affecting temperature? Not good: https://www.youtube.com/watch?v=WK_WyvfcJyg&info=GGWarmingSwindle_CO2Lag

richardscourtney
June 29, 2014 12:02 am

Anth0ny Watts:
Although I agree with much of what You have written here, I write to disagree a point you make at June 28, 2014 at 4:40 pm.
You say

Fixing a few missing datapoints in a month with FILNET to make the record useable is one thing, wholesale reanimation of dead weather stations for years is something else altogether.

Sorry, but NO!
The matter is one of acceptable scientific procedure, and it is binary; i.e. it is right or it is wrong.
Changing one datum may have negligible effect but replacing 100% of the data certainly would have significant effect. If ‘some’ use of FILNET is “one thing” then how much use of FILNET is “something else altogether”? At present there is no determined answer to that question but there are an infinite number of possible opinions of “how much” is not significant. Politics is about opinions but science is about determination of the nearest possible approximation to reality.
The underlying problem is that there is no agreed definition of global average surface temperature anomaly (GASTA) and, therefore, any method to determine GASTA is correct (at very least, it is impossible for it to be wrong). When GASTA is not defined then any use of FILNET – or anything else – to assume data is correct; i.e. it cannot be shown to be wrong for the same logical reason that an asserted name of the Pope’s wife cannot be shown to be wrong.
Richard

June 29, 2014 12:02 am

Anthony, if by “agree to disagree,” you mean I say you’re misrepresenting the Polifact piece and Zeke’s post, and you simply hand-wave me away, sure. I laid out what those pieces were about, with quotes to back it up. If I’m wrong, it should be easy to show.
I don’t know why a simple point of what topics were covered in sources should require us “agree to disagree.” Even people who violently disagree should be able to agree what topics a source covers.
REPLY: And I don’t agree with your assessment. The article mixed topics. It conflated my quote on Goddards intital issue (his stick graph/missing data) with a later second issue. My experience with the entire affair differs from yours. What you are missing is the fact that post 2000, the number of missing data, missing stations and infills has increased. It all goes back to the first issue, in Goddard’s first graph. I’m sorry that you can’t see this.
I’ll point out that I don’t agree with your latest headline “Cook et al Lie Their Faces Off” but you don’t see me there trying to tell you how to change your story because I disagree with the title. It’s your blog, you get to make your own editorial choices. Same here, even if you disagree with those choices. -Anthony

Nick Stokes
June 29, 2014 12:40 am

ferdberple says: June 28, 2014 at 11:01 pm
“Kansas was adjusted about 1/2 degree upwards in 2013. man made warming indeed.”

All Paul Homewood has done is give a table of USHCN final-raw for stations in one state in one month. No news there.

Nick Stokes
June 29, 2014 12:54 am

Paul Homewood says: June 28, 2014 at 3:23 pm
“Does Nick Stokes really believe these are all due to faulty sensors?”

No. All you have done is given the difference between USHCN final (the file ..FLs.52i.avg) and raw (the file ..raw.avg) in the USHCN dataset. They include TOBS, and all the adjustments that have been endlessly described. There is nothing new there.

CEH
June 29, 2014 1:08 am

I found this blogpost about thermometer accuracy quite interesting
http://pugshoes.blogspot.se/2010/10/metrology.html

Greg Goodman
June 29, 2014 1:21 am

Luling TX , Mayland, Joe d’Aleo’s finding massive adjustments in Maine ….
If correcting this “bug” does not affect long term trends, there’s a whole other, bigger, problem to be dealt with.

richard verney
June 29, 2014 1:28 am

It is clear beyond doubt (see this and the other recent article on Steve Goddard’s claim regarding missing data and infilling) and the poor siting issues that the surface station survey highlighted, that the land based thermometer record is not fit for purpose. Indeed, it never could be, since it has always been strained well beyond its original and design purpose. The margins of error far exceed the very small signal that we are seeking to wean out of it.
If Climate Scientists were ‘honest’ they would, long ago, have given up on the land based thermometer record and accepted that the margins of error are so large that it is useless for the purposes to which they are trying to put it. An honest assessment of that record leads one to conclude that we do not know whether it is today warmer than it was in the 1880s or in the 1930s, but as far as the US is concerned, it was probably warmer in the 1930s than it is today..
The only reliable instrument temperature record is the satellite record, and that also has a few issues, and most notably the data length is presently way too short to be able to have confidence in what it reveals.
That said, there is no first order correlation between the atmosheric level of CO2 and temperature. The proper interpretation of the satellite record is that there is no linear temperature trend, and merely a one off step change in temperature in and around the Super El Nino of 1998.
Since no one suggests that the Super El Nino was caused by the then present level of CO2 in the atmosphere, and since there is no known or understood mechanism whereby CO2 could cause such an El Nino, the take home conclusion from the satellite data record is that climate sensitivity to CO2 is so small (at current levels, ie., circa 360ppm and above) that it cannot be measured using our best and most advanced and sophisticated measuring devices. The signal, if any, to CO2 cannot be seperated from the noise of natural variability.
I have always observed that talking about climate sensitivity is futile, at any rate until such time as absolutely everything is known and understood about natural variation, what are its constituent forcings and what are the lower and upper bounds of each and every constituent forcing that goes to make up natural variation.
Since the only reliable observational evidence suggests that sensitivity to CO2 is so small, it is time to completely re-evaluate some of the corner stones upon which the AGW hypothesis is built. It is at odds with the only reliable observational evidence (albeit that data set is too short to give complete confidence), and that sugggests that something fundamental is wrong with the conjecture.

June 29, 2014 1:43 am

Morning All
From reading the comments on this thread, it is clear that bad stations, infilling of data, zombie stations and etc do not make for good scientific information.
But to hear that the raw data has been “adjusted” to cool older data, even if by a tenth of a degree shows without a doubt, that this is not sheer incompetence but a sustained and deliberate to “make the facts fit the theory”.
That is blatantly a corrupt practice.
If the US records are supposed to be one of the best, then the shit storm that this is causing is rightly justified.
I respect Anthony for having the balls to admit he was wrong. It takes a man to do that. But now, he should be chasing down the perpetrators of this and blowing SG’s whistle louder than loud.
However, this being the number one “denier” site, you can already bet any information from here will be, at best, marginalised.
The CAGW machine will only treat this as a speed bump, and like the climategate emails, this will be buried, adjusted or deliberately forgotten within months.

Greg Goodman
June 29, 2014 2:25 am

richard verney says: “Since no one suggests that the Super El Nino was caused by the then present level of CO2 in the atmosphere, and since there is no known or understood mechanism whereby CO2 could cause such an El Nino”
Oh yeah? Because you say so?
I’m not saying this is the case but consider the following: GHGs causes heat to be retained, lessening the natural cooling of the oceans. The natural variability of the Nino/Nina cycles successively absorbs incoming solar and dumps it out the atmosphere. If natural cooling is slowed and heat builds up, from time to time there is large El Nino that dumps this excess heat.
What actually causes these intermittent “cycles” is very poorly understood and often misrepresented as an “oscillation” with the undeclared implication that it symmetrical and will average out over time.
That is spurious. It is not a pendulum swing as Tisdale has pointed out. It is an active player.
So I don’t think you bland hand-waving declaration is either accurate or informed.

Editor
June 29, 2014 2:27 am

Zeke Hausfather says:
June 28, 2014 at 6:04 pm

If you don’t like infilling, don’t use it. It doesn’t change the result, almost by definition, since infilling mimics spatial interpolation:

Thanks, Zeke … if there is a perfect infilling method, you’d be correct. Are you claiming that such a method exists?
Zeke Hausfather says:
June 28, 2014 at 6:24 pm

Sunshinehours/Bruce:
Infilling makes no difference for Arizona:

Well, I can see a number of differences by naked eye … so the claim that it makes “no difference” is clearly untrue. The question is, how much difference?
Infilling operates on the ASSUMPTION that the station is highly correlated to its neighbors. And while on average this is true, for any individual station it may be far from true. But wait, it gets worse. Even if the overall correlation is good, the seasonal correlations may vary significantly. But wait, it gets worse. You are often infilling a single month … and even if the overall correlation is good, the correlation for that one particular month may be abysmal.
For example, annual temperatures in Anchorage and Gulkana in Alaska have a correlation of 0.86 … but despite that, a linear estimation of Anchorage temperature based on Gulkana temperature gives an error in the estimated annual data of up to a degree and a half … and that’s with quite good correlation.
In the case of Arizona, much of the earlier infilled data is lower than the non-infilled … so my question is, what are the trends for the two results?
Thanks for all your work on these questions,
w.

Greg Goodman
June 29, 2014 2:41 am

The exception is BEST, which starts with the raw daily data, but they might be getting tripped up into creating some “zombie stations” of their own by the NCDC metadata and resolution improvements to lat/lon. The USHCN station at Luling Texas is listed as having 7 station moves by BEST (note the red diamonds):
Luling-TX-BEST
But there really has only been two, and the station has been just like this since 1995, when it was converted to MMTS from a Stevenson Screen. Here is our survey image from 2009:
===
So if I’m reading that correctly, BEST will be “correcting” this station by about +1.5 since 1995, despite it being a well sited, good quality station in that period. That implies conversely that there is that whole regional average that it is being compared to may have a warm bias of that magnitude.

Eamon Butler
June 29, 2014 2:56 am

Please excuse me, but as a layperson I sometimes struggle with the technical terminology. Is ”Crow” a bit like Humble pie? 😉
Always a pleasure to see Honesty rise above all else.
Eamon.

michel
June 29, 2014 3:06 am

Well done you. And Prof Curry. And Paul Homewood and Zeke and the others. And well done Goddard, even though he was confusing and partly wrong, he really was on to something, and its good to see everyone recognising that.

SandyInLimousin
June 29, 2014 3:30 am

Joseph Bastardi says:
June 28, 2014 at 4:41 pm
Hear Hear on all points

nevket240
June 29, 2014 3:35 am

http://www.channelnewsasia.com/news/lifestyle/hotter-and-larger-tropics/1219108.html
This blows the original hypothesis out of the water. And no one questions it??
regards

Eliza
June 29, 2014 3:59 am

# Red Flags:
1. What Sunshine hours is reporting with recent data.
2. Your throwing pearls at pigs: Findings shoud not have been reported to USHNC, NCDC ect (as you are assuming they did not know, and may in fact have purposely fabricated it as SG maintains).
3. They won’t change anything anyway
4. You will continue to defend them (harmless scientists).
5. Its time for lawsuits, not talking
6. Remember BEST saga (mosh is still defending. that should be a REAL RED FLAG).
7. Thanks anyway re Goddard.

J Martin
June 29, 2014 4:06 am

Although the saying goes “never attribute to malice that which can be attributed to incompetance” (or something like that), the differences are so absurdly blatant that one is forced to wonder.
Can someone who carries some weight with NCDC and hence might get an answer, ask them to justify the adjustments they have made shown in the two graphs for Maine.
In the absence of a viable explanation this should be considered to be fraud, designed to manipulate politicians into providing continued / increased funding programs.

Kasuha
June 29, 2014 4:29 am

The key question here is, were estimated data used to calculate gridded and higher level averages, or to adjust other stations?
Doing so would be an inexcusable error, something that should never happen to a real scientist. An error so huge I almost refuse to believe people in NCDC would make willingly.
But if these estimates are just sitting there quietly and are not used for further processing, then they’re almost irrelevant.

Nick Stokes
June 29, 2014 4:47 am

Kasuha says: June 29, 2014 at 4:29 am
“The key question here is, were estimated data used to calculate gridded and higher level averages, or to adjust other stations?
Doing so would be an inexcusable error…”

They are used to calculate averages. That is the purpose of adjustment. And it isn’t an error.
A spatial average requires an integral, and they are in effect using numerical integration formulae. That is, summing interpolated values over the whole area, implicitly. Whether you interpolate values to include in the sum makes no difference, provided the interpolation is as accurate as that implied in the integration.
They interpolate all stations in each sum because they are using absolute temps, which includes the climatology. You need to keep the same climatology from month to month.

Eliza
June 29, 2014 5:11 am

“They interpolate all stations in each sum because they are using absolute temps, which includes the climatology. You need to keep the same climatology from month to month”
Well that explains everything WTF?
Does this guy have any degree from anywhere?If its Australian I understand..

A C Osborn
June 29, 2014 6:05 am

Nick Stokes says: June 29, 2014 at 4:47 am
When are you going to respond to my questions, you justify infilling along with Zeke, but you are using already “Estimated” values to infill with.
Please name all the stations that you used to compare to Luling so that the rest of us can check them out for validity of use and values.
I have already told you that San Antonio is Estimeated for that period, did you use it?

Latitude
June 29, 2014 6:10 am

Goddard headlined the Washington Times and Drudge again today….
http://www.washingtontimes.com/news/2014/jun/23/editorial-rigged-science/
He owes all of you guys for making this happen….accusing him of being wrong…and then having to eat it…..made it a 100 times bigger

1 6 7 8 9 10 13