Two years ago during the scorching summer of 2012, July 1936 lost its place on the leaderboard and July 2012 became the hottest month on record in the United States. Now, as if by magic, and according to NOAA’s own data, July 1936 is now the hottest month on record again. The past, present, and future all seems to be “adjustable” in NOAA’s world. See the examples below.
Josh has been busy again and writes at Bishop Hill with a new cartoon:
The temperature adjustments story has been brewing for weeks principally due to the many posts at ‘RealScience’ but taken up by others, for example, Paul Homewood, see here and here. Judith Curry has a great post about it here, as does Anthony here.
H/t to Real Science/Steven Goddard for suggesting including Toto. Cartoons by Josh
Bruce at Sunshine Hours has been doing some unthreading, er plotting, and at my request, prepared some USHCN maps of Kansas, first May’s high temperatures.
I’ve annotated the plot, to include “zombie” weather station that have been closed for years, but still show “estimated” data from NOAA. Those marked NRF are “no report found”…typically meaning NOAA hasn’t gotten the data from the observer yet, which is often mailed in on paper B91 forms. It is interesting to note how NOAA has been changing the data, in most cases adjusting it higher, though in a couple of cases, lower.
Bruce also plotted some other maps of Kansas, for July 1936, and for July 2012. Note how in July 1936 the Tmax temperature are almost all adjusted cooler, and in 2012, most all Tmax temperatures are adjusted warmer. Click images for larger versions.
Whatever happened to just using actual measured data? There is no justification for this.
And, NOAA can’t even keep their story straight about July 1936 temperatures. From a report I did in 2013:
NCDC’s SOTC July 2012:
http://www.ncdc.noaa.gov/sotc/national/2012/07
Screencap of the claim for CONUS Tavg temperature for July 2012 in the SOTC:
Note the 77.4°F value for July 1936. It is actually still in their SOTC for July 2012 today.
Now let’s look at some plots from NOAA’s Climate at a Glance. I just happened to have one from two years ago. It also says 77.4°F on the plot. The numbers match with the SOTC report. The annotations are mine.
Today, I ran the same plot again, and here is the NEW number for July 1936. The annotations are mine.
NOAA helpfully provided the data which I have saved as an Excel file, it has both 1936 and 2012 July data: NOAA_Tavg_Data_July_1895-2013 (.xlsx)
You can’t get any clearer proof of NOAA adjusting past temperatures.
This isn’t just some issue with gridding, or anomalies, or method, it is about NOAA not being able to present historical climate information of the United States accurately. In one report they give one number, and in another they give a different one with no explanation to the public as to why.
This is not acceptable. It is not being honest with the public. It is not scientific. It violates the Data Quality Act.
But wait, there’s more. In January 2013, I ran this story based on an article in the Wall Street Journal: July (2012) Was Hottest Month on Record
My story was: Does NOAA’s National Climatic Data Center (NCDC) keep two separate sets of climate books for the USA?
In that essay, I revised the WSJ graphic. At that time, it looked like this based on new numbers for July 2012 that I found from NOAA:
Now, with the new numbers in the Excel File above, output from NOAA, I had to revise it again. It looks like this now:
Now, once again, July 1936 is the hottest month in the US, even if by the slimmest of margins, all thanks to post-facto adjustments of temperature data by NOAA/NCDC.
I suggest that NOAA/NCDC have another one of those meetings like where they decided to keep long dead weather stations reporting as “zombies”, like I showed with Marysville, yesterday, and work on getting their story straight.
This constant change from year to year of what is or is not the hottest month on record for the USA is not only unprofessional and embarrassing for NOAA, it’s bullshit of the highest order. It can easily be solved by NOAA stopping the unsupportable practice of adjusting temperatures of the past so that the present looks different in context with the adjusted past and stop making data for weather stations that have long since closed.
NOAA has been accused by others of “fabricating” data, and while that is a strong word that I don’t like to use, it looks to be more and more accurate.
That said, I don’t believe this is case where somebody purposely has their hand on a control knob for temperature data, I think all of this is nothing more than artifacts of a convoluted methodology and typical bureaucratic blundering. As I’ve always said, never attribute malice to what can be explained by simple incompetence.
We already showed yesterday that NOAA can’t get their output data files correct, and we are waiting on a statement and a possible correction for that. But I think the problem is even larger than that, and will require an investigation from an unbiased outside source to get to the root of the problem.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.









Eh, they are just preparing for another one of their splendid “warmest July on record” moments with press release &. great noise &. all.
Please note temperature of July 1936 was decreased from 77.4°F to 76.8°F, making way easier to surpass it at some future date. The fact July 2012 was adjusted downward a bit more, is immaterial in this respect. Old news is never news.
Excellent that this is picked up. We have to push for real answers.
And this is by far the most clever Josh cartoon ever. Layer after layer of irony! Josh, you are a genius!
sunshinehours1…. Yes, I agree, there is likely no trend. Unfortunately, anyone checking NOAA’s site would never come to that conclusion and isn’t that the site most researchers and politicians (if they can read a graph) use?
I busted the NOAA within days of their opening of the Climate.gov web site as being propagandists with my own little graphic arts effort:
http://s16.postimg.org/54921k0at/image.jpg
What a gift this graphic deception was since I was ready with its exposure when Climategate afforded so much attention to skepticism. I posted it and a few other infographics tens of thousands of times to news sites along with little read along comments so I didn’t come off as just spam. It was easy converting conservatives that way, but liberals, oh boy, no, not so easy. I was even attacked for drawing a line at all, and often banned when I mocked attackers right back.
But now that Goddard’s zombie stations are found to be real, do they really form an adjustments hockey stick? This graphic artist amateur would like to know, as I’m sure would the media.
On the surface of things it seems some eager undergraduate or first year graduate student could spend some time with the newspapers of July 1936 and diligently record the official temperature data as given at that time, then write a nice paper. Or is this naive?
To quote a recent Secretary of State: “What difference, at this point, does it make?”
So the temperature data has been revised so many that it’s like the Talking Heads’ hairstyle (“I changed my hairstyle, so many times now, I don’t know what I look like!”)
This is just one of the things that the government keeps revising. Now gross domestic production, when they revise that (twice) so that it goes from a quarterly increase of 0.1% to a decline of 2.9%, we’re in for a bumpy ride keeping track of all the government lies as we go deeper into the recession.
http://bea.gov/newsreleases/national/gdp/gdpnewsrelease.htm
‘As I’ve always said, never attribute malice to what can be explained by simple incompetence.’
I really hate to say it, but that statement is completely inaccurate. Believe it or not I happened to have known the original author of that statement (trust me 🙂 and the truly correct wording that he/she originally gave it was, “Never attribute incompetence to what can be explained as simple malice.”
He/she has repeatedly complained to me that the words ‘incompetence’ and ‘malice’ were switched around by someone who was either incompetent when repeating their statement or was acting with malicious intent.
REPLY: Maybe, but without a reference your comment is simply an opinion, not a fact. What I wrote reflects exactly what I wanted to say, no need to rewrite or adjust it. – Anthony
Is there a written description of the algorithm being used and has that description been compared with the code used to make adjustments? I am guessing that the algorithm makes adjustments to data – old and new – based on the entire temperature record available, which grows every month.
For example, adjustments made in 2014 to the temperatures of 1936 have nine years more data than adjustments made in 2005 to the temperatures of 1936. If there is a trend one way or the other from the beginning of the record to the present, the adjustments would reinforce the trend.
This is what GISS is doing with their process of estimating monthly and seasonal temperature values when they are missing from the GHCN record. The estimates are revised as present data comes in, as if temperatures recorded in 2014 somehow influence temperatures 20, 50, or a hundred years ago.
John Goetz, I’ve asked for a flowchart, and was rebuked.
Your idea makes a lot of sense though.
Mere incompetence – artifacts of a convoluted methodology and typical bureaucratic blundering – would reasonably be expected to produce a normal distribution of error, especially given the hundreds of thousands of data points impacted.
But that’s not what we see. All of the error serves to cool the past, and to warm the present – in other words, all of the error serves to bolster the very assertion being promoted. At the micro level, as original, historic station data are being adulterated on a monthly basis, the data are modified upward or downward subtly. But at the macro level, the net adulteration always – always – serves to cool the past, and to warm the present.
It is utterly implausible that the observed error is random.
REPLY: I never said it was random, you inserted that idea. I said it was likely due to incompetence, and confirmation bias tends to push that one direction – Anthony
Bob Greene says:
June 29, 2014 at 12:53 pm
At a minimum, I’d say that USHCN’s error bars should reflect the amount a temperature sample varies in the database. I don’t see how they could possibly argue for less than that for their infilled/zombie data.
I’m uncomfortable using the word “fabricated,” but I do like “zombie.” I wonder how NCDC reacts to zombie. I wouldn’t be surprised if they adopt it themselves in internal meetings.
In 2007 NOAA published ‘documentation’ of USHCN homogenization V2. It is available on line at http://www.ncdc.noaa.gov/oa/climate/research/ushcn. Turgid prose. No flow chart. No valid back testing comparisons to V1. About all that can be learned is that the explicit separate UHI adjustment in V1 was eliminated as ‘no longer required’ by the new automated “PHA”. V2 has provably been tinkered with since 2007, but I can find no documentation of the subsequent modifications.
After erasing the Medieval Warm period and Little Ice Age, huge adjustments to centuries of well documented data, what’s the big deal with a degree or 2, here and there and there and here and here and there (:
http://hockeyschtick.blogspot.com/2014/01/the-rise-and-fall-of-hockey-stick-and.html
A little understated I should have thought. Real Science has been banging on about this for years. See Thermometer Magic posted September 2010. Anthony has highlighted the issue on these pages countless times. Jennifer Marohasy
has been busy for several years trying to get some transparency on adjustments in Australia. The following from a post by Michael Hammer on Marohasy’s site, dated June 2009.
Anthony, wake up! It’s intentional. They are trying to make it appear that the warming trend is greater than it is. It’s beyond obvious. Do you still believe Clinton didn’t inhale?
In case it has not occurred to you, someone, AW, should send the cartoon to Lamar Smith.
http://online.wsj.com/articles/lamar-smith-what-is-the-epa-hiding-from-the-public-1403563536
This was already reported over at American Thinker two months ago in an article entitled “July 2012 was Not the Hottest Month in U.S. History”: http://www.americanthinker.com/blog/2014/05/july_2012_was_emnotem_the_hottest_month_in_us_history.html
“That said, I don’t believe this is case where somebody purposely has their hand on a control knob for temperature data, ”
Yes they have.
Why else would the “oops slips” adjustments always reflect global warming.
This “one”may have been an “accidental adjustment” but it was still a deliberate action by someone to again adjust temperature records away from the available raw data.
With the evidence that is availiable around the world only a fool wouldn’t believe that.
And a fool you are not.
Do I or others really need to remind you that they are in control of historic weather data around the world are continuall “adjusting” the raw data to suit the fraud?
If they didn’t there simply would be no “justification” for continuing the fraud.
If you hadn’t already read it and I suspect you have.
This is for others who might be under the impression of “only in America”.
Yes bureau’s of meteorology right around the world are “controlling” the weather.
From rainfall gauges to temperature records and every thing else that will prop up this fraud.
America is not alone when comes to calling out what these fraudsters are doing.
http://joannenova.com.au/?s=bom+records+before+1910
“No, it was because Goddard originally claimed 40% of USHCN STATIONS were missing, which I knew from my survey to be wrong, and then he changed it to DATA after I complained but did not note the change in hist story. It seemed like sweeping the issue under the rug. Plus I could not get his code to run to replicate the problem, and our own USHCN data didn’t show the problem.”
===
You know……I honestly hope you stop saying this
(hey, I can have hopes too)
Shades of 1984.
We can adjust for that, too.
Surely they aren’t using an algorithm that purports to determine what the temperatures of the past are based on future temperatures?
“That said, I don’t believe this is case where somebody purposely has their hand on a control knob for temperature data, I think all of this is nothing more than artifacts of a convoluted methodology and typical bureaucratic blundering. As I’ve always said, never attribute malice to what can be explained by simple incompetence.”
That is an overly generous interpretation of what looks like plain old fraud. It is fraud generated by a bureaucracy that wants to provide its masters with exactly what they are looking for. It is also the action of a bureaucracy that sense a chill wind blowing from some unknown quarter and senses that there may be questions asked later and perhaps a bit of a witch hunt or two.
Hey, that’s my favorite movie!
Unless they have reinstituted temperature measurement since I was there last year, Buffalo and Cherokee OK will need to be added to the zombie list.
Tom J on June 29, 2014 at 2:58 pm
My sincere apologies. I meant no offense. I guess the joke didn’t work.
“Never attribute to malice that which can be adequately explained by stupidity, but don’t rule out malice.” – attributed to Albert Einstein. The ‘cocked-up vs conspiracy’ meme has been around for a long time, and is reasonable when a process is more or less free from political agendas, but that is not true here. I think it is time to move on to malice: stupidity isn’t a sufficient explanation.