Two years ago during the scorching summer of 2012, July 1936 lost its place on the leaderboard and July 2012 became the hottest month on record in the United States. Now, as if by magic, and according to NOAA’s own data, July 1936 is now the hottest month on record again. The past, present, and future all seems to be “adjustable” in NOAA’s world. See the examples below.
Josh has been busy again and writes at Bishop Hill with a new cartoon:
The temperature adjustments story has been brewing for weeks principally due to the many posts at ‘RealScience’ but taken up by others, for example, Paul Homewood, see here and here. Judith Curry has a great post about it here, as does Anthony here.
H/t to Real Science/Steven Goddard for suggesting including Toto. Cartoons by Josh
Bruce at Sunshine Hours has been doing some unthreading, er plotting, and at my request, prepared some USHCN maps of Kansas, first May’s high temperatures.
I’ve annotated the plot, to include “zombie” weather station that have been closed for years, but still show “estimated” data from NOAA. Those marked NRF are “no report found”…typically meaning NOAA hasn’t gotten the data from the observer yet, which is often mailed in on paper B91 forms. It is interesting to note how NOAA has been changing the data, in most cases adjusting it higher, though in a couple of cases, lower.
Bruce also plotted some other maps of Kansas, for July 1936, and for July 2012. Note how in July 1936 the Tmax temperature are almost all adjusted cooler, and in 2012, most all Tmax temperatures are adjusted warmer. Click images for larger versions.
Whatever happened to just using actual measured data? There is no justification for this.
And, NOAA can’t even keep their story straight about July 1936 temperatures. From a report I did in 2013:
NCDC’s SOTC July 2012:
http://www.ncdc.noaa.gov/sotc/national/2012/07
Screencap of the claim for CONUS Tavg temperature for July 2012 in the SOTC:
Note the 77.4°F value for July 1936. It is actually still in their SOTC for July 2012 today.
Now let’s look at some plots from NOAA’s Climate at a Glance. I just happened to have one from two years ago. It also says 77.4°F on the plot. The numbers match with the SOTC report. The annotations are mine.
Today, I ran the same plot again, and here is the NEW number for July 1936. The annotations are mine.
NOAA helpfully provided the data which I have saved as an Excel file, it has both 1936 and 2012 July data: NOAA_Tavg_Data_July_1895-2013 (.xlsx)
You can’t get any clearer proof of NOAA adjusting past temperatures.
This isn’t just some issue with gridding, or anomalies, or method, it is about NOAA not being able to present historical climate information of the United States accurately. In one report they give one number, and in another they give a different one with no explanation to the public as to why.
This is not acceptable. It is not being honest with the public. It is not scientific. It violates the Data Quality Act.
But wait, there’s more. In January 2013, I ran this story based on an article in the Wall Street Journal: July (2012) Was Hottest Month on Record
My story was: Does NOAA’s National Climatic Data Center (NCDC) keep two separate sets of climate books for the USA?
In that essay, I revised the WSJ graphic. At that time, it looked like this based on new numbers for July 2012 that I found from NOAA:
Now, with the new numbers in the Excel File above, output from NOAA, I had to revise it again. It looks like this now:
Now, once again, July 1936 is the hottest month in the US, even if by the slimmest of margins, all thanks to post-facto adjustments of temperature data by NOAA/NCDC.
I suggest that NOAA/NCDC have another one of those meetings like where they decided to keep long dead weather stations reporting as “zombies”, like I showed with Marysville, yesterday, and work on getting their story straight.
This constant change from year to year of what is or is not the hottest month on record for the USA is not only unprofessional and embarrassing for NOAA, it’s bullshit of the highest order. It can easily be solved by NOAA stopping the unsupportable practice of adjusting temperatures of the past so that the present looks different in context with the adjusted past and stop making data for weather stations that have long since closed.
NOAA has been accused by others of “fabricating” data, and while that is a strong word that I don’t like to use, it looks to be more and more accurate.
That said, I don’t believe this is case where somebody purposely has their hand on a control knob for temperature data, I think all of this is nothing more than artifacts of a convoluted methodology and typical bureaucratic blundering. As I’ve always said, never attribute malice to what can be explained by simple incompetence.
We already showed yesterday that NOAA can’t get their output data files correct, and we are waiting on a statement and a possible correction for that. But I think the problem is even larger than that, and will require an investigation from an unbiased outside source to get to the root of the problem.









DavidR, I get -0.085C/dec for USA48 from 1998 to 2013.
Not as much of a match.
DavidR:
Your entire post at June 30, 2014 at 7:49 am is pure sophistry.
For example, in response to my having written
You reply with this non sequiter that completely ignores my point
In that case, the continuing NOAA surface data ‘adjustments’ are wrong.
Also, you attribute to me loads of stuff which I have not said. I gave you a link to my true views and you have ignored that, too.
Clearly, you are not making any attempt at constructive dialogue: you are merely another anonymous troll.
Richard
DavidR says:
June 30, 2014 at 7:49 am
//////////////////
Are the trends really similar?
The satellite data shows no linear trend, just a one off step change in and around the 1998 Super El Nino!
The satellite data is picking up on a one off event of natural variation, on the other hand the NOAA surface data shows something rather different.
What I can’t understand is why anyone would want to use the surface data after 1979. The satellite data has better spatial coverage and is not adversely affected by UHI and station drop outs. Of course, it is not perfect and may have issues with orbital decay and sensor degradation, but then again so does the land based system with equipment and screen degradation. These issues, are merely error bars. A realisitc error bandwidth should be attached to the satellite data to reflect these issues.
Since the satellite measuring equipment is supposed to represent our most advanced technology and measuring accuracy, it seems weird that anyone would wish to continue with a data set which is far less competent, especially one which is being stretched way beyond its original design purpose and one which has som many fundamental issues with it..
The land based data set should end in 1979. Thereafter the satellite data set should be the standard. naturally, no attempt should be made to splice the one on to the other.
One can use the land based instrument record (preferrably just raw data but with approriate error bars to take account that the raw data comes with warts and all) to shed light on what happened between say 1850 and 1979.
One can use the satellite data set to see what has happened since.
The only exception I would make is with CET. Given its length, it is useful to preserve that record.
Thus going forward, we should in essence just be using the satellite data and ARGO. ,
Does this Mean Goddard was correct… something is seriously rotten at the NOAA…and perhaps worldwide…?
If so i would love to see polito’fact’ retract the pants on fire rating and change it to 100% true ..
But don’t hold your breath on that ..
DavidR, I did this pretty quick so I hope I’m right:
UAH by month doesn’t match up as well to USHCN by month.
1979 – 2013
Jan 0.5
Feb 0.26
Mar 0.4
Apr 0.3
May 0.16
Jun 0.16
Jul 0.19
Aug 0.16
Sep 0.2
Oct 0.11
Nov 0.28
Dec 0.03
1998 – 2013
Jan -0.14
Feb -0.7
Mar 0.66
Apr 0.17
May -0.46
Jun 0.33
Jul -0.1
Aug -0.05
Sep 0.03
Oct -0.28
Nov -0.21
Dec -0.27
On the new July plot from NOAA’s State of the Climate, pulling the trend, from 1936 – 2013 gets you a straight level line – no trend for the past 77 years. None at all.
richard verney says: June 30, 2014 at 8:21 am
I admire your confidence in the UAH dataset. Personally I think it is very misplaced.
It does not measure the Surface Temperature, which is where we actually live, like the Thermometer stations do.
Many times the regional “Lower Troposphere” anomalies have borne no relationship to the temperatures that we have experienced on the ground in both the Northern and Southern Hemispheres.
That you chase temperature readings for your own agenda, (as they do also), clearly shows that you have no real concern about what is going on with our society and the people infiltrating it, eh???
never attribute malice to what can be explained by simple incompetence.
It seems more and more like a malicious incompetence…
Where are honest scientists? Is there any left? Where is liability of NASA, NOAA and GISS? What other data is adjusted to look better?
Good work guys (and gals, where applicable.)
Tom J says on June 29, 2014 at 2:58 pm
… I really hate to say it, but that statement is completely inaccurate.
So, you have first hand knowledge of the office procedures at the Weather Bureau (now known as ‘NOAA”) through the 1900’s up to present day?
Can you chart for us the progress from manual procedures, including data and station sorting, average temperature calculations for each state, to the first use of tube-type IBM 700 series computers and punched cards, through the tape ‘dataset’ era in the 50’s through to DASD (Direct Access Storage Device) and instant, on-line access systems of today?
It would be very much appreciated if you could detail this for us.
.
The argument on the other side is already coming out, to explain a ~70 years of non warming :
Its GLOBAL warming … and its warming everywhere else but the US…
Response:
FINALLY we have a boarder that stops something… not people, but HEAT…
All that money has managed to stop the 2nd law of thermodynamics.. its amazing… perhaps we should start writing thesis papers on it!!!
As I’ve always said, never attribute malice to what can be explained by simple incompetence.
======
I’m curious about this……because I see too many case where it would have to be out and out malice.
Which would it be when you originate a climate model on one set of data…..and when you go to hind cast based on that exact same set of data ……you see there’s a completely different set of data
Not screaming your head off about it……would that be incompetence?
…I don’t think so
Its GLOBAL warming … and its warming everywhere else but the US…
====
it’s plausible……we all know the MWP only happened in Cherry Hill, NJ
I wonder if NOAA gets their data from these weather stations (USHCN)….great project at surfacestations.org
NOAA Reinstates July 1936 As The Hottest Month On Record
The National Oceanic and Atmospheric Administration, criticized for manipulating temperature records to create a warming trend, has now been caught warming the past and cooling the present.
July 2012 became the hottest month on record in the U.S. during a summer that was declared “too hot to handle” by NASA scientists. That summer more than half the country was experiencing drought and wildfires had scorched more than 1.3 million acres of land, according to NASA.
According to NOAA’s National Climatic Data Center in 2012, the “average temperature for the contiguous U.S. during July was 77.6°F, 3.3°F above the 20th century average, marking the warmest July and all-time warmest month on record for the nation in a period of record that dates back to 1895.”
“The previous warmest July for the nation was July 1936, when the average U.S. temperature was 77.4°F,” NOAA said in 2012.
This statement by NOAA was still available on their website when checked by The Daily Caller News Foundation. But when meteorologist and climate blogger Anthony Watts went to check the NOAA data on Sunday he found that the science agency had quietly reinstated July 1936 as the hottest month on record in the U.S.
http://dailycaller.com/2014/06/30/noaa-quietly-reinstates-july-1936-as-the-hottest-month-on-record/#ixzz369vSm4WI
Stephen Richards says:
June 29, 2014 at 1:24 pm
NOAA has been accused by others of “fabricating” data, and while that is a strong word that I don’t like to use, it looks to be more and more accurate.
Stephen, whilst fabricating is a strong word I feel falsifying is more accurate.
falsify
ˈfɔːlsɪfʌɪ,ˈfɒls-/Submit
verb
gerund or present participle: falsifying
1.
alter (information, a document, or evidence) so as to mislead.
“a laboratory which was alleged to have falsified test results”
synonyms: forge, fake, counterfeit, fabricate, invent, alter, change, doctor, tamper with, fudge, manipulate, massage, adulterate, pervert, corrupt, debase, misrepresent, misreport, distort, warp, embellish, embroider, colour, put a spin on.
Does the NOAA knob go to eleven?? ….Nigel Tuffnel
Latitude says:
June 30, 2014 at 2:14 pm
NOAA Reinstates July 1936 As The Hottest Month On Record
The National Oceanic and Atmospheric Administration, criticized for manipulating temperature records to create a warming trend, has now been caught warming the past and cooling the present.
============
You didn’t notice?……….warming the past and cooling the present???
I think NOAA owes the public an explanation of this flip flop. Additionally, Congress should hire a third party to audit their processes and procedures for handling data. Does anyone know if NOAA is accredited by any notified body?
At the risk of repeating myself:
Oceania has always been at war with Eastasia.
One point I would suggest is that average temperatures cannot be “correct” unless there is a frozen data set and a fixed method for handling that data. Very likely neither malice nor incompetence are involved. Slight changes in calculation methods, computing languages (there’s a reason that there are software collections that are authorized for use on critical data systems while other software is not), and potentially even hardware changes. Numeric co-processors can be very temperamental for instance. RAW Historical data should be frozen – that is no “adjustments” should be applied, once an adjustment process has been made the data should be stored to a uniquely identified version with metadata that explains all adjustments.
old44 says:
June 30, 2014 at 2:30 pm …
Neither falsifying nor fabricating meet “least” explanatory complexity criteria. The variations look like grad student data mangling. Attempts are made to try and “fix” a perceived problem or improve a computing process with equivocal results.
Wow!
… In support of Mark Stoval (@MarkStoval) says: June 29, 2014 at 4:30 pm
I see senior people in government organisations asking for multiple statistical models to be run; each ‘run’ produces a different outcome depending on assumptions and parameters.
Now, armed with a multitude of ‘runs’, the one that best supports the personal view of the senior person can be (cherry) picked for publicity purposes. (Alternatively it might be the outcome that supports the government-of-the-day’s spin doctors).
If the technical person that has done the statistical work knows that ‘Official Outcome’ is unrepresentative of their many attempts, they are most unlikely to risk their careers by speaking-out publically.
It truly doesn’t matter why the NOAA constantly changes their data about the past.
If the NOAA believes that their data about past temps was wrong, and was wrong again after adjustment, and was wrong again 3 months ago, and 2 months ago, and last month…
and if they believe that this data will continue to be wrong, as indicated by the fact that they don’t seem to have any plans to shut down the adjustment software…
Why would anyone use their dataset for any policy or scientific purpose?
Oh, I understand that it could never be truly said to be correct…but there’s wrong, and then there’s a change in the dataset over just 2 years that changes the trend from 1.24F/century to 1F/century…