Guest post by David Middleton

The headline was, of course accompanied by nonsense like this…
Senior Meteorologist Stu Ostro (Twitter) says, “Exceeding July 1936 at the peak of the Dust Bowl heat — is BIG.”
The “record” is less than 120 years long. The warmest month on record in the US, isn’t any more significant to climate change than this past weekend’s abnormally cool weather was.
NOAA’s hottest month ever is based on the homogenized US Historical Climatology Network (USHCN). THe USHCN is a subset of the GHCN…
Investigation of methods for hydroclimatic data homogenization
Steirou, E., and D. Koutsoyiannis, Investigation of methods for hydroclimatic data homogenization, European Geosciences Union General Assembly 2012, Geophysical Research Abstracts, Vol. 14, Vienna, 956-1, European Geosciences Union, 2012.
We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant.
From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends.
[…]
Poor station siting in the USHCN is also the reason that NOAA’s new U.S. Climate Reference Network (USCRN) shows July 2012 to be 75.5°F (about 2°F cooler than the USHCN. The homogenized USHCN supports AGW by artificially cooling the past and artificially warming the present.
While there are valid technical reasons for homogenizing the older records to account for changes in methods, instruments, station location and environmental alteration around stations… The methods used to homogenize the data induce artificial warming. This is a cold, hard, empirical fact. The USHCN data used by NOAA has been artificially cooled in the past in an effort to “homogenize” the older data and methods with the modern data and methods. Remove the homogenization and July 2012 would be 1.0°F cooler than July 1936.
However, even if the homogenization is producing a more accurate temperature record, the “record” is not long enough to say much of anything about climate.
Meteorologically speaking, “climatology” refers to periods greater than 30 years long. “Normal” climatology is generally the most recent 30-yr average. The 30-yr average is climate (what you expect). July and last weekend were weather (what you get). July was significantly warmer than average. Very few months (or weekends) are “average.” Most are above or below average.
Above Normal – One of the outlook categories that are based on the 1981-2010 climatological normal. During this 30 year reference period, average three-month temperature was observed in Above Normal category (10 warmest years) 1/3 (33.3%) of the time.
Anomaly – The deviation of a measurable unit, (e.g., temperature or precipitation) in a given region over a specified period from the long-term average, often the thirty year mean, for the same region.
B – Is used on climate outlooks to indicate areas that will likely be below normal.
Below Normal – One of the outlook categories that are based on the 1981-2010 climatological normal. During this 30 year reference period, average three-month temperature was observed in Below Normal category (10 coolest years) 1/3 (33.3%) of the time.
Climate – Prevailing set of weather conditions at a place over a period of years.
Climate Change – A non-random change in climate that is measured over several decades or longer. The change may be due to natural or human-induced causes.
In reality, climate is a combination of prevailing weather patterns and physical geography. Climatological normals are the mean weather patterns during a 30-yr reference period. Climate change occurs over “several decades or longer.”
NOAA’s NCDC trumpets warm months as all time climate records. The “hottest month on record” in the a NCDC headline is an assertion of climatic significance. When it has no more climatic significance then the coldest August 18-19 on record. Weather records are broken all of the time.
“Climate Normals”
The current climate normals (1981-2010) were adopted on July 1, 2011…
NOAA’s 1981-2010 Climate Normals
NOAA’s National Climatic Data Center (NCDC) released the 1981-2010 Normals on July 1, 2011. Climate Normals are the latest three-decade averages of climatological variables, including temperature and precipitation. This new product replaces the 1971-2000 Normals product. Additional Normals products; such as frost/freeze dates, growing degree days, population-weighting heating and cooling degree days, and climate division and gridded normals; will be provided in a supplemental release by the end of 2011.
The previous climate normals were from 1971-2000. I made an assumption that the climate normals are adjusted once per decade. If my assumption is correct, these are the climate normals reference periods since 1931:
| Decade | Reference Period | Mean July Temp. (° F) |
| 1931-1940 | 1901-1930 | 73.79 |
| 1941-1950 | 1911-1940 | 74.51 |
| 1951-1960 | 1921-1950 | 74.59 |
| 1961-1970 | 1931-1960 | 74.73 |
| 1971-1980 | 1941-1970 | 74.23 |
| 1981-1990 | 1951-1980 | 74.35 |
| 1991-2000 | 1961-1990 | 74.33 |
| 2001-2010 | 1971-2000 | 74.38 |
| 2011-2020 | 1981-2010 | 74.76 |
The NOAA data are available here: NCDC CDO
It appears to me that there is nothing terribly anomalous about the current 1981-2010 reference period. It’s 0.03° F warmer than the 1931-1960 reference period.
Here’s a plot of U.S. July temperatures (1895-2012)…

I’m sure that the actual 2012 July temperature must have been a few 1/100ths of a degree warmer than 77.4°F; otherwise July 2012 is actually a bit cooler than July 1936, despite the homogenization.
Rather than calculate a temperature anomaly relative to a fixed reference period, I decided to calculate it against what I think the contemporaneous reference period would have been (AKA a different take).
Example: The 1931-1940 anomaly is calculated against the 1901-1930 reference period.
Since “climate is what you expect and weather is what you get,” the very hot year of 1936 should be measured against the contemporaneous expectation. Here’s the July temperature anomaly with a different take…

It’s quite evident that July 1936 was a lot hotter than July 2012, relative to what was expected (climatology). One other thing should also be evident. The climate normals of the 20th century did not vary a lot and the current climate normals are not anomalous. The weather has varied a lot, the climate not so much.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Being a total layman, looking at the numbers and making all these nice calculations I wonder: When did the climate decide to go by nice decades and centuries? Why not start in 1895 e.g. and use e.g. 11.3 years (sun cycle) as a standard? Seems to me that with all sort of computer power available today this should be no problem. Question is, would anything different show up?
On the other hand, July was not that hot globally. I have bolded the July reading and the all time record for that data set in each paragraph below for an easy comparison for the four sets that have the July number out below.
With the UAH anomaly for July at 0.28, the average for the first seven months of the year is (-0.089 -0.111 + 0.111 + 0.299 + 0.289 + 0.369+ 0.28)/7 = 0.164. If the average stayed this way for the rest of the year, its ranking would be 9th. This compares with the anomaly in 2011 at 0.153 to rank it 9th for that year. On the other hand, if the rest of the year averaged the July value, which is more likely if the El Nino gets stronger, then 2012 would come in at 0.212 and it would rank 5th. 1998 was the warmest at 0.428. The highest ever monthly anomalies were in February and April of 1998 when it reached 0.66. In order for a new record to be set in 2012, the average for the last 5 months of the year would need to be 0.80. Since this is above the highest monthly anomaly ever recorded, it is virtually impossible for 2012 to set a new record.
With the GISS anomaly for July at 0.47, the average for the first seven months of the year is (0.34 + 0.40 + 0.47 + 0.55 + 0.66 + 0.56 + 0.47)/7 = 0.493. This is about the same as in 2011 when it was 0.514 and ranked 9th for that year. 2010 was the warmest at 0.63. The highest ever monthly anomalies were in March of 2002 and January of 2007 when it reached 0.88. If the July anomaly continued for the rest of the year, 2012 would end up 10th. In order for a new record to be set in 2012, the average for the last 5 months of the year would need to be 0.82. Since this is close to the highest monthly anomaly ever recorded, it is virtually impossible for 2012 to set a new record.
With the sea surface anomaly for July at 0.386, the average for the first seven months of the year is (0.203 + 0.230 + 0.241 + 0.292 + 0.339 + 0.351 + 0.386)/7 = 0.292. This would rank it 11th compared to 2011 when it was 0.273 and ranked 12th for that year. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555. If the July anomaly continued for the rest of the year, 2012 would end up 10th. In order for a new record to be set in 2012, the average for the last 5 months of the year would need to be 0.67. Since this is above the highest monthly anomaly ever recorded, it is virtually impossible for 2012 to set a new record.
With the RSS anomaly for July at 0.292, the average for the first seven months of the year is (-0.058 -0.121 + 0.073 + 0.332 + 0.232 + 0.339 + 0.292)/7 = 0.156. If the average stayed this way for the rest of the year, its ranking would be 12th. This compares with the anomaly in 2011 at 0.147 to rank it 12th for that year. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. If the July anomaly continued for the rest of the year, 2012 would end up 10th. In order for a new record to be set in 2012, the average for the last 5 months of the year would need to be 1.10. Since this is above the highest monthly anomaly ever recorded, it is virtually impossible for 2012 to set a new record.
Good write up David!
Glad to still see you in action as well.
And the low tonight here in S.E. Washington state at elevation 800 ft. is predicted to be 48 F ,,, and its August!
Great statistical/mathematics approach. Considering that a person in general cannot tell the difference of half a degree Fahrenheit, the differences in climatological normals by decade is insignificant. There is definitely no discernible upward trend.
In reply to Joe Prins (August 24, 2012 at 9:38 pm), that’s a good question. Yes, it may very well make a difference. I don’t work with weather/climate data but I’ve spent quite a long time looking at time series data, including discrete events, periodic sampling aggregates, and end state recordings. It is frequently handy to report on arbitrarily chosen intervals, but it is always a mistake not to do a thorough job of establishing that you do not introduce distortions when you do so.
A very simple example is to look at a bi-weekly payroll by month. You have roughly 26 pay events for each year of 12 months, meaning that a few months each year show a 50% higher dollar total than the rest. That is, of course, unrelated to the real world, unless you have budgetary requirements dictating using the one month interval. But IMHO that’s another kind of artificial constraint on things.
When doing any analysis using arbitrarily chosen intervals, a conscientious analyst will include a description of why that interval was chosen, what other intervals look like, and what the effect of using the particular interval is. I don’t see that very often.
If it has been established that “in two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trend,” then we are dealing with the fake temperature data. There is no need to argue about anything else: they faked the data, therefore, they have lost the argument.
Why argue on the basis of biased assumptions imposed by those whose financial well-being depends on the bias? Shouldn’t we, first and foremost, insist that they use honest, unadjusted data, and that otherwise anything they say is scientifically meaningless?
We have had average heat here in NW Oregon this summer. We have also enjoyed a month with no rain which has been great for the stone fruit and according to the predictions I’ve heard possibly two more weeks of dry which would allow us to finish our stone fruit season in fine style. I doubt very much if anyone could classify this as drought as we have had other summers with this type weather and suffered no yearly loss of moisture . It is enjoyable though . On the other hand it does appear that we are in a cooling mode as we have noticed for four years that our season for apples is shorter . We did not have an exceptionaly warm July this year , certainly not the warmist . It was very pleasent and I could enjoy more julys just like it I think .
It does not surprise me in the least thet the American URBAN land temperature is higher than in 1936.
I’d be VERY worried if the URBAN land temperature wasn’t going up.
A colder climate would NOT be good
Joe Prins says “When did the climate decide . . . ”
In 1935. The idea was to have a reference period that an adult could relate to. A person, say age 40 to 50, would have “normals” associated with his or her teenage and young adult years. It seemed to make sense to the folks that were interested in how weather records should be tallied and reported. There was no serious intention that these data would be used in the name of CAGW. I don’t have time to look up the reasoning tonight but you can have a go at it. Here is a start:
The World Meteorological Organization (WMO)
http://en.wikipedia.org/wiki/World_Meteorological_Organization
. . . was preceded by the (IMO)
http://en.wikipedia.org/wiki/International_Meteorological_Organization
and (I think) the manner of defining normals (as done in this WUWT post) was established in 1935 at the Warsaw Conference of Directors (see above link for some history).
Note that these are international with many member countries so many folks will have to agree to changes. Also note that this manner or defining and reporting weather data occurred before “all sort of computer power.”
[Back in the 1930s, computers were women: http://www.americanheritage.com/content/girl-computers ]
Given my belief that motor vehicle emission controls were an important contributor to the post 1970s warming, I thought I’d have a look at vehicle sales in the early 1930s.
Between 1929 and 1933 motor vehicle sales fell more than 75%. Farm tractor sales by even more. One can assume economic conditions resulted in greatly reduced vehicle usage.
Vehicles at this time produced a lot of cloud seeding particulates. Reduce the particulates and clouds decrease and solar insolation increases resulting higher summer daytime highs.
Was the early 1930s warming anthropogenic, due to reduced vehicle usage during the Great Depression?
Do the 1930s warming and post 1970s warming have a common cause?
Can we conclude from the continental land summer temperature that the global temperature is doing the same?
Unlikely.
Global temperature is governed by oceans which are 75% of the surface. Since SST measurements are ‘good’ only in the last few decades, the long term estimate is best obtained by observing long trend of a specific land area temperature which is governed by nearby ocean such as the CET.
http://www.vukcevic.talktalk.net/CET-July.htm
Long and short trends are absolutely clear.
Why are we so obsesssed about recent record highs, trying to desperately show that there was one other moment (or month) in a hundred years or so that happened to be warmer? So what if that were the case. Are we afraid that are skeptic arguments are no longer valid or been falsified by a record high?
PS.. I suggest that whenever anyone mentions Global Land Temperature, we all respond by adding the word “URBAN” .. in capitals.
That is what it mostly measures, afterall.
The 30 year convention is arbitrary. You cannot establish what a statistically significant interval is if you do not know the long term correlations in the data. And, nobody does. There very definitely appears to be an approximately 60 year quasi-cyclic process evident in all the temperature sets. For such a correlated process, 30 years is just about the worst possible time interval to choose.
Let’s suppose this was the hottest July since 1936. So what? Something caused that hot July as well. It is impossible to rule out that, that same something caused this one, possibly by interfering constructively with other processes which have previously caused a relatively hot July to appear.
Bart,
My background is range science, where estimating standing pasture yield is a common measurement. Commonly done with a sampling frame (quadrat) and a set of clippers.
And sampling statistics to say that your quadrat should be long enough to span at least small pattern in the vegetation ( See Greig-Smith Quantatative Plant Ecology for more of a start).
So, IMO, you should not be using a 30-year quadrat where there is a known 60-year pattern.
And Bart, It may very well be that the so-called “Hottest” comes from much greater concentration of urban or significantly effects stations. and the “unaccounted for” loss of all the rural stations JUST AT THE RIGHT TIME TO PERPETUATE THE MYTH. How darn convenient was that !!!!!!!!!!
Is it really warmer.? .
after all the ADJUSTMENTS, and all the massive urbanisation since 1936, do we really know,?
I DON’T THINK SO. !!!
Chris Schoneveld says:
August 25, 2012 at 1:17 am
To be fair, the skeptic argument is more that there is not a valid testable AGW argument (especially when it is based on potentially quite flawed data)
But further, to invoke that a current hottest day.week/month/year indicates AGW is also compeletely false as the correlation is simply unproven (especially when we know the land temp data is largely flawed).
As I see it, David’s post also shows how the alarmist use of the (flawed) data is also misleading.
The whole frickin thing is one circular argument, with no testable measurable indicator of a distinct anthropogenic signal above and beyond the natural temp variability. Saying such a signal can be seen in urban areas is completely invalid due to UHI effects. Hence, the data gridding and averaging and adjusting to ‘see’ the so called anthropogenic signal, and round we go again!.
My personal take is that the majority of any global warming (if significantly present) will be largely natural, and moreover, likely to be within the natural climate variability; a large step change will be required to truly determine any such change above and beyond the natural variability. 30 years, 50 years, 100 years, are all insufficient to detect a definitive change in temps based on the current data we have available.
Sure, with many many thousands of special identical sensors, in identical type (rural) locations, spread at carefully determined spatial and altitudal settings, might, after a few decades, give us a real indication of some upward trend – but even then, that could still be natural variability! Anyone that thinks the climate is or rather ‘was’ in some kind of ‘static’ zone pre-industrial times is delusional in the extreme. One only has to look at ice ages to know that any such ‘static’ zones are actually only ‘temporary’.
Bart says: August 25, 2012 at 1:29 am
……
Agree. If the long term trend is eliminated than it is possible to estimate natural variation (oscillation) which indeed has variable 60-5 year long period. Currently the N. Atlantic SST appear to be at the top of such cycle and pp amplitude difference is 0.6 degrees C, this is reflected in fractionally higher difference of 0.7 degrees C for the entire Northern hemisphere.
http://www.vukcevic.talktalk.net/GSC1.htm
Natural variation has number of explanations, none with a universal acceptance.
Jesse Owens’ world long jump record was unbeaten for 25 years. US July temperature record has held (if measurements are accurate) for 76 years.
Chris Schoneveld: “Why are we so obsesssed about recent record highs, trying to desperately show that there was one other moment (or month) in a hundred years or so that happened to be warmer?”
Well, the idea is that you can win hearts and minds by trying to link high temperatures and global warming together. That’s why the NCDC article title, Ostro’s twit, and the newest cover of National Geographic. Such non-sequitors are a very effective strategy as any incumbent politician will note.
Ah wait. You meant why would skeptics be skeptical of non-sequitors? Who knows, they probably blame it on the rain.
Proving, once again, that you can’t, scientifically, have the flam without the flim.
I find it humorous that in analyzing the “hot and dry” summer of 2012 in the midwest (Kansas City), according to the NWS, this year ranks in the top ten for hot and dry, until you get to August. Yes, August has been dry, until today (raining!!), but August is only the 85th warmest on record, out to about 130 years. Those dust bowl years are still at the top, depending on how the data is sliced.
My Mother in NW Kansas, Lived through the Dust Bowl . This is not the Dust Bowl conditions
by any stretch. It was partially exacerbated by poor farming practices. But boy the particulates
the air were amazing. Black muddy rain when it rained, headlights on in the midday. etc. My
Father, on the other hand lived in NE Oregon (Cowboy-Logger). It was dry here too. but,
the worst year in his memory was 1933 when the Tillamook burn and several other big fires happened. Then it was their turn to drive with headlights on in midday. Though it’s a “Far Piece”
from Tillamook to High Lonesome of NE Oregon…
Joe Prins says:
August 24, 2012 at 9:38 pm
Question is, would anything different show up?
As we are aware, Hadcrut4 has replaced 1998 as the hottest year with 2005 and 2010 being warmer. The average anomalies for these three years are as follows according to the woodfortrees numbers: 0.523, 0.535 and 0.5375 respectively. However when one digs a bit deeper, an interesting fact emerges. The hottest consecutive 12 month period is still from the previous century. The hottest 12 month period around 1998 is from September 1, 1997 to August 31, 1998. Here, the anomaly according to Hadcrut4 is 0.5675. 2005 is not changed by adding or subtracting months. However for the period around 2010, the hottest 12 month period is from August 1, 2009 to July 31, 2010. And for this period, the average anomaly is 0.565, which is 0.0025 below the 1998 value. Of course I am NOT going to suggest any significance to this, just like there is no significance to 2010 being 0.0145 warmer than 1998 with the error bar being about 0.1. But it is something to keep in mind in case someone comments that 2010 was the warmest year due to the “fluke” of how our calendar is constructed.
You can also see it here that 1998 was warmer by a line width:
http://www.woodfortrees.org/plot/hadcrut4gl/from:1980/mean:12