'Record temperatures' placed in context with station history

DayRec: An Interface for Exploring United States Record-Maximum/Minimum Daily Temperatures

Essay by Greg Kent

Foreword: There is a new resource for obtaining high/low temperature extremes.  The DOE released the DayRec tool that has requirements for long and complete station records.  I think this is a nifty source of information for cutting through the BS since so many stations have short histories or have very incomplete records.  I’ve done a little analysis of the data as it is presented to see the distribution of max/min records by decade over the last century by the 424 stations with the most complete records.  But there is a lot more that could be done with the data by someone else who can use tools more sophisticated than pivot tables.  I’m hoping that your regular readers will find this useful.

The NCDC website http://www.ncdc.noaa.gov/cdo-web/datatools/records tracks the number of “record” daily high TMax or low TMin temperatures for each of the US stations.

To be included as a record, the NCDC has a rather loose requirement for including stations. Their website has some problems witht he map plotter at the moment, but if I recall correctly, there a station can be included if it has a 30-year record and each year has coverage of at least 182 days. Therefore, any station that has recorded temperature data for half of each year since 1984 will be considered as setting a new record.

Such as loose definition, however, is likely very misleading to the media, politicians, and the general public. When most people hear that a new “record high temperature” has been set, they usually have a longer period of time in mind. I suspect that most people would react differently to an announcement of a “record high temperature” if they understood that the records began when Ronald Reagan was president. Of course, it is convenient for political reasons to maximize the number of “record high temperatures” because it fits the mandatory narrative about global warming or climate change.

Skeptics have been quick to point out that many of the stations in question have short records lasting only relatively short time scales, so these so-called record temperatures are relatively meaningless in the context of the longer times that most people are thinking about. Skeptics are also quick to point out when even the skewed NCDC records go the wrong way. For instance, the fact that 2013 had more daily record low temperatures than daily high records was fairly well communicated on skeptic blogs. (BTW, this trend has also continued for January and February of 2014, with more record low records than record highs.)

But if the NCDC’s records website is of limited value, it would still be interesting and informative to understand how “extreme” recent years’ temperatures have been in the context of a longer time period. And wouldn’t it be useful if this was an “official” set of temperature records that everyone (warmists and skeptics alike) could agree on. In fact, such a tool has recently been released.

The Carbon Dioxide Information Analysis Center of the Department Energy has released a tool that is intended to provide more meaningful analysis of extreme temperatures over time. The DayRec tool, as it is known, is described at the following link: http://cdiac.ornl.gov/climate/temp/us_recordtemps/dayrec.html

What makes this tool and its corresponding dataset useful is that it has more stringent (and realistic) criteria for determining which stations should be tracked for record temperatures. Instead of the 30 years used by NCDC, the DayRec tool requires a 100 year temperature record, from 1911-2010. This is much more meaningful timeframe to use for the context of speaking about records. The DayRec tool also has stringent criteria for the amount of data that can missing. Only a small amount of missing data is permitted and those “missing observations [must be] be spread out relatively evenly over time (both seasonally and over the full period of record), so as to avoid time-dependent bias that could on its own give misleading impressions of changes in the frequency of record-setting temperatures over time at a given station.” Only those GHCN stations that meet these criteria are included in the DayRec analysis. As a result of these criteria, only those stations that have the longest records with the most complete and least distorted records are considered for extreme high and low temperatures. This makes the DayRec records much more meaningful than the NCDC records website.

There are two versions of DayRec. The first version was released in 2012 and had very stringent missing data requirements (98.5% complete data). This contained 200 stations, but was not geographically well distributed. The desert southwest, for example, was completely excluded. In November 2013, a new version was released contained 224 additional stations. The aggregate 424 stations provides more thorough coverage geographically, but has slightly more missing data (97.6% complete data). This link shows the geographic distribution of the stations.

http://cdiac.ornl.gov/climate/temp/us_recordtemps/ui.html

clip_image002

The blue markers indicate the original 200 stations with the most complete records. The additional 224 stations are with less complete records are shown in the green markers. These help fill in the gaps. The 424 stations cover most of the lost 48 states (except CT or DE).

The primary intended use appears to be through a GUI to select a single station and generate reports. However, data files are available with all of the records for each day for each station. The files and layouts are located here:

http://cdiac.ornl.gov/ftp/us_recordtemps/sta424/ and http://cdiac.ornl.gov/ftp/us_recordtemps/sta200

The DayRec tools that are provided on the website are intended to provide analysis of daily record temperatures for a single station. However, using the data files makes it possible to analyze all of the stations.

When the record daily high temperature across all 424 stations are analyzed, it’s easy to determine if it has gotten hotter in recent decades. When record temperatures are put into buckets by decade, it’s easy to see that most the 1930s was by far the hottest decade across the continental United States. The number of record highs is the 1930s is double the number of record highs in the 2000s. The 2nd, 3rd, 4th, and 5th hottest decades were also in the early period, 1911-1960. The most recent decade, which has been affected the longest by global warming and is supposed to have the most “climate change”, is fairly anemic in 7th place with respect to record high temperatures. At least in the United States, the 2000s were not very extreme at all.

The same message holds for record high temperatures during the summer months. When the average person hears about global warming and record high temperatures, their minds likely go to a sweltering July day. The good news is the high temperature records for the summer months (JJA) are even more skewed to older decades. Whereas 60% of all daily record highs occurred before 1960, 70% of all record highs for the summer months occurred before 1960. The hottest days at the hottest time of the year were much more likely to occur in the past than in the present. The 1990s and 2000s set relatively few summer-time record highs. The vast majority of high temperature records in recent decades were in cold or cool season. For example, in the 1990s, more than twice as many daily record high temperatures were set in the winter months (4708) than were set in the summer months (2473). To the extent that global warming is happening, it is raising winter temperatures a few degrees, which is something most of the public would not be terribly concerned about. In fact, many of folks would consider this a good thing.

clip_image004

Taking a more granular look at individual years gives largely the same picture. The year with the most record high temperatures was 1936. The next few hottest years were also in the 1930s. Analyzing the 15 years with the most high temperature records shows that every single one of them occurs before the era of global warming began in the 1970s. The more recent years, which are supposed to have been particularly extreme, are dwarfed by the number of records set in 1911, 1925, and 1930s. Using a century-long scale makes recent years look neither particular hot nor particularly extreme.

clip_image006

What about record low temperatures? The data shows that there has been a reduction in the number of record lows in recent decades, especially in the winter months. The last two decades, for example, had very few record low temperatures. In other words, winter nights have become a little warmer over the last two decades. Most people would likely consider this a good thing, or at least not as something to be especially worried about. (Note: The number of low temperature records is shown as a negative number for orientation purposes to differentiate from high temperature records.)

clip_image008

Comparing the number of record highs and record lows across decades actually gives a good clue of what “global warming” actually means in the continental United States. Global warming does not mean that summer days are getting warmer in comparison to the last century. Instead, it means that winter nights are getting less cold and therefore setting fewer low temperature records. In the 2000s there were half as many record cold winter night-time lows (1693) as there were record hot summer day-time highs (3151). In the context of other decades, this doesn’t mean that there was a lot of high temperature records – as discussed above, the number of the high temperature records in the 2000s was rather low by historical standards. Rather, it is because the number of low temperature records was disproportionately low.

The DayRec website discusses that “changes through time in record high and low temperatures (extremes) are also an important manifestation of climate change.” It goes on to discuss the Meehl et al (2009) finding that “twice as many high temperature records are being set as low temperature records.” Analyzing the complete DayRec data set sheds light on this finding. The common interpretation of Meehl et al is that the number of record high temperatures has increased. The DayRec data (the longest term, most complete data sets available) prove that this is incorrect. The reason for Meehl’s finding is that the although the number of record high temperatures is lower than in the first half of the century, the number of record low temperatures has decreased even further in recent years.

To illustrate the types of reports that are available, the DayRec website includes a sample plot of high and low records for one example station. The text says: “Decadal frequency of record-setting Tmax’s and Tmin’s for Fort Collins, Colorado. The plot reflects record temperatures set over all days of the year. Note that over 2001-2010 there were just three record-low Tmin’s set, while there were 104 record-high Tmax’s set.” The plot is shown below:

clip_image010

The Fort Collins graph shows a definite stair-step pattern with an increasing number of high temperature records and a decreasing number of low temperature records. This plot corresponds remarkably well with the global warming/climate change narrative, and the stair-step rapid rise in the number of high records in the 2000s is almost frightening. Two relevant facts, though, can put our mind at ease. First, Fort Collins (station ID 53005) was rated by the surface stations project and assigned a CRN rating of 4, which indicates poor siting that can distort temperature readings by up to 2 degrees. Second, the distribution of high temperature records of this station is not representative of the entire population of 424 sites. Comparing the Fort Collins graph to the graph shown earlier of all 424 stations shows how extreme Fort Collins temperature record is. As mentioned previously, the decade of 2001-2010 ranked 7th overall in terms of number of high temperature records. Clearly, Fort Collins is an outlier. The Fort Collins station has the 5th highest number of records in the 2000s (104) out of all 424 stations. In comparison, the average (mean) number of high temperature records in the 2000s for all 424 stations is 39, and the median is 35.

The DayRec website states the public “would benefit from additional ways to get climate extremes information … and assess it.” That is certainly true. The trouble for the warmist establishment is that the data goes against the common global warming/climate change narrative. Of course, that is true only if you consider the data on extreme temperatures for all 424 stations. An analysis of the original 200 stations included in the DayRec database tells largely the same story. This is in contrast to the impression given by the Fort Collins plot that is shown on the webpage. It is intriguing that this particular station, which so nicely fits with the global warming narrative, was chosen for the webpage, even though it is an outlier. One wonders what might have motivated this choice.

*************************************

Further avenues for analysis for the DayRec stations:

1) Updating the data with for records set in 2011-2013 (2012 will likely have a bunch of highs, but 2013 will have a bunch of lows)

2) Looking at the full record for these 424 stations and calculating a yearly temperature and trend vs. the “official” NCDC contiguous US temperature trend

3) Looking at the missing data and in-filling. As pointed out in a recent paper, skipping missing data implicitly assumes that the missing data is the average of the data that is present. In-filling where possible might give a more accurate result.

4) Correlating these 424 station histories with their CRN score (or Leroy ratings) and seeing what the relationships are

0 0 votes
Article Rating
43 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
richardk
March 10, 2014 10:29 pm

If a temp record falls + or- do the trees hear it? Ask MM.

Mike T
March 10, 2014 10:43 pm

Excellent article, although it could do with some proof-reading/editing. It would be interesting to see a similar decadal analysis done for Australia (which has a surprisingly good data set, at least back to about 1910). Brings to mind state capital-based news reports which excitedly exclaim “hottest February day in MelPerBrisSyd for two years”. Yeah, right.

Keith W.
March 10, 2014 10:56 pm

Got to love the Google maps plot of the location nearest to my home and work, Hernando, MS. The highest magnification on satellite view put the sensor right in the middle of a swimming pool in the backyard of a residence.

March 10, 2014 11:21 pm

Mike T … we’ve been having cool days here in Brisbane for over a month already … MSM is schtum! February wasn’t at all hot … normally +30C and humid but can’t say it was near this for much of the month.

Mike T
March 10, 2014 11:36 pm

Streetcred, I’m not in a capital city, but in outback WA we were 3 degrees for max and 2 degrees for min warmer than average for Feb, both Jan and Dec also warmer than average. Roll on winter, I say!

Rob
March 10, 2014 11:45 pm

Surprising gaps in the Gulf Coast region. Especially considering that I’ve got stations with excellent data. Makes me wonder!!!

asbot
March 11, 2014 12:38 am

@ rob , I am not at all an expert on the region but thinking about it’s geological area, as in a wet swampy (and definitely in those days no flood control etc) unhealthy malaria area. I think there might not have been many weather stations a 100 years ago. And if I understand the parameters of the project correctly I think that that is a similar reason that the SW was left out in the initial statistics as well, maybe there is just not enough long term data. Help me out I am a bit dense on the SE part of the USA, Thanks Asybot.

Another Ian
March 11, 2014 12:45 am

Streetcred and Mike T
Note that the curent BOM usual set is 1913 – .
Which conveniently leaves out the Federation drought.
And I’ll back the thermometer readers of that era vs those of now when it comes to reading a glass thermometer.

March 11, 2014 12:51 am

if in 10k years winter will start in june then there will always be “record high temperatures” as the seasons track thro the months? what mess of the climate and the prospect for agriculture and nature that all will have ? In 5k years time when winter starts in march one might expect some violent climate flipping and extreme weather?

Patrick
March 11, 2014 12:54 am

“Mike T says:
March 10, 2014 at 10:43 pm”
I recall from ClimateGate that there was issues with data from Australia. My memory couldbe faded abuot that, but I am pretty sure the person inputting data was frustrated at the lack of integrity and simply made it up…or comments to that effect!

Henry Clark
March 11, 2014 12:55 am

As indirectly suggested by this and much other data, an unfortunate aspect in ways is, when the next LIA comes, it particularly won’t just be about milder summers but really cold winters unseen since the early 20th century or rather worse (moderated somewhat by agricultural irrigation land use changes but not enough to stop it).
The pattern in the plots mainly fits the usual: warming in the 1920s-1930s leading to high warmth and low cold in the 1930s-1950s, then cold driving the global cooling scare in the 1960s-1970s, and then the warmth which allowed the global warming scare.
It is a double-peak history, although it is clearer in a plot of average temperatures (if non-revisionist) than in these min/max plots.
While the 1930s U.S. TMax records were presumably extra enhanced by the agricultural Dust Bowl, overall that double-peak history for the U.S. (and, as separately illustrated subsequently, arctic, Northern Hemisphere, and non-revisionist global temperatures) coincides to the pattern in solar activity meanwhile, as seen in http://img213.imagevenue.com/img.php?image=62356_expanded_overview3_122_1094lo.jpg

steveta_uk
March 11, 2014 1:01 am

The number of record highs is the 1930s is double the number of record highs in the 2000s.

Someone is being stupid here (possibly me). Surely the number of record highs tells you nothing about whether the 1930’s was warmer than the 2000’s – only that the 1930’s saw those high temperatures for the first time. In the limitting case, it only takes one single record high in the 2000’s and all other measurements to be identical for the 2000’s to be a hotter decade overall.
Can you not compare actuals, rarther than records?

ROM
March 11, 2014 1:24 am

I hope that somebody down loads the data from every single one of those 424 stations and does it pronto.
Based on the past record of the more extremist elements of the global warming faith, they are highly likely to get in amongst that data and do some serious “adjustments” to ensure the unchallengeable status of the current climate catastrophe meme.
As a layman science promoter and supporter for nigh on half a century now I guess I am a bit sad that I have come to the point where I no longer trust the actions, the motives and even the ethics of the majority of scientists in a whole section of science, that of climate science.

Henry Clark
March 11, 2014 1:31 am

steveta_uk:
Indeed it was easier to set a record-so-far high back in the 1930s than afterwards. That (along with the Dust Bowl) is presumably a contributor to why the 1930s appear more exceptional in the U.S. TMax plot than they do in most other plots. What a plot of average temperature history shows is a double peak over the past century and a hot but not *quite* so extreme 1930s in context.
Can you not compare actuals, rarther than records?
Yes, I wouldn’t look at records *alone*, with them an extra demonstration but best in the context of other data as well.
Some average temperature plots can be seen in the link in my prior comment, around a third to half way down in the large image.

March 11, 2014 2:08 am

Of all the 400+ stations they could have chosen to demonstrate the capabilities of the system, isn’t it really surprising that they just happened to choose 4-rated Fort Collins that shows a strong GW signal? I’m totally amazed by that random selection of a temperature record, which is completely unexpected. /sarc off

D Carroll
March 11, 2014 3:04 am

I’m no mathematician/statistician, But this seems to me to be completely misleading.
If one takes a 100 year period, the second year of records has to break a record!! ether high or low!! The next year has a 50/50 chance of going ether way?! So in the early years of records, they’ll be broken all the time. It’s only after many years that it tends to even out.

jhapp
March 11, 2014 3:41 am

Records make more sense if you run time in reverse.

Bloke down the pub
March 11, 2014 4:00 am

‘One wonders what might have motivated this choice.’
You old cynic.

Editor
March 11, 2014 4:04 am

D Carroll
If one takes a 100 year period, the second year of records has to break a record!! ether high or low!! The next year has a 50/50 chance of going ether way?! So in the early years of records, they’ll be broken all the time. It’s only after many years that it tends to even out.
These are not daily records set at the timw, and beaten since. They are the daily records as of now.
The only question I would ask is if subsequent ties are included.

Bruiser
March 11, 2014 4:14 am

Australian temperature stations with recordings that go back to the late 19th century usually show 1889 as the hottest year. The Cape Otway Light House has records dating back to 1865. All of the hottest 95 percentile occur prior to 1890. The Lighthouse is unaffected by agriculture, industry or urban encroachment. New records are frequently claimed for towns where the weather station has been moved to the airport and the old records are ignored.

Mike T
March 11, 2014 4:50 am

Patrick: to my knowledge, there is nothing wrong with the data, as with all data it’s the way it’s handled that can be problematical. I suggested 1910 because before that, temperature data becomes “iffy” due to instrument exposure issues.

Leigh
March 11, 2014 4:53 am

In reply to Mike T .
If you go here you’ll see we have considerable problems with our BOM.
http://joannenova.com.au/2012/03/australian-temperature-records-shoddy-inaccurate-unreliable-surprise/
JoNova as well as a few others have been on the BOM’s back for years to explain themselves.
So far they have resisted demands to have the Australian government independant auditor to audit what they have done with our temperature records.

F.A.H.
March 11, 2014 5:03 am

I got curious about records applied to the year to year changes in anomalies. I took a simple difference of (year n+1 anomaly – year n anomaly) in HADCRUT4, then I took averages of those differences over each decade, starting with 1904-1913. The result for the 11 decades of the past century plus one, sorted in descending order of average year to year change in anomaly, are
2003 0.036
1943 0.027
1923 0.0164
1983 0.0128
1953 0.0104
1913 0.0042
1973 0.0013
1933 0.0002
2013 -0.0044
1993 -0.0048
1963 -0.0051
It looks like the current decade was the third most cooling decade on record for the past 110 years. Of the last three decades, two have been among the three most cooling decades on record. 2003 was the most warming decade on record. It looks like the warming has come and gone.

Mike T
March 11, 2014 5:04 am

Another Ian: “And I’ll back the thermometer readers of that era vs those of now when it comes to reading a glass thermometer.” No reason why you should. In any case temperature nowadays is taken from electronic probes, not mercury in glass, or alcohol in glass thermometers. I do have a slight issue with the probes, in that they are more sensitive than say mercurial maximum thermometers, and will give a slightly higher reading. I don’t know if that’s been corrected for over the long term, but it quite possibly has, and may be one of the reasons data is “massaged”. I don’t see anything wrong with “matching” data sets to account for varying instrumentation. As for data before 1910, it’s not considered to be “high quality data” because of instrument exposure issues.

Smoking Frog
March 11, 2014 5:06 am

Paul Homewood says:
March 11, 2014 at 4:04 am
These are not daily records set at the timw, and beaten since. They are the daily records as of now.
Then the essay ought to say so. Do you know for a fact that they are records as of now, or are you only inferring it from the fact that otherwise the essay would be worthless? I don’t like asking such an insulting question, but if they are records as of now, I find it baffling that the essay does not say so. Is this “as of now” something that is widely known? I doubt it.

Editor
March 11, 2014 5:52 am

Smoking Frog
Do you know for a fact that they are records as of now,
Yes. If you check the Fort Collins example, you get (at an eyeball count) around 450 high max records, whereas if you counted everytime a record had been set, you would probably be into thousands. (This also answers my question – yes it does include ties).
You can check this out with an individual database, such as this one in Alabama.
http://cdiac.ornl.gov/cgi-bin/broker?id=013816&_PROGRAM=prog.tempdoy_data_d9k_424.sas&_SERVICE=default&_DEBUG=0&tempvar=htxctx
Click on Data File, and you get the list of daily records. There is one for each day, except where there are ties.
In other words, these are all time daily records.

mellyrn
March 11, 2014 6:28 am

fwiw, the world’s all-time hottest temperature, 56.7C, 134F, was set in 1913, and remains unbroken. Second-hottest appears to be 55C, 131F, in 1931. Odd that catastrophic warming would not yet have broken either.
And the coldest, -89.2C, -128.6F, was set in 1983; second-coldest, 82.5C, 116.5F, in 2005.

Patrick (the other one)
March 11, 2014 6:51 am

Interesting note on this page:
http://cdiac.ornl.gov/climate/temp/us_recordtemps/ui.html
Note: Currently, DayRec does not include any stations in Connecticut and Delaware.
Also, New York has 9 stations listed, but not from the areas I expected.

Steve Keohane
March 11, 2014 6:59 am

Interesting you chose Ft Collins, CO as an example. Having moved there in January of 1972, with about 24,000 people it is now 150,000. Something people may not understand back east and in other high density population areas where towns are joined together, often most of the population is in the county, just outside town limits. So population growth is not limited to the town proper, and would cause something of a UHI effect beyond or differently from a scenerio of just a growing town. I now live on the western slope, in a valley with four towns of about 5,000 people each. Another 30,000 people live in the county, outside of the towns, in a 75 sq. mile area.

Evan Jones
Editor
March 11, 2014 7:14 am

Numbers of hi-lo are not necessarily indicative:
1.) Early on in a series there will be tend to be far more records (high or low) than later, for reasons that should be obvious.
2.) If we are an a high, but non-increasing plateau, we will tend to see more warm records than cold because (on the whole) from up on that “plateau”, you have to get just a teeny nudge up to get a warm record, but need a big drop to get record cool.

Jeffrey
March 11, 2014 7:27 am

In my opinion, this whole “record temperatures” fooferaw is about warmie PR, not science – because quantity is removed, and science is only quantity. After all, a “record” could be anywhere from a millidegree to a kilodegree above the previous “record,” so what does a “record” tell us about the heat energy content of the biosphere?

Evan Jones
Editor
March 11, 2014 7:27 am

Patrick: to my knowledge, there is nothing wrong with the data, as with all data it’s the way it’s handled that can be problematical.
It can be both, of course.
I suggested 1910 because before that, temperature data becomes “iffy” due to instrument exposure issues.
Around four out of five of today’s USHCN (the “crown jewel” of the HCN) is “iffy” because of instrument exposure issues.
Furthermore, the data from a particular station is worse than useless without the accompanying metadata. If there is no record of TOBS and station moves, the data from the station essentially conveys no meaning.
But the problem arises that there is no decent or consistent metadata for the USHCN going back more than 40-50 years or so. We don’t know how well they were located. We don’t know the time of observation history (utterly critical). And, for that matter, we don’t know the history of station moves. And the USHCN is the best. For the rest of the world (except maybe Aus/NZ), it’s even worse.
So they are pulling those earlier trends out of their behinds, anyway.

Pachygrapsus
March 11, 2014 8:17 am

With fewer records in either direction being set in the most recent decade, doesn’t that firmly contradict the recent press releases stating that we’re seeing more extreme weather? Every time I hear the “warm means cold” explanation, I think that it’s an extraordinary claim that requires extraordinary proof. This data seems to show that it’s just not true.

Editor
March 11, 2014 8:33 am

The lack of “cold records” reflects the fact that many of the stations only date back to after the war.
Therefore, they include plenty of cold records from the 1960’s and 70’s, but have none of the hot records from the 1930’s and 40’s.

Richard M
March 11, 2014 9:24 am

The lack of more cold records could easily be due to UHI. UHI is much stronger at night (where I live) with very little seen at the daytime highs. It’s too bad something like this doesn’t exist for global data. The alarmists will simply dismiss this as regional.

David A. Evans
March 11, 2014 1:01 pm

steveta_uk
As has been noted before, yes records are easier to set at the start of a record than later, however, what is important is that those records have not been broken, they are standing records, not just records that were broken at the time.
I do understand your confusion however as every measurement at the start of a time series is a record.
DaveE.

David A. Evans
March 11, 2014 1:11 pm

D Carroll
These are standing records, not since beaten
DaveE.

David A. Evans
March 11, 2014 1:14 pm

D Carroll
These are standing records, not since beaten
DaveE.
Some kind mod please delete previous comment with messed up link. Thanks. DE

D Nash
March 11, 2014 2:23 pm

Not sure how useful the number of max records in a year/decade really is. Saying that more records were broken on a daily bases in the beginning of the record keeping than in the end doesn’t seem to say much (am I missing something here?). As noted above you will see many more early on and fewer as the max temp gets higher. I can see if you list by year/decade the number of existing record Tmax values of the 424 stations so that there are 424 (excluding ties) data points you might get some information from the numbers. Alternatively, showing how many days the Tmax was above the average for the whole period might be of interest.

Mike T
March 11, 2014 5:30 pm

I note links above to BOM data sets which have issues. The rounding which occurred in the pre-Celcius era is of great concern, and is new to me. I note also that the BOM’s own stations appear in the “good” column i.e. no rounding, either in F or C. Some of those stations are “climate reference stations”, I have no idea what other (i.e. “volunteer” or “contract”) stations are used for calculating averages etc. Presumably, stations with evidence of rounding would be left out. As for the metadata, all stations have a filed paper history, but not all are accurate as early record-keeping was not always efficient.
I do have an issue with Oz maxima, especially for the eastern states. It’s not well known, but maximum thermometers aren’t reset until 9am (which is actually 8am, in DST) so one could have a mild day followed by a hot one. It’s not unusual for the temperature at 8am the following day to be higher than the previous day’s max temp, by 9am it’s well above… the maximum to 9am is recorded for the previous day. So, as an example, it’s 26C max on Tuesday, by 9am Wednesday the temp is 29C… that’s what goes into the record for Tuesday, 29C, not 26C. Ridiculous, really, but as far as I know it has always been done this way so the whole record is skewed. A similar thing can happen with minima in winter, the minimum thermometer is reset at 9am, it gets colder during the day, then warms up through the night (cloud comes over, or whatever). It might get down to 9C during the day, but nothing colder than 12C overnight- but 9C is the minimum temperature for the next day despite the fact it might have occurred at 10am the previous day. Given the fact that observers around the world would rarely work until midnight (when the 24 hour maximum clock should stop) this may be an issue in other countries’ data sets as well.

lee
March 12, 2014 12:06 am

Records Australia – Before the Stevenson screen ost used the Glaisher stand. Apparently BOM don’t like it as the records vary. However from Warwick Hughes page –
‘A pdf of the above can be downloaded
BoM swatting me down. But note on page 709 they conclude – Over the year, the mean temperatures were about 0.2 degrees C warmer in the Glaisher stand, relative to the Stevenson screen. – So it would seem that any questionable pre-1910 Australian data could be used after all in “global warming” studies, with this minor adjustment. I could be happy with that.’
extract: http://www.warwickhughes.com/papers/ozstev.htm
‘In 1991-92 when I asked Australian Bureau of Meteorology (BoM) people about the late 19C warmth, I was told that this was due to older style open thermometer stands being used before the BoM took over weather recording for the Commonwealth in 1907 (now they say the BoM was formed in 1908), when the Stevenson Screen was introduced across Australia.
This lead me in 1991 to researching in the BoM library to see what documentary evidence there was for this and I was unable to confirm the BoM version of history. I did find some proceedings of late 19C Intercolonial Conferences held in 1879, 1881 and 1888. It was clear that the scientists running the meteorological observatories for the various Colonies (that now make up Australia), were all aware of the Stevenson Screen and there was much evidence of its use, or the use of a similar local variant.’
Same source,
So if BOM was formed in 07-08, why do the records only go back to 1910- let alone the previous record keeping?

Smoking Frog
March 12, 2014 4:12 am

Paul Homewood says:
March 11, 2014 at 5:52 am
Thanks.

James Sefton
March 15, 2014 10:29 am

Using records proves nothing… as time goes on the chance of reaching a new record get’s less likely as we would need either a higher Tmax or a lower for Tmin than the previous to set a “new” record… sort of self defeating isn’t it? In reality that means as the dataset’s time record grows, new records would get harder and harder to reach each time a new one is set. Unless we do what they do in sport these days and start recording temperatures to thousandths of a degree?