Guest Post by Willis Eschenbach
As Anthony discussed here, some Australian climate scientists think that there was an “angry summer” in 2012. Inspired by the necromantic incantations in support of the Aussie claims coming from the irrepressible Racehorse Nick Stokes, I went to take a look at the Australian temperature data. I found out that in response to hosts of complaints about their prior work, in March of 2012 the Australian Bureau of Meteorology (BoM) released a new temperature database called ACORN-SAT. This clumsy acronym stands for the Australian Climate Observations Reference Network (overview here, data here)
It’s a daily dataset, which I like. And they seem to have learned something from Anthony Watts and the Surfacestation project, they have photos and descriptions and metadata for each individual station. Plus the data is well error-checked and vetted. The site says:
Expert review
All scientific work at the Bureau is subject to expert peer review. Recognising public interest in ACORN-SAT as the basis for climate change analysis, the Bureau initiated an additional international peer review of its processes and methodologies.
A panel of world-leading experts convened in Melbourne in 2011 to review the methods used in developing ACORN-SAT. It ranked the Bureau’s procedures and data analysis as amongst the best in the world.
and
Methods and development
Creating a modern homogenised Australian temperature record requires extensive scientific knowledge – such as understanding how changes in technology and station moves affect data consistency over time.
The Bureau of Meteorology’s climate data experts have carefully analysed the digitised data to create a consistent – or homogeneous – record of daily temperatures over the last 100 years.
As a result, I was stoked to find the collection of temperature records. So I wrote an R program and downloaded the data so I could investigate it. But when I had just gotten all the data downloaded started my investigation, in the finest climate science tradition, everything suddenly went pear-shaped.
What happened was that while researching the ACORN-SAT dataset, I chanced across a website with a post from July 2012, about four months after the ACORN-SAT dataset was released. The author made the surprising claim that on a number of days in various records in the ACORN-SAT dataset, the minimum temperature for the day was HIGHER than the maximum temperature for the day … oooogh. Not pretty, no.
Well, I figured that new datasets have teething problems, and since this post was from almost a year ago and was from just after the release of the dataset, I reckoned that the issue must’ve been fixed …
…
…
… but then I came to my senses, and I remembered that this was the Australian Bureau of Meteorology (BoM), and I knew I’d be a fool not to check. Their reputation is not sterling, in fact it is pewter … so I wrote a program to search through all the stations to find all of the days with that particular error. Here’s what I found:
Out of the 112 ACORN-SAT stations, no less than 69 of them have at least one day in the record with a minimum temperature greater than the maximum temperature for the same day. In the entire dataset, there are 917 days where the min exceeds the max temperature …
I absolutely hate findings like this. By itself the finding likely make almost no difference for most applications. These are daily datasets, with each station having around 100 years of data, 365 days per year, that means the whole dataset has about 4 million records, so the 917 errors are 0.02% if the data … but it means that I simply can’t trust the results when I use the data. It means whoever put the dataset out there didn’t do their homework.
And sadly, that means that we don’t know what else they might not have done.
Once again, the issue is not that the ACORN-SAT dataset had these problems. All new datasets have things wrong with them.
The issue is that the authors and curators of the dataset have abdicated their responsibilities. They have had a year to fix this most simple of all the possible problems, and near as I can tell, they’ve done nothing about it. They’re not paying attention, so we don’t know whether their data is valid or not. Bad Australians, no Vegemite for them …
I must confess … this kind of shabby, “phone it in” climate science is getting kinda old …
w.
THE RESULTS
Station, Bad days in record (w/ min. temperature exceeding the max. temp)
Adelaide, 1
Albany, 2
Alice Springs, 36
Birdsville, 1
Bourke, 12
Burketown, 6
Cabramurra, 212
Cairns, 2
Canberra, 4
Cape Borda, 4
Cape Leeuwin, 2
Cape Otway Lighthouse, 63
Charleville, 30
Charters Towers, 8
Dubbo, 8
Esperance, 1
Eucla, 5
Forrest, 1
Gabo Island, 1
Gayndah, 3
Georgetown, 15
Giles, 3
Grove, 1
Halls Creek, 21
Hobart, 7
Inverell, 11
Kalgoorlie-Boulder, 11
Kalumburu, 1
Katanning, 1
Kerang, 1
Kyancutta, 2
Larapuna (Eddystone Point), 4
Longreach, 24
Low Head, 39
Mackay, 61
Marble Bar, 11
Marree, 2
Meekatharra, 12
Melbourne Regional Office, 7
Merredin, 1
Mildura, 1
Miles, 5
Morawa, 7
Moree, 3
Mount Gambier, 12
Nhill, 4
Normanton, 3
Nowra, 2
Orbost, 48
Palmerville, 1
Port Hedland, 2
Port Lincoln, 8
Rabbit Flat, 3
Richmond (NSW), 1
Richmond (Qld), 9
Robe, 2
St George, 2
Sydney, 12
Tarcoola, 4
Tennant Creek, 40
Thargomindah, 5
Tibooburra, 15
Wagga Wagga, 1
Walgett, 3
Wilcannia, 1
Wilsons Promontory, 79
Wittenoom, 4
Wyalong, 2
Yamba, 1

Janama
Yes, I have mentioned this before in other posts. The reasoning behind it is that the Stevenson Shields weren’t rolled out to all stations until 1910/11. However, I am sure that Bourke would have had an SS then as it was an important w/s. It has daily temps back to 1871.
By discarding all temps prior to 1910 and adjusting raw data, it makes it easy for the BOM to make that claim about the ‘hottest summer’.
Check Bourke’s temps for 1896 – 22 days straight over 40C (highest 48.6C twice).
http://www.bom.gov.au/jsp/ncc/cdio/weatherData/av?p_nccObsCode=122&p_display_type=dailyDataFile&p_startYear=1939&p_c=-461101351&p_stn_num=048013
Now that’s a heatwave.
Willis, The reason the BOM went to the trouble of doing a whole new data set was because a group including Sen. Cory Bernardi, myself, Ken Stewart, and the BOM independent audit team (which I put together in 2010) asked the Australian National Audit Office to do an independent audit of the BOM data. Bernardi raised this in parliament. They had to respond.
The original audit request: http://joannenova.com.au/2011/02/announcing-a-formal-request-for-the-auditor-general-to-audit-the-australian-bom/
My write up of the BOM response:
http://joannenova.com.au/2012/06/threat-of-anao-audit-means-australias-bom-throws-out-temperature-set-starts-again-gets-same-results/
There are 20 + articles analyzing the BOM data: http://joannenova.com.au/tag/australian-temperatures/
The independent audit team are the ones who spotted the problems with F–>C conversions, with different datasets, with weighting, grids, maxes greater than mins, inexplicable adjustments. Several of them engage with the BOM constantly asking for the original data and methods.
Credit to Chris Gillham, Ken Stewart, Geoff Sherrington, Ed, Andrew, Ian, Lance, Janama, David Stockwell, Warwick Hughes and several others behind the scenes. They keep the BOM under pressure.
Nick Stokes says (my emphasis):
June 29, 2013 at 3:08 pm
I’m sure listening to your explanations is never the slightest pain for you … I was talking about pain for us.
Well, let’s see. You haven’t said what “recent” means, so that’s useless. And surprise of surprises, you haven’t provided anything but your big mouth to back up your claims.
The Aussies say that “ACORN-SAT is a complete re-analysis of the Australian homogenised temperature database. “. So definitely, it is something more than your claim that it’s just the regular results. They also say:
So your claim that “recent data, automatically recorded every half hour, doesn’t change” is obviously nonsense. First, the automatically recorded data goes through automated and semi-automated quality control. Of course, if they find errors, the data changes.
Then, after the automated quality control, there are a number of other “quality control checks”, and these will also result in changed data.
As a check, take a look at the standard BoM Bourke data here, and the ACORN-SAT Bourke data for the same period. Take a look at December 12, 2009, that’s a recent month, only four Decembers ago … the standard Bourke data has a max temp for that day of 32.7°C … but the ACORN-SAT data has thrown that data point out entirely, they have 99999.9 for that day, missing data. So your claim, that
is, as usual, just another one of your fantasies. December 2009 in the data, and there may be others, I just took a quick look and found several.
So as usual, Nick, you’re just spouting the first BS that comes to mind … the Australians say that the data goes through a number of checks after it’s first collected, and obviously, and contrary to your specious claim, the data changes as a result of that quality control.
But if you do have the ACORN-SAT data for this year, I’m glad to take a look at it. And eventually, I’m sure the Aussies will get off of their duffs and get around to posting it. By then, of course, their bogus claims of an “angry summer” will be long forgotten … which may be only coincidental.
w.
Sorry if this is a duplicate. Please delete if the last came through.
—————————————————————————–
Willis, The reason the BOM went to the trouble of doing a whole new data set was because a group including Sen. Cory Bernardi, myself, Ken Stewart, and the BOM independent audit team (which I put together in 2010) asked the Australian National Audit Office to do an independent audit of the BOM data. Bernardi raised this in parliament. They had to respond.
The original audit request: http://joannenova.com.au/2011/02/announcing-a-formal-request-for-the-auditor-general-to-audit-the-australian-bom/
My write up of the BOM response:
http://joannenova.com.au/2012/06/threat-of-anao-audit-means-australias-bom-throws-out-temperature-set-starts-again-gets-same-results/
There are 20 + articles analyzing the BOM data: http://joannenova.com.au/tag/australian-temperatures/
The independent audit team are the ones who spotted the problems with F–>C conversions, with different datasets, with weighting, grids, maxes greater than mins, inexplicable adjustments. Several of them engage with the BOM constantly asking for the original data and methods.
Credit to Chris Gillham, Ken Stewart, Geoff Sherrington, Ed, Andrew, Ian, Lance, Janama, David Stockwell, Warwick Hughes and several others behind the scenes. They keep the BOM under pressure.
Max/Min transposition may be explained in certain circumstances, but not all, but there are many, many other errors in Acorn. There are many data entry errors e.g. 26.8 instead of 36.8 (Alice Springs 28/01/1944) as well as obviously wrong adjustments e.g. Rutherglen maxima adjusted by -8.1C (13/10/1926) to produce a glaringly obvious anomaly compared to previous and following days; and the metadata in the Station Catalogue is still poor with much missing information.
Further, the international review panel wrote : “(T)he surface temperature observation network fails to meet the internationally recommended minimum spatial density through much of inland Australia.” Acorn’s lead author Blair Trewin admits this, saying “Even today, 23 of the 112 ACORN-SAT locations are 100 kilometres or more from their nearest neighbour, and this number has been greater at times in the past, especially prior to 1950.”
They also note: “The WMO Guide states that an acceptable range of error for thermometers (including those used for measuring maximum and minimum temperature) is ±0.2 °C. However, throughout the last 100 years, Bureau of Meteorology guidance has allowed for a tolerance of ±0.5 °C for field checks of either in-glass or resistance thermometers. This is the primary reason the Panel did not rate the observing practices amongst international best practices.”
The introduction of Acorn was rushed and resulted in many errors, but it is odd that they have not reviewed and corrected them.
Willis, I urge you to read my preliminary analysis at http://kenskingdom.wordpress.com/2012/05/14/acorn-sat-a-preliminary-assessment/
which might give you some further background. There are many other faults to be highlighted.
Ken Stewart
Philip Bradley said @ur momisugly June 29, 2013 at 2:49 pm
Given the rather infrequent application of the WMO Standard at temperature recording stations, it does seem rather pointless. Then some of us find the idea of using average temperature as a proxy for entropy rather pointless.
@ur momisugly Gonzo
You left off the dates for Marble Bar’s record hot spell: Oct. 30, 1923 to Apr. 7, 1924.
D’you think Ms Gaia was angry that summer, too? 😉
Willis,
“the standard Bourke data has a max temp for that day of 32.7°C … but the ACORN-SAT data has thrown that data point out entirely”
Yes, the 32.7 is there – but you didn’t mention that is is flagged “Not quality controlled or uncertain, or precise date unknown”.
By “recent” I was referring to the data set I linked to “Recent Months at Bourke”. Basically the last year. That’s enough to cover any post-ACORN period. Just don’t use flagged data, if you see any.
Streetcred said @ur momisugly June 29, 2013 at 5:00 pm
How odd! Back in the early 1970s I worked in the CES and PMG department. I was told that since I was not Roman Catholic and I was a member of the Labor Party, that I would never be promoted. After teaching two people their job when they were promoted above me, I realised the truth of this and quit.
Nick Stokes said @ur momisugly June 29, 2013 at 6:09 pm
Contradiction. The number in the record is either 99999.9, or 32.7. Someone’s telling porkies…
Ian George says: June 29, 2013 at 5:39 pm
“However, I am sure that Bourke would have had an SS then as it was an important w/s.”
According to Trewin, the screen at Bourke was installed Augusr 1908.
In outback Australia things are different and you should be surprised by such extreme temperatures because it can get pretty hot out there. It is far from the sea and the surrounding desert acts as a heat trap.
Of course if you come from the city you have no idea about what the outback is like. The scale of everything is bigger and more extreme.
In the city you are spoiled and pampered in every way. Unless you have lived out there you really have no idea of the scale of things.
For example you take it for granted that you will have instant information on tap. In the Australian outback even the radio news is three days old. Just going from your front door to your letter box you take at least a weeks rations.
As for extreme weather, you can be sure of it out there. The dust storms are so thick that the rabbits dug warrens in them. The wild life has adapted to the environment and become more fierce. The mosquitoes don’t suck blood, they suck bone marrow.
Of course people also adapt. One of the well known characters from the old days was Crooked Mick who worked on stations (ranches) out there. One of his jobs was putting up fences and he was the best and fastest. He would lay so much fence in one day that it took him three days to walk back to the start.
So the idea that minimum temperatures can exceed maximum temperatures is no big deal.
@ur momisugly Lew Skannen
Wasn’t Crooked Mick the bloke with a face like a buffalo turd? And if you think that’s bad, you should have seen his bird! 🙂
The Git has fond memories of an outback cop called Blue who introduced him to the philosophy of Spinoza over a few beers…
Lew Skannen – thanks for introducing some truth into the discussion of the BOM’s data. 🙂
Like the UK Met office, the BOM was taken over in the late 1990s by spivs with hair gel and communications degrees whose job was to hype CAGW. They easily found naive and compliant people with science degrees (jobs in science always being hard to find) and away they went.
In conjunction with CSIRO, the nation’s two most respected science bodies proceeded to swerve completely off the path of impartial research and measurement and into propaganda. Worrying about “global warming” (later rebadged “climate change”) found its way into their mission statements and corporate plans. This peaked in 2007, when our briefly resurrected Prime Minister, Kevin Rudd, announced that it was the greatest moral challenge facing our generation.
Like the Royal Society in the UK, and many other prestigious institutions, they sold their hard-won reputations for a mess of pottage. That the BOM had to invent a new metric called the national temperature average (or whatever they call it) and go along with unscientific crap like the “angry summer” must have the traditional and quickly moved along scientists who once worked there crying in their beer.
Jo Nova says:
June 29, 2013 at 5:41 pm
Willis, The reason the BOM went to the trouble of doing a whole new data set was because a group including Sen. Cory Bernardi, myself, Ken Stewart, and the BOM independent audit team (which I put together in 2010) asked the Australian National Audit Office to do an independent audit of the BOM data. Bernardi raised this in parliament. They had to respond.
… more good stuff snipped …
First, Jo, thanks for your comments giving credit to those to whom it is assuredly due.
Next, thanks for your great blog, which is always worth reading.
I fear my knowledge of the Australian situation is somewhat out of date. I knew that complaints about shabby records had forced a re-build of the whole deal, but I didn’t realize you were on the front lines.
In any case, my very best to you, and congratulations, keep them on the hop.
w.
Yes the Stevenson screen was installed at Burke in 1908.
Here’s the adjustments made to the Bourke temperature record by Simon Torok in 1996.
This is the code they used.
Station
Element (1021=min, 1001=max)
Year
Type (1=single years, 0=all previous years)
Adjustment
Cumulative adjustment
Reason : o= objective test
f= median
r= range
d= detect
documented changes : m= move
s= stevenson screen supplied
b= building
v= vegetation (trees, grass growing, etc)
c= change in site/temporary site
n= new screen
p= poor site/site cleared
u= old/poor screen or screen fixed
a= composite move
e= entry/observer/instument problems
i= inspection
t= time change
*= documentation unclear
48013 1021 1965 0 -0.2 -0.2 odn
48013 1021 1909 0 +1.0 +0.8 ords*
48013 1021 1897 0 -1.7 -0.9 ord
48013 1021 1885 0 -1.5 -2.4 ord
48013 1021 1880 1 +2.0 -0.4 rd
48013 1001 1965 0 +0.3 +0.3 fn
48013 1001 1915 0 +0.6 +0.9 frd
48013 1001 1909 0 -1.5 -0.6 ords*
48013 1001 1898 0 +0.5 -0.1 od
48013 1001 1893 0 -1.0 -1.1 od
48013 1001 1882 1 +0.9 -0.2 od
48013 1001 1881 1 +0.9 -0.2 od
48013 1001 1880 1 +0.9 -0.2 od
48013 1001 1879 1 +0.9 -0.2 od
48013 1001 1872 1 +5.0 +3.9 d
as you can see adjustments for the SS were made to both Max and Min.
Nick Stokes says:
June 29, 2013 at 6:09 pm (Edit)
No need to mention it, it’s relevant. You claimed that the two records were “identical”. I assume you understand what “identical” means. It doesn’t mean “almost the same, but one is flagged 32.7°C and one is 99999.9”. Here’s your claim:
I showed very clearly that they are NOT IDENTICAL. But you, being the nit-picking ridiculous jailhouse lawyer that you are, you are just being true to your sworn duty to never, ever admit that you are wrong.
Sad to say, Nick, they are not identical. So your claim of no further processing after the data is published, your idea that historic data “doesn’t change” is just more patented Stokes BS, thrown out to deceive the unwary.
But heck, keep it up. The entertainment value of watching you wriggle and squirm trying to prove that “identical” really means “sorta similar” is priceless.
w.
That is exactly what they do. They flag what seem to be anomalies and investigate them. Try reading their material on methods.
http://cawcr.gov.au/publications/technicalreports/CTR_049.pdf
Patrick,
“They always come crawling back to the “The Zapper” for some more…. sweet, sweet candy”
Great character!! LOL
If only they would invest as much time into doing the science as they have done in coming up with an acronym. Usually you know you are face-2-face with a bs machine when they have an acronym which is longer than 4 letter. In most cases they have thought up the acronym first, then designed the subject to fit the acronym.
Willis, you really should store the data in an Access database. No programming required to find those errors,just a simple SQL would have done it. Access will also allow you to do other data mining, such as the number of days per year the temp is above a certain line, also the number of record breaking days, etc. All simple SQL, the results of which you copy and paste into Excel for plotting.
Just a suggestion.
Downloaded some of the data, I wanted to see what the temp range was 5 days on either side of the bad records, such as
ID Location TMin TMax Date DateString
15590 Alice Springs 21.7 20.8 01-Mar-10 19100301
Once plotted you cant tell if the TMax is wrong, or the Tmin is wrong. It looks like a cold front moved through as the days before are in the high 30’s, but after mid 20’s
Temp_Data.Date Temp_Data.MinTemp Temp_Data.MaxTemp
27-Feb-10 23.1 34.8
26-Feb-10 23.1 36
25-Feb-10 22.4 34.8
24-Feb-10 22.4 34.1
02-Mar-10 17.4 20.5
28-Feb-10 23.1 30.2
01-Mar-10 21.7 20.8
06-Mar-10 17.4 23.8
05-Mar-10 15.1 20.8
04-Mar-10 20.5 26.6
03-Mar-10 17.4 27.9
So how you “fix” such mistakes will be a head scratcher.
Nick
Thanks, I stand corrected re Bourke’s SS date of siting.
Also thanks to Janama for the adjustment formula. Still, can’t see any logic in the adjustments with high temps reduced by differing amounts and low temps increased.
Willis, I figured out why some days have higher TMin than TMax. It’s an issue of timing when TMax and TMin are taken. The one time I checked the temps 5 days on either side, it was clear a cool frontal system moved through. Follow this. The TMin is taken at night, say just after midnight, but when the TMax is taken, some time in the following afternoon, the night time temp could be higher than the following day high as the cold front moved through. Hence the data is actually correct. Nothing to fix.
What is also interesting is that anomalous TMin>TMax was in 2010. 2010 was actually an abnormally cool year. Not once did it get above 29C. 232 days were below average. On that one anomalous day, Mar 1, and the day before were the coolest TMax of the year, a large plunge down for TMax in that period.
If you read ACORN’s procedures you will see that when they measure temperatures they record the minimum and maximum temperatures from separate thermometers. If they read within 0.5C of each other the temperatures are accepted. A temperature recorded as 14.5 for the maximum and 14.8 for minimum would be acceptable because of the uncertainty of the thermometers. If the temperature then went down for the rest of the day (due to a passing cold front) the minimum would be higher than the maximum. Anyone who has measured data is aware that issues like this come up, especially when you have millions of data points.
Some of the differences are too large for this explanation. At Open Mind the scientist in charge of the ACORN data set posted this response:
“The situation actually arises because, where adjustments are carried out to the data (e.g. because of site moves), the maxima and minima are adjusted independently. What this means that if the maxima at a site in a given year are adjusted downwards because the former site is warmer than the current one (or if the minima are adjusted upwards because the former site is cooler), and you have a day when the diurnal range in the raw data is zero or near zero, you could end up with the adjusted max being lower than the adjusted min (e.g. if the raw data have a max of 14.8 and a min of 14.6, but the mins are adjusted up by 0.4, you would end up with a max of 14.8 and a min of 15.0).
What this reflects, in essence, is uncertainly in the adjustment process (the objective of which is to provide the best possible estimate of what temperature would have been measured at a location if the site on that day was as it was in 2013). Clearly in these cases either the estimate of the max is too low or the min is too high; however, providing the adjustment process is unbiased, these cases will be offset by cases where the max in too high/min is too low, and there is no overall bias.
We’ve decided, though, that the internal inconsistency (which, as Tamino notes, affects only a tiny percentage of the data) looks strange to the uninitiated, so in the next version of the data set (later this year), in cases where the adjusted max < adjusted min, we'll set both the max and min equal to the mean of the two."
Any real data sets with millions of points have issues where adjustments are required. If you ask you can get explanations for many things.