More hot days — or “purpose-designed” temperature sensors at play?
I don’t believe in conspiracies of silence except when it comes to Harvey Weinstein and the Australian Bureau of Meteorology.
For some time, weather enthusiasts have been noticing rapid temperature fluctuations at the ‘latest observations’ page at the Bureau’s website. For example, Peter Cornish, a retired hydrologist, wrote to the Bureau on 17 December 2012 asking whether the 1.5 degrees Celsius drop in temperature in the space of one minute at Sydney’s Observatory Hill, just the day before, could be a quirk of the new electronic temperature sensors. Ken Stewart, a retired school principal, requested temperature data for Hervey Bay after noticing a 2.1 degrees Celsius temperature change in the space of one minute on 22 February 2017.
In both cases, the Bureau assured these ‘amateurs’ that they didn’t understand what they were noticing. In the case of Dr Cornish, he was referred to a Bureau report that makes reference to international studies, which explains how measurements from the fast responding electronic sensors are made comparable with measurements from the more inert mercury thermometers by averaging over at least one minute – except the Bureau does not actually average any of the measurements recorded from its custom-built sensors.
Electronic sensors, progressively installed into Bureau weather stations replacing mercury thermometers, beginning some twenty years ago, can capture rapid changes in temperature.

On a hot day, the air is warmed by turbulent streams of ground-heated air that can fluctuate by more than two degrees on a scale of seconds. So, if the Bureau simply changed from mercury thermometers to electronic sensors, it could increase the daily range of temperatures, and potentially even generate record hot days approach 50 degrees Celsius, because of the faster response time of the sensors.
Except to ensure consistency with measurements from mercury thermometers there is an international literature, and international standards, that specify how spot-readings need to be averaged – a literature and methodology being ignored by the Bureau. For example, the UK Met office takes 60 x 1-second samples each minute from its sensors and then averages these. In the US, they have decided this is too short a period, and the standard there is to average over a fixed five-minute period.
The Weather Observers Handbook 2012, for example, states that if averaging is not over a five-minute period it affects temperature extremes. An example is even provided. Dodge City, Kansas, has a long temperature record dating back to 1875. The hottest day on record stood at 43.3 degree Celsius. Then there was a heatwave in 2011, the highest reading from an electronic sensor was 43.9 degree Celsius. But when it was found this record was from readings that had only been averaged over only one minute, the new record was scratched – because when the same readings were averaged over the ASOS standard of five minutes, the maximum temperature was 43.3 degree Celsius – a tie.
In Australia, our Bureau takes not five-minute averages, nor even one-minute averages. As Ken Stewart discovered when he persisted with understanding the nature of the data he had been provided by the Bureau from Hervey Bay: the Bureau just take one-second spot-readings.

More examples from Ken Stewart here and here
Check temperatures at the ‘latest observations’ page at the Bureau’s website and you would assume the value had been averaged over perhaps 10 minutes. But it is dangerous to assume anything when it comes to our Bureau. The values listed at the ‘observations’ page actually represent the last second of the last minute. The daily maximum (which you can find at a different page) is the highest one-second reading for the previous 24-hour period: a spot one-second reading in contravention of every international standard. There is absolutely no averaging.
For about five weeks now the Bureau have been obfuscating on this point. There is ‘more than one way’ of achieving compliance with WMO guidelines they write in a ‘Fast Facts’ published online on September 11 – after I wrote a blog post detailing how their latest ‘internal review’ confirmed they were in contravention of international standards. I even suggested that the last 20-years of temperature recordings by the Bureau will be found not fit for purpose, and eventually need to be discarded. (I would have written like Harvey Weinstein, except that was a few weeks before this other scandal broke.)
The Bureau has been insisting for some time that they don’t need to average because they have sensors with a long response time, which actually represent an average value, that is the same as the time constant for a mercury thermometer. How this is achieved in practice was detailed for the first time in a letter from the new head of the Bureau Andrew Johnson, last Friday.
The letter explains that all the sensors the Bureau uses have been ‘purpose-designed’. I had been requesting manufacture’s specifications, but instead, I received this advice that it’s to Bureau specifications and, by inference, there is no documentation. To be clear, there are also no reports detailing the laboratory and field tests that explain how the custom-built devices have been designed to ‘closely mirrors’ the behaviour of mercury thermometers including the time constants – to quote from Dr Johnson’s letter of last Friday.
The few values quoted in this letter from Dr Johnson indicate that the Bureau has rolled-out a network of electronic sensor that under hot and windy conditions will potentially capture temperature spikes – as noted by Mr Stewart and Dr Cornish – which would be impossible from a mercury thermometer.
I am not blaming the sensors for being so responsive, just the Bureau for pretending one-second spot-readings from their purpose-designed sensor are comparable with instantaneous readings from mercury thermometers – while providing no proper documentation. I’ve suggested recording in a way that will facilitate averaging, but Dr Johnson has indicated that the Bureau would be ‘unable to meet this request’. I am not blaming Dr Johnson exactly, he didn’t put the system in place, but he is going to defend it.
If you believed in conspiracies, you might believe the increase in the incidence of hot days across Australia was because of the purpose-designed sensors, but really it has more to do with the system of one-second spot readings. Whichever, the Bureau can give us a hottest winter on record, even when there are record snow dumps in the Alps, and record numbers of frosts on the flats.
While it may be the expectation of the Australian community that temperatures would be measured consistent with some standard, clearly this is not the case. The only real question now, is whether the Bureau is such a big and important Australian institution that, like Harvey Weinstein, it’s transgressions are best ignored – at least for the moment, not under the current Minister’s watch, what? I had my eyes closed.
Jennifer Marohasy is a Senior Fellow at the Institute of Public Affairs and blogs at jennifermarohasy.com.
They’re not “faulty” if they’re designed to work that way.
Richard. Design faults are real. Which could still make you correct if the designer is faulty.
US temps based on five minute averages can be compared with Australian averages based on one second, two second, three second, one minute or whatever it is the BoM uses.
Below are NOAA temps based on the the US Climate Divisional Database and Australian BoM figures, both starting in 2000 when pretty well all temps in both countries were recorded via AWS electronic thermometers.
The period can be broken into two comparisons which have the same number of years.
Averages 2000-2013
Min
US 2000-2006 : 41.6F
US 2007-2013 : 41.2F
AUS 2000-2006 : 15.4C
AUS 2007-2013 : 15.5C
Max
US 2000-2006 : 65.4F
US 2007-2013 : 65.1F
AUS 2000-2006 : 29.1C
AUS 2007-2013 : 29.1C
__
Averages 2000-2015
Min
US 2000-2007 : 41.6F
US 2008-2015 : 41.3F
AUS 2000-2007 : 15.4C
AUS 2008-2015 : 15.5C
Max
US 2000-2007: 65.4F
US 2008-2015 : 65.1F
AUS 2000-2007 : 29.1C
AUS 2008-2015 : 29.2C
The northern hemisphere has supposedly had more warming than the southern hemisphere.
Did someone open the Stevenson Screen door at 1.00 pm the time of the peak?
i think some may have missed the significance of that question neilc.
Many adjustments to the historical temps are valid. But the trick is… heads we adjust, tails we ignore.
Because of the complexity, it is easily to obsfucate
But, the good news is that there is a limit to this deception and the manufactured gain in temp trend is offset by the loss of credibilty
Special thanks to those who are watching the rabbits and their cabbage patch
I’m an instrumentation electronics engineer, and I can absolutely second Leo Smith, that there are many ways of averaging a signal to any degree desired, even in hardware before a “reading” is produced.
It seems to me that the real issue here is that the BOM pretends that because the equipment is purpose-built, somehow there is no way of specifying what this averaging is. This is nonsense. As Matt puts it, generally we just specify a response time constant, and that’s an end of it. There are other ways; but they can all be described, as they must be for the engineering to be done.
Hi Jeffrey. The time constant changes with wind speed. Faster the wind the faster the heat transfer. However the InGlass has two time constants and moving parts (the liquid and the recording pointer). The bore expands first then the liquid. So matching a PRTD is not possible because the InGlass does not follow the clasic 1 pole RC time constant curve. If you get the 63.2% time the same then the 5% time will be different. Your thoughts please.
In regards to a more accurate method: Surely, the precision of the existing BOM sensors, just recording temperature and not throwing out 59 seconds is enough data to feed a simple algorithm (I hate the “m” word) that could closely approximate the InGlass, given its inertia.
Just in case I’m not being entirely clear, mercury thermometers didn’t record wind speed either*.
*Perhaps they did indirectly but wouldn’t a precise enough sensor capture enough data to model these time constants?
Thank you Jennifer.
It seems likely to be swept under the carpet again.
90% of the top people seem to be warmist apologists.
With people like Lewindowsky, Karoly, Gerghis and Nick Stokes in the background and hotbeds of misinformation at Melbourne Uni, Uni of Western Australia and Queensland Uni , not to mention the Ship of Fools fellow, Canberra? it is a forlorn hope getting action.
Still World wide is better than Australian wide and who knows, Malcolm might come riding to the rescue.
Further reflection on the actual digital sensor itself. This device is not up to the task it is being asked to perform. If the sensitivity to transients is of the order of degrees within seconds then it is no more measuring temperatures than a slow response device that requires hours to detected degrees. There is no way to average out from the data to extract the temperature signal. Someone opening the screen to take a mercury reading by merely breathing would introduce a bias. The best one could hope is to aggressively filter outliers and then average over time provided the raw second by second data was saved.
Get a jet engine blast on one of those weather stations at an airport and you can get all sorts of record temps for a minute.
Funny enough the highest UK temperature on record came from … ‘Heathrow Airport ‘
The idea of using airport based weather stations for wider scale weather is frankly hilarious. Anyone that has even work at one can tell you how hot it can get , compared to their surrounding areas, even without any wash from the jets.
Basically its classic question of ‘better than nothing ‘ not that the measurements have real scientific validity .
Same here in Sydney. Either Sydney airport or Bankstown airport are regularly quoted as “highest on record etc” or “x higher than average”. Sydney airport is surrounded by lots of water. Bankstown is 30-40kms further west, surrounded by land. Then Observatory Hill, moved in the 1930’s IIRC, surrounded by buildings.
Exactly.
The same thing has been reported from Germany,
http://notrickszone.com/2015/01/13/weather-instrumentation-debacle-analysis-shows-0-9c-of-germanys-warming-may-be-due-to-transition-to-electronic-measurement/
Every time I read an article like this I am struck by the sheer silliness of trying to measure global temperatures with thermometers, whether analog or digital. A century from now, scientists will look back with amazement at the amount of time and money wasted on possibly the dumbest exercise in scientific inquiry in history. Even 17th-century astronomers who tried to prove the Sun was orbiting Earth did work that was less silly than this.
A proper instrument to measure temperature is a caliper.
There is an adjustment for a change of location, for a time of observation, and probably also one for a change of instrumentation. I wonder if it has been applied during a transition to electronic thermometers. Perhaps Steven Mosher could share some of his wisdom.
The only way to make an adjustment for change of instruments is to have both instruments running side by side for a period of time so that a delta between them can be determined.
From the article, no such activity took place. They just removed one and put the new one in it’s place.
It’s possible that a generic delta can be calculated and applied to all of the new instruments, but such an adjustment is only an average at best and should (if those involved were being honest) result in an increase in the uncertainty of the measurements.
from jennifers posts i gather they have done this with some stations. the difficulty she faces is getting access to the data . i think we all know why that will be and what the data will show.
The main Broome observations in Western Australia are taken at the airport, where common spikes of 1-1.5C happen every day, especially at times of scheduled arrivals and departures. Then 8km down the road at Broome Port, the temperature often stays within 1.5C over a 24 hour period, but never any obvious spike. The Port is surrounded by water.
From the article” “if the Bureau simply changed from mercury thermometers to electronic sensors, it could increase the daily range of temperatures, and potentially even generate record hot days approach 50 degrees Celsius…”
If the range is increased, then the standard deviation is also increased. In other words, what BOM has done is increase the uncertainty of their data. The spikes need to be qualified as having less certainty than the data from the mercury thermometers. Once again, what is missing in the picture is a rigorous statement about the estimated uncertainty and how it was determined.
Hi Clyde,
So far, we don’t have access to parallel data (measurements from the two types of equipment co-located within the one shelter) for any locations – despite this existing for more than 20 years. Where are the white-hat hackers within our community?
We do have access to a limited amount of data which has been generated by the sensors and measured as highest, lowest and last one-second spot readings for each one minute interval. Given the somewhat perculiar nature of this samplying – how would you analysis it to quantify uncertainty. Would it be fair, for example, to simply treat each of the one-second measurements (i.e. the highest, lowest and last) within each five minute interval as equivalents within the available sample from which some statistics (including uncertainty values) could be derived?
jennifermarohasy,
While I think it is unfortunate that the BOM has decided to use the current sampling protocol, I think that in order to be sure that the data are being compared to similar quantities from the rest of the world, it should be reduced to a best estimate of the diurnal maximum and minimum.
It seems to me that the most likely bias is ground-level convective thermal bursts during the Summer. Although, a gust of wind in the Winter could also represent an anomalous temperature. I think that what should be done is to use a 5-minute moving average on the Tmax and separately on the Tmin temperatures. It would be instructive to compare those two graphs with a 5-minute moving average on the last temperature for each minute. I think that being dogmatic about what should be done subsequently, without seeing the results of my suggestion, would be premature. However, basically, I think that the largest Tmax and smallest Tmin should be obtained from the two filtered results.
Surely this would work both ways?
If the instruments react to short periods of extreme warmth they should also react to short periods of extreme cold. If Australia really isn’t warming, despite the evidence from both surface and satellite data, then this BOM compilation method should produce as many new record cold temperatures as it does warm ones.
As has been explained above, there are more sources of spurious warm readings than there are for spurious cold readings.
Plus the algorithm already truncates spurious low readings. It doesn’t truncate spurious warm ones.
Precisely. I have never experienced a jet aircraft give out a short burst of cold air.
Night time temperature is not as corrupted because there’s less wind movement as the lapse rate collapses.
While running a test cell for testing gas turbine engines we encountered numerous anomalies with temperature measurements using “Custom” temperature measurement devices. To calculate horsepower and horsepower losses a number of higher end electronic sensors were employed on various points on the engine and its sub systems. Most of these sensors were expensive and had an error rate of .01 degree C. we had to constantly calibrate the system using a calibrated heat bath because not only were the sensors prone to drift but the monitoring equipment reading the temperature probes also was prone to error. We found that if we did not calibrate the temperature system on a 3 month schedule then the turbines we were testing would fail test due to been out of parameters. We also found that the fluctuations in power supplies used for the monitoring equipment had a large influence on the output readings. I wonder how often the weather bureau actually checks these sensors for proper calibration.
From what I have read, they are checked prior to installation, and that’s it.
Sadly, the most surprising part of this is that the heat record for Dodge City is only 110F, must be the elevation.
I’m sure the BoM is looking into this, and will inexplicably decide that the electronic sensors are actually creating a cooling bias, and the raw data will need adjusted upward.
O/T somewhat
Check this description od a weather station site
http://www.smalldeadanimals.com/2017/10/the-sound-of-se-596.html#comment-1131063
Why do weather stations still need to be physically opened to read values?
This could indeed disturb the inner air and affect measurements.
Can recorded data not be transmitted electronically?
Thats just great, a random systematic* error!
Nick S and his silly mate with the 8 foot measuring stick will be along shortly to tell us it doesn’t matter!
He’ll say:
“We’ve got probability distribution, you don’t need real measurements.
And besides, accuracy is not a numerical quantity, precision is what matters and these sensors are very precise.” 😉
*Systemically induced random truth generating sensors!
The lack of commentary by Nick Stokes on this sensing time/measurement time issue is oddly curious.
But refreshing.
Anthony,
It is my experience that the other side will not engage at all on this issue. And recently the Bureau made the decision to refused to provide comment to any journalist, requesting any information, on anything related to this issue of measurement.
Its almost as if they are reading from Benjamin Disraeli’s playbook
‘never apologise, never explain’.
Perhaps even thinking about surface measurements being a confused mix of temperature and wind speed hurts them right in the religious parts.
Another case of the dog that did not bark I guess.
If that sensor behaviour is as described what is there to comment on the sensor problem creates a form of Cauchy distribution there is no mean on that thing because it depends on length of application of the spike. It’s one of those pathological cases where if crazies like Nick S tries to apply his Central Limit Theorem and you get garbage.
You would need to measure the asymptotic effect, put some likelyhood parameters on it and go at the statistics that way to make the sensor readings work properly. That means measuring all sorts of the spike behaviour with application times and enviromental conditions. It’s doable but would be tedious.
The obvious solution is fix the dam sensors.
[mr] mosher also.
that should have been mr mosher.
It can be readily shown that there is no way of averaging a quick response thermometer to the old fashioned mercury one as the enclosure is no perfect so there is a variation in the way it needs to be averaged depending on the sunshine levels and the wind strength.
Scientists need proper training in the non theoretical work they do. There is no kudos and probably even less money in the sort of study that checks the practical aspects of their work and it is over half the real work.
David “as the enclosure is not perfect”. Yes and the old Australian screen was way bigger, had white wooden legs not a grey steel pole and may have let less light in through the base.
Please give us an edit as I always notice the bad typing after I send no matter how often I look before.
Why does this not surprise me….???
Jennifer,
For statistical advice, contact Prof. William Briggs of Heartland Institute and Cornell University
https://www.heartland.org/about-us/who-we-are/william-briggs
Or Steve McIntyre who debunked Mann’s hockey stick
https://climateaudit.org/contact-steve-mc/
Jennifer ==> In the US, with Tide Gauges, the standard is to average 181 1-second readings (centered on the 1/10th hr) to arrive at a 6-minute (1/10th hr) reading. ONLY the six minute readings are permanently recorded. This is all done in a dedicated “computer” officially called a “primary data collection platform”. Not only does it find the mean of the 181 a-second readings, it does it once, throws out 3-sigma outliers, then does it again before recording and transmitting the official six-minute measurement. thus, hopefully, the spurious 1-second readings are dealt with, and the rapid fluctuations are smoothed out to what we would agree might be a dependable idea of the tide height, every six minutes.
The ASOS User’s Guide related what an ASOS is supposed to do with Ambient Temperature:
Just with the rounding up there would emerge a warming signal.
Note they are recording actual temperature.
This rounding alone would lead to a temperature measurement step when electronic thermometers were instituted.
Just imagine the possibilities open here were banks to use this form of ‘averaging’.
There would be an enquiry.
Err, hang on……
Kip Hansen,
Thanks for sharing this!
How cool: “Not only does it find the mean of the 181 a-second readings, it does it once, throws out 3-sigma outliers, then does it again before recording and transmitting the official six-minute measurement.”
Something like this should be installed by our Bureau for air temperatures!
JM ==> In actual fact, the NOAA’s ASOS User’s Guide calls for a system very like the Tide Gauges system.
So, six ten-second readings are averaged to produce a one-minute measurement recorded for the time Hr:Minute:00.
Five 5-minute values are used to calculate a 5-minute average, there is some evaluation to determine if there are at least four one-minute values, so maybe there is an out-of-range check in there.
Similar to the Tide Gauges system, These 5-minute values should be the only permanently recorded temperatures, recorded to the nearest degree F and converted to degrees C to the nearest 0.1.
That’s the official NOAA process inside the ASOS.
Recording record temperatures by the second now are we ? One hair drier and records are shattered .
So the science is settled is it ? But the way to compare and record temperatures isn’t . This climate fraud just gets to bizarre for words . Water bottles randomly thrown off ships to split second temperature readings
sprinkled around the planet and airport tarmacs . Yes that sure sounds like something the broke nations of the world need to spend $$Trillions on . Good grief !
Find a new scary religion that the pant wetter’s can pay for if they wish.