Australia Weather Bureau Caught Tampering With Climate Numbers

Energy

Climate Change wooden sign with a desert background (By ESB Professional/Shutterstock)

From The Daily Caller

Photo of Chris White

Chris White

9:57 PM 07/31/2017

Australian scientists at the Bureau of Meteorology (BOM) ordered a review of temperature recording instruments after the government agency was caught tampering with temperature logs in several locations.

Agency officials admit that the problem with instruments recording low temperatures likely happened in several locations throughout Australia, but they refuse to admit to manipulating temperature readings. The BOM located missing logs in Goulburn and the Snow Mountains, both of which are in New South Wales.

Meteorologist Lance Pidgeon watched the 13 degrees Fahrenheit Goulburn recording from July 2 disappear from the bureau’s website. The temperature readings fluctuated briefly and then disappeared from the government’s website.

“The temperature dropped to minus 10 (13 degrees Fahrenheit), stayed there for some time and then it changed to minus 10.4 (14 degrees Fahrenheit) and then it disappeared,” Pidgeon said, adding that he notified scientist Jennifer Marohasy about the problem, who then brought the readings to the attention of the bureau.

The bureau would later restore the original 13 degrees Fahrenheit reading after a brief question and answer session with Marohasy.

“The bureau’s quality ­control system, designed to filter out spurious low or high values was set at minus 10 minimum for Goulburn which is why the record automatically adjusted,” a bureau spokeswoman told reporters Monday. BOM added that there are limits placed on how low temperatures could go in some very cold areas of the country.

Bureaus Chief Executive Andrew Johnson told Australian Environment Minister Josh Frydenberg that the failure to record the low temperatures at Goulburn in early July was due to faulty equipment. A similar failure wiped out a reading of 13 degrees Fahrenheit at Thredbo Top on July 16, even though temperatures at that station have been recorded as low as 5.54 degrees Fahrenheit.

Failure to observe the low temperatures had “been interpreted by a member of the community in such a way as to imply the bureau sought to manipulate the data record,” Johnson said, according to The Australian. “I categorically reject this ­implication.”

Marohasy, for her part, told reporters that Johnson’s claims are nearly impossible to believe given that there are screen shots that show the very low temperatures before being “quality assured” out. It could take several weeks before the equipment is eventually tested, reviewed and ready for service, Johnson said.

“I have taken steps to ensure that the hardware at this location is replaced immediately,” he added. “To ensure that I have full ­assurance on these matters, I have actioned an internal review of our AWS network and associated data quality control processes for temperature observations.”

BOM has been put under the microscope before for similar manipulations. The agency was accused in 2014 of tampering with the country’s temperature record to make it appear as if temperatures had warmed over the decades, according to reports in August 2014.

Marohasey claimed at the time that BOM’s adjusted temperature records are “propaganda” and not science. She analyzed raw temperature data from places across Australia, compared them to BOM data, and found the agency’s data created an artificial warming trend.

Marohasey said BOM adjustments changed Aussie temperature records from a slight cooling trend to one of “dramatic warming” over the past century.

Follow Chris White on Facebook and Twitter

Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact licensing@dailycallernewsfoundation.org.

HT/MarkW and Joe Perth

0 0 votes
Article Rating
95 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
marknutley
August 2, 2017 12:11 pm

How does limiting how low the temperature goes not introduce a bias?

RWturner
Reply to  marknutley
August 2, 2017 1:04 pm

I’d describe it as fraud more than simple bias.

Paul Mackey
Reply to  RWturner
August 3, 2017 12:34 am

Hear hear. Why limit it because if there is any trend eventually your spurious limit will be broken. Take all the readings, then let the stats do the work. Setting these artificial limits is essentially picking and choosing your data point to suit a preconceived idea. Not science.

M Seward
Reply to  marknutley
August 2, 2017 2:19 pm

How does ‘replacing the hardware’ not lock in the fake trend. (I assume with ‘hardware’ that automatically self censors low temperatures so as to not let such embarrassing matters get out in public again).
These people are not a scientist’s bootlace.

BigBubba
August 2, 2017 12:11 pm

Edit required: -10 C is 14F and -10.4 C is 13.28 F (the article has them the other way round)

AussieBear
Reply to  BigBubba
August 2, 2017 2:54 pm

Geez, at least choose a unit of measurement C or F and stick with it. Don’t put a number 10 and then put another number and say its 14F. I know the story and it was very distracting. Personally, I would use C like the rest of the world…

PaulH
Reply to  AussieBear
August 2, 2017 5:04 pm

I agree. The unnecessary mixing of Celsius and Fahrenheit makes the article difficult to follow.

August 2, 2017 12:12 pm

Government world wide is corrupt. Gladly, not all of us are corrupted.

Tom Halla
August 2, 2017 12:14 pm

And the BOM has never, ever done anything remotely like this in the past./sarc

Auto
Reply to  Tom Halla
August 3, 2017 2:16 pm

Tom Halla,
I am astonished.
The watermelons, as you rightly imply, will use every deceit in the book – it seems, being an outsider – to massage the Adjustocene records.
Sad.
Sad – but not science; advocacy; or taking thirty pieces of silver.
Auto

August 2, 2017 12:20 pm

It’s not only recent records, but earlier record highest temperatures in the early part of last century have also been adjusted, at times based on temperatures recorded at other weather stations 100’s of kilometers away. Strangely they were generally adjusted down!
http://jennifermarohasy.com/2014/03/fiddling-temperatures-for-bourke-part-1-hot-days/
http://joannenova.com.au/2014/09/the-mysterious-lost-hot-sunday-in-bourke-did-it-really-happen/

Bryan A
August 2, 2017 12:21 pm

OH How the AGWers would SCREAM if there was a Maximum Temperature preset and measurements were automatically dropped if the readings went above that threshold

skorrent1
Reply to  Bryan A
August 2, 2017 7:37 pm

Many years ago the Thai Tourist Bureau decreed that temperatures at Bangkok airport would never be greater than 95F. Bad for tourism, donchano.

Robert B
Reply to  skorrent1
August 3, 2017 5:54 am

It might have been an issue once at an important station in my home town in Australia during the 70s and early 80s. A friend who lived nearby said they watered the lawns in the middle of the afternoon around the station on very hot days. January means show a negative trend until 1990.
http://www.bom.gov.au/tmp/cdio/076031_36_01_5197161740161394903.png

August 2, 2017 12:36 pm

This is simply bogus. They’re hiding behind automated “outlier” rejection as an excuse.
If you automatically adjust “outliers”, you’ve programed the results. It isn’t science.

Latitude
Reply to  Bartleby
August 2, 2017 12:56 pm

…wonder how many record lows they’ve missed

Reply to  Latitude
August 2, 2017 8:49 pm

From here,
http://www.bom.gov.au/climate/change/acorn-sat/documents/ACORN-SAT_Observation_practices_WEB.pdf
The range could be
[…Since 1938, for the meteorological range of –10 to +55°C…]
And if that is indeed the range, this could easily be one of those “adjustments” whereby more legitimate cold data is rejected than hot data thus biasing the more modern automated readings to be warmer. One might wonder what ranges are in place in other countries.

Reply to  Latitude
August 2, 2017 8:59 pm

Further down in the document is

Unlike the MiG and AiG thermometers, automated
processes developed in the late 1980s and the next
two decades, enabled all PRS purchased for use to
be checked prior to use in the field; to ensure they
met manufacturing standards and provided readings
within measurement tolerances of ±0.08°C over a
temperature range from –10 to +55°C. If a PRS failed
to satisfy those tolerances, it was returned to the
manufacturer for replacement. As they were also
used for wet-bulb thermometry, the PRS were batchtested
to ensure the platinum-resistance element was
located in the required position in the probe shaft

So the reading of -10.4 was removed because it was no longer in spec for guaranteed accuracy.

Grant
August 2, 2017 12:38 pm

Bet the high values are not automatically adjusted.

August 2, 2017 12:38 pm

Failure to observe the low temperatures had “been interpreted by a member of the community in such a way as to imply the bureau sought to manipulate the data record,” Johnson said, according to The Australian. “I categorically reject this ­implication.”

It was just a lucky coincidence.
Like the boss winning the company raffle.

schitzree
Reply to  M Courtney
August 2, 2017 7:16 pm

Like the boss winning the company raffle.

Three times in a row.
~¿~

Louis
Reply to  M Courtney
August 3, 2017 10:54 am

When climate scientists resort to manipulating data, it tells me they are losing confidence that CO2 can warm temperatures by itself. They obviously feel the need to provide an assist. The political ends justify the means.

Chris Schoneveld
Reply to  Louis
August 6, 2017 12:57 am

CO2 doesn’t warm temperatures, it warms the air or the water.

Thomas Homer
August 2, 2017 12:42 pm

Imagine if two thermometers were installed at each location. With some level of redundancy, independent bad readings would be more apparent (even within acceptable ranges) and outlier readings where both match would be more accepted.

randy julander snow survey
Reply to  Thomas Homer
August 2, 2017 1:04 pm

a man with 2 watches never knows what time it is. we do have sites with multiple temp sensors. they can be up to 10 degrees different in any given hour. slight differences in solar exposure make a big difference. differences of up to 1/2 a degree simply in sensor accuracy. would rather just have one and say the temperature is ‘this’….

Reply to  randy julander snow survey
August 2, 2017 3:27 pm

I suppose I was just slow on that one 🙂

Reply to  Thomas Homer
August 2, 2017 3:26 pm

Thomas, we have a old saying among statisticians. It goes like this:
“A man with a wristwatch always knows the time. A man with two wristwatches can never be sure…”

John Harmsworth
Reply to  Bartleby
August 2, 2017 4:48 pm

An Australian knows enough to paint the optimal time on his wrist!

D. J. Hawkins
Reply to  Bartleby
August 2, 2017 5:09 pm

Harmsworth
Which would, of course, be “beer o’clock”, right? 😀

schitzree
Reply to  Bartleby
August 2, 2017 7:19 pm

As the song said, ‘It’s Five O’clock Somewhere.
^¿^

tetris
Reply to  Bartleby
August 3, 2017 11:05 am

Another saying from when I did stats [long, long ago in a faraway country..]:
statistics is like a bikini – what remains hidden is probably more interesting than what’s revealed.

August 2, 2017 12:46 pm

I can see how administrators could convince themselves that “quality control” was not “manipulation”.
But still I would say that this is a soft form of negligence, because a reasonable person, with reasonable intelligence in their field of expertise should know that the possibility of such temperatures exists, and to deny this possibility in the guise of “quality control” is a form of misrepresentation, hence fraud. It’s fraud used to make oneself complacent in one’s role of carrying out the rules. If the rules are based on fraud, then the actions of those who enforce those rules are based of fraud.
Fraud, fraud, fraud. Have I said it enough now?

John Harmsworth
Reply to  Robert Kernodle
August 2, 2017 4:55 pm

If it isn’t fraud then what are the limits on the highs? What do lower temperature records actually mean? If the record is -10 then maybe that wasn’t really possible, so they should raise the limit to 9.9C. Hmmmm….it’s fraud!

Resourceguy
August 2, 2017 12:46 pm

Is this the AI threat we were warned out recently? I suppose we could miss the next ice age signals with this ‘hand on the dial’ approach.

August 2, 2017 12:52 pm

If you think the sun done it
solar cycle 24 sunspot number for July 2017 in the old money (Wolf SSN) is fractionally down to 11 points while the new Svalgaard’s reconstructed number is at 18.3.
Composite graph is here
SC24 is nearing what might be a prolong minimum (with a late start of SC25) but a ‘dead cat bounce’ from these levels could not be excluded.

Resourceguy
Reply to  vukcevic
August 2, 2017 12:56 pm

The southern US is seeing temps and weather fronts that look a lot like the cool summer of 2009 during that solar minimum. Two or three years of that pattern in place of one will be more noticeable.

randy julander snow survey
August 2, 2017 1:01 pm

sorry, I don’t agree with the supposed ‘tampering’ issue here. we at snotel use exactly the same process. since we collect hourly data from 900 stations we have to rely on computer programs as an initial filter to screen out howlers and screamers – bad data. we set an arbitrary limit that alerts us when a particular parameter is near or exceeding a record – high or low… cause that’s stuff you want to know and be aware of. and if its outside a reasonable bound, the computer sets it to suspect so its not incorporated into various hydrologic models as ‘good’ data. so that value would be removed from ‘public consumption’ until there is a validity check via neighboring site, an onsite visit, etc. this to me seems to be more of a standard protocol in the data collection business than an intent to manipulate data.

Latitude
Reply to  randy julander snow survey
August 2, 2017 1:22 pm

that alerts us…randy, not the same thing….read their two excuses again..designed and automatically adjusted….faulty equipment…..can’t be both

RWturner
Reply to  randy julander snow survey
August 2, 2017 1:33 pm

“She analyzed raw temperature data from places across Australia, compared them to BOM data, and found the agency’s data created an artificial warming trend.
Marohasey said BOM adjustments changed Aussie temperature records from a slight cooling trend to one of “dramatic warming” over the past century.”
Does your method also turn cooling trends into warming trends?
Think about this just a little:
““The temperature dropped to minus 10 (13 degrees Fahrenheit), stayed there for some time and then it changed to minus 10.4 (14 degrees Fahrenheit) and then it disappeared,” ”

“Bureaus Chief Executive Andrew Johnson told Australian Environment Minister Josh Frydenberg that the failure to record the low temperatures at Goulburn in early July was due to faulty equipment. A similar failure wiped out a reading of 13 degrees Fahrenheit at Thredbo Top on July 16, even though temperatures at that station have been recorded as low as 5.54 degrees Fahrenheit.”
If you’re too lazy to think about it, Marohasey has already thunk it out for you:
“Marohasy, for her part, told reporters that Johnson’s claims are nearly impossible to believe given that there are screen shots that show the very low temperatures before being “quality assured” out.”
What are we to make out of this, what was faulty about the equipment that were giving accurate data? I take it that the “faulty equipment” was actually faulty (err purposely programmed) programmed equipment to systematically remove inconvenient data.

bitchilly
Reply to  RWturner
August 2, 2017 4:54 pm

some people appear to be missing this very important point. there was nothing wrong with the equipment . the problem arose with the program processing the data . someone from the bom wrote that program to prevent temperatures below a threshold being recorded . is this happening at other stations ?

knr
Reply to  randy julander snow survey
August 2, 2017 1:42 pm

ever heard of current measurements running ‘to high ‘ and need to be adjusted down , a approach readily used for historic records ?

Gary Kerkin
Reply to  randy julander snow survey
August 2, 2017 3:08 pm

we have to rely on computer programs as an initial filter to screen out howlers and screamers – bad data

I appreciate your problem with the collection and recording from such a large number of stations but the term “computer programs” flags an alarm with me. The term suggests a set of rules for decision processes and I have to wonder what the bases for the rules are. Are they based on statistics? If so, what form of statistics? Gaussian? Normal? Skewed? Multimodal? Can the decisions determine that an outlier is an accident rather than a genuine reading?
Are the rules based on physics? If so, what physics? What assumptions are involved? Have those assumptions been rigorously verified?
In systems in which natural forces predominate I am reluctant to accept the rejection of outliers. To my mind the only acceptable form of modification is to correct demonstrable instrumentation problems such as, say, an inexplicable step change in the output from an instrument. In the shorter term nature does not display step changes (but please don’t challenge me with “earthquakes”—I live in New Zealand!)

randy julander snow survey
Reply to  Gary Kerkin
August 3, 2017 6:20 am

temperature data flagging is based on straight normal statistical bounds. we keep all original data. period. everything. we don’t ‘estimate and replace any temperature data’. if a data point is clearly bad and all data collection systems based on thermistors and electronic transfer will have occassional clearly errant data, (-69.4 or 2021) we set that to null for our data base (no data)… which as previously stated maintains 2 traces: original and edited. so anyone can go in and see what the original was and what the edited is.
when it come to verification – most would be surprised to know how difficult testing in the field can be. using a certified thermometer as close to the thermistor as possible and recording the two instruments over a few hours one can observe differences of 0 to 10 degrees. the only way to accurately certify a thermistor is to remove it and put it in an environmental chamber, run it from bottom to top of scale. one of the reasons we don’t even try to ‘estimate’ what a value might have been for any given observation
we run on the assumption that all data are innocent, valuable until proven guilty.

Paul Penrose
Reply to  Gary Kerkin
August 3, 2017 7:16 am

Randy,
Your procedure sounds fine to me, but that’s not what the BOM was doing. And that’s the problem.

D. J. Hawkins
Reply to  randy julander snow survey
August 2, 2017 3:42 pm

I don’t see a problem with red-tagging data that might be suspect. A bad platinum RTD might throw a reading of “999.99”, so clearly something must be done. The better approach, in my thinking, is to leave the data up, displayed in a color indicating that it is under review. After the QA/QC process, a note indicating the result could be added to the final reading. In the meantime, if other users are scraping the data from your site for their own use, maybe you return a “no record” result for that time slot until the final determination is made.

John Harmsworth
Reply to  D. J. Hawkins
August 2, 2017 5:02 pm

Sounds good! Don’t bother applying to the Australian BOM!

Hivemind
Reply to  D. J. Hawkins
August 3, 2017 12:49 am

“red-tagging” is fine. But the BOM didn’t do that. They CHANGED THE DATA. Total unconscionable.

Quilter52
Reply to  randy julander snow survey
August 2, 2017 10:53 pm

Randy
Sorry but this Aus citizen thinks its a fraud. Goulburn where this occurred is one of Australia’s coldest inland places and I can certainly recall over many years times when the temperatures were below -10 C. Setting a lower limit there would meet the “this temperatures dodgy” needs you mention in your post without artificially leading to a warming trend but deleting low temperatures, even though they do occur. By the way, we have had a very cold winter here so far in my area in the ACT with severe frost most mornings in July. No trumpeting of the “cold weather” because it doesn’t fit although it is probably also due to a very dry winter so far. My, non meteorological observations are that dry weather usually means the winter is colder here in Canberra.

Hivemind
Reply to  randy julander snow survey
August 3, 2017 12:47 am

“alerts us when a particular parameter is near or exceeding”
Alerts, yes. Flags data as suspect, ok. But changing data to something else? That’s unconscionable.

RWturner
August 2, 2017 1:02 pm

Add it all up and the Earth is probably already in a slight cooling trend since at least 2003.
On a side note, it appears there are signs of a slight La Nina forming:
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ocean/weeklyenso_clim_81-10/wkteq_xz.gifcomment image

M Seward
Reply to  RWturner
August 2, 2017 4:52 pm

RW Turner,
what you offer is obviously spurious data from faulty instruments and really should have been taken off line as soon as the Director of Climate Control was made aware. S/he will doubtless be sent for counselling in any case as El Nino/La Nina is a male invented dichotomy used to fineagle the truth about the relentless and almost uniform progress of the global warming disaster hitherto created by the male dominated economic and political regime now crumbling before the reality of a feminist influenced reality.
The Director of Public Persecution will be in touch……meanwhile reflect on the truthiness of the Oz BoM data after it is purified.

TonyL
August 2, 2017 1:02 pm

“The bureau’s quality ­control system, designed to filter out spurious low or high values was set at minus 10 minimum for Goulburn which is why the record automatically adjusted,” a bureau spokeswoman told reporters

From a media flack, OK.
Then there is this:

Bureaus Chief Executive Andrew Johnson told Australian Environment Minister Josh Frydenberg that the failure to record the low temperatures at Goulburn in early July was due to faulty equipment.

Somebody is not telling the truth.
What on earth is a “spurious” value in a linear measurement system, anyway? Either your equipment is functioning correctly or it is not. If your equipment is functioning correctly, you do not get to pick the data you like and discard the rest. If your equipment is broken, you fix it. Again, you do not get to pick and choose what data you like from a broken system.

RWturner
Reply to  TonyL
August 2, 2017 1:36 pm

GOod point, faulty equipment would be giving an erroneous reading at all times and would be obvious at all times, not just when the temperature dropped below an inconvenient point.

Paul Penrose
Reply to  RWturner
August 3, 2017 7:22 am

RWturner,
No, faulty equipment can have intermittent failures. I’ve seen it many times in electronic systems. All it takes is a cold solder joint that has thermally cycled enough to crack, and/or started to oxidize. Or a cracked diode. There are many things that will appear to work fine at certain temperature ranges, but outside that they begin to fail. And they might not be very consistent even then. Lots of non-linear effects going on.

RWturner
Reply to  RWturner
August 3, 2017 8:07 am

I’ve never seen electronic hardware intermittently fail due to physical effects caused by external temperature. Electronics can overheat, then cool off, and work fine after that, sure. But solder, diodes, or capacitors fail intermittently due to cold? Never witnessed that. Once they fail, they fail in my experience. Perhaps there are complex problems that arise in extreme cold that I’m not familiar with, but certainly much colder than a few degrees below freezing.

knr
Reply to  TonyL
August 2, 2017 1:38 pm

clearly you never worked in climate ‘science’ were a perk of the job is you do get to pick the data you like and discard the rest, hell you can even just make it up and still be fine as long as your results give the ‘right ‘ results no one in that profession gives a dam how you got them .

TonyL
Reply to  knr
August 2, 2017 1:52 pm

clearly you never worked in climate ‘science’

True, very true.
I have always had real jobs, where what I built, hardware and software, had to work, and work correctly.
It is harder to do things this way, but people do insist, and they can be fussy about it.
I have been following the Climate Wars for a long time, yet I never cease to be amazed at how even the most basic precepts of data integrity are so flagrantly ignored.

D. J. Hawkins
Reply to  knr
August 2, 2017 3:48 pm

@TonyL
Ahhh, yes. To quote Dr. Ray Stantz: “I’ve worked in the private sector. They expect results.”

bitchilly
Reply to  TonyL
August 2, 2017 4:56 pm

you might want to mention that to those processing the argo data .

Svend Ferdinandsen
August 2, 2017 1:18 pm

“I have taken steps to ensure that the hardware at this location is replaced immediately,”
What does that mean? Is it equipment failure that measured too low, or will the new equipment limit itself to -10.
Before they change the equipment it would be wise to check it, and tell if it was a failure or not.

RWturner
Reply to  Svend Ferdinandsen
August 2, 2017 1:38 pm

I imagine the equipment has already been crushed into a tiny cube. These types of instances seem to be the only time that bureaucracies fail to follow proper procedures, like Clinton just accidently destroying so many devices and hard drives after the FBI asks for it.

Rotor
Reply to  RWturner
August 2, 2017 8:01 pm

That equipment has joined the IRS hard drives in the great computer beyond.

Doonman
Reply to  Svend Ferdinandsen
August 2, 2017 10:32 pm

Anyone who takes official measurements with instrumentation has a certified calibration program in place that must keep periodic records on the performance of all instruments to specification and accuracy against independent standards. Any malfunction or suspected malfunction of an instrument is immediate cause for shutdown and return to the calibration agency where it is impounded and retested. Out Of Tolerance reports are generated and the performance trends of instrument populations are followed as well.
This should be easy to check. If they don’t have these records available, then their data is not data anyway.

knr
August 2, 2017 1:36 pm

Funny how these ‘mistakes’ always end up favoring ‘the cause ‘ with luck like that they are wasting their time being climate ‘scientist’ when they could be hitting the tables of Vegas

The Reverend Badger
Reply to  knr
August 2, 2017 2:22 pm

Returns are better in their day job.

drednicolson
Reply to  knr
August 2, 2017 5:16 pm

Casino owners employ a lot of statisticians to keep detailed data on winnings and losses, for two main purposes: to ensure that the odds stay in favor of the house (otherwise they’re losing money) and to find potential cheaters (who are winning more than they “should”).
So someone with a climate scientist’s run of luck would quickly be interrogated by security and/or escorted off the grounds. ;]

Curious George
August 2, 2017 1:38 pm

We the Government set the low limits for a temperature, and no damn instrument has the authority to override our decisions.

PSU-EMS-Alum
August 2, 2017 2:03 pm

This reminds me of a data set I was playing with recently.
I was curious what the difference between calculating a daily average as (H+L)/2 and that of the average of all observations in a day.
I have to dig up the code but I think the average difference for each day over a decade was roughly a 0.3 +/- 1.5 F between the two methods.
In the case where the daily average is a sum of all observations, ignoring a seemingly wrong value would have a much smaller effect on the result.

Bryan A
Reply to  PSU-EMS-Alum
August 2, 2017 2:14 pm

I would imagine that (H+L)/2 would only give you the Median and not necessarily the average which could fall above or below the median
Like for this set
15.1
16.3
17.4
18.1
19.1
(H+L)/2 would be (15.1 + 19.1)/2 34.2 /2 = 17.1
while the average is 17.2

Bryan A
Reply to  Bryan A
August 2, 2017 2:17 pm

Throwing out the low in this case would render an average of 17.725
Far warmer than the average of 17.2 with the reading by 1/2 degree

The Reverend Badger
Reply to  Bryan A
August 2, 2017 2:28 pm

Surely we are interested in heat flow so the temperatures should be in K absolute, you get a better perspective on it then so as to comprehend the real physical processes. Arbitrary scales of C or F may hinder proper understanding. One can always convert back to C/F later for the non-scientists.

Bryan A
Reply to  Bryan A
August 2, 2017 2:37 pm

C is far less arbitrary than F though. 1C and 1K are the same animal the only difference is the zero point location

Bryan A
Reply to  Bryan A
August 2, 2017 2:39 pm

water boils at 100C or 100K higher than it freezes

Curious George
Reply to  PSU-EMS-Alum
August 2, 2017 3:17 pm

There is min-max thermometer which records a min (L) and a max (H). Most historical data have been recorded with min-max thermometers, read once a day. If you switch to an average of 1,000 observations per day, you have a different animal.

schitzree
Reply to  Curious George
August 2, 2017 7:46 pm

True, but if you record 1000 observations a day you can still pick out the min and max for a particular 24 hour period, it then is the same animal and can be compared to the historic data.
I assume this is how it’s being done. You get the high quality data for acurate weather forcasting but can still use it with historic data for long term trends.
I mean, it would be stupid to limit ourselves to data quality levels of a hundred years ago, wouldn’t it.
~¿~

August 2, 2017 2:18 pm

Speaking from experience, this idea of readings being either acceptable or outliers reminds me of the “normal ranges” that have been established by pathology laboratories to assess the results of blood tests. Whilst the results are deemed to be either within or outside the “normal range”, or borderline, that normal range is only what has been deemed to be normal for the average person through tests conducted over time by that laboratory, but doesn’t allow for the fact that for some people “normal” may be outside that range, and if on occasion a test may fall within that range, rather than it being an indication of all being well, it is in fact indicating that something is not right. The same applies to blood pressure readings.

hunter
August 2, 2017 2:24 pm

Jennifer Marohasy deserves the hat/tip for this. She outed the BOM on this and has outed them on earlier occasions for “modifying” the past.
She is a true heroine of this – rigorously devoted to the science and the truth, and paying in her professional life the price of being an honest and vocal skeptic.

Quilter52
Reply to  hunter
August 2, 2017 10:59 pm

Absolutely true re Jennifer. She has always called the crap and paid a significant penalty for it I believe. But eventually we will see through all this stuff and she will be recognised as a heroine. Meanwhile, we will get the suitably adjusted data to reflect the world they want rather than the world we actually have.

August 2, 2017 2:26 pm

Essay ‘When Data Isn’t’ provides a number of examples from Australia, in ludin most famously Rutherglen, also exposed first by Jen Merohasy. In the case of Darwin and Rutherglen, it appeared to be faulty homogenization, with the bad contaminating the good. Regional expectations is always logically flawed if comparison stations are of dissimilar quality or circumstances. For a concrete example, see footnote 26 on BEST 166900.

August 2, 2017 2:43 pm

Bureaus Chief Executive Andrew Johnson told Australian Environment Minister Josh Frydenberg that the failure to record the low temperatures at Goulburn in early July was due to faulty equipment.

“I have taken steps to ensure that the hardware at this location is replaced immediately,” he added. “To ensure that I have full ­assurance on these matters, I have actioned an internal review of our AWS network and associated data quality control processes for temperature observations.”

Josh Frydenberg should be asking (in no particular order):
1. how do you know which equipment is faulty before any testing? Was it the sensor, cabling, the recording equipment, ???
2. if the temperature monitoring equipment is to be (arbitrarily?) replaced immediately, what steps are in place to reconcile the readings between the old and new equipment? It is not possible to have dual readings in order to be able to cater for slight differences between the new and the old as the old is faulty (apparently). The magical, secretive homogenising process rears its ugly head here.
3. knowing that this problem is faulty equipment, why is it necessary to action an internal review of the entire system? This seems like overkill for a known hardware fault at one location.
4. if the problem is faulty equipment, why did the reported data change? – “The temperature dropped to minus 10 (13 degrees Fahrenheit), stayed there for some time and then it changed to minus 10.4 (14 degrees Fahrenheit) and then it disappeared,”
5. what temperature will be recorded into the ‘official’ records for the moment in question?
6. how long has the equipment been faulty, how can you determine this with any certainty and how many records for this site will be adjusted or discarded due to the faulty equipment?
Frydenberg should ensure that the answer to the next questions is a categorical YES:
7. will the internal review be overseen by any external authority or expert panel to ensure no cover-ups?
8. will the internal review be made public?
9. will this spokeswoman lose her job, either for giving mis-information about this issue as she is at odds with the bureau chief or for letting the general public know (if the bureau chief is wrong) that there is a system in place to limit the recording of temperature data
<blockquote)“The bureau’s quality ­control system, designed to filter out spurious low or high values was set at minus 10 minimum for Goulburn which is why the record automatically adjusted,” a bureau spokeswoman told reporters Monday. BOM added that there are limits placed on how low temperatures could go in some very cold areas of the country.
10. if (when) the internal review reveals that there is an arbitrary lower limit on temperature ‘data’, will the bureau chief be asked to resign for lying to the minister and the public in that he has claimed, with some certainty, that the issue is faulty equipment
No doubt smarter people than me can think of many more questions, both about this incident and also for the entire process

DonK31
Reply to  John in Oz
August 2, 2017 5:55 pm

One more question. If the equipment is faulty, then why is any of the data from that equipment accepted? How long was it faulty and how much data from that site should be thrown down the memory hole?

schitzree
Reply to  John in Oz
August 2, 2017 8:06 pm

“The temperature dropped to minus 10 (13 degrees Fahrenheit), stayed there for some time and then it changed to minus 10.4 (14 degrees Fahrenheit) and then it disappeared,”

“The bureau’s quality ­control system, designed to filter out spurious low or high values was set at minus 10 minimum for Goulburn which is why the record automatically adjusted,” a bureau spokeswoman told reporters Monday. BOM added that there are limits placed on how low temperatures could go in some very cold areas of the country.

… Wait a minute.
If I’m reading that correctly, they are saying that they have the system rigged so if the temperature goes below -10C it only records it as -10C, and the fault that they are replacing the equipment for is that it (briefly) showed a temp lower then that.
They aren’t the least bit apologetic that they have their metaphoric thumb on the scales, they are only worried because for a moment it became visible.
>¿<

Resourceguy
August 2, 2017 2:44 pm

This is equivalent to the statements of national and provincial leaders in various countries saying they have no gay people there. And there are no freezing temps in Australia either.

August 2, 2017 2:45 pm

Another major difficulty with believing data from the BOM are ‘raw’ is this: all temperature data in Australia are ONE SECOND READINGS, so a spike due to instrument error is recorded as the day’s maximum, unlike the rest of the world where typically 5 minute averages are used.

August 2, 2017 3:20 pm

The Grand Project is to make 2017 appear ‘the hottest year ever’ for Global Warming, and the BOM is being paid to assist in this by limiting low temperatures and adjusting high temperatures even higher. Routinely the past temperatures are adjusted lower and current temperatures are reported higher. As a result of this BOM fraud’s success, the CSIRO is now emboldened to call for the recruiting of even more Climate Scientists to assist in this giant fraud under the guise of improving ‘Models’ for future planning. Their ‘Models’ have never worked and never will as they are based solely on the action of CO2, a trace gas.

August 2, 2017 3:38 pm

Zeke, we need you! Please explain this to us dunderheads.

Alan Esworthy
August 2, 2017 4:16 pm

The temperature dropped to minus 10 (13 degrees Fahrenheit), stayed there for some time and then it changed to minus 10.4 (14 degrees Fahrenheit) and then it disappeared…

That didn’t make sense to me, and no wonder. Those C-to-F conversions are wrong.
-10.0° C = 14.0° F (not 13° F)
-10.4° C = 13.3° F (not 14° F)

Gary Pearse
August 2, 2017 5:06 pm

The BOMs ENSO graph jumped down into La Nina territory at the same time of their getting caught at trimming off cold temperatures after being on a holding pattern at threshold El Nino for months. Coincidence? You get too many of these and trust is gone.

Amber
August 2, 2017 5:58 pm

Can we expect a new hockey stick graph ? Too cold you say ? That cold is just science fictionally impossible and must be ignored . Hide the decline ring a bell .
When Antarctica continues to grow why is it surprising Australia doesn’t get side swiped ?
How did a copy of the NOAA manual of temperature “adjustments ” end up in Aussie hands ?

TonyL
Reply to  Amber
August 2, 2017 6:35 pm

How did a copy of the NOAA manual of temperature “adjustments ” end up in Aussie hands ?

Clandestine donations to the Clinton Foundation?

Robert from oz
August 2, 2017 9:31 pm

Min temp for Goulburn below -10 rings an alarm bell I suppose that’s plausible but for Thredbo which is above the snow line, -10 as a lower limit is going to take some splaining .

nankerphelge
August 3, 2017 5:06 am

“Marohasy, for her part, told reporters that Johnson’s claims are nearly impossible to believe given that there are screen shots that show the very low temperatures before being “quality assured” out.”
Heck one could even lose their job or University tenure for something like this.
Ah just like the good Dr Marohasy did!!!

August 3, 2017 9:52 pm

Surely the solution to this, or fix, is to use fault tolerant weather stations. E.g. With 3 thermometers in close proximity. When all are working, take the average. When one diverges wildly, take the average of the other two. If one thermometer consistently reads low replace the instrument or station.

tango
August 4, 2017 5:20 pm

BOM CSIRO SCIENTISTS ARE LUCKY THEY BELIEVE IN THEM SELVES AS THE AVERAGE AUSTRALIAN DOESN’T BELIEVE THEM

Henry
August 6, 2017 1:29 pm

Apparently the record low at Goulburn Airport AWS was -10.9 Celsius on 17 Aug 1994 according to the BOM at http://www.bom.gov.au/climate/averages/tables/cw_070330_All.shtml so if new measurements below -10C are now rejected, this record can never be broken