January 26, 2022 By jennifer marohasy
It could be that the last 26-years of temperature recordings by the Australian Bureau of Meteorology will be found not fit for purpose and will eventually need to be discarded. This would make for a rather large hole in the calculation of global warming – given the size of Australia.
The Australian Bureau of Meteorology undertakes industrial-scale remodelling of historic temperatures in ways that generate more global warming for the same weather through the process of homogenisation, remember the work I did some years ago on Rutherglen. The process is neither transparent nor scientific and lacks consistency with the Bureau’s own policies. Also of concern, since 1996 the Bureau has converted to custom-made electronic probes for temperature recording, and rather than averaging temperatures over one or five minutes as is standard practice around the world from such equipment, the Australian Bureau is recording one second extrema. To be clear a ‘hottest temperature’ record is now a one second automatic download from a supersensitive electronic probe rather than a reading from a more inert mercury thermometer. This is another way the Bureau gets more global warming for the same weather – with its own third generation probes. I am going to explain all of this in more detail in a book I’m sketching out, to write this year.
Theoretically the probes would also bias the minima downwards, except remember Thredbo? How I showed a few years ago that the Bureau has placed limits on how cold an individual weather station can record temperatures, so most of the bias is going to be warmer.
Journalist Graham Lloyd, ignoring all the problems that I’ve documented in such detail over the years, was arguably somewhat premature when he published uncritically for the Bureau on 14th January.
… according to the Bureau of Meteorology annual climate statement, 2021 was the coolest year in nearly a decade and wettest since 2016. By the end of 2021 – and for the first time in five years – no large parts of the country were experiencing rainfall deficits and drought conditions.
Announcing BoM’s 2021 temperature data, climatologist Dr Simon Grainger says: ‘After three years of drought from 2017 to 2019, above-average rainfall last year resulted in a welcome recharge of our water storages but also some significant flooding to eastern Australia.’
In 2021, Australia’s mean temperature was 0.56C above the 1961-1990 climate reference period. It was the 19th-warmest year since national records began in 1910, but also the coolest year since 2012. Rainfall was 9 per cent above the 1961-1990 average, making 2021 the wettest year since 2016, with November the wettest on record.
I went looking for the Annual Climate Statement, and the supporting data. There was a note at the Bureau’s website saying the Annual Climate Statement wouldn’t be published until February. There was no data published on 14th January, just the promise.
I am keen to see which few years they have put before 2021; as being hotter. But alas I will have to wait until February. Of more concern, in February the dilemma will remain that many of the earlier annual average temperatures have been calculated with a different mix of weather stations.
Very few people realise – though I have explained all of this multiple times including to multiple journalists – that when the Bureau of Meteorology transitioned in 2011 to the new Australian Climate Observation Reference Network – Surface Air Temperatures (ACORN-SAT) system for calculating the national average temperature it removed 57 stations from its calculations, replacing them with 36 on-average hotter stations. This had the effect of increasing the recorded Australian average temperature by 0.42 degree Celsius, independently of any actual change in the weather.
Of the 57 stations removed from the calculation of the national average temperature, only 3 of these had closed as weather stations. I will explain all of this in my book. You should be able to order it for Christmas.
There are so many problems with Australia’s official temperature record including the changing combination of stations, the use of custom-designed probes without averaging, not to mention all the homogenisation.
Australia has reliable historical temperature data for the period from about 1889 until 1996 measured using liquid-in-glass thermometers – mercury for maxima and alcohol for minima. Averaging the maxima and minima gives a mean temperature. In some of the records there is a cooling trend to about 1960, and then warming to the present. This is particularly the case at inland locations. Within the longer trend there are short cycles with temperatures generally trending up during drought, and down during wetter years.
When the longest continuous temperature series are combined, with a transparent system of area weighting, as I did in an analysis of temperature trends for south-eastern Australia published as a book chapter by Elsevier in 2016, the overall warming trend is only 0.3 degrees Celsius per century (1887 – 2013). This is significantly less than the Australian Bureau of Meteorology 0.7 degrees Celsius for the same region but a shorter period (1910 – 2016). NASA recently announced a rate of 1.1 degrees Celsius since the late 19th century average, the start of the industrial revolution.
The daily maximum and minimum values in ‘the national temperature dataset’ (the homogenised ACORN-SAT data) are different from the actual recorded historical values, often by several degrees, usually cooler. The further back you go in time, the more significant the cooling thus making what was a modest temperature increase over the period appear greater than it is.
This remodelling is in a different category to correcting for outlier that might have been caused by transcription errors or faulty equipment. The remodelling cannot be confused with legitimate data-hygiene/quality assurance.
Furthermore since 1996, at an increasing number of weather stations platinum electronic probes have replaced the traditional mercury and alcohol thermometers. For example, the Rutherglen agricultural research station has a long, continuous, temperature record with minimum and maximum temperatures first recorded using standard and calibrated equipment in a Stevenson Screen back in November 1912. Considering the first 85 years of summer temperatures – unadjusted/not homogenized – the very hottest summer on record at Rutherglen is the summer of 1938/1939.
At Rutherglen, the first significant equipment change happened on 29 January 1998. That is when the mercury and alcohol thermometers were replaced with an electronic probe – custom built to the Australian Bureau of Meteorology’s own standard, with the specifications still yet to be made public.
According to Bureau policy, when such a major equipment change occurs there should be at least three years (preferably five) of overlapping/parallel temperature recordings. However, the mercury and alcohol thermometers (used to measure maximum and minimum temperatures, respectively) were removed on the same day the custom-built probe was placed into the Stevenson screen at Rutherglen, in direct contravention of this policy.
In 2011, the Bureau made further changes in that it stopped averaging one- second readings from the probe at Rutherglen over one minute. The maximum temperature as recorded each day at Rutherglen is now the highest one-second spot reading from the custom-built probe. That is correct – a spot reading.
So, to reiterate, we now have a non-standard method of measuring (spot readings) from non-standard equipment (custom-built probes) making it impossible to establish the equivalence of recent temperatures from Rutherglen with historical data.
At Rutherglen, a modest rate of warming in the historical maximum temperatures of 0.7 degrees Celsius per Century was changed to 1.3 degrees Celsius in ACORN-SAT Version 2. Changes to the minimum temperature trend are more dramatic: a slight cooling trend of 0.3 degrees Celsius in the historic dataset was changed to warming of 1.9 degrees in ACORN-SAT Version 2 for Rutherglen.
There is much more detail concerning temperatures at Rutherglen and surrounding stations in a report I wrote a few years ago, that can be downloaded by clicking here.
The number of weather stations with electronic probes has slowly increased since 1996, and the bureau now has a network of about 700, referred to as automatic weather stations (AWS). In a report released in September 2017 it acknowledged issues with the performance of just two of these: Goulburn Airport (Goulburn) and Thredbo Top Station (Thredbo). These are the same two weather stations that I reported in blog posts on the 5th and 18th July 2017 as not recording temperatures measured below minus 10 degrees, respectively.
While the Bureau strenuously denied it was setting limits, Minister Josh Frydenberg nevertheless insisted on a review of the entire AWS network.
When the report was published the Bureau’s investigations confirmed that Goulburn and Thredbo were the only sites where temperature records had been affected by the inability of some Bureau AWS to read low temperatures. What are the chances? Of the nearly 700 weather stations, I stumbled across the only two with problems.
Goulburn was discovered because my friend Lance Pidgeon lives nearby and was up early on the morning of 2 July concerned his pipes were going to freeze and burst – while watching the live AWS temperature readings tick-over on that weather station, then letting me know when the record for July of minus 10.4 was reached: only to see it rounded up to minus 10.0.
Thredbo was discovered because, after making a fuss about Goulburn, I wanted to check that the Bureau had lifted the limits on readings below minus 10. So, two weeks later I decided to get up early and watch the one-second reading at one of the stations in the snow fields on the Sunday morning of 16th July thinking it might be a cold morning. Why did I choose Thredbo – of all the weather stations in the Australian Alps? Simply because my school friend Diane Ainsworth died in the landslide there twenty years ago.
I wrote this all up some years ago, at my blog, and it was republished by The Spectator Australia online, you can read more by clicking here.
When the Australian Bureau of Meteorology announce how much hotter last year was relative to earlier years, which they usually do at the beginning of each year, a reasonable person might assume they had just added-up and averaged recorded temperature measurements, perhaps adding an area weighting. But it’s much more complicated than that. In an article in the Weekend Australian on January 22-23 entitled, ‘BoM cools the past, warms present’ Graham Lloyd explains that the Bureau has remodelled Australia’s official temperature record for a third time in nine years. He wrote this the week after announcing to readers of The Australian that the Bureau had published the data for 2021, when it hadn’t.
The Bureau are not saying how much the most recent remodelling (December 2020) has warmed the overall trend. They acknowledged back in November 2018 that the second remodelling added 23% to the overall warming trend.
The Bureau claims the changes are necessary because of changes in the location of recording equipment, abrupt warming or cooling relative to other sites in the region. The article concludes with assurances from the Bureau that what they do is World’s Best Practice.
In June 2014 I gave a presentation to the Sydney Institute entitled ‘Modelling Australian and Global Temperatures: What’s Wrong? Bourke and Amberley as Case Studies’. I choose Amberley because this is a military base with temperatures recorded by military personnel. Temperatures have always been recorded at this same location since 1941. My analysis was of the ACORN-SAT version 1 dataset for this location, and for the period to 2013.
At Amberley the historic minimum temperatures showed cooling at a rate of about 1 degree per century from 1970. The Bureau changed this to warming after first determined there were two statistical discontinuities in the data in 1980 and 1996, and to correct for these changed all the historical temperatures back from 1996 to 1941 were changed creating a warming trend of 2.5 degrees Celsius per century. A combined absolute temperature increase of 1.5 degrees Celsius. This is a very large adjustment.
According to various peer-reviewed papers, and technical reports, homogenization is a technique that enables non-climatic factors to be eliminated from temperature series. It is often done when there is a site change (for example from a post office to an airport), or equipment change (from a Glaisher Stand to a Stevenson screen). But at Amberley neither of these criteria can be applied. The temperatures have been recorded at the same well-maintained site within the perimeter of the air force base since 1941.
My criticisms were published in Issue 26 of The Sydney Papers Online. The Bureau has a policy of not responding to my enquiries, submissions, or peer-reviewed journal articles. However, interestingly Gavin Schmidt then the new Director of NASA’s Goddard Institute for Space Studies in New York came to the Bureau’s defence – on Twitter.
Dr Schmidt was quite blunt about what had been done to the Amberley minimum temperature series, ‘@jennmarohasy There is an inhomogenity detected (~1980) and based on continuity w/nearby stations it is corrected. #notrocketscience’.
When I sought clarification regarding what was meant by “nearby” stations I was provided with a link to a list of 310 localities used by climate scientists at Berkeley when homogenizing the Amberley data. The inclusion of Berkeley scientists was perhaps to make the point that all the key institutions working on temperature series (the Australian Bureau, NASA, and also scientists at Berkeley) appreciated the need to adjust up the temperatures at Amberley.
But these 310 ‘nearby’ stations stretch to a radius of 974 kilometres and include Frederick Reef in the Coral Sea, Quilpie post office and even Bourke post office.
Considering the unhomogenized data for the six nearest stations that are part of the Bureau’s ACORN-SAT network (old Brisbane aero, Cape Moreton Lighthouse, Gayndah post office, Bundaberg post office, Miles post office and Yamba pilot station) the Bureau’s jump-up for Amberley creates an increase for the official temperature trend of 0.75 degree C per century. Temperatures at old Brisbane aero, the closest station that is also part of the ACORN-SAT network, also shows a long-term cooling trend. Indeed, perhaps the cooling at Amberley is real. Why not consider this, particularly in the absence of real physical evidence to the contrary?
In the Twitter conversation with Dr Schmidt we suggested it was nonsense to use temperature data from radically different climatic zones to homogenize Amberley, and repeated our original question asking why it was necessary to change the original temperature record in the first place. Dr Schmidt replied, ‘@jennmarohasy Your question is ill-posed. No-one changed the trend directly. Instead procedures correct for a detected jump around ~1980.’
If Twitter was around at the time George Orwell was writing the dystopian fiction Nineteen Eighty-Four, I wonder whether he might have borrowed some text from Dr Schmidt’s tweets, particularly when words like, ‘procedures correct’ refer to mathematical algorithms reaching out to ‘nearby’ locations that are across the Coral Sea and beyond the Great Dividing Range to change what was a mild cooling trend, into dramatic warming, for an otherwise politically incorrect temperature series.
There is more in the article published by the Sydney Institute following the presentation that I gave there a few years ago, you can read it by clicking here.
The feature image, at the very top of this blog post, shows me at the Goulburn weather station in August a few years back.
Are any ground station “records” fit for purpose?
If they were interested in accurate data they would convert minute averages into hourly averages then into daily averages. The only reason for not doing this is because they want bad data.
That’s just one ploy in their bag of tricks, but an important one.
only takes 1 to stuff the data
Nothing is “fit for purpose” if it refutes your ideology.
Conversely, everything is fit for purpose if supports your ideology.
As usual, Loydo doesn’t even bother dealing with the facts presented, just throws out insults and slinks away.
Is UAH data is fit for purpose? Please ‘splain.
Yes, they are. They show warming well within natural variability. Nothing to worry about.
I’m surprised that you think UAH is fit for purpose since it shows that nothing unusual or dangerous is happening.
How does that trend from the height of 20th century warmth in 1979 in any way justify destroying industrial civilization, which supports eight billion rather than one billion people?
I pointed out two closely correlated trends.
What do you think Jen? Pretty close? And what does it mean? That the BOM and a strikingly different methodology come up with the same trend? And which also happens to be a cool outsider? Pretty close? Or not really that close? Or close-ish but not immediately proximate in any extraordinary way?
No matter, obsfucating the glaring warming trends seems precisely what this post is about and frankly, what all your posts are about.
Presenting a single line for “global temperature” is a fools game.
And Loydo is just the fool to try.
Once you cook the books enough, you can get them to say whatever you want.
BTW, it’s been warming since around 1850, prior to 1950, the warming could not have been caused by CO2. Since 1950 the rate of warming has not increased.
Just because it has warmed a little bit is not evidence that CO2 is the cause, nor is it evidence that the warming is a problem.
Both CO2 and the warming we have seen since 1850 is entirely beneficial. We still have several degrees to go before we get back to the temperatures the world enjoyed during the Holocene Optimum.
Flat temperature from 1997 to 2016 and cooling since then refutes your antiscientific faith, as of course does cooling from 1945 to 1977 under rising CO2.
I vote for more global warming as Perth clocked up 5 days in a row over 40 degree C for the first time ever. We spent more time beaches than usual and other than the odd shark alert it was best summer ever.
I do not know your data source, but Perth exceeded 5 consecutive days over 40 deg C in each of these years:
1956 1933 1961 1933 1933 1961 1965 1956 1961 1961 1933 1956 1964 2016 1978 1980 1991 1991 2016 2007 1980 1962 1956 1989 1985 1961 2003 1961 1964 1985 1989 2007 1968 1975
So, WA, like the US, was hotter in 1933 than 2021.
Yes, the 1930’s were just as warm as today in Australia, too. All over the world, actually. If I had access to my Tmax charts, I would demonstrate.
Bob Tisdale, do you have a link to your Tmax charts on your website. I’ve been looking on your website but cannot find the right page.
Bob has Tmax charts showing that it was just as warm in the Early Twentieth Century as it is today, and these Tmax charts represent nations from all over the world, and both hemispheres including Australia.
My copies of the Tmax charts are on a harddisk I can’t access at present.
The Temperature Data Mannipulators are just as busy in Australia as elsewhere. They all have a similar goal. They are all lying to us in order to accomplish that political goal of demonizing CO2.
This is what I have seen living in Perth
The BOM and Australian media channels are incompetent and live on the east coast – wouldn’t have a clue only want our Goods and Service Tax money for there losers
That was the ABC story whether it was accurate or not I didn’t check as to much time down beach 🙂
You looked out your window; good for you, and say 10 or 15 in a row would be ok too? Can you see outside of a western, a/ced, beached city like Perth?
The green inner city greentard does a greta … how dare you 🙂
Loydodo projecting again
you do realize the earth is still in an inter glacial period, correct?
The BoM’s 7-day forecasts in Brisbane always run hot and rarely do they match actuals … you have to be quick to check the actuals before they ‘homogenise’ them for publication the following day.
who votetemperature stations decide nothing. Those who count the voteselect the temperature stations decide everything.”
— Zombie Joseph Stalin
seriously doubt MY local station is even close to correct
in 14+yrs weve had the barometer work 2x briefly
if it works? the temp readings vanish
rainfall reported there is seriously UNDER what multiple locals record
There are just questions it is not polite to ask about how one supports The Narrative.
Oh what a tangled web they weave. All for settled science?
Precisely. They are trying Far Too Hard to find the trapped heat.
Those ‘once-a-second’ thermometers tell as much.
In many ways similar to Germany’s Engywendy
Why. What has Australia got to feel guilty about – what are they trying to prove?
It’s obvious that in their heart-of-hearts they know it is Junk Science.
Hence also the ‘Centres Of Excellence’ that are looking for the trapped heat – they are trying to convince/brainwash themselves as much as everyone else.
do you laugh or cry
I thought it was the “hidden heat!”
And now the virtuous EU are crying out for our cheap gas, You reap what you sow morons, Enjoy the cold.
Settled science is social science.
The “scientists” that settled climate science as claimed were exposed just before the IPCC Copenhagen Conference, hacked emails in two batches that were called “Climate Gate 1 & 2” containing exchanges between them as they discussed how to create a warming trend computer model to support the climate hoax, as compared to natural climate change over time since the beginning.
How many people remember “Kiwigate”?
The New Zealand Met Service’s surface temperature data record showed no discernible warming during the last century. But that didn’t stop New Zealand climate scientist Jim Salinger (once a lead author of the IPCC, and a former employee at Britains CRU (the centre of the “climategate” scandal) create a global warming trend in New Zealand used by the IPCC.
Fortunately, “Kiwigate” was exposed and Salinger was fired.
Better get the story straight unless legal action is taken. Salinger was not sacked for fiddling data, but for breaching NIWA’s rules regarding contact with the media, something he had been doing for many years. It is described in a Wikipedia entryhttps://en.wikipedia.org/wiki/Jim_Salinger#Employment_issue
NIWA was sued by the NZ Climate Science Coalition. They claimed that organisation had used a methodology to adjust historic temperature data that was not in line with received scientific opinion. The Coalition lost the case and the subsequent appeal and had costs awarded against them. They had to ingo into liquidation.
They had a weak case as scientists are free to use whatever methodology they think is appropriate. Mandated methodology comes from legal requirements, especially in medical science and much of chemistry.
I think NIWA is wrong, but it was unintentional. A 1968 student thesis showed conclusively the cause of any rise (and there wasn’t much) was due to the urban heat island effect.
That is why I accept the official data sets and analyse them. There is plenty of evidence even in these, and even if they have been “adjusted”, to show that global warming is coming to an end. After all, the climate changes, it always changes.
Can you explain for us casual observers what the problem is with homogenization? I feel like I have a vague understanding, but how does it create bad data, if that is the case?
My understanding is that science is about observation, confirmation, accuracy, precision and replication of results.
In layman’s terms –
“you can’t make sh1t up as you go along according to your own rules and call it observed & recorded data”
If the raw data reads high or low it will continue to reflect increases or decreases and so is useful, if not indispensable. Once homoed it corrupts the historic record and also the raw data has a tendency to disappear once homoed data appears. Plus dozens of other things.
Assumption 1: Temperature profiles are nearly sine waves.
Thus the temperature readings at two different stations are related by:
Station 1 – sin(t)
Station 2 – sin(t + a)
The correlation of the two sin waves turns out to be cos(a).
“a” is affected by distance, elevation, pressure, and humidity plus probably other variables such as terrain, nearby bodies of water, and ground cover under the stations.
A distance of about 50 miles results in a correlation factor of .8 or less. This is typically low enough that most people do not consider it representative of useful correlation.
Trying to homogenize non-correlated (or at least only lightly correlated) temperatures leads to nothing but mis-applied temperature corrections to stations being homogenized. It is certain to cause mis-identification of stations that are “wrong”. Hubbard and Lin did a study in 2002 on the application of corrections to measuring stations and their conclusion was that any type of correction should only be done on a station-by-station basis. Even such a difference in micro-climate as one station having green grass under it while another has pea gravel can result in a difference in temperature readings between stations.
The only *accurate* way to identify a discontinuity in temperature readings at a station s by analyzing the actual station readings – and that is nearly impossible since it is difficult to separate out drift in the measurement device from actual climate change.
Ms. Marohasy’s statement of “According to Bureau policy, when such a major equipment change occurs there should be at least three years (preferably five) of overlapping/parallel temperature recordings. ” is on point. It should, however, also require that both stations be compared to a recently calibrated third measurement device in order to determine the accuracy of each of the two stations. Assuming that the newest device is the most accurate is not a justified assumption.
It’s a fair question, and I wanted to hear the author’s explanation.
Thanks Tom. I explain in some detail in the article at this link:
If you put ‘Rutherglen’ into the search engine at my site you will find a lot more. Including how they claim Rutherglen does not move in synchrony with other stations in that region … but it does, perfectly, except they ‘homogenised’ the other stations and then compared it to Rutherglen. I kid you not. Search Rutherglen and ‘Benalla’ or Rutherglen and ‘Deniliquin’ at my site.
If we considering just one example of a change forced by homogenisation, let’s consider January 1939. That was when a terrible bushfire ravaged the state of Victoria, known as the BlackFriday Bushfire. The actual recorded minimum temperature on the hottest day at Rutherglen was 28.3 degrees Celsius. This temperature was changed in 2011 to 27.8 and then with the second lot of revisions in 2018 to 25.7. I haven’t checked to see what it is in this latest iteration (ACORN-SAT 2.2).
Anyone wanting to say model past temperatures during extreme events/bushfires and they look at the official Australian record they are not going to get accurate data. They are going to be out by some few degrees.
A reason that Rutherglen shows continual cooling in its minimum temperatures over the 20th century is because it is an area that saw the development of major irrigation schemes. Water cools the landscape. We can see this in the Rutherglen record, before homogenisation.
And to be clear, the temperatures at Rutherglen until they changed to the probe on 29th January 1998, were measured using standard equipment and in the same paddock at an agricultural research station.
Thanks for the explain and the link.
14 January 1939 minimum at Rutherglen was : RAW 28.3C / ACORN 1 27.8C / ACORN 2 25.7C / ACORN 2.1 25.7C / ACORN 2.2 26.0C.
11 January 1939 minimum was a slightly warmer night at Rutherglen : RAW 28.8C / ACORN 1 28.3C / ACORN 2 26.2C / ACORN 2.1 26.2C / ACORN 2.2 26.5C.
For all of January 1939 averaged at Rutherglen, ACORN 2.2 added 0.3C warmth to ACORN 2.1 : RAW 18.0C / ACORN 1 16.7C / ACORN 2 15.2C / ACORN 2.1 15.2C / ACORN 2.2 15.5C.
ACORN 2.2 added about 0.35C warming to all ACORN 2.1 minima at Rutherglen before 2016 and back to 1913. There were supposedly NIL changes to maxima in ACORN 2.2 but analysis shows some twiddling – e.g 1913-1922 max in ACORN 2.1 averages 21.32C while in ACORN 2.2 the average is 21.35C for Rutherglen (RAW 21.84C).
With Rutherglen’s installation of an AWS temperature probe in 1998, the station’s average max in 1988-97 was 21.45C and in 1999-2008 it was 22.40C – a 0.95C warming. Min only increased 0.03C.
Example : February average rainfall 1988-97 = 31.4mm, 1999-2008 = 36.3C. February average maximum 1988-1997 = 30.37C, 1999-2008 = 30.95C.
Rutherglen’s temperature history has been battered and bruised by a combination of AWS installation and ACORN adjustments.
Overall at the 104 non-urban ACORN stations used by the BoM to calculate Australia’s national temperature trends, and referenced to 2010-19, ACORN 2.2 cooled 1910-19 by 0.06C max and 0.11C min when compared to ACORN 2.1.
With regards 2021 being Australia’s 19th warmest year since records began in 1910, absolute average annual temperatures based on the ACORN 2.2 daily dataset show the mean 1961-90 temperature at the 104 non-urban stations was 18.93C in ACORN 2.2 and 19.15C in RAW.
The 1910 absolute mean temperature was 18.73C in ACORN 2.2 and 19.26C in RAW averaged at the 104 stations. The 2021 absolute mean temperature at the 104 stations was 19.48C in RAW with ACORN 2.2 likely to be exactly the same (note difference of absolute 18.93C in 1961-90 and 19.48C in 2021 is 0.55C, tying in with the official 0.56C anomaly for 2021).
The 104 station data shows 2021 was 0.75C warmer than 1910 in adjusted ACORN 2.2, and 0.22C warmer in unadjusted RAW.
A full analysis of RAW and all ACORN datasets up to v2.2 at all 112 ACORN stations is at http://www.waclimate.net/acorn2/index.html
To disprove the analysis, the BoM should release preferably all years of absolute mean temperatures in both ACORN 2.1 and ACORN 2.2 so they can be compared.
If not, the BoM should at least publish the absolute annual 1910 v 2021 comparison in its 2021 Annual Climate Statement to correct the mistaken media/academic/commentator/political belief that Australia has warmed 1.44C since 1910 (which was the 1910 v 2019 warming).
The most common method of homogenization is to replace missing data. To do this they will examine stations that are “near” the station with the missing data. They will examine trends in the nearby stations. If they find that on average the nearby station is 1 degree warmer, then they conclude that on the day the data is missing, the nearby stations were likely 1 degree warmer as well. So they take the nearby data, subtract 1 degree, and put that in to replace the missing data.
Of course nearby can be any station within 500 to 600 kilometers.
The exact method of determining what the average for the nearby stations and how to use that average to replace the missing data varies from one researcher to another, and to a large degree is considered a trade secret and not shared.
I thought 1200 km, but that might be diameter rather than radius. However in polar regions, the radius could be that great.
Please correct if wrong.
I’m working from memory so you have as good a chance of being right as I do. I wanted to keep it on the low side so that the usual suspects wouldn’t be able to use that as a distraction.
“Within 1200 km of a reporting station”:
On an airline, it’s 1339 km from Churchill, Manitoba to Duluth, MN. From Tucson to Boise is 1352 km. Las Vegas to Portland, 1227 km.
A little research tells me it probably is not what thought. It seems to be more of a process of identifying discontinuities or step changes in the data that mean something changed with the instrument or its surroundings. The homogenization tries to remove the discontinuities, perhaps by looking at nearby temperature records, or some other means. Temperature readings tend to error on the high side (although not for sea surface temperatures), so to the extent that there are more of these “discontinuities” in older data, which seems logical for land based instruments, homogenization would tend to increase the long term trend. Not sure it’s a very big deal since these kinds of adjustments will occur less in more recent data.
I cannot imagine, as a scientist, disappearing actual older data, which seems to be a marker for misconduct.
Ahhh, but that’s not so. Now that they have modern instrumentation, they merrily add degrees to the latest temperatures.
Discontinuities and step changes occur at specfic time when specific changes are made. The off set that results should be a set value until the next step change is identified. They are too lazy/it is too difficult to actually identify what changes occurred at each location so they create an algorithm to homogenise the data. So on day X the alteration value might be 0.5° and Day X+1 might be -0.25°. it isn’t done the correct way.
It appears that you thought you knew the answer before you asked the question. That might explain why you received so many down votes for a question. Your reputation precedes you.
You said, “Temperature readings tend to error [sic] on the high side.” Do you have a citation or at least a reasonable explanation for that?
How do you differentiate between a change in the station measurement device and a change in the actual climate at that location?
If something changed in the surrounding then why is an adjustment justified? The measurements at a station are supposed to be indicative of what is happening at that station. If someone built a pond upwind of the station then how to you determine what adjustment is needed?
As I posted in a different message, homogenization is a losing battle because of decreasing correlation between stations as distance, elevation, humidity, terrain, ground cover, etc increase cos(a), the correlation between the temperatures at two different stations.
You are trying to deal with a historic record and you think it’s okay to change the historic data where you can’t test and check the error .. Mate I think you lack any understanding at all 🙂
“Not sure it’s a very big deal since these kinds of adjustments will occur less in more recent data.”
Here is an example of how the Temperature Data Mannipulators have changed the temperature profile of the Earth:
The actual temperature profile of the Earth is on the left on the webpage (Hansen 1999, U.S. chart). The Data mannipulators have changed the profile on the left, into the profile of the bastardized Hockey Stick chart on the right of the webpage. The profiles are radically different, as you can see.
The chart on the left shows there is no unprecedented warming today since it was just as warm in the 1930’s. The chart on the right shows a lot of unprecedented warming. The chart on the left is taken from actual temperature readings made by human beings. The chart on the right is a bastardization of the temperature record that comes out of a computer directed by dishonest people who are making an effort to demonize CO2, and this is how they do it.
The Hockey Stick chart and the claim we are experiencing unprecendented warming today because of CO2 is a BIG LIE perpetrated by the Temperature Data Manniplators. Their only “evidence” for these claims is this bastardized Hockey Stick chart which they made up out of whole cloth in the computers. The Hockey Stick chart does not repesent the real world.
I keep reading Tom.1’s posts and it is quite clear to me that he is a Troll who acts like a laymen. He makes questioning statements like he is trying to learn something and then comes back after with a programed response.
My suggestion to all would be stop feeding the Troll, it is most likely just the a new name for an old Troll.
What’s “homogenization” you say[, Tom Dot1]? Some kind of dairy product treatment?
Well no, not quite. It is data that has been put through a series of processes that render it so the end result is like comparing the temperature between several bowls of water that have been mixed together, then poured back into the original bowls and the temperature measured of each. What you get is an end temperature for each bowl that is a mixture of the other nearby bowl temperatures. ***
Admittedly, raw data can have its own problems, but there are ways my friends and I at the Pielke research team can make valid station trend comparisons without making numerical adjustments to the actual data raw data. ***
(Source: https://wattsupwiththat.com/2009/07/30/on-climate-comedy-copyrights-and-cinematography/ )
Investigation of Methods for Hydroclimatic Data Homogenization, Steirou, E., and D. Koutsoyiannis, European Geosciences Union General Assembly 2012, Geophysical Research Abstracts, Vol. 14, Vienna, 956-1, European Geosciences Union (2012). —
‘We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. … From the global database, GHCN-Monthly Version 2, we examined all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. … in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. ***
1. Homogenization is necessary to remove errors introduced into climatic time series.
2. Homogenization practices used until today are mainly statistical, not well-justified by experiments and rarely supported by metadata. …
3.  Homogenization is expected to increase or decrease the existing multiyear trends in equal proportions[, however,] in 2/3 of the cases, the trends increased after homogenization.
4. The above results cast some doubts on use of homogenization procedures and … indicate that the global temperature increase during the last century is smaller than 0.7 – 0.8˚ C.
(Edited by me for readability)
(Source: https://wattsupwiththat.com/2012/07/17/new-paper-blames-about-half-of-global-warming-on-weather-station-data-homgenization/ )
I gave a detailed explanation with several examples from around the world in essay When Data Isn’t in ebook Blowing Smoke.
An actual technique doesn’t create bad data unless misapplied. This thing that is called ‘homogenization’ is a bespoke technique made up by climate scientists with an agenda. That isn’t the evidence that it is bad. The evidence is the stupid results that are spat out.
The problem for the believers is that it isn’t enough to fudge temperature records for the northern hemisphere. It must be repeated in the southern hemisphere, and guess what – most of the measurements in the SH are in Australia, although there has been some tampering in one of the South American countries. If those of us in business homogenised our financial records the same way as BoM do to the temperature records, we would end up in jail.
I was aware of the scam over Amberley, as I had collected a large quantity of Amberley records in order to advise a customer nearby on CRD (climate-responsive design) issues. The raw figures show a decline in average temperature.
“The raw figures show a decline in average temperature.”
“Raw” (actual temperature readings) figures are what should be used. Using anything else is a con job.
casual observer ? You ? bullshit !!
For the instrument era temperature data, the Temperature Data Mannipulators put the actual temperature readings, which show it was just as warm in the Early Twentieth Century as it is today, and put this data through their computers and what comes out says it was much cooler in the Early Twentieth Century that it is today. This allows the Data Mannipulators to claim that the Earth is experiencing unprecedented warming today and this warming is caused by CO2.
But they couldn’t say that if the Early Twentieth Century was just as warm as today without the benefit of current CO2 levels, which means there is no unprecedented warming today and also means that CO2 has little to do with affecting the Earth’s temperatures.
So in order to demonize CO2, the Temperature Data Mannipulators “homgenize” the data so it fits their CO2 narrative.
They couldn’t do that if they just went by the actual temperature data that was written down by human beings over the years, so they get their computers and mannipulate the data to make it fit their Human-caused Climate Change meme.
Actual temperature readings put the lie to the Human-caused Climate Change narrative. That’s why the Temperature Data Mannipulators change the temperature profile using their computers.
Casual observers 😉
Seismometers in Australia must surely be buzzing with the vibrations caused by long deceased weather station personnel spinning as their carefully recorded data is tossed on the bonfires of politically correct climate science.
I am particularly surprised to find out that each station is custom built.
The idea that you can build a climate database when each station is unique is something that is so dumb that only a climate scientist could come up with it.
This also further proof that the claims that they can measure the “average” temperature to within a few thousandths of a degree is nothing but pure fantasy.
To use multiple readings to improve precision the first and most basic requirement is that you need to measure the same thing, using the same thing.
Using a thousand identical sensors to measure a thousand different patches of air already violates both of those requirements. But to find out that the sensors are not in fact identical blows their claims so far out of the water that you will need to use low flying planes to find it.
If we were talking about actual science, that fact would be surprising. But in “climate science” (TM), just SOP.
Good post re. Australian data tampering and climate modeling (redundant, I know). Expecting the antipodal CAGW support team any moment now…
I can just hear Nick proclaiming that as long as the sensor records what you were expecting to see, that it must be good.
I would actually like to hear Nick’s take on this.
IMO, hard to find fault with JM’s objections, but if anyone could, it would be our Nick.
BTW, a super hotty turning the Big 6-0 next year:
Which as we all know is the new 5-0.
Who lectures in pearls and a LBD. Prepared for the post-function cocktail parties. OK, maybe a medium black dress, for the formal, academic occasion.
Nick is clutching his pearls.
I kind of agree, I would like to hear what Nick would have to say on the subject as well, he has kind of disappeared for a while. I wonder if he has finally come to the conclusion that he is a sceptic after all.
National Institute of Weather and Atmospherics, NIWA, has been fiddling with the New Zealand temperature record too, always tending to show an upward trend. When the record was challenged Australian Bureau of Meteorology was asked to do a review. NIWA refused to release the paper. I think it is still not on the public record. Good work Ms Marohasy, looking forward to your book. A nice photo.
Thanks Peter. But I don’t know why you can’t refer to me as either Jen, Jennifer or Dr Marohasy. Please. I get tired of the ‘Ms Marohasy’.
My apology. I should have used the honorific Dr.
I was told years ago that when a person insists on being referred to as Ms something is missing in their life and mind.
$9.2 Trillion Per Year To Save The World
Multiply that by about 50 because the report was done by idiots who assumed the most crazy stats for green technology.
Would love to see him behind bars.
She actually cited “Don’t Look Up” as a film that is analogous to what these shrieking idiots call the climate crisis. LOL. It’s at that point that any credibility is out the window. Assuming there was any in the first place.
If they are unfit in Australia, temps will be estimated by homogenization between “fit” stations in South America and Africa, then published in many places as “data”. This will be useful in future to check if the parameters of future models fit past data. /s
So what we have are tamperature measurements.
Tamperatures, period. No actual measurements need apply.
Do you record tamperatures monthly?
What difference does it make?
They’re all tampered with.
Please don’t throw a tamper tantrum.
Shameless. Funny, but shameless.
Do it for one second and it will be a new record on the BOM digital censors as a new century catastrophic high .
Crimate change has a well paying agenda . Not just Aussie but most of the western societies.
I never trust anyone to fiddle with data. If it is necessary to adjust, an explanation must be included along with the raw data.
As soon as you fiddle with recorded measurements, you are dealing with numerical constructs derived through assumptions. not “data”, “observations, “facts”, “realities”.
If an explanation is required then a whole new data set should be started. You should not try to adjust the old data to match the new data. It’s too bad if that affects the length of the records available but that is far better than just making it up out of nothing.
It’s like the Indiana Jones movie where he says “Bad data”.
Does anyone know the mass of the new temperature probes compared to the big blob of mercury and glass making up the old thermometers? A thermistor can be a tiny component standing above a circuit board on a long pair of pigtails or it could be encased in a big blob of epoxy/something to increase it’s thermal mass. Honest scientists would match the physical mass of the sensors to ensure identical response to temperature change.
Let alone comparing temperatures recorded with mercury thermometers vs. those digitally, with a nearby electrical power source.
Well, honest scientists would match the response curves of the sensors to ensure identical response to identical inputs. This may or may not require identical masses. It’s important to keep your eye on what really matters.
Computers can average readings over any time period, but keeping the thermal mass of measuring devices the same is probably a good starting point.
Thermal mass is especially important if you are going to measure temperatures every second.
It’s not just the sensor, it is the whole measuring station that needs to be standardized. While the Argo floats use calibrated thermistors (calibrated at least initially) the actual uncertainty of the temperature measured by the float is +/- 0.5C. Not much better than a LIG thermometer. Anything that changes the water flow rate or salinity of the samples measured by the thermistor, e.g. a barnacle in the water intake of output, will affect the temperature measurement.
BOM seem to do many strange things. Mildura had a weekend of 123 F and 124 F in 1906. They, reasonably, insisted that it was read in the shade but not a Stevenson Screen so was higher than it would be with the more modern equipment.
When they did a study to correct it down over 4 C, they used a comparable site, Deniliquin, over 300 km east of Mildura. During that month, they differed a lot, from D being warmer than M to being 14 C cooler. How it could possibly be considered to be comparable is beyond me.
Despite a good reason to think that it was too high because of equipment, they ignored the second hot day completely even though reported in newspapers. They also converted the 123 F and 124 F to 50.1 C and 50.7 C, or 122.18 F and 123.25 F. Basically, they assumed that it was rounded up from 122.5 and 123.5 and because the precision was 0.25 F, that they would use the lowest possible values in C that could get reported as 123 F and 124 F.
They are clearly looking for any reason to cool the past and warm the present, but ignoring any reason to adjust temperatures the other way.
Another late 1800s was a BoM weather station located at the Bourke NSW Post Office and during days of heatwave weather conditions the the operator responsible for recording the data decided to check it on a Sunday, not normally done on that holiday, and one of the hottest days on record was recorded there. BoM ignore that data and only use record data after 1910.
Using data from different stations, no matter where they are, is called lying.
Dr. Merohasy, I definitely will be buying your forthcoming book if available in ebook form. Long ago ran out of real book shelf space at both homes. Your Rutherglen Research analysis lives on (with text credits and live footnotes to your site) in essay When Data Isn’t in ebook Blowing Smoke.
What is being described is a special case of ‘averaging’ that should more properly be called the “mid-range.” It is really the median of two numbers. That is, it is equidistant between two extremes, and therefore strongly affected by error in either of the two measurements. It lacks the redeeming features of a true arithmetic mean, which is redundancy that reduces the sensitivity to singular errors; also, any calculation of a standard deviation is meaningless. Therefore, one cannot say anything about the probability of observations from a population.
A little OT but here’s an interview with John Cook, by YouTuber Mallen Baker. It’s quite revealing. Cook comes over as a small minded hate-filled person, for whom the underlying motivation of his whole “debunking” campaign is hatred of his own father-in-law.
His unexceptional intellect and lack of any scientific background at all make him highly susceptible to alarmist pseudoscience, of which he has become an obsessive devotee. He is a curious example of a real-time self-debunker with practically everything he says.
I’m not going to torture myself by watching the video, but he always seemed to me as a “hanger on”. Someone who attempts to garner influence by being a lapdog to people more intelligent that he.
Jennifer Marohasy and colleagues have monitored various BoM automatic weather stations and discovered that the temperature readings are often not reported accurately, not an actual figure but for example 29.2 C actual might become 29.9 C reported, and obviously spread over the network of weather stations the impression of a temperature rise trend can be created over time.
They also noted where weather stations were located in or near heat sinks, roads, airport infrastructure and others. One of the original locations now close to the Sydney Harbour Bridge at the original Observatory and alongside the motorway for traffic heading north onto the bridge was open parkland but now has the motorway and buildings not far away.
Recent headlines seen like “Australia has provisionally just had its joint hottest day on record” regarding the 13th Jan 2022.
3rd January 1909, Bourke 51.7°C (125°F) & Brewarrina 50.6°C (123°F).
What did the rest of Australian weather stations record for min/max/avg on Thurs 13th Jan 2022 compared to historical records?
Headline statement should say A town in Aust NOT ALL.
“Hottest Day Ever in Australia Confirmed: Bourke 51.7°C, 3rd January 1909” Jennifer Marohasy
(And several other pages/comments regarding this & similar cases)
The latest Onslow obs was at the Onslow Airport. Checking the BOM records show they previously had another station in town. Airport read ~0.7C hotter than town in early decades (1941-1970), ~1.4C hotter towards the last decades of the overlap (1991-2020). I question the significance of the news.
See January mean of daily max.
The crude comparison above may not exactly quantify a fixed offset or bias. But it is worth further investigation of raw data & the differences need better explanations. It’s easier to write the sensational headlines than to check the details & explain the uncertainties.
Weather records are supposed to be an historical record of what actually happened, but these days the official records are simple manipulated propaganda for the climate change religion. Crying “wolf” is now the highest of “scientific” achievements, – ranked high above doing actual science.
I’m seeing more and more alamista cr8p appearing in our Australian newspapers, particularly the Courier Mail, the others have always been ‘full of it’ … mostly without accreditation.
living in perth we hear that we have had a record with heat over 40deg C for five days
when I moved to Perth in 1993 we had several periods of five days over 40
interestingly none of our talking head media point out that in 1993 Perth city land area was about 50% of today only 1.5 million people lived in the state with about 700,000 in the Perth city Today Perth has swallowed satellite cities of Joondalup to the north (40 km from Perth) and Mandurah to the south (80 km from Perth) and infilled all the land in between
to achieve this we have added thousands of square km with hundreds of square km of bitumen that’s our heat island
but why look at facts when you can espouse the BOM BS
Station moves when encroaching urban development occurs seems reasonable, but what has happened is the fiddlers realized that this would be a good tool to further the “crisis warming” fiction.
I first suspected this trick when everytime I mentioned a case of large changes from raw temperature readings, ‘station moves’ were always brought up by Best, Nick Stokes and other ‘interested parties’.
‘Raw’ temperatures at Capetown, South Africa and indeed a number of other African stations, along with Canadian, Greenlandic, European, Paraguayan, Ecuadoran… all have temperature patterns indistinguishable from the US one, with alltime highs in late 1930s, 35yrs of steep cooling from mid 1940s to 1980, followed by the 2- decade warming 1998 (the super el Niño).
I recall Hansen’s seeming disappointment at the time that 1998 was not a new T record. People knew how to take and record temperatures and a spurious one here and there doesn’t call fo massive adjustment. It just provides cover for unwarranted fiddling with a swathe of temperatures.
Jennifer, here is the Capetown ‘Raw’ as an example:
And here is what the fiddlers did to it:
I note Australia and New Zealand used to have similarly the late 1930s highs – these similar patterns around the globe are coroboratory and the temperature folk know this. Paul Honewood has done a lot of this and you could get data and graphics from him if you wanted to extend your studies.
“‘Raw’ temperatures at Capetown, South Africa and indeed a number of other African stations, along with Canadian, Greenlandic, European, Paraguayan, Ecuadoran… all have temperature patterns indistinguishable from the US one, with alltime highs in late 1930s, 35yrs of steep cooling from mid 1940s to 1980, followed by the 2- decade warming 1998 (the super el Niño).”
That’s exactly right. The real temperature profile of the Earth is the one represented by the U.S. surface temperature chart where it shows it was just as warm in the Early Twentieth Century as it is today. The chart on the left in the link below (Hansen 1999):
All unmodified, regional surface temperature charts show this same temperature profile.
This temperature profile tells us we are not experiencing unprecedented warming today, because it was just as warm in the recent past, and it also tells us CO2 is a minor player in the Earth’s atmosphere as far as temperatures are concerned, since it was just as warm in the ETC as it is today, but with much less CO2 in the atmosphere then than now, yet it is no warmer today than then, so CO2 has not managed to raise the temperatures even though there is a lot more of it in the atmosphere today.
So…based on total lies and deception, we are supposed to feel that 200 sq km of prime agricultural land (and that’s only what’s in the pipeline near us so far) being handed over for wind and solar is justifiable? To put a halt to a problem that doesn’t exist, with infrastructure that isn’t fit for purpose! Let alone have any affect on the temperature!
They don’t even tell the host landholders that it’s only if they can’t pay the cleanup bill then the local council (ratepayers) have to. In fact some are told there are profits to be made from recycling. I assume they would have to declare bankruptcy. With the amount of renewables planned for our area, it’s unlikely even the councils will be able to foot the bill.
So many lies. Do people really wonder why we balk when they say “trust the science”?
I’m sure there’s a perfectly innocent explantion for this, and that the BOM will provide it when they have a few spare moments. Right now they are too busy beating back Man Made
Global Warming CovidClimate Change. When that is done, they will let us know.
Record temperature of 50.7ºC recorded in Australia – Panorama | Armenian news
And in Western Australia:
Perth last week clocked an all-time record, with six consecutive days above 40 degrees Celsius.
The state has now had more individual days over 40C than during any other summer.
Earlier in January, the small town of Onslow, on the Pilbara Coast, equalled the Australian temperature record of 50.7C.
And last month, Marble Bar had 16 days above 45C — the most on record.
Australia is warming and getting more extreme heat.
We knew you would eventually show up. Where is colluuuusion clown Simon?
Did you ask the aboriginals what the hottest run of days at Marble bar was before 1901 or an old timer what the max temp at Onslow airport was before 1940 griff?
You want to be careful with that unwoke attitude of yours to measuring warm days mate given all your buddies screaming Invasion Day on Australia Day. You never know where they’ll want to stick those culturally oppressive thermometers you’re so besotted with and they’ve got a lot bigger with automation.
From Griff’s link: “Temperatures reached 50.7 degrees Celsius at Onslow at 2.26pm, matching Oodnadatta in outback South Australia, which hit 50.7C on January 2, 1960.”
So it was just as warm in 1960, as it is today.
Why isn’t it hotter now than then, since there is so much more CO2 in the atmosphere today than there was in 1960? When is this predicted temperature rise to 2.0C above the average going to start?
Griff has done this a number of times. Linking to a ” new record” but making no mention of when the previous record was set
For example he trumpeted the fact that a new New Years Day record was set in the UK this year but ignored the fact that the previous record had been set in 1916 – 106 years earlier!
That’s one reason I looked at Griff’s link. I suspected this high temperature was pretty close to one in the past, and it was.
Nothing unprecedented here. Griff is his own worst enemy. 🙂
Griff may have missed my earlier comments regarding Onslow, Bourke & averages.
If the Onslow city temp was accurate
then the value recorded at Onslow Airport can have atleast 1.4C subtracted from it (correcting bias) before comparing with historical records. Such discrepancies exist within 3 to 10km of each other in many places around Australia but no explanations are provided by BOM. Why is Penrith NSW close but often warmer than Richmond NSW? How much could it be related to just clouds & wind? How much of the difference could be population density, traffic, land use (city vs farms), irrigation (no longer can water during 10am to 4pm as we used to in city but farms can still irrigate large areas day&night)? Is temperature a reliable measure of energy when absolute humidity can vary so much?
Once again griff demonstrates an extreme ignorance of basic statistics.
As usual, he trumpets every record high, no matter how small, as if it actually meant something, but completely ignores all record lows.
When record highs outnumber record lows by a factor of two or more for several years running, just maybe, it will be meaningful.
In today’s world we have about as many record lows as record highs.
It is a mistake to assume that the arithmetic average of the maximum and minimum temperatures in a day gives the average temperature. It depends on the temperature curve during the day, and the difference between these two calculated quantities can be quite large.
It would also be curious to know how a probe reacts when warming vs cooling is thermal inertia the same for both and is it the same for moist air vs dry air, if the stations are taking readings constantly vs an average over a 5 or 10 minute period this could be relevant.
With scientists and politicians agonizing over the difference between 1.5 deg warming, and 2.0 degrees, you’d think those responsible for measurements would want to ensure that a reporting error of 0.42 degrees isn’t introduced out of laziness. Just what would have been so wrong to run the new thermometers in parallel to validate their consistency? It’s inexplicable that they didn’t do that.
This is what warming in the Northern Hemisphere looks like in the era of the weak Sun. More snow mass year to year (not counting mountains).
Spare a thought for, Africa,
WMO-” Because the data with respect to in-situ surface air temperature across Africa is sparse, a one-year regional assessment for Africa could not be based on any of the three standard global surface air temperature data sets from NOAA-NCDC, NASA-GISS or HadCRUT4 Instead, the combination of the Global Historical Climatology Network and the Climate Anomaly Monitoring System (CAMS GHCN) by NOAA’s Earth System Research Laboratory was used to estimate surface air temperature patterns”
I feel fed up that BOM won’t be transparent about their methodology. If it was ‘real’ science, rather than science-with-an-agenda they should publish the base dataset and method/rationale for adjusting it for peer review and scrutiny.
If it weren’t for individuals like Jennifer Marohasy, nobody would get to know the truth.