The ‘trick’: How More Cooling Generates Global Warming

From the “we’ll fix that in post” department comes this from down under courtesy of Dr. Jennifer Marohasy.

COOLING the past relative to the present has the general effect of making the present appear hotter – it is a way of generating more global warming for the same weather.

The Bureau of Meteorology has rewritten Australia’s temperature in this way for the second time in just six years – increasing the rate of warming by 23 percent between Version 1 and the new Version 2 of the official ACORN-SAT temperature record.

Temperatures from the Rutherglen research station in rural Victoria are one of the 112 weather stations that make-up ACORN-SAT. Temperature have been changed here by Blair Trewin, under the supervision of David Jones at the Bureau.

Dr Jones’s enthusiasm for the concept of human-caused global warming is documented in the notorious Climategate emails, during which he wrote in an email to Phil Jones at the University of East Anglia Climatic Research Unit on 7 September 2007 that:

“Truth be known, climate change here is now running so rampant that we don’t need meteorological data to see it.”

We should not jump to any conclusion that support for human-caused global warming theory is the unstated reason for the Bureau’s most recent remodelling of Rutherglen. Dr Jones is an expert meteorologist and an honourable man. We must simply keep asking,

“What are the scientifically valid reasons for the changes that the Bureau has made to the temperature records?”

In 2014, Graham Lloyd, Environmental Reporter at The Australian, quoting me, explained how a cooling trend in the minimum temperature record at Rutherglen had been changed into a warming trend by progressively reducing temperatures from 1973 back to 1913. For the year 1913, there was a large difference of 1.7 degrees Celsius between the mean annual minimum temperature, as measured at Rutherglen using standard equipment at this official weather station, and the remodelled ACORN-SAT Version 1 temperature. The Bureau responded to Lloyd, claiming that the changes were necessary because the weather recording equipment had been moved between paddocks. This is not a logical explanation in the flat local terrain, and furthermore the official ACORN-SAT catalogue clearly states that there has never been a site move.

Australians might nevertheless want to give the Bureau the benefit of the doubt and let them make a single set of apparently necessary changes. But now, just six years later, the Bureau has again changed the temperature record for Rutherglen.

In Version 2 of ACORN-SAT for Rutherglen, the minimum temperatures as recorded in the early 1900s, have been further reduced, making the present appear even warmer relative to the past. The warming trend is now 1.9 degrees Celsius per century.

The Bureau has also variously claimed that they need to cool that past at Rutherglen to make the temperature trend more consistent with trends at neighbouring locations. But this claim is not supported by the evidence. For example, the raw data at the nearby towns of Deniliquin, Echuca and Benalla also show cooling. The consistent cooling in the minimum temperatures is associated with land-use change in this region: specifically, the staged introduction of irrigation.

Australians trust the Bureau of Meteorology as our official source of weather information, wisdom and advice. So, we are entitled to ask the Bureau to explain: If the statements provided to date do not justify changing historic temperature records, what are the scientifically valid reasons for doing so?

The changes made to ACORN-SAT Version 2 begin with changes to the daily temperatures. For example, on the first day of temperature recordings at Rutherglen, 8 November 1912, the measured minimum temperature is 10.6 degrees Celsius. This measurement is changed to 7.6 degrees Celsius in ACORN-SAT Version 1. In Version 2, the already remodeled value is changed again, to 7.4 degrees Celsius – applying a further cooling of 0.2 degrees Celsius.

Considering historically significant events, for example temperatures at Rutherglen during the January 1939 bushfires that devastated large areas of Victoria, the changes made to the historical record are even more significant. The minimum temperature on the hottest day was measured as 28.3 degrees Celsius at the Rutherglen Research Station. This value was changed to 27.8 degrees Celsius in ACORN Version 1, a reduction of 0.5 degrees Celsius. In Version 2, the temperature is reduced by a further 2.6 degrees Celsius, producing a temperature of 25.7 degrees Celsius.

This type of remodelling will potentially have implications for understanding the relationship between past temperatures and bushfire behavior. Of course, changing the data in this way will also affect analysis of climate variability and change into the future. By reducing past temperature, there is potential for new record hottest days for the same weather.

Annual average minimum temperatures at Rutherglen (1913 to 2017). Raw temperatures (green) show a mild cooling trend of 0.28 degrees Celsius per 100 years. This cooling trend has been changed to warming of 1.7 degrees Celsius per 100 years in ACORN-SAT Version 1 (orange). These temperatures have been further remodeled in ACORN-SAT Version 1 (red) to give even more dramatic warming, which is now 1.9 degrees Celsius.
Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
184 Comments
Inline Feedbacks
View all comments
nw sage
March 5, 2019 5:22 pm

Actually, by changing the data to fit the global warming meme they are simply being honest – making the data truthful: Since the data is now man made, the global warming trend it shows using the man made data is also man made! Who could argue with that?
The data never existed before in the real world therefore it is man-made. It certainly didn’t exist at the time the records were recorded.

Tom Abbott
March 5, 2019 5:47 pm

I hope Happer is taking note. 🙂

Rod Evans
March 5, 2019 5:47 pm

The BOM are really getting into the swing of the adjustocene aren’t they?

March 5, 2019 5:52 pm

Once again, there’s five significant digits and four decimal places from data barely resolved to a tenth of a degree. This doesn’t seem possible if one propagates uncertainty throughout their calculations.

I’ve been going over Nick Stoke’s TempLS R program for the past few days, and it gave me cause to look at the GHCN monthly summary data it uses. This data has two digits right of the decimal point, but it starts with GHCN daily data in tenths of a degree. No result derived from it should have more than one place to the right of the decimal. NOAA also provides no uncertainty for the series, so I tried my own evaluation.

I picked a month at random from a site with good numbers for the year. The name of the station is Dumont D’Urville, in the Antarctic. It has two different IDs in the two data sets: AYM00089642 in the daily data and 70089642000 in the monthly summaries, but the latitude and longitude agree, so it’s the same place.

Right off the bat I saw the numbers don’t agree between data sets. The monthly average in the summary is given as 0.3C for January 2011, but the daily TAVG, averaged for the month, gives 0.5C. An interesting aspect of the daily data set is that it has TMAX, TMIN, and TAVG, but the TAVG is averaged from hourly measurements, not the average of TMAX and TMIN. The monthly data set says only that it is in hundredths of a degree.

When I ran the numbers on the dailies for the month of January 2011, the error in the mean (ΔX/√N) was 0.2°C. I can’t see how in good faith values in the hundredths place can be published with the uncertainty in the mean is an order of magnitude larger. The monthly average for January 2011 was 0.5±0.2°C. That’s pretty astounding. Am I wrong somewhere?

Steven Mosher
Reply to  James Schrumpf
March 6, 2019 1:19 am

“It has two different IDs in the two data sets: AYM00089642 in the daily data and 70089642000 in the monthly summaries, but the latitude and longitude agree, so it’s the same place.”

Nope. ! not necessarily.

having the same lat lon is not assurance it is the same station.!
You have no assurance that the same location reported in the metadata
MEANS that the stations are the same or more importantly that the sensor is the same.

WHY?
because some locations ( airports, research centers) have TWO sensors located roughly in the same location.

One sensor may report hourly
One sensor may report daily.

Further, Just because the metadata is different ( site name and site location) doesnt MEAN that the
stations are DIFFERENT!

WHY?

well because just like data can be wrong, so can metadata be wrong.

potential mistake: believing the metadata without question.

so. there is a way to tell, but you need to do more work.

For that station 70089642000 there are 6 alternative sources and at least one controlling daily
GSOD

From the looks of it you went hunting for a cooling station:

There are at least two candidate stations there, very slightly different naming and slighty
lat lon. data series is different and the sources for the data are different.

http://berkeleyearth.lbl.gov/stations/151563

http://berkeleyearth.lbl.gov/stations/3022

welcome to Hell.!!! are these the same station? reported in different databases? two sensors at one location? two different records of the same station?

The approach folks take ( see ISTI and Berkeley) is to apply tests to both the metadata and the time series data to de conflict multi sources. You have to check all sources and check all metadata
and check whether the time series match. and then you can deconflict. The deconfliction process
is probablistic.

Reply to  Steven Mosher
March 6, 2019 9:36 am

The stations have the same name and check to within 10.8″ in latitude and longitude (about 880 feet difference), but I shouldn’t trust that because having the same name, latitude, and longitude in an official US Government publication is no reason to trust that it’s the same station. However, your BEST location data doesn’t match either of the NOAA station location data, so maybe there’s actually FOUR stations running down there.

Do you realize how that sounds?

What “all sources” and “all metadata” take precedence over the officially published station IDs and metadata in official NOAA data sets? Why do they have different IDs? It’s the Federal government, and one data set is daily records and the other is monthly summaries, so why should they have the same ID in two different data sets? They don’t explain it; they just tell you “These re the station ID.” It actually says, in the daily files readme.txt, “The name of the file corresponds to a station’s identification code. For example, “USC00026481.dly” contains the data for the station with the identification code USC00026481).”

Do you believe THAT, or do you need independent verification from a third party for that as well?

Anyway, it’s obvious they ARE the same station, unless one is of the paranoiac type. That isn’t the point, anyway. The point is that if you take the daily temps and get the average, and actually pay attention to the uncertainty, instead of treating scientific measurements like a calculator game, the mean for the month of January 2011 at that station is 0.5±0.2°C. You realize how large that uncertainty is, right? That was using the ΔX/√N formula to get the error in the mean.

The figure for that month from the monthly summary is only different by the uncertainty from the daily measurements, so I suppose that’s a win. The other months have similar uncertainties, so using results with two digits to the right of the decimal is definitely not supported by the raw data.

I’m curious; how do you justify using such precision? I see you have some uncertainties on the pages you linked, but they look overprecise for the starting data as well. What’s the method for taking daily data in tenths of a degree and averaging them monthly into hundredths of degrees? Would you take a moment and explain the process?

Steven Mosher
Reply to  James Schrumpf
March 7, 2019 3:21 am

“The stations have the same name and check to within 10.8″ in latitude and longitude (about 880 feet difference), but I shouldn’t trust that because having the same name, latitude, and longitude in an official US Government publication is no reason to trust that it’s the same station. However, your BEST location data doesn’t match either of the NOAA station location data, so maybe there’s actually FOUR stations running down there.

Do you realize how that sounds?

Sounds like you never looked at metadata

What “all sources” and “all metadata” take precedence over the officially published station IDs and metadata in official NOAA data sets? Why do they have different IDs?

1. NOAA/ NCDC sources are not “Official”
2. Both GHCN D and GHCN M are aggregated from other sources.
3. There are multitude of IDs all assigned by different agencies

Your mistake is thinking that NCDC is a source of data from that part of the world.
they are not. People send them data. They repost it.
At the same time people send them data, they also send the data to other other folks
like WMO and USAF

Thats right, other governments and other scientific agencies report their data to
NCDC They also report it to WMO

NCDC just collect the data.

But lets do a test

go to GHCN M

Check AGM00060355 SKIKDA

Find the location?

Let me help

Lat 36.9330
Longitude 6.9500

Do a google map of that
Find something?

Yup it in the water

Now go to WMO oscar metadata.

https://www.wmo-sat.info/oscar/

Look up the same site. Its correctly located.

Why does NCDC have the station in the water and WMO have it correctly placed?

Because WMO did a whole program to update the metadata of sites and got people to update
their positions. The program requied people to do reports in degrees, seconds and minutes.
so WMO metadata is usually pretty good, they did an update.

This new data gets reported to WMO
This data doesnt always get reported to NCDC ! DOH!

So, sometimes NCDC metadata is out of date.

It goes like this.

Country X has a station
1. They report some data to WMO
2. They report some data to NCDC.
3. They report some data to USAF
4. They keep and publish their own local records
3. They dont always update metadata to all sources ( 1,2,3,4)

So just because NCDC puts the SKIKDA station in the middle of the ocean doesnt mean
its in the mddle of the ocean because you think NCDC is “official”
You need to check
Want country owns the site
do they keep a record
do they send records to WMO
do they send records to NCDC
do they send records to USAF, METAR

Then when you look at all the data you can deconflict.

So remember

NCDC is a repository. Other people own and send them data.
That “same” data is also sent to other repositories.
The repositoires dont always agree
First job is deconflicting
lat lon and name are not enough to deconflict because two sensors can be at the same locating reporting different measures

Steven Mosher
Reply to  James Schrumpf
March 7, 2019 3:47 am

“What “all sources” and “all metadata” take precedence over the officially published station IDs and metadata in official NOAA data sets?

For GHCN D ( daily )
They list the ORIGINAL sources for the daily data with every stations

See this
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/daily/readme.txt

So for the station in question you do back to the origina; sources before it gets into GHCN D

Sometimes this will be GSOD, or FSOD, or any other number of original sources

“I’m curious; how do you justify using such precision? I see you have some uncertainties on the pages you linked, but they look overprecise for the starting data as well. What’s the method for taking daily data in tenths of a degree and averaging them monthly into hundredths of degrees? Would you take a moment and explain the process?”

Sure. First you can read our published paper and appendix

Next. I put a little tutorial together. Should be done in a couple weeks

Mostly people dont understand what a global spatial average is. Hint
its not an average and not a measurement.

Second, try this. add as much error as you like to the measures.
then compute a global average and take the anomaly
what do you notice?

Steven Mosher
Reply to  James Schrumpf
March 7, 2019 3:51 am

““These re the station ID.” It actually says, in the daily files readme.txt,”

So you read the readme, but missed the part where they tell you what the original sources are

Then you ask me why these files ( GHCN D) are not the only sources

Weird.

Walter Sobchak
March 5, 2019 6:20 pm

Welcome to the Adjustocene. I am a denier. And the first thing I deny is that the “climate scientists” are honest or well intentioned.

If the question is whether it is warmer now than it was 150 years ago. The answer is maybe. Show me honest unadjusted evidence with plausible error bars.

Tom Abbott
Reply to  Walter Sobchak
March 6, 2019 4:35 am

“If the question is whether it is warmer now than it was 150 years ago. The answer is maybe.”

The real question to ask is whether it was as warm in the 1930’s as it is today. The answer is yes Unmodified charts from all over the world show this.

So if it was as warm in the 1930’s as it is today, then that means there is no unprecedented amount of heat that has been added to the Earth’s atmosphere between then and now, due to CO2 or anything else. The 1930’s warmth rose to a certain level, then cooled off for a few decades, and now, today, the warmth has risen back to the same level as in the 1930’s. This means CO2 is not a player in this game, it is an also-ran.

It’s was just as warm in the 1930’s as now. No unprecedented warmth is required. There is no CAGW!

In fact, current temperatures are about 1C cooler than the high temperatures of the 1930’s. The year 2016 was designated by NASA/NOAA as the “hottest year evah!”. According to the Hansen 1999 US surface temperature chart, 1934 was 0.5C warmer than 1998, and was 0.4C warmer than 2016. And now we have cooled about 0.6C from the 2016 highpoint. “Hotter and hotter” is no longer applicable.

There is no unprecedented heat in the Earth’s atmosphere. There is no CAGW.

WXcycles
Reply to  Tom Abbott
March 6, 2019 7:23 am

‘Scientific’ prediction: 2020, 2021 and 2022 will be the hotterest years ever in BOM land and new unprecedented all-time high max and min records will be set every other day. And it will require drastic collective actions and the outright rejection of all debate and vilification of all dissent in order to ‘fix’ it.

John Endicott
Reply to  WXcycles
March 6, 2019 10:22 am

ah, more of the same then.

March 5, 2019 7:03 pm

For anybody interested in the ACORN 2 rewriting of Australia’s temperature history, I’ve uploaded a state-by-state breakdown of all ACORN stations, starting with a national analysis of the 57 locations that have a start year of 1910 at http://www.waclimate.net/acorn2/index.html

Actually, only 111 of the 112 network stations could be analysed because the bureau’s ACORN 2 daily temp download files provide Halls Creek temperatures in the Kalumburu maximum file. Pretty sloppy for a world class dataset.

The above linked national page links to NSW, the Northern Territory, South Australia, Tasmania, Victoria and WA with min/max charts and data summaries for each site, as well as Excel downloads containing the daily temps in all three datasets (ACORN 1, ACORN 2 and RAW) and tabulated averages of their annual weekday temps, annual monthly temps and annual temps (averaged to be compliant with ACORN procedures).

Since there’s been some comment above on decimal precision by observers, the tables also show the annual x.0F/C rounding percentages and the averages before and since 1972 metrication. The bureau has conceded that metrication caused a +0.1C breakpoint (probably because a bit over 60% of all F temps before 1972 were recorded as x.0F with a downward observer bias) but chose not to adjust for this in the homogenised ACORN because the extra warmth may have been caused by major La Ninas in the early to mid 1970s accompanied by the wettest years and highest cloud cover ever recorded across Australia. Good luck figuring that one out.

Reply to  Chris Gillham
March 5, 2019 8:21 pm

Loved this part:

The rounding relationship

An average 60.25% of all Fahrenheit temperatures recorded from 1910 to 1971 were x.0 rounded numbers without a decimal, with a likely but unproven cooling influence due to a greater number of observers (many farmers or post office staff rather than weather bureau employees) writing, for example, 86.7F as 86 rather than 87 because they considered it more accurate and honest.

Now they’re mind-readers, and can tell that apparently soft-hearted “farmers or post office employees” would be likely to cheat the temps down than would steely-eyed weather bureau employees.

Is there no excuse to modify data that’s too embarrassing to use?

Tom Abbott
Reply to  James Schrumpf
March 6, 2019 4:39 am

“Is there no excuse to modify data that’s too embarrassing to use?”

Any excuse will do.

David Stone
March 5, 2019 7:05 pm

Former Prime Minister Tony Abbott actually did call for an investigation into BoM shenanigans. Not long after, he was goooone and … nothing happened. The great and powerful Tony Jones, highest paid “journo” in Australia’s government broadcaster, the ABC, politely but firmly ordered the lily livered Minister of the Environment and Energy, Greg Hunt, to cease and desist with anything of the sort and….. nothing happened.
What we need in Australia is an investigation into the BoM and a Royal Commission into the wind and solar sector before they turn a first world nation into a third world nation faster than you can say “renewable energy”. Neither will happen. The Coalition won’t do it, had their chance and wilted under mild heat from Jones and ironically, Malcolm Turnbull. Understandable, yet another former PM, Malcolm Turnbull would be revealed as corrupt, placing the financial interests of his son Alex, a big investor in Infigen Energy, above that of the nation he led. Watch the Turnbull fortune grow now, maybe not quite Al Gore speed but it’ll be impressive – the most dangerous place to be in Australia is between a Turnbull and a bucket of money.
The Labor party won’t either. They’ll be preoccupied overseeing the tsunami style collapse of the Australian economy, swift, broad, deep and complete

Patrick MJD
Reply to  David Stone
March 5, 2019 9:14 pm

I think with reserve bank interest rates on hold, again, house prices and the economy tanking I think the scare campaigns will grow in strength, people will vote on how they feel rather than any kind of sense of the matter.

Either way, coal will be blamed.

March 5, 2019 7:53 pm

The Bureau has also variously claimed that they need to cool that past at Rutherglen to make the temperature trend more consistent with trends at neighbouring locations. But this claim is not supported by the evidence.

It wouldn’t be right even if the claim was “supported by evidence”. Why should the trend at one station match the trend at neighboring stations, which could be hundreds of miles away? Planets have been discovered because of small perturbations in the data. Take the discovery of Neptune, for example (quotes from the astronomy site

https://cseligman.com/text/history/discoveryneptune.htm

The Discovery of Neptune
The first accurate predictions of Uranus’ motion were published in 1792. Within a few years it was obvious that there was something wrong with the motion of the planet, as it did not follow the predictions. Alexis Bouvard, the director of the Paris Observatory, attempted to calculate improved tables using the latest mathematical techniques, but was unable to fit all the observations to a single orbit, and finally decided to rely only on the most modern observations, while suggesting, in his 1821 publication of his results, that perhaps there was some unknown factor that prevented better agreement with the older observations.

That’s Old School science, the kind that got things done, like discovering the Law of Gravity and unseen planetary bodies. This is how Post Modern Climate Science would have handled this pesky perturbation:

How Neptune Didn’t Fit the Model
The first accurate predictions of Uranus’ motion were published in 1792. Within a few years it was obvious that there was something wrong with the motion of the planet, as it did not follow the predictions. Michelmas Mannus, the director of The Team Observatory, attempted to calculate modifications to the observations using the latest statistical techniques that he had thought up that morning, but was unable to fit all the observations to a single orbit, and finally decided to rely only on the observations that fit the existing model, while suggesting, in his 1821 publication of his results, that perhaps the Deniez had prevented better agreement with the approved observations.

Robert Coutts
March 5, 2019 8:16 pm

How the Snip can those who doctor raw temperature data call themselves scientists??

Profanity is profanity no matter how many ‘*’ you use – MOD

UNGN
March 5, 2019 9:11 pm

UHI of the Dallas Fort Worth Metroplex is massive. In the Middle sits DFW Airport where all of the Media’s (for at lest 5 Million people) temps are recorded. This morning, the low temp was just 1 degree from the absolute, 120 year, record low for that date in the area.

Of course, just 15 miles outside of the DFW UHI, temps were 5-6 degrees colder (and would have SMASHED the 120 year record). 120 years ago there was no UHI at DFW airport. Even 40 years ago, there was no MASSIVE UHI at DFW airport… so why in 2019, would ANYONE be adjusting temp from the past upward? If anything, they should be adjusting modern temps downward.

Prjindigo
March 5, 2019 11:02 pm

The problem really lies with the fact that ALL the data sources are anecdotal in nature. It doesn’t matter how sensitive and accurate a thermometer is if its data is being averaged over a distance starting at it’s location. Every single smoothed datapoint is falsified information and wrong which means that the entire dataset is abjectly wrong when used. The density of the thermometers isn’t even sufficient to be used in forecasting anymore. I’ve constantly seen a cold AND hot bias from my local airport only 6 miles from me while my two outside thermometers (by the house and on a post 30′ away) agree all the time on the local temp but not with the airport.

Until they improve the methodology by measuring the actual bias curves and increase the SENSOR resolution and then start actually monitoring atmospheric DEPTH with thermometers as well we will continue to have shit-worthless “science” done by consensus and bad headlines.

There simply isn’t enough resolution of weather data to prove them wrong.

knr
March 6, 2019 1:24 am

The idea that you can make scientifically meaningful changes, and it science they claim to be doing , when you cannot correctly define what needs changing and to what degree. Is a load of dingo’s droppings. You have to ask given this approach, amongst many others, does climate ‘science ‘ have any standards at all beyond supporting the cause ‘ and thinking ‘headlines first ‘ ?

Tom Abbott
Reply to  knr
March 6, 2019 5:09 am

“does climate ‘science ‘ have any standards at all beyond supporting the cause ‘ and thinking ‘headlines first ‘ ?”

The answer is no.

Doug Prior
March 6, 2019 3:09 am

Excellent article from Jennifer here on WUWT.

Australia’s BoM wasn’t forthcoming with an ACORN-SAT version 2 media release, though after it was published, there was some media coverage.

Around the same time, what I found interestingly honest from the BoM was a Special Climate Statement on the recent Queensland floods. I’m not sure why they didn’t put out a Special Weather Statement, but even I am a bit confused about the difference between climate and weather.
This Statement was released on social media, and I took up the opportunity to converse. Without putting up all the details here (@BOM_au retweeted @BOM_Qld 15 Feb), someone called David Grimes tweeted “More than a years annual average precipitation in days… incredible”. So I just had to reply “David, you think that is incredible? I think it is incredible that the BoM didn’t see it coming and didn’t record the rain in their own gauges (as I mentioned on the BoM Facebook page, attached).” I conversed with the BoM on Facebook as indicated, where I pointed out “So what the BoM can’t report on in this Statement is either a forecast or rainfall totals in their own gauges. I thought they were meant to be a Bureau of Meteorology.”

Hopefully, more people are commenting on the BoM social media feeds from the viewpoint of denying anthropogenic global warming, as opposed to the belief system of the BoM.

I realise this post of mine is just my bit of a say, but in the context of what the BoM does here in Australia, they have been on the backfoot this year without sensible staff to make them reasonable in a lot of peoples’ eyes. The more pushback, the better.

WXcycles
Reply to  Doug Prior
March 6, 2019 7:43 am

I documented the entire event here:

https://community.windy.com/post/18456

BOM was indeed slow to wake-up. Their whole job is to provide a superior FOREcast for precisely this sort of event, not an AFTERcast.

Doug Prior
Reply to  WXcycles
March 6, 2019 4:10 pm

WXcyles, that is good documentation and thanks for pointing me to windy.com
It is poor form from the BoM not to be able to forecast better. Maybe a lot of staff were still on holidays.

As far as a build up to the monsoon, I noticed from mid-January till the monsoon arrived in the north, then again when it took hold in Queensland, a strong constant flow, I guess hot overland trade winds, from southern Queensland to northern West Australia. Looking back, that must have been a bit novel to BoM as it was to me.
Weather records need to be understood in the context of cycles such as solar minima and maxima.

WXcycles
Reply to  Doug Prior
March 6, 2019 6:38 pm

You’re welcome Doug.

Windy is an incredible tool, I couldn’t recommend it highly enough. I use it to plan daily. It’s incredibly accurate in 24 to 48 hour periods and will get even more accurate by mid-year as the ECMWF model is getting a major expansion to its continuous data inputs and automated processing. Windy now even contains global MET radar data feeds, such as from BOM with lightning overlaid, and have recently implemented local observations verses model for direct comparisons with the models.

I don’t need to use BOM any more and increasingly rarely do – ENSO data is about all I need from BOM now. I can get an order of magnitude better cyclone forecasts from a combination of Windy and US Navy cyclone forecasts and its aggregated Sat imagery. BOM is staggeringly rubbish at cyclone forecasts, a joke really, and that’s a pretty serious core-forecasting function to be so terrible at.

The one thing BOM did OK during the recent NQ rain event was to report river rises and flood level data fairly well, but that’s also nowcasting and aftercasting warnings. The 5 to 7 days of regional warning that they could have provided, well in advance, and which the extremely impressive ECMWF model in Windy provided with very impressive accuracy for areas, timing, intensity and scale, they didn’t provide to those areas to be affected. The models showed the nature of the problem 7 to 10 days in advance that north Queensland was in for record-setting rainfall and floods over a vast land area.

RE solar interpretations of WX cycles I find inter regional ocean temp effect on seasonal wind and pressure trends much more enlightening, predictive and practical. It doesn’t take long to see what the season is likely to bring via examining those and the related trends for which Windy is an ideal tool. Frankly examining that makes something of a joke of BOM’s regional medium and long-range ‘forecast’ maps, which are notoriously unreliable.

BOM will just become less and less relevant or credible to the public like the ABC has been doing during the past 20 years and both will get axed. Both of those organizations have aligned themselves to a bunch of radical left-wing hysterical fools which are sounding more and more bonkers everyday. No soup for such deadwood deceivers.

The allure of Windy is that it came from ordinary everyday people who simply needed more accurate and detailed forecasts, anywhere, anytime, so they could do whatever they needed to do better, with more efficiency, more safety.

AGW is not Science
Reply to  Doug Prior
March 6, 2019 9:28 am

I’d have looked for some historical events along the same lines. There’s usually multiple examples to de-fuse that type of suggestion that anything happening with the weather is “unprecedented.” Then point out that ‘average” precipitation is nothing more than a midpoint or extremes, it is NOT “normal” or “expected” precipitation, any departure from which is to be viewed as an “anomaly.”

Finish up with one of THEIR favorites, whenever weather which seems to conflict with their belief system occurs – “It’s WEATHER – Not “climate.”

Doug Prior
Reply to  Doug Prior
March 6, 2019 4:07 pm

The link to BoM Special Climate Statements is:
http://www.bom.gov.au/climate/current/statements/
showing for many years the mixing up of weather and climate.

The David Grimes referred to in my previous post is the President of the World Meteorological Organisation, an agency of the United Nations.

Reply to  Doug Prior
March 7, 2019 10:36 pm

“Australia’s BoM wasn’t forthcoming with an ACORN-SAT version 2 media release, though after it was published, there was some media coverage.”

To clarify … the only media coverage was the front page of The Australian newspaper several days after the ACORN 2 story was for the first time anywhere revealed by WUWT via a post by myself. The Australian followed it a couple of days later with an excellent editorial.

Apart from that front page on Australia’s national newspaper, there has been no mainstream media coverage at all. The BoM has still not issued a press release that might prompt other media to inform the public that Australia’s temperature history has been rewritten with a 23% increase in estimated warming since 1910.

ACORN 2 also has not yet been propagated through the BoM website, assuming it will be to make the whole exercise worthwhile, and with a federal election expected in two months the timing of that propagation and media release will be interesting.

Visit http://www.waclimate.net if you want full analysis of ACORN 2 including a state-by-state breakdown.

Doug Prior
Reply to  Chris Gillham
March 8, 2019 2:05 am

Chris, your initial post to WUWT brought this ACORN 2 story to the attention of many including me, thanks.
At least Sky News Australia was able to have Jennifer interviewed live. I realise mainstream media will have little coverage unless more journalists like Graham Lloyd take up the story.
Good analysis on http://www.waclimate.net though I won’t be getting into the detail.
I have an interest in weather from my teenage years even spending a week of high school work-experience with cloud-seeding scientists in the mid-1970s. It would be great if some well trained semi-retired scientists became more vocal in their concerns. I guess some are posting on sites like this.

AGW is not Science
March 6, 2019 9:34 am

midpoint OF extremes

Really need that edit function back!

March 6, 2019 10:01 am

Searched the responses — where’s Stokes?

Steven Mosher
Reply to  beng135
March 7, 2019 2:41 am

he is probably reading the scientific report

that would be a good move for folks here too

Reply to  Steven Mosher
March 7, 2019 6:20 am

That would be fine. Except that then he’d come here and put forward some predictably lame/inexplicable excuse justifying it (in his mind).

E J Zuiderwijk
March 6, 2019 10:47 am

Scientific misconduct, pure and simple. It will cost the perpetrators their jobs.

March 6, 2019 1:12 pm

Being unaware that CO2 is not a pollutant and is required for all life on earth is science ignorance.
Failure to discover that CO2 has no significant effect on climate but water vapor does is science incompetence.
Changing measured data to corroborate an agenda is science malpractice.

Peter O'Brien
March 6, 2019 1:16 pm

The disconnect between the miniscule tenths of a degree (within the error margin) increases that warmists use to justify their claim of global warming and the adjustments of up to 3 degrees to individual stations on a daily basis up to a century ago is startling.

March 6, 2019 1:22 pm

Water vapor is a ghg. It increased about 7% 1960-2002. During that time, atmospheric WV increased 5 molecules for each CO2 molecule added to the atmosphere. The level of WV in the atmosphere is self-limiting. Except for the aberration of el Ninos, WV appears to have stopped increasing in about 2002-2005.

Robert B
March 6, 2019 2:04 pm

Deniliquin had a new irrigation area opened in 1939 and another in 1955 so we are comparing apples to oranges. http://www.irrigationhistory.net.au/history/continuing-growth.asp
Water allocations began in the late 60s and severe restrictions to water use in droughts should mean an artificial warming since 1967.

Other stations also have reason why the site has changed dramatically (tourist towns watering surrounding lawns to keep maxima down and not scare off tourists) but to adjust and then fitting a linear trend to the data, talking about breaking records by a fraction of a degree, using the ratio of heat to cold records broken as evidence of climate change or my pet hate, a heatwave index that will exaggerate any fraction of degree change to the 90 percentile to look like a 30% increase in extreme heat, is so blatantly unscientific that its extreme incompetence.

Steve O
March 6, 2019 3:52 pm

If the data adjustments create a warming trend that isn’t there, then models being crated today based on the adjusted data will run hot. Future temperature trends will disappoint the forecasters and will need to be adjusted downward.

Rick
March 6, 2019 3:55 pm

I have to commend S. Mosher for making a lengthy posting on this thread.
He explains the case for CAGW very well.
The problem that a skeptic like myself has with the whole CAGW theory boils down to my own personal observation of local weather conditions.
Where I live in Canada, we have long cold winters and short hot summers.
Like most farmers I have extensive records for historic rainfall, temperature, and general weather conditions going back over the 50 plus years I’ve farmed in this area and prior to that my father was interested in the weather.
The weather in 2018/2019 seems very similar to the weather we had in 1968/1969.
Where is the catastrophic warming for our area?
Why haven’t we experienced the man-made warming that should have brought us milder winters and hotter summers by now?

Reply to  Rick
March 7, 2019 6:35 am

Where is the catastrophic warming for our area?

There isn’t any. There never will be — not from CO2 (cyclical/natural climate changes have and will continue to occur). CO2 slightly warms the coldest areas & has little effect on the already warm areas. That’s a good/beneficial effect, and then there’s the additional fertilization effect on the biosphere’s plants.

Reply to  beng135
March 7, 2019 11:46 am

If by ‘slightly’ you mean less than a measureable amount, well, maybe. I have never found, in a decade+ of searching, ANY warming that could not be completely explained by factors other than CO2.

March 7, 2019 6:55 pm

he Bureau has also variously claimed that they need to cool that past at Rutherglen to make the temperature trend more consistent with trends at neighbouring locations.

That may be the most unscientific pronouncement I’ve seen in years. So much wrong in such a small number of words.

First: raw data is not changed to be “more consistent with trends at neighboring stations.” One doesn’t know that the neighboring stations are any more trustworthy than the station whose data is being changed.

Second: scientists have to “Know when to hold ’em, know when to fold ’em.” I’ve seen it stated many times that the data”must be adjusted” so that the statistics they want and need for their purposes can be calculated. That is not scientific methodology; that is fabrication.

If a station has a move, or a sensor change causes a difference in readings, then that station record should be terminated, and a new one begun. One does the science with the data one has, not the data one wishes one had.

Bill in Oz
March 8, 2019 4:53 pm

The Bureau of Misinformation has a facebook page.

I put a short post up about BOM’s ACORN 2 with a link to this article on WUWT.

It never appeared.

BOM is not interested in hearing criticisms

From which I conclude it’s time to shut it down.