Robots Recreating Past Temperatures – Are Best to Avoid Australian Data

Reposted from Jennifer Marohasy’s Blog

BlueTeam-LookingDown-copy-1024x819

May 17, 2019 By jennifer

At an artificial intelligence (AI) conference in New York recently, Sean Gourley explained Wiener’s Law: automation will routinely tidy up ordinary messes but will occasionally create an extraordinary mess – that so mimics what could have been, that the line between what is real, and what is fake, becomes impossible to decipher, even by the experts.

AI research over the last couple of years at the University of Tasmania could have been a check on the existing mess with historical temperature reconstructions. Reconstructions that suggest every next year is hotter than the last the world over. Except that Jaco Vlok began with the Australian Bureau of Meteorology’s temperature datasets without first undertaking adequate quality assurance (QA).

Remember the infamous Climategate emails, and in particular the ‘Harry read me files’? Harry, working at the Climate Research Unit (CRU) at the University of East Anglia, wrote:

Getting seriously fed up with the state of the Australian data. so many new stations have been introduced, so many false references … so many changes that aren’t documented. Every time a cloud forms I’m presented with a bewildering selection of similar-sounding sites, some with references, some with WMO codes, and some with both. And if I look up the station metadata with one of the local references, chances are the WMO code will be wrong (another station will have it) and the latitude/longitude will be wrong too.

For years, the Australian Bureau of Meteorology has been capitalizing on the mess that by its very nature throws up ‘discontinuities’ that can subsequently be ‘homogenized’ … so Blair Trewin is obliged to apply algorithms, to ensure every reconstruction shows steadily rising temperatures in accordance with theory.

As Christopher Booker explained some years ago:

What is tragically evident from the Harry Read Me file is the picture it gives of the CRU scientists hopelessly at sea with the complex computer programmes they had devised to contort their data in the approved direction, more than once expressing their own desperation at how difficult it was to get the desired results.

In short, Phil Jones at the Climatic Research Unit in the UK, Gavin Schmidt at GISS NASA in New York, and even David Jones at the Australian Bureau in Melbourne have overseen the reworking of climate data until it fits the theory of catastrophic anthropogenic climate change (AGW).

They have, in fact, become the masters of Wiener’s Law, without actually knowing the first thing about AI.

They have overseen the use of algorithms – independently of the checks and balances routinely applied in the mainstream AI community – to recreate past temperatures.  In the process the Medieval Warm Period (MWP) and the temperature extremes of the late 1930s, so evident in the raw data for both Australia and also the US, have been removed from our historical temperature records. Thus, we have the Paris Accord, and a federal election in Australia where both candidates for future Prime Minister are committed to saving the environment from rising temperatures even if it means ruining the economy.

The history of science would suggest that disproving a failed paradigm is always more difficult than replacing one, and so I have thought beginning afresh with the latest AI techniques had merit.   But this work is only likely to succeed if the Australian raw temperature database – known as ADAM – is reworked from the beginning.  Otherwise artificial warming from both the Urban Heat Island (UHI) effect and also the Bureau’s new electronic probes in Automatic Weather Stations (AWS), that record hotter for the same weather, will keep creating hockey sticks as inescapably as Groundhog day.

While artificial intelligence, and in particular ANNs, are now considered a mature technology used for a variety of tasks that require pattern recognition and decision making and forecasting – their capacity is denied by mainstream climate scientists.  One of the reasons is that leading climate scientists claim the natural climate cycles have been so perturbed by carbon dioxide that the patterns no longer persist.  This is of course little more than a hypothesis, which can be tested using ANNs as a research tool.

It has been my experience that the raw measurements of any variable associated with weather and climate, when arranged chronologically, show a pattern of recurring cycles.

These oscillations may not be symmetrical, but they will tend to channel between an upper and lower boundary – over and over again. Indeed, they can be decomposed into a few distinct sine waves of varying phase, amplitude and periodicity.  It could be the case that they represent actual physical phenomena, which drive continuous climate change.

If this is the case, it may be possible to forecast the climate including temperature, wind speed and direction and even rainfall, by understanding its component parts.  As long as the relationships embedded in the complex oscillation continue into the future, a skilful weather and climate forecast is theoretically mathematically possible using ANNs – despite chaos theory.

Skilful weather and climate forecasts using ANN represent a new application for an existing technology.  Indeed, if only a fraction of the resources spent applying this technology to mining social media data for advertising, could be diverted to the goal of better climate forecasting I’m sure more major advances would be made very quickly.  But in the case of Australia, the databases will first need to be reworked to install some integrity.

In particular, every time there is a significant equipment change (for example, a change from a mercury thermometer to an electronic probe in an automatic weather station) then that temperature series needs to be given a new ID.  In this way the ANN has some hope of finding the real patterns in climate change from the artificial warming embedded with the new equipment … or the growth of a city.

Innovation, while usually technological, often has a real political implication.  For example, with the invention of the printing press in the 1430s, suddenly there was an efficient way of replicating knowledge – it became harder to control the information available to the masses.

Since the printing press, there have been many other inventions that have dramatically improved our quality of life including the invention of the steam engine in 1712, the telephone in 1876, penicillin in 1928 and personal computing as recently as the 1970s.  Today more people are living longer, healthier and more connected lives thanks to these and other innovations.  But when we consider the history of any single invention we find that it rarely emerged easily: there was initially confusion, followed by resistance.

The history of innovation (and science) would suggest that only when there is opportunity for competition do new and superior technologies take hold.  Of course, this does not bode well for the adoption of AI for weather and climate forecasting by meteorological agencies because they are government-funded monopolies. Furthermore, they are wedded to general circulation modelling that is a completely different technique – based on simulation modelling and next year being hotter than the last.

To be clear, there is the added complication that simulation modelling is integral to demonstrating anthropogenic global warming, while ANN rely exclusively on assumptions about the continued existence of natural climate cycles.  To reiterate, it has been said that because elevated levels of carbon dioxide have perturbed weather systems, ANNs will not work into the future because the climate is on a new trajectory. Conversely, if ANN can produce skilful climate forecasts then arguably anthropogenic climate change is not as big an issue as some claim.  Clearly, as with the printing press, there are political consequences that would follow the widespread adoption of AI in climate science for historical temperature reconstructions and also weather and climate forecasting.  I’m hoping this could begin with more funding for the important work of Jaco Vlok – but perhaps not at the University of Tasmania or with Australian temperature data.

The new report by Jaco Vlok ‘Temperature Reconstruction Methods’ can be downloaded here, and my explanation of its importance and limitations ‘New Methods for Remodelling Historical Temperatures: Admirable Beginnings Using AI’ can be downloaded here.

The feature image (at the very top) shows Jaco Vlok (left) then Jennifer Marohasy, John Abbot and JC Olivier.

**********

Fig50-768x612

Figure 50 from the new report by Jaco Vlok showing monthly mean maximum temperatures from the 71 locations used to recreated the temperature history at Deniliquin.

RP-AI-JVD-Overview-20190517-test

And here is Jennifer Marohasy’s report explaining it all in more detail.

0 0 votes
Article Rating
70 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
May 17, 2019 2:17 pm

The phrase, “upscale garbage compactor” comes to mind, when thinking of models associated with climate projections or reconstructions.

Editor
Reply to  Robert Kernodle
May 17, 2019 4:09 pm

Robert, for the past week or so, every time I’ve read the initials GCM, I’ve read them as GPM for General Propaganda Model.

Regards,
Bob

Greg
Reply to  Bob Tisdale
May 18, 2019 12:56 am

AI will not be a solution until there is proper auditing and accountability of the people screwing with the data. If those in charge are corrupted by political bias or “good intentions” to get a particular result they will just tune the AI or selectively remove non favourable results and only report the ones which fit the program.

Worse, any results will then be presented as beyond dispute because they were the result of “objective” computers and therefore free from human bias.

So far BoM are still refusing to publish their data manipulation methods for validation. They are worthless scientifically but this gets bypassed and they are allowed to continue.

ATheoK
Reply to  Robert Kernodle
May 17, 2019 5:10 pm

Upscale?
I have always been puzzled by the rationale of garbage compactors that turn fifty pounds of garbage into fifty pounds of garbage.

If one removes the trash normally, the bag might weigh fifteen to twenty pounds.
Repeated compressing the garbage eliminating nice light air, and one reaches weights not normally reached in home garbage pails.

Greg
Reply to  ATheoK
May 18, 2019 12:48 am

The object of compacting is to reduce volume, not to reduce mass. Duh.

MarkW
Reply to  ATheoK
May 18, 2019 3:56 pm

A lot of people pay for trash pickup based on how many cans they have out at the street.

Kurt Linton
May 17, 2019 2:36 pm

Ha Ha! CO2 broke physics! It really CAN do anything.

TW2019
May 17, 2019 2:36 pm

What is an ANN? All other acronyms are defined at first use, but not that one.

WarrenTR
Reply to  TW2019
May 17, 2019 3:06 pm

Artificial Neural Network – a type of model a black-box

Michael Penny
Reply to  TW2019
May 17, 2019 3:17 pm

Artificial Neural Network

Richard M
Reply to  TW2019
May 17, 2019 3:17 pm

I assume it meant Artificial Neural Network.

Frederick Michael
Reply to  TW2019
May 17, 2019 3:29 pm
jtom
Reply to  TW2019
May 17, 2019 3:47 pm

Artificial Neural Networks

Reply to  TW2019
May 17, 2019 3:48 pm

Artificial neural networks. There’s a wikipedia page on it. Not sure if the subject has been messed up yet.

MarkW
Reply to  TW2019
May 18, 2019 3:57 pm

Having to wait an hour to find out if anyone else has answered a question doesn’t seem to be working.

Mr.
May 17, 2019 2:39 pm

It all gets down to the willingness of elected politicians questioning / challenging / confronting career bureaucrats and academics.
Apart from president Donald Trump, who else among the ranks of western political “leaders” do we see holding bureaucrats & academics to account?

Wes C.
Reply to  Mr.
May 18, 2019 12:00 pm

Certainly not here in Canada…they are about to vote on a motion set by our Climate barbie to declare a climate emergency. This is obviously a ploy to lock down emission commitments before the coming election. Insuring any challenge by the new government would be political suicide. Climate change doesn’t get any more political than this!!

thingadonta
May 17, 2019 3:05 pm

I had a computer science friend who believed in aliens- he doesn’t get out much.

Ian Bryce
May 17, 2019 3:28 pm

Deniliquin, like many inland towns in Australia, shows no global warming over a 100 year period.

Mr.
Reply to  Ian Bryce
May 17, 2019 4:17 pm

Oh I dunno.
Things get a bit heated there every year when the Denny Ute Muster is on. 🙂

J Mac
May 17, 2019 3:30 pm

When you cite ‘ANNs’, are you referring to Artificial Neural Networks?

Loydo
Reply to  J Mac
May 17, 2019 4:21 pm

There is only one acronym you need to know: IPA, Marohasy’s paymaster, owned by Australia’s wealthiest mining tycoon Gina Rhinehart.
https://independentaustralia.net/politics/politics-display/how-gina-rinehart-bought-the-ipa,11749

Paid disinformers.

Schitzree
Reply to  Loydo
May 17, 2019 6:02 pm

You know, Loydo, if you don’t have anything intelligent to add to this discussion (and you never do), maybe you should refrain from posting.

Or you could just keep thread jacking and posting Fake News for your paymasters. Just don’t expect us to respect you for it.

~¿~

clipe
Reply to  Loydo
May 17, 2019 6:21 pm

desmogblog disinformation Loydo

One of the most straightforward climate change storylines is the link between global warming and coral reefs such as the Great Barrier Reef.

How’s that working out?

Editor’s note: An earlier version of this article implied that Gina Rinehart has provided direct financial support to Andrew Bolt. There is no evidence to support this claim. This has been corrected.

https://theconversation.com/drowning-out-the-truth-about-the-great-barrier-reef-2644

LdB
Reply to  Loydo
May 17, 2019 7:41 pm

Loydo must be a paid disinformer … apparently that is all you have to do make a claim and it’s true.

lee
Reply to  Loydo
May 17, 2019 8:23 pm

So different from the left leaning Lowy Insitute which found surprise surprise that most Australians are fearful of climate change. Of course they never released the questions asked. 😉

Bryan A
Reply to  Loydo
May 17, 2019 9:31 pm

Sounds like Loydo suffers from a strong case of Psychological Projection

MarkW
Reply to  Bryan A
May 18, 2019 3:59 pm

When you only have one skill, make the most of it.

J Mac
Reply to  Loydo
May 17, 2019 11:40 pm

Still muddying the waters by continuing your specious fulminations, eh Loydo? You would be better named ‘Worm Tongue’…

MarkW
Reply to  Loydo
May 18, 2019 3:58 pm

Ah yes, the standard left wing line that anyone who disagrees with me does so only because he’s paid to.

I guess it’s easier than actually trying to refute their arguments.

Gwan
May 17, 2019 3:32 pm

When is Stokes coming to try and defend the manipulation of the Australian climate data.
Any one with half a brain who changed thermometers would run the old thermometer along side the newer one for at least 3 years .
This was required but it never happened .Why?
The temperature record around the world has been inflated by these practices which are either deliberate or incompetent .
The high temperature s in the 30s and 40s have been reduced and the Medieval Warm period has been down graded so that the present warming can be touted as the warmest EVER.
Inconvenient facts become old wives tales for example the Vikings farmed in Greenland 800 years ago .
How about some truthful answers Stokes.

Reply to  Gwan
May 17, 2019 3:41 pm

“When is Stokes coming to try and defend the manipulation of the Australian climate data.”
I gave my account of Australian temperature data in a WUWT article here> You can track BoM data online from the time of measurement (within 30 mins) to it’s final appearance in the GHCN unadjusted file. It isn’t manipulated.

And I have shown, endlessly, with my program TempLS, which I run every month, that you can use the unadjusted data to get very similar results to adjusted. I think the adjustments are something any scientist has to do, but they don’t make much difference overall.

ps No-one disputes that the Vikings farmed in Greenland.

Macha
Reply to  Nick Stokes
May 17, 2019 4:38 pm

Nick, then why do so many people find crappy Data???

beng135
Reply to  Macha
May 19, 2019 6:10 am

Supposedly it all balances out by some extraordinarily good data.
/sarc

Mark - Helsinki
Reply to  Nick Stokes
May 17, 2019 4:52 pm

The Oz temp record from BOM is junk Stokes, and you know its junk and not admitting as much makes you worse than a liar

Reply to  Nick Stokes
May 17, 2019 10:34 pm

Chiefio (EM Smith) has been analysing the whole GHCN data base and has just shown that instruments do make a difference see https://chiefio.wordpress.com/2019/05/16/what-difference-does-instrument-selection-make-to-anomaly-processed-temperature-data/
It is logical. An electronic instrument can show a pulse of 1 sec duration. BOM is using resistance thermometers which have no correction for impulses such as a jet going past a weather site at an airport. The old mercury in glass thermometers take some time to respond. I have read that the WMO requires that signals should be averaged over 3 minutes and that signals be checked to eliminate short term electronic surges. BOM does not follow best practice . The other thing with BOM is that they have dropped hundreds of stations in rural locations and have increased the stations at airports many of which such as in SE Qld (such as Gold Coast, Brisbane, Sunshine Coast & Hervey Bay ) are located close to the ocean. These sites are treeless, have large expanse of black tarmacs which absorb the heat; have concrete building, footpaths, roads etc which reflect heat; and have poorly located instruments subject changing turbulence. BOM is a disgrace

Jennifer Marohasy
May 17, 2019 3:33 pm

Thanks so much for re-posting. ANN stands for Artificial Neural Network … a form of Artificial Intelligence (AI). Apologies for omitting this in the original post.

ANNs are at the cutting edge of AI technology, with new network configurations and learning algorithms continually being developed. In 2012, when John Abbot and I began using ANNs for monthly rainfall forecasting we choose a time delay neural network (TDNN), which was considered state-of-the-art at that time. The TDNN used a network of perceptrons where connection weights were trained with backpropagation. More recently we have been using General Regression Neural Networks (GRNN), that have no backpropagation component.

Jaco Vlok has been experimenting with the application/with ANNs for temperature reconstruction, I would like him to have the resources to do this with data from the USA that is not UHI contaminated. I think this is the future.

ironargonaut
Reply to  Jennifer Marohasy
May 18, 2019 10:42 am

If a new factor comes into play that is not input into data, does that destroy the predictive ability? My guess if this doesn’t accurately predict then it will be used as “proof” of a man made warming with no attempt to find the factor. It will be assumed it is CO2.

Tom Vojtek
May 17, 2019 3:42 pm

What is an ANN?

Looks like ANN is artificial neural network.

Tom Vojtek
Reply to  Tom Vojtek
May 17, 2019 3:44 pm

Sorry, didn’t see the comment made at 3:33.

old construction worker
May 17, 2019 3:54 pm

AI and Temps: GIGO

commieBob
May 17, 2019 5:00 pm

As long as the relationships embedded in the complex oscillation continue into the future, a skilful weather and climate forecast is theoretically mathematically possible using ANNs – despite chaos theory.

Well, if we have a reliably repeating pattern, we can predict what will happen. We don’t have reliable patterns. We have pseudo periodic functions that lead us into serious error.

We’ve been throwing billions (by now) of dollars at ever more expensive computers and ever more sophisticated programs and accurate weather prediction is as elusive as it ever was. ANN isn’t getting around chaos.

DocSiders
May 17, 2019 5:13 pm

Is anyone executing any serious audits on all the adjusted temperature data?

Tony Heller shows graphs of adjusted 20th century temperatures…then and now…from NOAA. NOAA took the temperature decline into the 1970’s and turned it into a barely downward sloping trend. So, we are supposed to believe that the Ice Age Scare during that time was motivated by a barely detectable trend?

They nearly erased the world wide 1930’s heatwaves. Why and how are they getting away with this?

Jennifer Marohasy
May 17, 2019 5:15 pm

commie Bob,

I disagree. All the money is being thrown at General Circulation Models (GCMs) which are the tool underpinning Catastrophic Anthropogenic Global Warming Theory (CAGWT).

Virtually no money (beyond the generosity of the B. Macfie Family Foundation) has been spent on trying ANNs for the development of a new theory. In particular, this theory could be based on the recurrent patterns so evident in temperature and rainfall data at the daily, monthly, 18.6 year, 21,000 year and so it goes on… .

It is when there is phase alignment of the different cycles that we see the really big events … as we see the biggest sea tides at particular times of the year.

Some of the work I’ve done, and defended with respect to temperature proxies using ANNs is here: https://jennifermarohasy.com/temperatures/response-to-criticism-of-abbot-marohasy-2017-georesj/
Most of my work has been with monthly rainfall forecasting, publications listed here: http://climatelab.com.au/publications/

We’ve hardly began to consider the application of ANNs for forecasting weather and climate. Jaco Vlok’s work is fundamental to this … though he has not got off to a good start because of QA issues.

commieBob
Reply to  Jennifer Marohasy
May 17, 2019 7:11 pm

Faced with a signal, the temptation is to try to understand it as the sum of a bunch of sine waves. There are other possibilities. Here’s an example:

In summary, we have introduced a stochastic threshold model, which exhibits a pseudo resonance quasi-periodicity as a function of the noise intensity. The model is not forced at that frequency, and thus offers an explanation of apparent regularity in observed time series completely different from that of the linear trend TAR model and the stochastic resonance model. link

So, it may be all about noise. The trouble is that you can’t predict the instantaneous value of a noisy signal.

May 17, 2019 5:19 pm

There have always been people and organizations staffed by such people
who like things just the way that they are. This is driven by both cultural
and commercial considerations.

Sadly only a very drastic event, such as a collapse of a civilisation, or a
defeat in a war, can real change take place.

A good example was the first World War. A combination of new
equipment such as the machine gun, plus a very old way of fighting wars,
resulted in 1914 with what had started off as a mobile war, soon bogged
down into a trench war for almost 3 years. Then the invention of the tank,
initially as a sort of mobile pill box, changed it back to a mobile war.

But at the end in 1918, the Allies were convinced that defence was the key
factor, not mobility, so the Allies went back to the old way of soldering.

While the British and the French did retain the Tank, they did not really
develop them. After all was the thinking by the Allies we won so the way
we did things must be right.

But the Germans having lost, thought that there had to be a better way.. They
reasoned that cavally was the way, but as horses were no longer practical
on a battlefield dominated by a machine gun then lets use the tank as we once
use horses.

Hence Blitz screed was born, “”Lightning war””

The German General Staff embraced it, but the Allies on the other hand
by 1939 had Generals who had been junior officers in WW1 and still
thought of tanks as just a minor factor, to help the troops walking forward
from the trenches.

The point I am attempting to make is that things change, but not always for
the better.

We may have to do it the hard way, a collapse of the economy, just as
happened in the 1930 tees with all of the misery, and sadly the possibility
of another war.

I doubt if I will be around to see it, but I fell for our children and grand
children,

MJE VK5ELL

Reply to  Michael
May 17, 2019 10:25 pm

Hand waggle…

What the Germans did was NOT return to the concept of cavalry as practiced from (approximately) the 15th through 19th centuries.

If you want a parallel of armored vehicles to mounted soldiers, you need to go back to the heavily armored mounted cavalry – starting with the ancient Persian dehgans and Roman cataphracts up through about the 14th century. (You can, though, compare later cavalry, and the auxiliary mounted archers of antiquity, to the light armored cars used in Blitzkrieg tactics. Harassers of ground troops along the front, pursuit of routed forces, flankers, scouts, etc. – not the “punch through” forces.)

Rod Evans
Reply to  Writing Observer
May 18, 2019 12:44 am

Whilst I accept the premise that change often has to be forced on society by disruptive events wars, prolonged droughts, ongoing volcanic activity etc. I suspect these are what gets recorded by history so becomes recognised as significant.
The more subtle changes that impact society in the long term often get forgotten. An example from the 19th century here in the UK for example, saved more more lives and set in train a revolution in health improvement never yet repeated. Joseph Bazelgette, and the civil engineering he oversaw put sewers in London, he saved lives reduced illness and influenced/set the standard used world wide.
On the outcomes of change brought by the first world war. I would add female emancipation to the list. Also, it would be hard to argue against aircraft development in that conflict being the most significant change in military engagement thereafter. The aeroplane evolution was even more dramatic than the tank, which though it broke the deadlock it was still just heavy ground based warfare, traditional might is right thinking.

MarkW
Reply to  Michael
May 18, 2019 4:01 pm

Blitz Grieg

MarkW
Reply to  Michael
May 18, 2019 4:03 pm

Blitzkrieg was also the total package, fighters, bombers, paratroopers, troop transports, AND tanks.

Brett Keane
Reply to  MarkW
May 19, 2019 11:11 am

MarkW et al: Tanks took care of the Machine Gun nests with signalling from the ground troops. British Sight and Sound Ranging with electronic help thanks to Rutherford and his teams etc., destroyed German artillary strength. The 12,000-stong RAF airplane presence was also well-worked in with ground intelligence by then. Good coordination under French overall command was finally achieved.
The March 1918 defence followed by the August breakthrough near Cambrai, fatally weakened German power. Leading to their decision to seek Armistice. Aided by Home Front breakdown as our long Royal Navy blockade continued to tighten. Growing American Power helped make resistance seem hopeless to Ludendorff and Wilhelm. Brett Keane

Marcus
May 17, 2019 5:31 pm

“TRUMP ADMINISTRATION PROPOSES MAJOR CUTS IN FEDERAL SCIENCE PROGRAMS”

https://www.heartland.org/news-opinion/news/trump-administration-proposes-major-cuts-in-federal-science-programs?fbclid=IwAR0_gc51Hr_C_F7rv5zrfQbpQHfQqJ9llfRx-UMyl6-wTpdV_P9IHlb7tm0

“EPA’s budget would be reduced from $8.8 billion to $6.1 billion, with spending for science and technology cut by $440 million. The White House proposes cutting EPA’s Atmospheric Protection Program, which reports on greenhouse gases, by 90 percent, and reducing funding for the Office of Energy Efficiency and Renewable Energy by 70 percent.”

observa
May 17, 2019 5:34 pm

You just have to get rid of those pesky humans interfering with their overarching AI-

‘A James Cook University associate professor has resigned from her honorary position over the sacking of professor Peter Ridd, who was dismissed after he criticised the institution’s climate change science.
Sheilagh Cronin ­resigned from the unpaid role at the Townsville university in protest and said she was “ashamed” that she had not done so earlier.

A marine physicist who had worked at the university for 30 years, Professor Ridd was censured three times before being sacked last year. He challenged the dismissal in the Federal Court and on April 16 judge Salvatore Vasta found all 17 findings used by the university to justify the sacking were unlawful…….

Dr Cronin, an adjunct associate professor with the university’s Mount Isa Centre of Rural and Remote Health and a former president of the Rural Doctors Association of Australia, sent a letter to vice-chancellor Sandra Harding last week outlining her reasons for resigning.
“I am coming to the end of my professional career but my main reason for resigning is my disquiet over the dismissal of the respected physics professor … Peter Ridd,” Dr Cronin wrote. “I believe his treatment by yourself and your board is completely contrary to the philosophy of open discussion and debate that should be at the heart of every university. It saddens me that the reputation of JCU is being damaged by the injustice of Professor Ridd’s case.”’
(report in The Australian 18/5/19)

Brett Keane
Reply to  observa
May 19, 2019 10:41 am

Bless you Shelagh. Brett Keane

JD Vlok
May 17, 2019 5:53 pm

Thanks for re-posting and providing the link to my report. To summarize the use of artificial intelligence in temperature reconstruction; the artificial neural network (ANN) is trained on existing raw data based on each weather station location. The trained ANN is then used to perform spatial interpolation, i.e. to estimate temperature values for locations where no measurements were taken. Using the same principle, the ANN can also estimate or infill missing values in existing temperature records.

I evaluated the estimation performance of the technique by withholding known data from the ANN, and then seeing how accurate these values can be estimated. The ANN performs slightly better than a benchmark I’ve chosen (inverse distance weighting).

The accuracy of the estimation is obviously limited by the raw data quality. So, to obtain a high-quality reconstruction (from which e.g. global long-term trends can be derived) incorrect raw measurements must be excluded (or corrected). Now the question becomes how to identify incorrect data, amidst non-climatic effects such as urban heat islands, switch-over from liquid-in-glass to automatic weather stations, and undocumented historical site moves. Much more work is required.

Reply to  JD Vlok
May 18, 2019 12:14 am

I may be unique here in having read the report. And I think it explains many things well. It is basically a conventional homogenisation technique, or at least includes the part where values at a location are inferred from neighbors. What I don’t see is the essential role of ANN or AI. The techniques used predate those schemes, and in the case of Deniliquin, for example, it was shown that the interpolation using ANN was successful, but there is no comparison with an interpolation not using ANN. I don’t believe ANN confers an advantage.

I have developed a method based on LOESS. It is like the one here, and could be used with IDW, although I use exponential, which seems to match reality better (Hansen and Lebedeff, 1987). But I think either would be fine. LOESS means a locally weighted regression, in my case first order. This is better than simple averaging, as it compensates for any asymmetry in station location which might interact with trend in the data. I used it for global averaging, as described here, where it is very successful. It is slightly different in that it interpolates on an extra set of nodes rather than on stations, but can be used for both.

I would encourage JD Vlok to continue with his numerical analysis, but drop the ANN and AI. It is a dead end here.

JD Vlok
Reply to  Nick Stokes
May 18, 2019 5:26 am

Regarding “there is no comparison with an interpolation not using ANN”: I compared the performance of inverse distance weighting (IDW) and the ANN technique, with Deniliquin results summarized in Table 4 (p. 104) and for other stations in the same region in Fig. 59 (p. 106). The ANN performs better when estimating temperature data at unsampled locations.

I used IDW as yardstick, and leave-one-out cross-validation (LOOCV) as performance measure to simplify possible future comparison with other spatial interpolation techniques.

One advantage of an AI temperature reconstruction approach is that it reduces the burden of determining the weights of neighboring stations, i.e. how much each neighboring station should contribute to estimating the temperature at a given location. The AI algorithm infers these weights indirectly from the training data. It seems that the ANN will typically provide estimates better than those obtained by an optimal IDW approach (see the IDW grid-search results in Fig. 57, p. 105).

The performance advantage of ANNs (at least in this case) is derived from the fact that they are data driven (see Jennifer Marohasy’s Appendix 2 in “New Methods for Remodelling Historical Temperatures” – the last link in the post).

This does not necessarily imply that ANNs or AI is the ultimate solution, but I think it is worthwhile to consider.

Reply to  JD Vlok
May 18, 2019 5:58 am

“with Deniliquin results summarized in Table 4 (p. 104) and for other stations in the same region in Fig. 59 (p. 106)”
Apologies, I missed that.

Reply to  JD Vlok
May 19, 2019 7:44 am

A late thought, you modified technique seems to be simply using anomalies, which I think should always be done. And with modification, it seems that modified IDW and modified ANN perform almost exactly the same (and well). So I still can’t see any reason to use ANN.

JD Vlok
Reply to  Nick Stokes
May 19, 2019 6:20 pm

Both modified techniques are designed to infill or estimate missing values in an existing data record. They use the difference in mean values between the record to be infilled and its neighbors. They perform approximately the same – as you have said.

On the other hand, the plain (or not modified) techniques are designed to perform spatial interpolation – to estimate temperature data at locations where there were no measurements taken. In this case, the ANN generally performs better than IDW.

Reply to  Nick Stokes
May 19, 2019 6:59 pm

” designed to infill or estimate missing values in an existing data record”
They may have been designed for that. But you can use the same idea for spatial infilling. I think what is happening here is
1. Modification is effectively using anomaly. It isn’t obvious the way you’ve written it, but instead of averaging the temperatures, you are averaging the discrepancy between the temperature and a local average. As with time anomalies, this has the merit of reducing data inhomogeneity, so biased sampling won’t matter much. It is always a good thing to do.
2. Unmodified, you suffer from bias in the sample. There is variation in whether you are including hot and cold places, which is reflected in the average. ANN helps reduce the bias in the sample, and so improves the result.
3. But you don’t need to do that. Taking anomalies is far simpler than ANN or AI, and as you have shown, works well. Since it makes the bias unimportant, ANN gives no further improvement.

The rule is simple. Whenever you average (or sum) any sampled set, make sure you first subtract any source of predictable variation. That then won’t be confounded with sample biases. That is what your modified does.

May 17, 2019 6:48 pm

Possibly I am being a but hard on the lady, but is this a case of the
“”Rats leaving the sinking ship””.

Expect a few more to resign, who wish to be considered to be “Honest”
academics.

MJE VK5ELL

John Pickens
May 17, 2019 9:54 pm

Please refer to the predicted climate catastrophe as CAGW, or Catastrophic Anthropogenic Global Warming. By bowing to the Orwellian changes to the nomenclature, you are helping these environmental Luddites.

John F. Hultquist
May 17, 2019 10:10 pm

Many posters above commented on the initially undefined ANN.
Yet Loydo, at 4:21, got a pass on IPA.

A look here will correct that: IPA

WR2
May 17, 2019 10:37 pm

Has there been any studies that quantify what percentage of the trend in these various temperature datasets is due to adjustments? My guess would be that campared with the raw data trend, it’s probably on the order of 50%.

Carbon Based Lifeform
May 18, 2019 2:36 am

Surely the fundamental flaw is that the surface weather station network is not fit for purpose. It was created to measure local temp and weather, it is completely unsuitable to use it to determine global climate for a host of reasons, including site distribution, UHI, changes in equipment over time, etc.
Surely satellites are much more robust at measuring global changes.

Robert of Texas
May 18, 2019 8:36 am

If you want to actually take climate science seriously, you first have to abandon the idea that a TMAX and TMIN are the only valid measurements of temperature. You first need a way of describing a day’s variation in temperature kind of like a probability wave in quantum mechanics. So we need to define a “climate day-wave” (which would be a set of waves describing that days measurements). Humidity, rate of change, evaporation, wind speed and direction, and solar flux (measurements at several wavelengths) are all part of this. Abandon this nonsense of a Global Average Temperature which is just misleading.

Next you have to get quality data…not this junk we keep using that was never designed for climate science. Pristine sites set up to record the data you need for the research, and carefully managed meta-data for any changes that are introduced. I personally do not think you need thousands of sites for studying climate change in the U.S., a few hundred would suffice. You are not after local differences which is usually just weather but broad changes over time. I am not convinced that satellites can provide this data unless you get some stationary ones and resolve some of the issues of trying to infer data – so maybe a mixture of pristine land sites and satellite data.

You need to govern the data and protect the raw data, the processes used to adjust it, and the results from the manipulations of biased researchers. They can take a copy of the data and do what they want, but the source data must remain pristine and out of the hands of propagandists.

With this kind of effort in-hand, you can actually start to describe climate change. You still cannot separate natural from man-driven climate change, but at least you have a start on a meaningful description of change.

Or we can stick with the “Some places will get warmer and some cooler, some wetter and some dryer, but in general everyone is going to die…”

Brett Keane
May 19, 2019 10:45 am

Bless you Sheilagh. Brett Keane

Brett Keane
May 19, 2019 11:15 am

MarkW et al: Tanks took care of the Machine Gun nests with signalling from the ground troops. British Sight and Sound Ranging with electronic help thanks to Rutherford and his teams etc., destroyed German artillary strength. The 12,000-stong RAF airplane presence was also well-worked in with ground intelligence by then. Good coordination under French overall command was finally achieved.
The March 1918 defence followed by the August breakthrough near Cambrai, fatally weakened German power. Leading to their decision to seek Armistice. Aided by Home Front breakdown as our long Royal Navy blockade continued to tighten. Growing American Power helped make resistance seem hopeless to Ludendorff and Wilhelm. Brett Keane

May 19, 2019 10:53 pm

Trying to change the worlds weather sounds like a worthy goal until you get the bill for it Meanwhile the climate itself will be having the last word . Recently it’s been getting cooler and according Landscheidt Sharp and Smith, we have already tipped into a mini Ice Age … See Paullitely.com for all the verifiable details

%d bloggers like this: