Reposted from Jennifer Marohasy’s Blog
August 12, 2021 By jennifer
I was at the Australian National University in October 2018, when the largest supercomputer in the Southern Hemisphere began running the simulations that have now been published as the IPCC’s Assessment Report No. 6 (AR6). It’s being touted as the most comprehensive climate change report ever. It is certainly based on a very complex simulation model (CMIP6).
Many are frightened by the official analysis of the model’s results, which claims global warming is unprecedented in more than 2000 years. Yet the same modelling is only claiming the Earth is warming by some fractions of a degree Celsius! Specifically, the claim is that we humans have caused 1.06 °C of the claimed 1.07 °C rise in temperatures since 1850, which is not very much. The real-world temperature trends that I have observed at Australian locations with long temperature records would suggest a much greater rate of temperature rise since 1960, and cooling before that.
Allowing some historical perspective shows that the IPCC is wrong to label the recent temperature changes ‘unprecedented’. They are not unusual in magnitude, direction or rate of change, which should diminish fears that recent climate change is somehow catastrophic.
To understand how climate has varied over much longer periods, over hundreds and thousands of years, various types of proxy records can be assembled derived from the annual rings of long-lived tree species, corals and stalagmites. These types of records provide evidence for periods of time over the past several thousand years (the late Holocene) that were either colder, or experienced similar temperatures, to the present, for example the Little Ice Age (1309 to 1814) and the Medieval Warm Period (985 to 1200), respectively. These records show global temperatures have cycled within a range of up to 1.8 °C over the last thousand years.
Indeed, the empirical evidence, as published in the best peer-reviewed journals, would suggest that there is no reason to be concerned by a 1.5 °C rise in global temperatures over a period of one hundred years – that this is neither unusual in terms of rate nor magnitude. That the latest IPCC report, Assessment Report 6, suggests catastrophe if we cannot contain warming to 1.5 °C is not in accordance with the empirical evidence, but rather a conclusion based entirely on simulation modelling falsely assuming these models can accurately simulate ocean and atmospheric weather systems. There are better tools for generating weather and climate forecasts, specifically artificial neural networks (ANNs) that are a form of artificial intelligence.
Of course, there is nowhere on Earth where the average global temperature can be measured; it is very cold at the poles and rather warmer in the tropics. So, the average global temperature for each year since 1850 could never be a direct ‘observation’, but rather, at best, a statistic calculated from measurements taken at thousands of weather stations across the world. And can it really be accurately calculated to some fractions of a degree Celsius?
AR6, which runs to over 4,000-pages, claims to have accurately quantified everything including confidence ranges for the ‘observation’ of 1.07 °C. Yet I know from scrutinising the datasets used by the IPCC, that the single temperature series inputted for individual locations incorporate ‘adjustments’ by national meteorological services that are rather large. To be clear, even before the maximum and minimum temperature values from individual weather stations are incorporated into HadCRUT5 they are adjusted. A key supporting technical paper (eg. Brohan et al. 2006, Journal of Geophysical Research) clearly states that: ‘HadCRUT only archives single temperature series for particular location and any adjustments made by national meteorological services are unknown.’ So, the idea, that the simulations are based on ‘observation’ with real meaningful ‘uncertainty limits’ is just not true.
According to the Australian Bureau of Meteorology (BOM), which is one of the national meteorological services providing data for HadCRUT, the official remodelled temperatures are an improvement on the actual measurements. This may be so that they better accord with IPCC policy, with the result being a revisionist approach to our climate history. In general they strip the natural cycles within the datasets of actual observations, replacing them with linear trends that accord with IPCC policy.
The BOM’s Blair Trewin, who is one of the 85 ‘drafting authors’ of the Summary for Policy Makers, in 2018 remodelled and published new values for each of the 112 weather stations used to calculate an Australian average over the period 1910 to 2016, so that the overall rate of warming increased by 23 %. Specifically, the linear trend (°C per century) for Australian temperatures had been 1 °C per century as published in 2012 in the Australian Climate Observations Reference Network − Surface Air Temperature (ACORN-SAT) database version 1. Then, just in time for inclusion in this new IPCC report released on Tuesday, all the daily values from each of the 112 weather stations were remodelled and the rate of warming increased to 1.23 °C per century in ACORN-SAT version 2 that was published in 2018. This broadly accords with the increase of 22% in the rate of warming between the 2014 IPCC report (Assessment Report No. 5) which was 0.85 °C (since 1850), and this new report has the rate of warming of 1.07 °C.
Remodelling of the data sets by the national meteorological services generally involves cooling the past, by way of dropping down the values in the first part of the twentieth century. This is easy enough to check for the Australian data because it is possible to download the maximum and minimum values as recorded at the 112 Australian weather stations for each day from the BOM website, and then compare these values with the values as listed in ACORN-SAT version 1 (that I archived some years ago) and ACORN-SAT version 2 that is available at the BOM website. For example, the maximum temperature as recorded at the Darwin weather station was 34.2 °C on 1 January 1910 (this is the very first value listed). This value was changed by Blair Trewin in the creation of ACORN-SAT version 1 to 33.8 °C. He ‘cooled’ this historical observation by a further 1.4 °C in the creation of ACORN-SAT version 2, just in time for inclusion in the values used to calculate a global average temperature for AR6. When an historic value is cooled relative to present temperatures, then an artificial warming trend is created.
I am from northern Australia, I was born in Darwin, so I take a particular interest in its temperature series. I was born there on 26th August 1963. A maximum temperature of 29.6 °C was recorded at the Darwin airport on that day from a mercury thermometer in a Stevenson screen, which was an official recording station using standard equipment. This is also the temperature value shown in ACORN-SAT version 1. This value was dropped down/cooled by 0.8 °C in the creation of ACORN-SAT version 2, by Blair Trewin in 2018. So, the temperature series incorporated into HadCRUT5, which is one of the global temperature datasets used in all the IPCC reports shows the contrived value of 28.8 °C for 26th August 1963, yet the day I born a value of 29.6 °C was entered into the meteorological observations book for Darwin. In my view, changing the numbers in this way is plain wrong, and certainly not scientific.
The BOM justifies remodelling because of changes to the equipment used to record temperatures and because of the relocation of the weather stations, except that they change the values even when there have been no changes to the equipment or locations. In the case of Darwin, the weather station has been at the airport since February 1941, and an automatic weather station replaced the mercury thermometer on 1 October 1990. For the IPCC report (AR5) published in 2014, the BOM submitted the actual value of 29.6 °C as the maximum temperature for Darwin on 26th August 1963. Yet in November 2018, when the temperatures were submitted for inclusion in the modelling for this latest report (AR6), the contrived value of 28.8 °C was submitted.
The temperature series that are actual observations from weather stations at locations across Australia tend to show cooling to about 1960 and warming since then. This is particularly the case for inland locations from southeast Australia. For example, the actual observations from the weather stations with the longest records in New South Wales were plotted for the period to 1960 and then from 1960 to 2013, for a presentation that I gave to the Sydney Institute in 2014. I calculated an average cooling from the late 1800s to 1960 of minus 1.95 °C, and an average warming of plus 2.48 °C from the 1960s to the present, as shown in Table 1. Yet this new United Nation’s IPCC report claims inevitable catastrophe should the rate of warming exceeds 1.5 °C, yet this can be shown to have already occurred at many Australian locations.
This is consistent with the findings in my technical report as published in the international climate science journal Atmospheric Research (volume 166, pages 141-149) in 2015, which shows significant cooling in the maximum temperatures at the Cape Otway and Wilsons Promontory lighthouses, in southeast Australia, from 1921 to 1950. The cooling is more pronounced in temperature records from the farmlands of the Riverina, including at Rutherglen and Deniliquin. To repeat, while temperatures at the lighthouses show cooling from about 1880 to about 1950, they then show quite dramatic warming from at least 1960 to the present. In the Riverina, however, minimum temperatures continued to fall through the 1970s and 1980s because of the expansion of the irrigation schemes. Indeed, the largest dip in the minimum temperature record for Deniliquin occurs just after the Snowy Hydroelectricity scheme came online. This is masked by the remodelled by dropping down/cooling all the minimum temperatures observations at Deniliquin before 1971 by 1.5 °C.
In my correspondence with the Bureau about these adjustments it was explained that irrigation is not natural and therefore there is a need to correct the record through remodelling of the series from these irrigation areas until they show warming consistent with theory. But global warming itself is not natural, if it is essentially driven by human influence, which is a key assumption of current policy. Indeed, there should be something right-up-front in the latest assessment of climate change by the IPCC (AR6) explaining that the individual temperature series have been remodelled before inclusion in the global datasets to ensure a significant human influence on climate in accordance with IPCC policy. These remodelled temperature series are then incorporated into CMIP6 which is so complex it can only be run only a supercomputer that generates so many scenarios for a diversity of climate parameters from sea level to rainfall.
In October 2018, I visited the Australian National University (ANU) to watch CMIP6 at work on the largest supercomputer in the Southern Hemisphere. It was consuming obscene amounts of electricity to run the simulations for this latest IPCC report, and it is also used to generate medium to long range rainfall forecasts for the BOM. The rainfall forecasts from these simulation models even just three months in advance are, however, notoriously unreliable. Yet we are expected to believe rainfall forecasts based on simulations that make projections 100 years in advance, as detailed in AR6.
There are alternative tools for generating temperature and rainfall forecasts. In a series of research papers and book chapters with John Abbot, I have documented how artificial neural networks (ANNs) can be used to mine historical datasets for patterns and from these generate more accurate medium and long-range rainfall and temperature forecast. Our forecasts don’t suggest an impending climate catastrophe, but rather that climate change is cyclical, not linear. Indeed, temperatures change on a daily cycle as the Earth spins on its axis, temperatures change with the seasons because of the tilt of the Earth relative to its orbit around the Sun, and then there are ice ages because of changes in the orbital path of the Earth around the Sun, and so on.
Taking this longer perspective, considering the sun rather than carbon dioxide as a driver of climate change, and inputting real observations rather than remodelled/adjusted temperature values, we find recurrent cycles greater than 1.07 degrees Celsius during the last 2000 years. Our research paper entitled ‘The application of machine learning for evaluating anthropogenic versus natural climate change’, published in GeoResJ in 2017 (volume 14, pages 36-46) shows a series of temperature reconstructions from six geographically distinct regions and gives some graphic illustration of the rate and magnitude of the temperature fluctuations.
ANNs are at the cutting edge of AI technology, with new network configurations and learning algorithms continually being developed. In 2012, when John Abbot and I began using ANNs for rainfall forecasting we choose a time delay neural network (TDNN), which was considered state-of-the-art at that time. The TDNN used a network of perceptrons where connection weights were trained with backpropagation. More recently we have been using General Regression Neural Networks (GRNN), that have no backpropagation component.
A reasonable test of the value of any scientific theory is its utility – its ability to solve some particular problem. There has been an extraordinary investment into climate change over the last three decades, yet it is unclear whether there has been any significant improvement in the skill of weather and climate forecasting. Mainstream climate scientists, and meteorological agencies continue to rely on simulation modelling for their forecasts such as the CMIP6 models used in this latest IPCC report – there could be a better way and we may not have a climate catastrophe.
Further Reading/Other Information
The practical application of ANNs for forecasting temperatures and rainfall is detailed in a series of papers by John Abbot and me that are listed here: https://climatelab.com.au/publications/
Chapter 16 of the book ‘Climate Change: The Facts 2020’ provides more detail on how the Australian Bureau of Meteorology takes a revisionist approach to Darwin’s history.
https://climatechangethefacts.org.au/wp-content/uploads/2021/08/MAROHASY-2020-Rewriting-Australias-Temperature-History-CCTF2020_16.pdf
There is an interactive table based on the maximum and minimum values as originally recorded for each of the 112 Australian weather stations used to calculate the official temperature values as listed in ACORN-SAT version 1 and version 2 at my website, click here:
https://jennifermarohasy.com/acorn-sat-v1-vs-v2/
The feature image, at the top of this blog post, shows Jennifer Marohasy in front of the supercomputer at the Australian National University in October 2018, which was running simulations for the latest IPCC report.
It’s value is much higher than that,but i detect jealousy coming from you.How much traffic do you get on your website?
The bigger the computer, the more garbage it can hold.
A political report based on simulations is an admission of a lack of evidence.
Adjusting the past to remove human influence in the present is illogical.
There was no human influence in the past temperatures to remove.
Only recent temperatures have any human influence to be removed.
This entire process of adjusting the past to remove a human influence that was not present in the past is perhaps the weakest link in climate science.
You cannot correct an error that doesn’t exist. Instead all you can do is introduce error.
This problem of adjusting the past really needs to be exposed in paper after paper because it is mathematical nonsense.
The problem in modelling the future is that the errors grow exponentially with each iteration.
Even if your forecast error is 0.0001% it is like compound interest. The interest grows and grows until it exceeds the principle.
This problem is well known in the field of Applied Mathematics. Even simple linear programming models used to introduce matrices in high school quickly develop loss of precision errors when run on computers.
It is inconceivable that climate models would not suffer the same fate. For example, total energy.
Climate models will not conserve energy. It is impossible due to LOP errors. Instead they will have to smear this error back into the model each iteration. Already your train has left the tracks.
Pat Frank has written extensively on the tendency of iterative methods to blow up due to error propagation.
The thing is CMIP6 predicts a warming rate of +0.07 C/decade from 1880-2020. The observation over this period is +0.07 C/decade. According to Pat Frank the probability that CMIP6 would essentially nail the warming rate over a 140 year period is vanishingly small yet it still happened. Don’t hear what I didn’t say. I didn’t say CMIP6 is perfect or that all of its predictions end up as good as the global mean warming trend. I’m just saying that Pat Frank’s hypothesis is that CMIP6 shouldn’t even have gotten in the ballpark on any of its predictions and yet those predictions somehow turned out to be reasonable and generally better than many other models and certainly far better than contrarian models that can’t even get the direction of the temperature change correct.
You genuinely believe we can know what global temperatures from the 1880’s to 2 decimal places? That’s assuming global temperature is even meaningful.
I didn’t say that.
In that case why are you even talking about global temperature?
I’m responding to your post regarding Pat Frank’s analysis of the uncertainty of the CMIP suite of models. His post on WUWT showed that the uncertainty on the global mean temperature is ±11 C over a 90yr period.
He is showing that the uncertainty range grows to unrealistic values, even if the ‘best guess’ plods along with what seems to be reasonable values.
Why is it that alarmists have such difficulty understanding the concepts of uncertainty? Maybe that is why they are alarmists.
Are you saying that Frank’s analysis of CMIP6 uncertainty on the global mean temperature after a 90yr prediction is not ±11 C?
No, I’m not saying that at all. What I’m saying is that you don’t understand what Frank is saying, or the implications for an uncertainty that grows with time.
My understanding of Frank says is that the uncertainty on CMIP6’s prediction of the global mean temperature grows to ±11 C after 90 years.
How many times are you going to show you know nothing about metrology? Dr. Frank’s analysis does not provide “error bars” against which you can measure accuracy. Dr. Frank assessed the uncertainty of the current models. Uncertainty intervals are descriptive calculations of where you can not know what the correct answer actually is. Worse, you can never know what the real answer is when your calculation is within the uncertainty interval.
I’m sorry you do not have a sufficient education in science and specifically metrology to understand uncertainty. As I have already said to you, some metrology courses and perhaps some training in professions where uncertainty is paramount would provide you with the knowledge to learn and accept what uncertainty is.
And what were the assessed uncertainties on the annual global mean temperature predictions from CMIP5, CMIP6, and GISS Model II?
I suggest you read Dr. Franks posts here and on the internet about the uncertainty of the projections made by models.
Suffice it to say that none of the projections lie outside the uncertainty intervals for 2100. That means any calculation that ends up in the interval has no proof that it will occur. Narrowing the uncertainty interval must be done in order to claim any certain foreknowledge of what may happen.
Like it or not, uncertainty means don’t know and you can never know what the real truth is.
I’ll answer the question. He shows an uncertainty for a 90yr prediction of the global mean temperature anomaly from CMIP6 of ±11 C (1σ). That is the equivalent to a trend uncertainty of ±1.2 C/decade (1σ). How do you think we can test to see if this is plausible?
You betray your lack of engineering and metrology everytime you respond. Why? You can not statistically reduce uncertainty. The GUM allows you to show uncertainty as a 1 or 2 sigma when dealing with a single measurand. For iterative measurements uncertainty is additive. The accepted method is to treat iterative measurements as orthogonal and calculated by Root Sum Square (RSS not RMS).
Maybe you can check my work. The PDF calculation for an error > 0.01 C/decade based on an uncertainty of ±1.2 C/decade (1σ) comes out to > 99%. Is that what you get?
Aren’t the models tuned to the history?
You are overlooking the fact that the mean is the ‘best guess,’ but doesn’t guarantee that it is correct. There is a 95% chance that the true value could be as much as +/- 2-sigma.
That’s my point. According to Pat Frank there is a less than a 1% chance that CMIP6 would predict a warming trend of +0.07 C/decade of a 140 yr period. It was either a lucky shot by CMIP6 or Frank’s analysis of the uncertainty was faulty.
Uhh, Ingrown, the reason CO2 emissions dropped in 2020 was the Covid lockdown. Prior to that, CO2 emissions were steadily climbing. CO2 emissions are already on their way back up and will most likely exceed 2019 emissions soon. Let’s review your assertion in a few months and determine if you were lying (once again).
From the article: “Specifically, the claim is that we humans have caused 1.06 °C of the claimed 1.07 °C rise in temperatures since 1850, which is not very much.”
The highest temperature the Earth has reached in the 21st century occurred during the year 2016. That is the point where the 1.07C was reached.
We are not currently sitting at 1.07C above the baseline as this article and many other articles imply. We are actually sitting at about 0.5C above the baseline since temperatures have cooled considerably since 2016.
But nobody seems to take notice. They talk like we are at the hottest point in human history and are going higher. But the truth is we are currently going lower.
We need a little better accuracy when reporting our current temperature situation. We are no longer in 2016, and the temperatures are no longer 1.07C above the baseline.
Here’s the satellite record. It shows the real temperatue profile of the Earth:
The 1.07C figure in AR6 is the 2010-2019 average. Per UAH the 2010-2019 anomaly is +0.12C. Over the most recent 10 years the figure is +0.15C for a change of +0.03C. That means we are currently sitting at 1.10C using the AR6 baseline and adding in the warming UAH recorded.
From the article: “AR6, which runs to over 4,000-pages, claims to have accurately quantified everything including confidence ranges for the ‘observation’ of 1.07 °C. Yet I know from scrutinising the datasets used by the IPCC, that the single temperature series inputted for individual locations incorporate ‘adjustments’ by national meteorological services that are rather large. To be clear, even before the maximum and minimum temperature values from individual weather stations are incorporated into HadCRUT5 they are adjusted.”
Yes, this is the whole problem.
The Data Manipulators are adjusting the temperatures in order to create a false, scary scenario where the Earth is getting hotter and hotter and is currently at the hottest temperature in human history. NOAA just made that claim yesterday saying last July was the hottest July in human history. It felt rather pleasant to me, considering other July’s I have experienced.
The truth is, going by “unmodified” regional surface temperature charts, the Earth is NOT at the hottest point in human history. There are several periods in recent recorded history that were just as warm as today, in the 1880’s and the 1930’s. And there were other periods in history that were warmer than today such as the Roman warm period. Claiming today is the hottest in human history is a total distortion of the temperature record. The Data Manipulators are lying through their teeth in an effort to sell their Human-caused Climate Change scam.
From the article: “According to the Australian Bureau of Meteorology (BOM), which is one of the national meteorological services providing data for HadCRUT, the official remodelled temperatures are an improvement on the actual measurements.”
What a joke!
A very costly joke.
From the article: “Then, just in time for inclusion in this new IPCC report released on Tuesday, all the daily values from each of the 112 weather stations were remodelled and the rate of warming increased to 1.23 °C per century in ACORN-SAT version 2 that was published in 2018. This broadly accords with the increase of 22% in the rate of warming between the 2014 IPCC report (Assessment Report No. 5) which was 0.85 °C (since 1850), and this new report has the rate of warming of 1.07 °C.”
This is not science. It is Fraud posing as science.
1) More should be done to validate these formulas adjusting/infilling data.
2) Ensure meticulous records are kept of changes & influences to recorded data.
3) Ensure overlap of recording data when sites move.
4) Replicate the old measuring methods with newer equipment & data to compare eg. use new data to replicate alternative time of day measurements and compare this to data collected using current standards & compare with the models applied to the old data.
5) Quantify the size of the adjustments & error range/behaviour.