Are Australians heading for the cooker?
Guest essay by Bill Johnston
Elections for Australia’s National (Federal) parliament are looming and carbon tax is a battleground issue.
The incumbent Labour Party have proposed to transition its existing toxic carbon tax to an emissions trading scheme, linked to that in Europe, a year earlier than planned. Supposedly this would save their ‘working families’ about $300 AUD/year for one year. The Liberal opposition party has promised to scrap the tax and ETS altogether and go for “direct action”. The fringe Greens party are wailing from the political sidelines because they don’t like anything.
IF climate change is natural; and, IF the warming is more hot air than substance, neither plan is likely to achieve anything except increase the cost of living.
To help clarify things, this essay presents a straightforward analysis of Australia’s overall average temperature record, in sufficient detail that it could readily be repeated.
Methods and results.
Average ACORN-SAT temperature anomaly data (1961-1990 climatology) were downloaded month-by month from Australia’s Bureau of Meteorology (http://www.bom.gov.au/) “Climate Change Tracker” page.
To ensure data were not skewed by choice of the period used for anomaly calculations, they were sorted into a year by month array in Microsoft Excel and a lookup table of the 1961-1990 monthly climatological averages was used to convert anomalies into degrees Celsius.
To prevent December being recognized by Excel as the first month of the following year (which due to “time slippage” it does) time was calculated as a continuous variable of month-centered deciyears using the formula: Deciyear = [(digimonth-0.5)/12 + year]
In a separate table, Excel functions were used to calculate overall monthly average temperatures from the back-transformed data. Using those as lookup values, temperature data were seasonally adjusted using the standard approach of deducting overall monthly averages from respective monthly values. A new zero-centred anomaly dataset was thus calculated. It was those data that were analysed here.
Data were examined graphically in Excel (using Excel fitted trend lines); and statistically using the standalone package PAST (v. 2.17b) from the University of Oslo (http://folk.uio.no/ohammer/past).
Homogeneity is an issue for time series whose data-stream may be a mixture of trend; abrupt changes due to external “shocks”, and cyclic phenomenon. Shocks or shifts may result in changes in the data’s properties causing spurious trend inferences if they are ignored.
Two Excel addins were used to investigate homogeneity. They were:
(i) Change Point Analyser (CPA) from www.variation.com (time-limited freeware) (nonparametric), which used bootstrapping to detect changepoints in the mean and standard deviation of an historical data stream. CPA flagged if data were not serially independent and sub-sampling and grouping options are available to handle the problem. Various user options affect the sensitivity of CPA analysis.
(ii) Sequential t-test Analysis of Regime Shifts) v. 3.2 (STARS) (freeware from: http://www.beringclimate.noaa.gov/regimes/) (parametric). Conducts sequential t-tests along a time series to detect changes in the mean level. It can also be used to test for changes in the variance. STARS presents various options for handling autocorrelation as well as settings which determine sensitivity.
To minimise detection of statistically non-significant shifts (false detections), and maximise detection of significant ones, for both procedures, a rigorous iterative approach involving combinations of test parameters was used.
Target confidence levels were: P > 95% (CPA) and P <0.05 (STARS), meaning less than a 5% chance of detected changes being spurious. (Both CPA and STARS provide actual P levels for detected shifts, which allows a stringent interpretation of results.)
In addition to basic statistical tests such as for normality and autocorrelation, PAST conducts Lomb periodogram spectral analysis on detrended data and it includes routines for fitting up to 8 optimised sinusoidal regressions (a form of “blind” spectral analysis).
It’s linear regression module handles non-normal and potentially autocorreleted data using bootstrapped confidence intervals, with the overall significance of regression indicated by P(unc.), which is the probability that data are not correlated. PAST will also fit an optimised cubic smoothing spine to noisy data.
PAST conveniently operates on a “cut-and-paste” basis with Excel, allowing them to be used in tandem.
(More information, and references are available at the respective web-sites.)
This study adopted a decomposition approach to the trend problem. Decomposition involves breaking the total signal (which is: trend + noise) into its components; which are deducted before trend is investigated. The approach is simple, objective, transparent and repeatable.
To leave the trend intact, it is largely the “noise” part of the data (i.e. the detrended data) that are investigated. (Detrended data are the residuals from fitting a least-squares regression relationship to the data and deducting the fit; it is a transformation option in PAST.)
Already, without affecting the overall naïve trend (0.0089oC/yr), or sacrificing data (which happens with annual averaging), deducting monthly averages from values for respective months removed the annual summer-winter temperature oscillation, which accounts for a considerable portion of the total noise (Figure 1).
Figure 1. Average temperature (left) and monthly anomaly data (right) plotted at the same scale. For the anomaly data, the annual ‘cyclic’ signal has been removed compressing the apparent data range. The underlying trend remained unaffected (indicated as 0.0089oC/year; or 0.089 oC /decade).
Visualisation of the anomaly data using LOWESS regression in PAST (smoothing parameter 0.3), and an expanded scatter-plot of the anomaly data in Excel (not shown here) indicated a shift may have occurred in the series in the 1950’s. This was investigated further by constructing a Cusum (or residual mass) plot in Excel. (The LOWESS tends to ‘smooth’ its way through data steps, which hides their abrupt nature.)
Cusum values are calculated as the cumulative sum of zero-centred monthly anomaly values.
The plot (Figure 2) suggested the overall dataset contained several discontinuities, with turning points (inflections) in 1957 and 1979. This indicated the data might not be homogeneous.
(Like a scatter plot, which indicates the linear characteristics of data; a Cusum chart indicates the shape of the data relative to the long-term mean.)
(Consecutive numbers, including completely random ones, show “runs” above and below the data average. Thus a Cusum chart is an indicator not a statistical test. The test, which comes later, is the probability that such runs represent a process that is not due to chance (CPA; P ~> 95%) or that steps in the time-series-mean are statistically significant (P < 0.05; STARS).)
Figure 2. The Cusum tracks the behaviour of the data relative to its overall average. In this case, average temperatures tracked less than the long-term average until 1957, and tracked higher after about 1979. Thus, the data may consist of three distinct data segments each of which could display different characteristics.
It was important to check that step-changes in the data were not confounded with long-period oscillations. This was investigated using PAST.
Spectral analysis on the detrended data found statistically significant (P < 0.05) peaks at 0.02297 and 0.00987yr-1, corresponding to frequencies of 3.6 and 8.44 years (Figure 3). Optimally fitted sinusoids detected similar signals (periodicity of 3.7 and 8.5 years). Although the amplitudes were small (0.142 and 0.137Co respectively) relative to the range of the detrended data, their effects ought still be removed.
Figure 3. PAST graphic of the Lomb periodogram of detrended data, indicating spectral peaks at 0.00987yr-1 (P <0.05) and 0.02297yr-1 (P < 0.01) corresponding to periods (cycles) of 8.44 and 3.6 years respectively. (The red lines represent the 95 and 99% confidence intervals.)
(The frequencies correspond roughly to the 8.85 yr cycle of lunar perigee and a related quasi-cycle of 4.4 years, which have long been known to affect sea-levels (see Haig et. al. (2011) doi:10.1029/2010JC006645 for a general discussion.) However, there seems to be little written about the possible impact of short-term cycles on terrestrial temperature.)
To remove the underlying cycles, sinusoids fitted to the detrended data were pasted back into Excel and deducted from the monthly anomaly data. Homogeneity of those residuals was investigated using CPA and STARS.
CPA and STARS both detected highly significant step-changes in the de-cycled data around April 1957; January 1979 and April 2002. STARS detected an additional change in August 2010. (It needs to be noted that CPA is a less powerful test than STARS and it is less sensitive to detecting change-points near the end of a series. A major advantage of STARS (over most other step-detection techniques) is that it is effective for monitoring changes over a full record.)
(Iteratively determined CPA test parameters were: target significance level, 95%; confidence level for inclusion, 99%; CI estimation at 99% (bootstraps 10,000, with replacement; Cusum estimates.) For STARS, parameters were: probability, 0.1, cutoff length 120 months (10 years); Huber parameter, 5 (no outlier adjustment); IPN4 to handle autocorrelation.)
Figure 4 shows the data segments detected by CPA superimposed on its Cusum chart (CPA graphic).
Results of STARS analysis is shown in Figure 5.
The step-changes identified here need to be interpreted in the light of changes in the broader climate.
STARS analysis of C. Folland’s (Hadley Centre) unfiltered Interdecadal Pacific Oscillation (IPO) index (1871-2007) found significant shifts in 1907 (from a previous strongly positive phase) to a moderately positive phase that lasted until 1947. The index shifted to a negative phase in 1948; a strongly positive phase in 1975; then back to a negative phase in 1999. These were sudden and abrupt phase-changes that have been linked to the severity and intensity of ENSO cycling by a number of Authors.
Remembering that averaging smoothes data considerably; and that the temperature data represent continental-scale averages it was not unexpected that the temperature response to IPO phase-changes lagged by several years. The consistency of the temperature response strongly implicated the IPO (or DPO) as the likely trigger for temperature step-changes detected here (1957, 1979, 2002 [the 2010 temperature shift was outside the IPO data range]).
Clearly, a linear trend model is inappropriate for stepped data as trend analysis could be biased by the choice of start and end dates. Also, confounding steps and trends will certainly lead to spurious results and miss-attribution of cause and effect.
Figure 5. STARS analysis of decycled anomaly data, with the highly significant step-changes superimposed. P levels were: April 1957 4.8E-05; January 1979 2.01E-05; April 2002 4.1E-05 and August 2010 0.0089. (E represents engineering notation.) It was extremely unlikely that the step changes were due to chance.
For example, the overall trend in the data (January 1910 to June 2013) is 0.089oC/decade. The trend from 1950, which is a popular climate-change starting point, to June 2013, is 0.13 oC/decade. The trend from the start (1910) to the end of the 2010 “hot decade” is 0.1oC/decade, or conveniently, about 1 degree for the century. All very quotable statistics; but all quite misleading.
Looking at the individual step-changes, the difference between the period average at the start of the record (-0.275oC) and the end (0.132oC) is 0.41oC; divided by the number of decades (10.64), the rate is 0.039 oC/decade. But that is not a real rate either; it is simply the difference divided by elapsed years.
So is there a trend?
Residuals from deducting the step-change means from the anomaly data were pasted into PAST and detrended. They were modelled using an optimised smoothing spline, which tracked much of the noise. After deducting the splined signal from the step-free residuals (in Excel) data were pasted back into PAST and analysed for trend.
No significant trend was detected.
To further underscore the trend problem, the dataset was segmented at each of the changepoints indicated in Figure 5, resulting in five trend clusters, four of which were valid (from January 1910 to March 1957; April 1957 to December 1978; January 1979 to March 2002 and April 2002 to July 2010. (The fifth cluster from August 2010 to January 2013 was not defined by an end-point so it too short to adequately test.)
After reducing noise by deducting a cubic smoothing spline fit to each cluster’s detrended data, only one cluster (April 1957 to December 1978) showed a statistically significant (P(unc) <0.01) trend. For that case, the trend was negative (-0.16oC/decade).
So were the step-changes real or an artefact?
It has been established in Australia’s climate literature that that when the IPO (or PDO) is in its positive phase, dry El Niño conditions predominate over much of the continent; and that El Niño events are more frequent and severe. When in its negative phase, La Niña dominates which brings generally moist conditions especially to southern Australia.
This is not to say that droughts don’t occur; or floods, during opposite phases; it is a general statement supported by published studies.
The timing of IPO/DPO shifts is evidenced by other meteorological events. For instance, a time-plot of Australia’s annual rainfall shows extreme values in the early 1950’s; 1974; 2000, and 2010/11 (1974 being the most pronounced). The rapid temperature decline post the 2000-2010 “hot decade” evidences how rapidly climate changes in response to a major shift.
The impact of shifts in the broader climate system on floods, cyclones, droughts and heatwaves is also corroborated by day-to-day reports in historic newspapers and other documents including Bureau of Meteorological Bulletins, special statements and Journal papers.
Thus multiple lines of evidence can be drawn-on to support that around the time of IPO/PDO shifts, the climate is markedly perturbed. It then takes a year or 2 for things to settle down, often to a new level. Clearly step-changes are real and consequential in human terms.
Regardless of their origin, events that impact on data are not trends. They create inhomogeneties in time series, which invalidate trend estimation using least-squares methods. The popular choice of 1950 as a climate change “starting point” is not a valid one because the data from 1950 to 1957 are from a pool of lower than average values that exert leverage on the trend-line. As indicated earlier, data prior to 1957 were non-trending.
It is important that time-related data are checked for inhomogeneties and other non-trend signals and that the effect of these are removed. Otherwise the total signal is a confounded one.
- Australia’s averaged temperature data were impacted on by climate shifts in the 1950’s, 1970’s 2002 and 2010. After deducting the impact of those natural events, no residual warming trend was evident that could be related to atmospheric CO2 levels.
- Australia’s, ‘hot decade’ (2000-2010) was used to relentlessly market global warming by Australia’s Climate Commission; the Bureau of Meteorology; green groups and politicians in order to stir a sense of catastrophe and climate-fear. However, the fear was unfounded; the drought and associated high temperatures were a temporary aberration caused by El Niño cycles, not global warming.
- The 2010 down-step exposed much of that decade’s climate-grooming as false and deceptive. Deceit continues under the guise of “climate change”. There is no evidence at this time that climate change and CO2 concentration in the atmosphere are related.
- The outcome of Australia’s looming election will make no difference to the climate, or to the likelihood or impact of future climate changes. Ditching the carbon tax together with ‘direct action’ would save the Nation’s taxpayers many billions of AUD$ which would be better spent on Nation-building and improving access to services.
Dr. Bill Johnston is a retired natural resources scientist with an interest in climate change issues.