Australia's average temperature

Are Australians heading for the cooker?

Guest essay by Bill Johnston

Elections for Australia’s National (Federal) parliament are looming and carbon tax is a battleground issue.

The incumbent Labour Party have proposed to transition its existing toxic carbon tax to an emissions trading scheme, linked to that in Europe, a year earlier than planned. Supposedly this would save their ‘working families’ about \$300 AUD/year for one year. The Liberal opposition party has promised to scrap the tax and ETS altogether and go for “direct action”. The fringe Greens party are wailing from the political sidelines because they don’t like anything.

IF climate change is natural; and, IF the warming is more hot air than substance, neither plan is likely to achieve anything except increase the cost of living.

To help clarify things, this essay presents a straightforward analysis of Australia’s overall average temperature record, in sufficient detail that it could readily be repeated.

Methods and results.

Average ACORN-SAT temperature anomaly data (1961-1990 climatology) were downloaded month-by month from Australia’s Bureau of Meteorology (http://www.bom.gov.au/) “Climate Change Tracker” page.

To ensure data were not skewed by choice of the period used for anomaly calculations, they were sorted into a year by month array in Microsoft Excel and a lookup table of the 1961-1990 monthly climatological averages was used to convert anomalies into degrees Celsius.

To prevent December being recognized by Excel as the first month of the following year (which due to “time slippage” it does) time was calculated as a continuous variable of month-centered deciyears using the formula: Deciyear = [(digimonth-0.5)/12 + year]

In a separate table, Excel functions were used to calculate overall monthly average temperatures from the back-transformed data. Using those as lookup values, temperature data were seasonally adjusted using the standard approach of deducting overall monthly averages from respective monthly values. A new zero-centred anomaly dataset was thus calculated. It was those data that were analysed here.

Data were examined graphically in Excel (using Excel fitted trend lines); and statistically using the standalone package PAST (v. 2.17b) from the University of Oslo (http://folk.uio.no/ohammer/past).

Homogeneity is an issue for time series whose data-stream may be a mixture of trend; abrupt changes due to external “shocks”, and cyclic phenomenon. Shocks or shifts may result in changes in the data’s properties causing spurious trend inferences if they are ignored.

Two Excel addins were used to investigate homogeneity. They were:

(i) Change Point Analyser (CPA) from www.variation.com (time-limited freeware) (nonparametric), which used bootstrapping to detect changepoints in the mean and standard deviation of an historical data stream. CPA flagged if data were not serially independent and sub-sampling and grouping options are available to handle the problem. Various user options affect the sensitivity of CPA analysis.

(ii) Sequential t-test Analysis of Regime Shifts) v. 3.2 (STARS) (freeware from: http://www.beringclimate.noaa.gov/regimes/) (parametric). Conducts sequential t-tests along a time series to detect changes in the mean level. It can also be used to test for changes in the variance. STARS presents various options for handling autocorrelation as well as settings which determine sensitivity.

To minimise detection of statistically non-significant shifts (false detections), and maximise detection of significant ones, for both procedures, a rigorous iterative approach involving combinations of test parameters was used.

Target confidence levels were: P > 95% (CPA) and P <0.05 (STARS), meaning less than a 5% chance of detected changes being spurious. (Both CPA and STARS provide actual P levels for detected shifts, which allows a stringent interpretation of results.)

In addition to basic statistical tests such as for normality and autocorrelation, PAST conducts Lomb periodogram spectral analysis on detrended data and it includes routines for fitting up to 8 optimised sinusoidal regressions (a form of “blind” spectral analysis).

It’s linear regression module handles non-normal and potentially autocorreleted data using bootstrapped confidence intervals, with the overall significance of regression indicated by P(unc.), which is the probability that data are not correlated. PAST will also fit an optimised cubic smoothing spine to noisy data.

PAST conveniently operates on a “cut-and-paste” basis with Excel, allowing them to be used in tandem.

This study adopted a decomposition approach to the trend problem. Decomposition involves breaking the total signal (which is: trend + noise) into its components; which are deducted before trend is investigated. The approach is simple, objective, transparent and repeatable.

To leave the trend intact, it is largely the “noise” part of the data (i.e. the detrended data) that are investigated. (Detrended data are the residuals from fitting a least-squares regression relationship to the data and deducting the fit; it is a transformation option in PAST.)

Already, without affecting the overall naïve trend (0.0089oC/yr), or sacrificing data (which happens with annual averaging), deducting monthly averages from values for respective months removed the annual summer-winter temperature oscillation, which accounts for a considerable portion of the total noise (Figure 1).

Figure 1. Average temperature (left) and monthly anomaly data (right) plotted at the same scale. For the anomaly data, the annual ‘cyclic’ signal has been removed compressing the apparent data range. The underlying trend remained unaffected (indicated as 0.0089oC/year; or 0.089 oC /decade).

Visualisation of the anomaly data using LOWESS regression in PAST (smoothing parameter 0.3), and an expanded scatter-plot of the anomaly data in Excel (not shown here) indicated a shift may have occurred in the series in the 1950’s. This was investigated further by constructing a Cusum (or residual mass) plot in Excel. (The LOWESS tends to ‘smooth’ its way through data steps, which hides their abrupt nature.)

Cusum values are calculated as the cumulative sum of zero-centred monthly anomaly values.

The plot (Figure 2) suggested the overall dataset contained several discontinuities, with turning points (inflections) in 1957 and 1979. This indicated the data might not be homogeneous.

(Like a scatter plot, which indicates the linear characteristics of data; a Cusum chart indicates the shape of the data relative to the long-term mean.)

(Consecutive numbers, including completely random ones, show “runs” above and below the data average. Thus a Cusum chart is an indicator not a statistical test. The test, which comes later, is the probability that such runs represent a process that is not due to chance (CPA; P ~> 95%) or that steps in the time-series-mean are statistically significant (P < 0.05; STARS).)

Figure 2. The Cusum tracks the behaviour of the data relative to its overall average. In this case, average temperatures tracked less than the long-term average until 1957, and tracked higher after about 1979. Thus, the data may consist of three distinct data segments each of which could display different characteristics.

It was important to check that step-changes in the data were not confounded with long-period oscillations. This was investigated using PAST.

Spectral analysis on the detrended data found statistically significant (P < 0.05) peaks at 0.02297 and 0.00987yr-1, corresponding to frequencies of 3.6 and 8.44 years (Figure 3). Optimally fitted sinusoids detected similar signals (periodicity of 3.7 and 8.5 years). Although the amplitudes were small (0.142 and 0.137Co respectively) relative to the range of the detrended data, their effects ought still be removed.

Figure 3. PAST graphic of the Lomb periodogram of detrended data, indicating spectral peaks at 0.00987yr-1 (P <0.05) and 0.02297yr-1 (P < 0.01) corresponding to periods (cycles) of 8.44 and 3.6 years respectively. (The red lines represent the 95 and 99% confidence intervals.)

(The frequencies correspond roughly to the 8.85 yr cycle of lunar perigee and a related quasi-cycle of 4.4 years, which have long been known to affect sea-levels (see Haig et. al. (2011) doi:10.1029/2010JC006645 for a general discussion.) However, there seems to be little written about the possible impact of short-term cycles on terrestrial temperature.)

To remove the underlying cycles, sinusoids fitted to the detrended data were pasted back into Excel and deducted from the monthly anomaly data. Homogeneity of those residuals was investigated using CPA and STARS.

CPA and STARS both detected highly significant step-changes in the de-cycled data around April 1957; January 1979 and April 2002. STARS detected an additional change in August 2010. (It needs to be noted that CPA is a less powerful test than STARS and it is less sensitive to detecting change-points near the end of a series. A major advantage of STARS (over most other step-detection techniques) is that it is effective for monitoring changes over a full record.)

(Iteratively determined CPA test parameters were: target significance level, 95%; confidence level for inclusion, 99%; CI estimation at 99% (bootstraps 10,000, with replacement; Cusum estimates.) For STARS, parameters were: probability, 0.1, cutoff length 120 months (10 years); Huber parameter, 5 (no outlier adjustment); IPN4 to handle autocorrelation.)

Figure 4 shows the data segments detected by CPA superimposed on its Cusum chart (CPA graphic).

Figure 4. Cusum chart indicating changes detected by CPA.

Results of STARS analysis is shown in Figure 5.

The step-changes identified here need to be interpreted in the light of changes in the broader climate.

STARS analysis of C. Folland’s (Hadley Centre) unfiltered Interdecadal Pacific Oscillation (IPO) index (1871-2007) found significant shifts in 1907 (from a previous strongly positive phase) to a moderately positive phase that lasted until 1947. The index shifted to a negative phase in 1948; a strongly positive phase in 1975; then back to a negative phase in 1999. These were sudden and abrupt phase-changes that have been linked to the severity and intensity of ENSO cycling by a number of Authors.

Remembering that averaging smoothes data considerably; and that the temperature data represent continental-scale averages it was not unexpected that the temperature response to IPO phase-changes lagged by several years. The consistency of the temperature response strongly implicated the IPO (or DPO) as the likely trigger for temperature step-changes detected here (1957, 1979, 2002 [the 2010 temperature shift was outside the IPO data range]).

Clearly, a linear trend model is inappropriate for stepped data as trend analysis could be biased by the choice of start and end dates. Also, confounding steps and trends will certainly lead to spurious results and miss-attribution of cause and effect.

Figure 5. STARS analysis of decycled anomaly data, with the highly significant step-changes superimposed. P levels were: April 1957 4.8E-05; January 1979 2.01E-05; April 2002 4.1E-05 and August 2010 0.0089. (E represents engineering notation.) It was extremely unlikely that the step changes were due to chance.

For example, the overall trend in the data (January 1910 to June 2013) is 0.089oC/decade. The trend from 1950, which is a popular climate-change starting point, to June 2013, is 0.13 oC/decade. The trend from the start (1910) to the end of the 2010 “hot decade” is 0.1oC/decade, or conveniently, about 1 degree for the century. All very quotable statistics; but all quite misleading.

Looking at the individual step-changes, the difference between the period average at the start of the record (-0.275oC) and the end (0.132oC) is 0.41oC; divided by the number of decades (10.64), the rate is 0.039 oC/decade. But that is not a real rate either; it is simply the difference divided by elapsed years.

So is there a trend?

Residuals from deducting the step-change means from the anomaly data were pasted into PAST and detrended. They were modelled using an optimised smoothing spline, which tracked much of the noise. After deducting the splined signal from the step-free residuals (in Excel) data were pasted back into PAST and analysed for trend.

No significant trend was detected.

To further underscore the trend problem, the dataset was segmented at each of the changepoints indicated in Figure 5, resulting in five trend clusters, four of which were valid (from January 1910 to March 1957; April 1957 to December 1978; January 1979 to March 2002 and April 2002 to July 2010. (The fifth cluster from August 2010 to January 2013 was not defined by an end-point so it too short to adequately test.)

After reducing noise by deducting a cubic smoothing spline fit to each cluster’s detrended data, only one cluster (April 1957 to December 1978) showed a statistically significant (P(unc) <0.01) trend. For that case, the trend was negative (-0.16oC/decade).

So were the step-changes real or an artefact?

It has been established in Australia’s climate literature that that when the IPO (or PDO) is in its positive phase, dry El Niño conditions predominate over much of the continent; and that El Niño events are more frequent and severe. When in its negative phase, La Niña dominates which brings generally moist conditions especially to southern Australia.

This is not to say that droughts don’t occur; or floods, during opposite phases; it is a general statement supported by published studies.

The timing of IPO/DPO shifts is evidenced by other meteorological events. For instance, a time-plot of Australia’s annual rainfall shows extreme values in the early 1950’s; 1974; 2000, and 2010/11 (1974 being the most pronounced). The rapid temperature decline post the 2000-2010 “hot decade” evidences how rapidly climate changes in response to a major shift.

The impact of shifts in the broader climate system on floods, cyclones, droughts and heatwaves is also corroborated by day-to-day reports in historic newspapers and other documents including Bureau of Meteorological Bulletins, special statements and Journal papers.

Thus multiple lines of evidence can be drawn-on to support that around the time of IPO/PDO shifts, the climate is markedly perturbed. It then takes a year or 2 for things to settle down, often to a new level. Clearly step-changes are real and consequential in human terms.

Regardless of their origin, events that impact on data are not trends. They create inhomogeneties in time series, which invalidate trend estimation using least-squares methods. The popular choice of 1950 as a climate change “starting point” is not a valid one because the data from 1950 to 1957 are from a pool of lower than average values that exert leverage on the trend-line. As indicated earlier, data prior to 1957 were non-trending.

It is important that time-related data are checked for inhomogeneties and other non-trend signals and that the effect of these are removed. Otherwise the total signal is a confounded one.

Conclusions.
1. Australia’s averaged temperature data were impacted on by climate shifts in the 1950’s, 1970’s 2002 and 2010. After deducting the impact of those natural events, no residual warming trend was evident that could be related to atmospheric CO2 levels.
2. Australia’s, ‘hot decade’ (2000-2010) was used to relentlessly market global warming by Australia’s Climate Commission; the Bureau of Meteorology; green groups and politicians in order to stir a sense of catastrophe and climate-fear. However, the fear was unfounded; the drought and associated high temperatures were a temporary aberration caused by El Niño cycles, not global warming.
3. The 2010 down-step exposed much of that decade’s climate-grooming as false and deceptive. Deceit continues under the guise of “climate change”. There is no evidence at this time that climate change and CO2 concentration in the atmosphere are related.
4. The outcome of Australia’s looming election will make no difference to the climate, or to the likelihood or impact of future climate changes. Ditching the carbon tax together with ‘direct action’ would save the Nation’s taxpayers many billions of AUD\$ which would be better spent on Nation-building and improving access to services.

===================

Dr. Bill Johnston is a retired natural resources scientist with an interest in climate change issues.

Article Rating
Inline Feedbacks
July 21, 2013 3:33 pm

Reblogged this on Public Secrets and commented:
The upshot is that, after accounting for natural events, there is no discernible trend toward warming, let alone one that could be attributed to human-generated CO2. In other words, it’s all scare-mongering on the part of the Eco-Left and Green Statists, and the fools think they’re right.

geran
July 21, 2013 3:40 pm

There is a REALLY good linkage between increased CO2 levels and politicians “hot-air” about “climate change”. (Someone should prepare a graph….)
Maybe they should SHUT UP. Wishful thinking, I know.
(Oh look, WUWT’s ENSO meter continues to fall. The ocean heat hides so well it can even lower ocean temps!)

Stuart Elliot
July 21, 2013 3:43 pm

I wish facts were enough. I believe that a collective sense of outrage and betrayal will be required to shatter the grasping smugness of the alarmist / politician complex. But at least facts give the world a place to start the journey back to sanity.

July 21, 2013 3:43 pm

“The incumbent Labour Party have proposed to transition its existing toxic carbon tax to an emissions trading scheme, linked to that in Europe, a year earlier than planned.”
Wow, Goldman Sachs, et al. must be frothing with excitement! The “carbon tax” was basically a ruse to lull Green parliament members into voting for the bill, due to their believers’ inclination to know cap and trade is a financial scam to primarily determine the highest price “polluters” will pay NOT to change their behavior. Now the “trick” has been revealed, or is more openly unavoidable, and rather than hide, the Labour Party glories in pushing the scam faster into operation! “Labour” Party? Nope, Bankster Party

July 21, 2013 3:46 pm

No surprises here – the warm anomaly in Australia could be compared to the warn autumn of 2011 in the northeastern US and eastern Canada while the rest of the world was entering into one of the coldest winters since 1850. Meaningless in any case for the world as a whole.

Anenome Ofglobalgov
July 21, 2013 3:47 pm

We, the People, have been deceived. In the Larry Abraham book ‘The Greening’, 1993, he gives evidence that the world’s ‘Elite’ have always believed that war is necessary to keep the masses under control, that they were seeking an alternative control mechanism for the same purpose, and that the Environment was chosen as the tool. And look how they have succeeded. That is why they are using the global warming fraud, and also UN Agenda 21 to regulate, restrict and control us step by evil step until we are cleared off the land into self-contained highrise ‘transit centres’ with public transport and no cars allowed.. To create dependency of We, the People, on our (controlled) ‘governments’, the derivatives markets were created to create debt overload of the banks (by removing the Glass-Steagalls Act that separated commercial banks from casino investment banks like Goldman Sacs), which will collapse in the near future, and ‘bail-ins’, the theft of depositors money, will take place – all planned decade ahead. Australia and New Zealand in april began working on legislation to legalise the theft of our deposits by the banks that are too big to fail, so-called. There is a push around the world to prevent this catastrophe by brining Glass-Steagallsinto legislation, see Citizens Electoral Council. of Australia, LaRouchePac too.

SamG
July 21, 2013 3:48 pm

Typo: “Labor” party
While I’m at it, I’d prefer it if nobody votes for min 51% party control over half the country. I don’t like being ruled.

Justthinkin
July 21, 2013 3:50 pm

Save many billions for OZ taxpayers. But, but, but. What about all the poor climate “scientists”. How are they going to keep the multi-million \$ houses and Porches?? Oh, the humanities.

Malcolm Miller
July 21, 2013 3:52 pm

A beautiful analysis, but I fear there will be no change in the politicians’ use of alarmism; it’s too easy to scare people into supporting their wishes for where the money goes.

July 21, 2013 3:57 pm

Australia already has the 4th highest electricity prices in the world. Now they want to increase them further. Australians are apathetic idiots. They have mines, but they tax them out of existence. They have foundries, but they tax them out of existence. They export raw product and import finished product. Meanwhile there are no jobs and no skills. They flood their own country with Chinese immigrants who compete for the few jobs that are out there, but they are willing to work for a fraction of the price and to house share 8 to a house to pay the rent. Australians are fools.

SamG
July 21, 2013 4:05 pm

Adam, do you vote? If so -you are to blame.

Frank Kotler
July 21, 2013 4:28 pm

“The approach is simple, objective, transparent and repeatable.”
Say what?

John B. Lomax
July 21, 2013 4:36 pm

I do not fully comprehend the statistics employed but understand just enough to accept this analysis as fair and accurate. I LIKE the conclusions.

Other_Andy
July 21, 2013 4:38 pm

“IF climate change is natural; and, IF the warming is more hot air than substance, neither plan is likely to achieve anything except increase the cost of living.”
IF climate change is caused by CO2; and, IF the warming is no hot air and is of substance, neither plan is likely to achieve anything except increase the cost of living.
Only one fifth of the countries have signed up to reduce CO2.
80% of all countries are either exempt or do not want to participate.
The top 5 countries, emitting more than 50% of Global CO2, are either exempt or do not participate.
Of the top 30 countries, emitting almost 90% of all Global CO2, only 10 countries (emitting about 13% of global CO2) have signed up to reduce CO2.
Nothing will change either way.

Graham of Sydney
July 21, 2013 4:40 pm

“The incumbent Labour Party…”
Unlike it’s UK counterpart, ours is the “Labor Party”.
Incidentally, I’m still not sure which is the greatest national disaster.

Henry Galt
July 21, 2013 5:03 pm

CS ≤ 0°C
.

tommoriarty
July 21, 2013 5:04 pm

An analysis of sea level rise around Australia indicates that the sea level orthogonal to ENSO3.4 yields a relative rise rate greater in mid-20th century than at the end of the century.
The graphical result is here…
http://climatesanity.wordpress.com/#jp-carousel-4748
Description of my methods can be found here…
http://climatesanity.wordpress.com/2013/07/17/the-search-for-acceleration-part-6-australia/
and here…
http://climatesanity.wordpress.com/2013/06/17/the-search-for-acceleration-part-1/

Unite Against Greenfleecing
July 21, 2013 5:06 pm

They might as well call it Rudd’s EU bail out fund. Hard working Australians carbon taxes to go overseas to EU mafias and white collar criminals. What could be better?

Greg Cavanagh
July 21, 2013 5:07 pm

Stuart Elliot, We know the pollies aren’t sane, and at least 50% of the population are below average intelegence.
I realy don’t think sanity is the norm…

Darren
July 21, 2013 5:08 pm

I wonder also if most people realise that the so called 398 ppmof C02 measured at Manua Lua is actually way higher thanalmost every other measured location. Wheer I live it is only 360ppm (South Australia) , with daily variance of plus/minus 20-25 ppm

July 21, 2013 5:17 pm

Bil, I wrote an article on Australian temperatures based on the work of statistician Jonathan Lowe.
http://www.bishop-hill.net/blog/2011/11/4/australian-temperatures.html
In summary, it finds that more than 40% of the increase in Australian average temperatures since the 1950s is an artifact of deriving the average from min+max/2. Fixed time temperature measurements show over 40% less warming.
What has happened is low level aerosols and low level clouds (likely mostly aerosol seeded) have decreased causing earlier and higher minimum temperatures.
Your cumulative anomaly graph essentially shows, as Australia’s population increased, the aerosol (plus BC and OC) load increased, until various clean air initiatives were introduced, starting in the 1960s.

LevelGaze
July 21, 2013 5:20 pm

And Greg Cavanagh is in the 50% of the population with below average spelling ability…

Lord Galleywood
July 21, 2013 5:22 pm

UN Agenda 21 at its best – You know it makes sense, go google it to put your mind’s at rest – Simples.

RoHa
July 21, 2013 5:26 pm

@ SamG.
If Adam is an Australian citizen and an adult he is required by law to vote. It’s compulsory here.
And depressing.
We have the choice between the Australian Labor* Party (a bunch of half-wits, drunks, and unemployables) or the Coalition of the Liberal Party, the National Party, and the Country Liberal Party (all together another bunch of half-wits, drunks, and unemployables) as major parties, plus minor parties consisting mostly of half-wits, drunks, unemployables, and total loonies. Plus we have the single transferable vote system, which means that even if your first choice is a total loony (and why not?) your vote could trickle down to one of the time wasters of the major parties.
When they put “none of the above” and “hang the lot” on the ballot paper I’ll vote cheerfully.
(*This shows that the party can’t even spell.)

AndyG55
July 21, 2013 5:56 pm

RoHa.
You have to attend a voting booth and get your name crossed off.
There is NOTHING that says you have to vote.
You can just take your ballot paper put a big F’U across it if you like.

AndyG55
July 21, 2013 5:56 pm

ps MANY do just that.

AndyG55
July 21, 2013 6:02 pm

Greenfleecing says, “carbon taxes to go overseas to EU mafias and white collar criminals”
These would be among KRudd’s best friends.
I wonder if he has a deal going somewhere, for retirement.

pat
July 21, 2013 6:13 pm

21 July: ABC America: US drops bombs on Great Barrier Reef Marine Park
http://www.abc15.com/dpp/news/national/us-drops-bombs-on-great-barrier-reef-marine-park
21 July: US Navy set to recover bombs on Barrier Reef
The four bombs – two inert and two high explosive but unarmed – were dropped
from a pair of US Harrier jets on Tuesday.
http://www.brisbanetimes.com.au/environment/us-navy-set-to-recover-bombs-on-barrier-reef-20130721-2qcqt.html
and not a word of outrage from Ove as yet:
Professor Ove Hoegh-Guldberg was granted a fellowship from 2009-2014 to conduct climate change research on the Great Barrier Reef…
The Great Barrier Reef Marine Park Authority (GBRMPA) is a co-sponsor of this research fellowship.
http://elibrary.gbrmpa.gov.au/jspui/handle/11017/158?mode=simple

gopal panicker
July 21, 2013 6:27 pm

‘averaging’ temperature over a large area like Australia is a mistake…even worse is the so called ‘global average temp’….a nonsense concept

thingdonta
July 21, 2013 7:16 pm

“Regardless of their origin, events that impact on data are not trends.”
You don’t know whether the events are part of the overall trend or not, so this statement is statistically invalid.
You cant filter out step changes when it is convenient. There is no way of knowing whether the step changes are part of the overall trend or not. They may or may not be. But to filter them out routinely, you have caught the academic disease of ignoring aspects or parts of the data that don’t suit you. This filtering of step changes as you have done is an old trick, and it is invalid.
Step changes can be caused by an overall trend.

geran
July 21, 2013 7:53 pm

thingdonta says:
July 21, 2013 at 7:16 pm
thing, you must have missed the next sentence:
“The popular choice of 1950 as a climate change “starting point” is not a valid one because the data from 1950 to 1957 are from a pool of lower than average values that exert leverage on the trend-line. As indicated earlier, data prior to 1957 were non-trending.”
(I know you don’t want your comment to be “statistically invalid”. Glad to help.)

July 21, 2013 7:54 pm

RoHa says:
July 21, 2013 at 5:26 pm
When they put “none of the above” and “hang the lot” on the ballot paper I’ll vote cheerfully.
==============
legally change your name to “None_of_the_above Zzzz” and run for office.
On the ballot you will appear at the bottom as:
Zzzz, None_of_the_above
You will win by a landslide.

July 21, 2013 7:57 pm

Nice work, but I highly suggest you database this first in Access and use SQL to filter and grab the data you are looking for. What you did took a lot of time in Excel, but is just few seconds in Access. Then you copy the resulting recordset and paste into Excel for graphing.
For example, this SQL will return the Tmax range for every day of the year:
SELECT Temp_Data.Month, Temp_Data.Day, Avg(Temp_Data.MaxTemp) AS AvgOfMaxTemp, Max(Temp_Data.MaxTemp) AS MaxOfMaxTemp, Min(Temp_Data.MaxTemp) AS MinOfMaxTemp
FROM Temp_Data
WHERE (((Temp_Data.ID)=’15590′) AND ((Temp_Data.MaxTemp)99999.9))
GROUP BY Temp_Data.Month, Temp_Data.Day
ORDER BY Temp_Data.Month, Temp_Data.Day;
You can also get the TMax record breaking year for any day. Takes a nested SQL to do it, but it’s real simple and fast.
Just a suggestion.

July 21, 2013 8:31 pm

Thanks, Dr. Johnston. A lot of hard work!
But I find an analysis based on P-Values difficult to accept though, ever since I read “Unsignificant Statistics: Or Die P-Value, Die Die Die” (William M. Briggs, 13 June 2013), at http://wmbriggs.com/blog/?p=8295

thingodonta
July 21, 2013 8:37 pm

Geran.
As you probably know, 1950 is often chosen as a stating point because it is from around this date that alarmists chose to believe that the sun no longer had any effect on warming. (There is a paper in AR4 that describes this which is given prominent coverage). They chose this because they assumed that because the sun was near peak activity more or less around the mid 20th century, but solar activity got no higher, no further warming should occur from the sun after this, therefore all warming from around ~~1950 should be taken as man-made.
This choice, and the reasoning, was false from the start. The earth continues to warm for a period even if the incoming heat remains constant, same as for a heating pot on a stove, as others have described elsewhere. There is a solar heat lag diurnally, seasonally, and over decades.
1950 also was chosen because it is clean, round number, and moreover more or less coincides with post war growth. All this suits the alarmists.
But I really don’t see how my point is changed from values before or after 1950-1957, or any other date. Step changes, before or after trends, are all relevant and must be included if one does not KNOW what causes the step changes, or ‘before and after averages’, to begin with. You cannot simply filter out step changes assuming they are not part of the system, or overall trend, to begin with. The best example I can think of is any typical 1st year chemistry experiment, where a reaction may proceed in a step -like fashion even if the underlying causative input is constant.

Nick Stokes
July 21, 2013 9:40 pm

This analysis goes wrong right from the start. It takes anomaly data and converts back to the original absolute temperatures. I don’t know why – the absolute data is on the site here. But anyway, it then recreates anomalies by subtracting the monthly averages for each station for the available period of data. This is said to avoid skewing by choice pf period.
Well, it does, and it’s much easier too, because for a fixed anomaly period you might have to worry about stations that don’t have data in the period. Not here. But there is a very good reason why scientists don’t do it that easy way. It messes up the trend.
I’ve described why, with diagrams, here. But briefly, here’s an example. Suppose you had two nearby stations, A recording from 1900 to 1975 and B from 1925 to 2000. To simplify, they record identical temperatures, rising 0.02°C/yr from -1°C to 1 °C.
Now calculate anomalies based on 1935-1965. The mean is zero, so there is no change. And the average of the two also has a trend of 0.02°C/yr, as expected.
But now suppose you canculate the anomaly as done here. Station A has a mean -0.25, so subtract that (from 1900-1975). Station B has mean 0.25 so subtract that (from 1925-2000). The average anomaly (of A and B) now goes from -0.75 to 0.75. The trend has been flattened and is now 0.014 °C/yr.
You have to use a fixed period (doesn’t matter what), to get the trend right.

July 21, 2013 9:40 pm

[snip way off topic -mod]

SAMURAI
July 21, 2013 10:12 pm

The PDO entered its 30-yr cool phase from 2008, so there will be fewer/weaker El Nino events and more La Nina events for the next 30 years or so, which will tend to generate cooling temperatures in Australia as is already shown in step-drop around 2010 seen in the above chart.
Australia is located in the Earth’s largest oceanic heat sink, so it’s only natural its climate is influenced more by El Nino/La Nina events than other countries.
The Island countries to the North and East of Australia (Indonesia, New Guinea, Philippines, New Zealand) also block ocean currents from efficiently removing the tremendous amount of heat dumped into Australia’s surrounding oceans during El Nino events adding to warmer/prolonged temperatures during El Nino events.
A combination of a cooling Pacific and much weaker solar cycles/increased cloud cover in the coming decades will naturally and eventually lower Australia’s ocean/land temperatures.
The only things Australia’s economic-destructive CO2 taxes will achieve are: higher unemployment, increased debt, lowered living standards, inflation, reduced capital investment, a weaker currency, business exodus, uncompetitive manufactured products, slower GDP growth, decreased industrial output, decreased exports, increased imports, lower living standards and oh, yeah, perhaps 0.007C lower world temperatures…can’t forget that…
The world has gone completely insane.

Nick Stokes
July 21, 2013 10:13 pm

thingodonta says: July 21, 2013 at 8:37 pm
“As you probably know, 1950 is often chosen as a stating point because it is from around this date that alarmists chose to believe that the sun no longer had any effect on warming.”

No, Gistemp used 1951-1980 for a simple reason. They believed 3 decades were needed, and the index was developed in the 80’s. In fact, Hadcrut and NOAA used 1961-90 for the same reason. They started later.
Once you have an anomaly base, you have printed data sets, and it’s a pain to change the base (and there’s no reason). The pain is less nowadays when people routinely download a whole online data set, and changes can be more frequent.

Bill Johnston
July 21, 2013 10:59 pm

Interesting comments so far. The reason for recalculating monthly anomalies was to create a zero-centred dataset. If you sum the raw anomalies they are not zero, thus a Cusum curve, which is quite revealing in a time series context does not “close”. We are also not talking about separate climate stations here, we have a dataset, essentially made up of temperature by area weights. It is synthetic; it is an index; and it it true that no-place probably experiences Australia’s average temperature.
I understand the point made above by Nick Stokes, but that is not the point. The data itself already has embedded in it the answer to the problem he has raised.
The issue with the step changes is that they represent inhomogeneties in the data stream. Valid least-squares estimation of trend, requires that data are homogeneous, otherwise their “trend” is not deterministic. We have a situation here where 4 of the 5 segmented trends were not significant; the other was negative. Putting those together with steps in between does not a real trend make.
To Thingodonta, the step changes were real, attributable and this is not the first time scientists have noted their existence. (See the STARS website). As I’ve made the point, if steps remain in, the trend is confounded; the “real CO2 related trend”, should be ex-the steps; but it is not detectable.

gavin pruden
July 21, 2013 11:18 pm

For their own reasons neither side want to talk about the most interesting point about direct action.That is how cheap and easy it will be to shut it down if it snows in darwin. The left don’t wan’t the unsure to know the plan has a fall back position and the right dont want to admit their doubts about global warming. At worst you end up with some over priced trees and no money leaves the country. Its still a waste of money but you can end it quickly when the global warming scan goes tits up and in the mean time the wild life gets some more trees.

Nick Stokes
July 21, 2013 11:23 pm

“I understand the point made above by Nick Stokes, but that is not the point. The data itself already has embedded in it the answer to the problem he has raised. The issue with the step changes is that they represent inhomogeneities in the data stream. “
How so? You’ve spuriously flattened the trend even before you start looking for change points.
But using patchwork anomaly bases also introduces inhomogeneities. In the anomaly average example I cited, there are now large spurious discontinuities at 1925 and 1975 in what should have been a uniform line.
As I said, it’s wrong from the start because of anomaly treatment. But you’ve also concluded that there is insignificant trend, after subtracting what you say are change point effects, which in fact have a trend.

Bill Johnston
July 22, 2013 12:59 am

Nick, There is no spurious flattening of the data. The trend in the original series was 0.009 deg./yr. They like to refer to it as 1 degree per century.
We have step-changes that are due to broad-scale shifts in the climate (as indicated by the IPO/DPO) which are stochastic events; they don’t occur every 20 or 40 years, they simply occur irregularly and unpredictably. We had a major El Nino cluster post 2010. If we leave these in, it is unarguable that it is they that determine the trend.
For your model, trend = CO2 trend + IPO/PDO step-trends + …..
I don’t personally have an issue with such a model, except it is not about CO2. If there was an underlying 200-yr cycle that was low in 1910 and high in 2010, then that would be trend determining wouldn’t it?
You would know that in regression analysis, factors are additive; thus steps like these can be adjusted for by subtraction. It is no different from subtracting the ‘cycle’ by deducting long-term monthly averages; or significant longer-period oscillations. All I’ve done is take the model above and deduct the steps (and noise), that should leave the CO2 trend intact.
The step-changes may have a trend (they do in this case), but their magnitude and timing have not been related to CO2. They don’t always step upwards; the last one was strongly negative.
It would be wrong from the start not to plot the data up and explore its properties. By that I mean, look for constancy in slopes; changes in the data ranges; the Cusum is a very useful tool because it emphasises directionally similar data behaviour. An eye-ball plot of the raw data clearly indicated something changed in the 1950’s. I then explored and analysed that.
I looked at your work but I’m afraid I don’t understand it. As alluded to by von Storch, the message often gets lost in the use of ‘advanced techniques’, which is problem in climate research. The statistics may be great but the extension approach is often lacking.
What I’ve tried to do here is present an easily understood, uncomplicated approach to a pretty simple problem. The packages I used are in the public domain and anyone can grab the data, (or any other time-related data) grab the software and have a go.
To say it again, a basic tenet of regression analysis is that data are homogeneous – that their properties don’t change through time. These data were inhomogeneous!
Cheers,
Bill

Patrick
July 22, 2013 1:05 am

“AndyG55 says:
July 21, 2013 at 5:56 pm”
That’s typically called a “donkey vote”. And, unfortunately, it’s a wasted vote. Even though “voting is compulsory”, one of the soon to be retiring ALP ministers who was the lead singer of “Midnight Oil”, never used to “vote” himself.
Rudd is so popular here as apposed to Abbott, but then the LNP has Turnbull who is, in effect, Rudd (Light).
“gopal panicker says:
July 21, 2013 at 6:27 pm”
The BoM recently changed the way they “work out” what the average temperature is for the whole continent of Australia, but as yet and apart from knowing how to create (Because no-one any where at any time has observed and measured an average – It’s totally man-made) an average, the BoM has not released how adjustments made. ACORN-SAT uses 112 devices, nationwide. That’s 1 device per ~68,500 square kilometres.
There are far too many people in Australia who, sadly, accept this garbage!

SamG
July 22, 2013 1:24 am

RoHa
I’m Australian, I understand it’s compulsory, but I want to know if Adam votes and whether he does so enthusiastically. You can still informal vote. You see, most people who complain about govt, either defer to the state when it benefits them, have erroneous views on voluntary social and economic interaction, and/or believe democracy yields greater prosperity and freedom. It doesn’t.

thingodonta
July 22, 2013 1:37 am

“The only things Australia’s economic-destructive CO2 taxes will achieve are: higher unemployment, increased debt, lowered living standards, inflation, reduced capital investment, a weaker currency, business exodus, uncompetitive manufactured products, slower GDP growth, decreased industrial output, decreased exports, increased imports, lower living standards and oh, yeah, perhaps 0.007C lower world temperatures…can’t forget that…”
You forgot one thing, better financial coffers and feeling good amongst the elite green priests, whilst everybody else pays for them to feel good and have more money, more green programs. etc etc. The only people who benefit are the elite amongst the new self-proclaimed rulers-same as in Stalin’s day.

July 22, 2013 1:47 am

A number of people are missing the point of averages and anomalies.
I agree the average temperature of Australia has no real meaning, but no one really cares what it is anyway. What people are interested in, is the change in some average (the anomaly) as signal of worldwide climate warming (or otherwise).
If you read my link above, you will see the Bom/CSIRO have wilfully used the the min/max dataset because it shows the warming they want (although obviously not enough) and ignored the fixed time temperature measurements, which are superior in several ways, not least because they are not sensitive to factors that affect the timing of min and max.
If this were a public company, people would go to jail for this kind of selective use of data.

Bill Johnston
July 22, 2013 2:16 am

Not that simple Philip. For normal observations, liquid- in-glass thermometers are read at 9AM. Maximum is for the previous 24 hours, it usually occurs sometime the previous day. Minimum is usually the overnight temperature; it usually is recorded in the AM of the day it is read. Some stations adjust for this in calculating averages as (Min+Max)/2 by shifting the max to the day before (some don’t perhaps??)
Automatic weather stations simply use one probe to record continuously – every minute or so (there is a standard). The maximum in that case is the highest reading in the 24 hours; the minimum is the lowest; the average is still calculated the same.
Before Max and Min thermometers came along, readings were physically taken at a time interval. This usually missed what the actual max and min temps were.
The point I was making was that if there is a warming trend in Australia’s data, it is well and truly hidden by the natural step-changes.
Cheers,
Bill

July 22, 2013 3:05 am

Minimum is usually the overnight temperature; it usually is recorded in the AM of the day it is read.
Common misconception. Minimum temperatures usually occur in the daytime, after dawn. How long after dawn varies by latitude and season. The error almost every one makes is that because minimum temperatures are strongly influenced by overnight temperatures, any changes in minimum temperature result from changes in overnight temperatures. This has not been the case over the last 60 years.
Before Max and Min thermometers came along, readings were physically taken at a time interval.
The min/max thermometer was invented in 1782, and has been in widespread use for more than 150 years.
Automatic weather stations simply use one probe to record continuously – every minute or so (there is a standard). The maximum in that case is the highest reading in the 24 hours; the minimum is the lowest; the average is still calculated the same.
That they are recording continuously makes no difference, because the average temperature compilers still use min/max temperatures.
if there is a warming trend in Australia’s data, it is well and truly hidden by the natural step-changes.
Assumption. In order to be convinced of an ocean oscillation effect on land temperatures I’d need to see strong correlations in the raw data.

AndyG55
July 22, 2013 5:08 am

Patrick, You misunderstood what I was saying, I think.
RoHa said voting is compulsory.. It is not.
Turning up at a booth and getting your name crossed off saves you a fine, but what you do with your ballot paper after that is up to you.
And NO !, I will not be doing a donkey vote.
And I do not understand why Rudd is popular at all.
He is an egotistical megalomaniacal prat, who is all talk and scatty ill-conceived plans.. that never work out.
But he seems to have many people fooled, again..
You wold think people would remember the first time he was there.. a total mess.

Bill Johnston
July 22, 2013 3:18 pm

Phillip Bradley, You made some important points. Just before sun-up is when frosts fall in winter. Although to an observing eye it is still dark, lets call it a grey area.
You can download automatic weather station data from BoM and track just when the minimum temperature occurs and it varies through the night (I think of “night” recordings as between 3PM and 9AM). By 9AM temperature is on-the-rise and the minimum recorded temperature has invariably been recorded.
The Bureau has confirmed that where liquid-in-glass and AWS data are available, it is the AWS data that have precedence. Minimum and maximum data are derived from the single-sensor data stream. The average is still calculated the same way – (Max+Min)/2, which is what I indicated.
I have not made an assumption about oscillations in the data. They were objectively detected using 2 approaches. You could easily check that for yourself by repeating my analysis. Because they closely approximate the period of the lunar cycles, that is a likely explanation. Perhaps you could put forward an alternative view. (It is also possible they are simply statistically significant artifacts.) The way to tell is to go the next step and look at cross correlation of the temperature and ocean signals – see if they line up. I have not done that because in the context of the essay it was not needed. (The signals could be lagged.)
Don’t confuse the IPO/DPO induced step changes with the oscillation; I tested for and removed the oscillation before testing for homogeneity so the two signals were not confounded.
Cheers,
Bill

Brian H
July 22, 2013 9:47 pm

Greg Cavanagh says:
July 21, 2013 at 5:07 pm
Stuart Elliot, We know the pollies aren’t sane, and at least 50% of the population are below average intelegence.
I realy don’t think sanity is the norm…

Thare speling is crapy, tu. They really lack intelligence.

Garfy
July 24, 2013 11:09 am

Marcel Leroux, a french climatologist would agree, but nobody ever listen to him