This is part 8, essentially a wrapup see all other parts 1-7 here: http://kenskingdom.wordpress.com/
Guest post by Ken Stewart, July 2010
“…getting seriously fed up with the state of the Australian data.”
(Harry the mystery programmer, in the HARRY_READ_ME file released with the Climategate files.)
He’s not the only one. In a commendable effort to improve the state of the data, the Australian Bureau of Meteorology (BOM) has created the Australian High-Quality Climate Site Network. However, the effect of doing so has been to introduce into the temperature record a warming bias of 41.67 %. And their climate analyses on which this is based appear to increase this even further to around 66.67%.
This post is the summation of what I believe is the first ever independent check on the official climate record of Australia. It is also the first ever independent check on the official record of an entire continent.
I will try to keep it simple.
Here is the official version of “the climate trends and variations in the Australian instrumental record” published for the Australian public, the government, and all the world at http://www.bom.gov.au/climate/change/aus_cvac.shtml
Time Series Graph using their handy trend tool:
0.1 degree C per decade, or 1 degree per100 years.
In the BOM website appears this explanation:
The temperature timeseries are calculated from homogeneous or “high-quality” temperature datasets developed for monitoring long-term temperature trends and variability …….. Where possible, each station record in these datasets has been corrected for data “jumps” or artificial discontinuities caused by changes in observation site location, exposure, instrumentation or observation procedure. This involves identifying and correcting data problems using statistical techniques, visual checks and station history information or “metadata”.
and
“High-quality” Australian climate datasets have been developed in which homogeneity problems have been reduced or even eliminated.
I have given a very brief summary of this process in http://kenskingdom.wordpress.com/2010/05/12/the-australian-temperature-record-part-1-queensland/
(I should point out that this method was changed somewhat by Della-Marta et al (2004) who also used a distance weighting method as well and included some urban stations and stations with much shorter records.)
Torok and Nicholls (1996), authors of the first (published) homogenization, rightly state that
“ A high-quality, long-term surface air temperature dataset is essential for the reliable investigation of climate change and variability.”
Here is the map showing the 100 currently used High Quality stations that supposedly meet this requirement:
Before my first post, I asked BOM to explain some of the odd things I had noticed in the Queensland data. Amongst others, this statement by Dr David Jones, Head of Climate Monitoring and Prediction, National Climate Centre, Bureau of Meteorology, in an email dated 25 April 2010, caught my eye:
“On the issue of adjustments you find that these have a near zero impact on the all Australian temperature because these tend to be equally positive and negative across the network (as would be expected given they are adjustments for random station changes).”
This statement has been the yardstick for this study.
Not having access to the list of stations, the metadata, the software used, or the expertise of BOM, the average citizen would normally accept the published results as they stand. However I wanted to have a closer look. Surely the results of any adjustments should be easy to compare with the previous record.
I downloaded annual mean maxima and minima for each of the sites from BOM Climate Data Online, calculated annual means and plotted these. Frequently, two or three stations (some closed) were needed for the entire record from 1910-2009, and even then there sometimes were gaps in the record- e.g. from 1957 to 1964 many stations’ data has not been digitised. (But 8 years of missing data is nothing- many stations have many years of estimated data “filled in” to create the High Quality series). I also downloaded the annual means from the High Quality page, and plotted them. I then added a linear trend for each.
I have exhaustively rechecked data and calculations in all 100 sites before compiling this summation. I have decided to amend only one, Bowen, by creating a splice by reducing early data and omitting some data, so that the trend matches that of HQ. This is on the basis of no overlap at all, but makes the plot lines roughly meet. Unsatisfactory, and Bowen should be excluded. The net effect on the Queensland and Australian trends is negligible (0.01 C).
Let’s look at Dr Jones’ assertion for the whole of Australia.
“…a near zero impact on the all Australian temperature …”
WRONG.
We can look at the record in a number of ways- here is the graph of the average raw and adjusted temperatures for all 100 stations. The discrepancy is obvious. 
That’s 0.6 degrees C / 100 years for the raw data. The adjusted trend is 0.85.
Before anyone complains that anomalies give you a more accurate picture of trends across a large region, I also calculated anomalies from the 1961-1990 mean for the all Australian means (0.6 raw to 0.85 HQ increase)
and for all 100 stations (slightly different result): (0.6 raw to 0.9- 50%)
But the figure BOM publishes is 1.0C- that’s a two-thirds increase!
We can also look at the average adjustment for each station: + 0.23 degrees Celsius. (The table of all 100 stations is too large to include).
Or we can find the median adjustment (+ 0.275 C), and the range of adjustments:
So much for “these tend to be equally positive and negative across the network”.
We can also look at the “quality” of the High Quality stations.
Urban vs Non-urban:
“Please note: Stations classified as urban are excluded from the Australian annual temperature timeseries and trend map analyses. Urban stations have some urban influence during part or all of their record.” (http://www.bom.gov.au/cgi-bin/climate/hqsites/site_networks.cgi?variable=meanT&period=annual&state=aus)
In Part 1 I showed how 3 Queensland sites listed as urban by Torok and Nicholls (1996) are now non-urban. Della-Marta et al resurrected a number of others in other states.
The full list is: Cairns AMO, Rockhampton AMO, Gladstone MO, Port Hedland AMO, Roebourne, Geraldton AMO, Albany AMO, Alice Springs AMO, Strathalbyn, Mount Gambier AMO, Richmond AMO, Mildura AMO, East Sale AMO, Cashmore Airport, Launceston Airport.
15% of the network is comprised of sites that BOM is at pains to assure us are not used to create the climate record.
Long records:
“… the number of stations is much smaller if only stations currently operating and with at least 80 years of data are considered. To increase the number of long- term stations available, previously unused data were digitised and a number of stations were combined to create composite records… all stations in the dataset (were) open by 1915.” (Torok and Nicholls)
Torok wanted 80 years of data: Della-Marta et al and BOM have settled for much less. There are six stations with no data before 1930 (80 years ago), but BOM has included these. Some are truly dreadful: Woomera- 1950; Giles- 1957; Newman- 1966.
As well, many of the sites have large slabs of data missing, with the HQ record showing “estimates” to fill in the missing years.
Here is a graph of the number of stations with data available for each year.
Note that only 70% of raw data is available for 1910; 90% by 1930; another drop from 1945 to 1960; and the huge drop off in HQ data this decade!
Data comparison:
“Generally, comparison observations for longer than five years were found to provide excellent comparison statistics between the old and new sites…… Comparisons longer than two years, and sometimes between one and two years, were also found to be useful if complete and of good quality… Poor quality comparisons lasting less than two years were generally found to be of limited use.” (Della-Marta et al, 2004)
Wouldn’t “excellent comparison statistics” be essential for such an important purpose? Apparently not. There are many sites with less than five years of overlapping data from nearby stations (up to 20 km apart). A number of sites have no overlap at all.
This results in enormous gaps in the temperature record. Here is the map of the High Quality network, with sites deleted if they are (a) listed as urban in 1996 (b) sites with less than 80 years of observations (c) sites with less than 5 years of comparative data overlap- or sometimes all of the above!
The sites left are concentrated in Eastern and South-Western Australia, with an enormous gap in the centre. Check the (admittedly very aprroximate) scale.
And finally…
Claims made in the State of the Climate report produced by BOM and CSIRO in March 2010.
Since 1960 the mean temperature in Australia has increased by about 0.7 °C . The long term trend in temperature is clear…
TRUE. But the raw data shows the mean temperature since 1910 has increased only 0.6 C.
Australian average temperatures are projected to rise by 0.6 to 1.5 ºC by 2030.
REALLY? That would require between 5 and 12 times the rate of warming seen in the raw temperature record, or between 3 and 7.5 times that shown by BOM’s published figures.
Much of Australia will be drier in coming decades
MAYBE NOT. See http://kenskingdom.wordpress.com/2010/03/20/political-science-101/
Our observations clearly demonstrate that climate change is real.
TRUE- that’s what climate does.
CSIRO and the Bureau of Meteorology will continue to provide observations and research so that Australia’s responses are underpinned by science of the highest quality.
“Highest quality”? REALLY?
Conclusion
This study shows a number of problems with the Australian High Quality Temperature Sites network, on which the official temperature analyses are based. Problems with the High Quality data include:
- It has been subjectively and manually adjusted.
- The methodology used is not uniformly followed, or else is not as described.
- Urban sites, sites with poor comparative data, and sites with short records have been included.
- Large quantities of data are not available, and have been filled in with estimates.
- The adjustments are not equally positive and negative, and have produced a major impact on the Australian temperature record.
- The adjustments produce a trend in mean temperatures that is roughly a quarter of a degree Celsius greater than the raw data does.
- The warming bias in the temperature trend is 41.67%, and in the anomaly trend is 50%.
- The trend published by BOM is 66.67% greater than that of the raw data.
The High Quality data does NOT give an accurate record of Australian temperatures over the last 100 years.
BOM has produced a climate record that can only be described as a guess.
The best we can say about Australian temperature trends over the last 100 years is “Temperatures have gone down and up where we have good enough records, but we don’t know enough.”
If Anthropogenic Global Warming is so certain, why the need to exaggerate?
It is most urgent and important that we have a full scientific investigation, completely independent of BOM, CSIRO, or the Department of Climate Change, into the official climate record of Australia.
I will ask Dr Jones for his response.
(Thanks to Lance for assistance with downloading data, and janama for his NSW work. Also Jo Nova for her encouragement.)


It could be even worse than we thought. There are divergences from the “raw” data that Ken has downloaded and the CRU2010 data that the Met office released after climategate that are “Based on the original temperature observations sourced from records held by the Australian Bureau of Meteorology” which are apparently taken of the CLIMAT reports.
http://www.metoffice.gov.uk/climatechange/science/monitoring/subsets.html
Here are a couple of comparisons of the HQ stations with the files the Met office released.
These are some of the results I got when comparing the CRU stations with the BOM.
http://www.warwickhughes.com/blog/?p=510
Hall s Creek
http://members.westnet.com.au/rippersc/hccru2010.jpg
http://members.westnet.com.au/rippersc/hccru2010line.jpg
Note that 2007, 2008,2009 are cooler in the met office version and the slope actually cools from 1950 once you drop off the 1899 figure (the only year that Phil Jones used from the old town that is 12kms and 63 metres downhill from the new town)
Meekatharra
http://members.westnet.com.au/rippersc/meekacru2010.jpg
http://members.westnet.com.au/rippersc/meekacru2010line.jpg
Note the same thing 2007-2009 and the lesser slope from the bom data .
Which lot is the “raw” data? I would find it strange if the CRU/ Met Office actually adjusted cooling into the BOM data.
Then there is this that was done by CSIRO for the timber industry
http://www.timber.org.au/resources/ManualNo1-ClimateData.pdf
Graeme W says:
July 27, 2010 at 9:55 pm
“As I understand it, old records are manual reads, so a 0.5 degree error is certainly a possibility. The modern readings are down to a 0.1 degree, so the error is probably 0.05 (though I don’t know what the instrument accuracy is).”
Any reasonably skilled met observer can read an old-fashioned standard thermometer to 0.1 degree F (error +/-0.05 degree F). Or in Centrigrade, with half degree points marked, to 0.05 degree. The accuracy and stability of those manual thermometers was considerably better than most of the digital recorders used today, which have to be “calibrated”, leaving room for endless fudging. You can believe the old pen and ink raw data, back several hundred years. By contrast, numbers on a computer can too easily be manipulated or downright falsified.
(Occasionally, a thermometer’s mercury or alcohol thread may snap without being immediately noticed, when readings with a bias of a few tenths of a degree are possible – in front of me right now I’ve got an old one of mine I used to use as the wet bulb and swing manually to get the depression, which has a disconnected 0.2F bit of thread at the top. It’s easy to correct for once you spot the problem though).
I don’t think there’s any doubt that there has been some warming during the satellite era. The positive PDO just happens to run during this period. Also, I think there’s little doubt the 1930s were warmer than present in many parts of the world. The number of record highs is proof. Therefore, we have a situation where much of the changes are to make older temps lower. That produces a more consistent degree of warming and a large amount than what is likely to be the actual case.
In any event, it will all work its way out in the next few years. If Joe Bastardi is right and we see a big La Niña next winter, the warmers will face lots and lots of questions.
As an absolute layman , I was wondering why on earth temperature measurements are so often taken at airports , doesnt the masses of fiercly hot air blown out of the aircrafts engines mix with the normal air and warm it .
Over a day thousands of gallons of fuel would be burnt launching the aircraft into the sky . , so why do people think that thermometers at airports wouldnt record a warmer than natural temperature ?
Why measure temperatures at airports ? Surely the blowtorch effects of jet engines hitting the runway will make the ground absorb significant amounts of heat – enough to skew temperature readings to the warm side . Tens of thousands of liters of jet fuel being burned every day must make the temperature reading false – or am i missing something ?
The last 60 years of data (1950 – 2010) gives a warming trend of 0.18C per decade – 80% more warming than the 0.1C per decade derived from the first graph. That seems significant. I wonder why it wasn’t highlighted in this article? Surely 60 years of data qualifies as ‘climate’, does it not? Indeed, it’s clear that *all* of the warming over the last 100 years occurred since 1950.
Great analysis. I hope this proves to be another nail in coffin of ‘panic’!
I’ve spent a little time trying to find a single remote (ie not urbanised in recent years) weather station that shows the same warming trends in raw temperature data as in the global anomaly data – without success. If the global ‘anomaly’ trend always exceeds the trend from raw station data – surely any pretence that the global anomaly is a true reflection has to be abandoned because the pitfalls in going from to point measurements to area averages are myriad.
I was fascinated by your thermal image a few days ago which illustrated the difference in heat signature between asphalt and concrete. No one mentioned the fact that the trees in the pic were much cooler than anything else (which makes sense: they spend the day converting incident solar radiation to sugar if I recall my school biology – or maybe it’s the lack of CO2 around the leaves that makes them cooler :). If there is any rise in recent temperatures, you’ve got to account for land-use changes – I’d want to discount any stations with any potential for advection from urban areas if you really want to spot trends of 0.1C per decade.
I would think many cattle stations in Australia have long term term temperature data – and would probably be free of UHI influences – anyone got any contacts?
The poor science being practised globally by (so-called) climate scientists is spectacular.
I think quite a few of these folks need to go back to college and repeat the tutorials in basic experimental methods.
First: propose a theory (Man-made CO2 is heating up the planet).
Second: design the experiment (measure CO2 levels both man-made & natural & temperature at say 1000 randomised points on the globe (also consider multi-altitude global measurements) in a repeatable calibrated manner over a period of say 1000 years.)
Third: collate the data and report, peer review & publish (peer meaning someone of equal standing not your best mate, review meaning critical examination not exec summary only)
(Note: just because it is hard to do these things well doesn’t mean you shouldn’t even try to do them!)
No.
First & last: publish finalising your theory – using whatever existing data you can manipulate to fit your theory. If no data exists – invent it in a “model”.
The networks of weather stations were never designed to be used to gather data for high fidelity temperature measurements as needed for the above. Its about time someone stands up and shakes these fools by the shoulders.
The word fraud was used on here the other day – but I don’t actually believe these folks believe they are doing anything wrong. They are just bad scientists.
The Tsunami of Truth – coming to a climate research unit near you!
Ken what does the full range of daily max and min temperatures looks like? In Canada the increase in the average of the yearly mean since 1900 is being caused by cooler maximum summer temps, and not as cold minimum winter temps. It would be interesting to see a graph of the entire range of a year’s temperature to see if there is any changes in the min and max for each year.
anthony holmes says:
July 28, 2010 at 4:45 am
“As an absolute layman , I was wondering why on earth temperature measurements are so often taken at airports ,…”
There are almost always met stations at airports because pilots need to know the temperature, pressure, wind direction, wind strength, cloud height and cloud cover over the runway. They are appropriately sited (or cited!) for aviation purposes; but very badly sited (or cited!) for climate purposes.
“anthony holmes says:
July 28, 2010 at 4:45 am
As an absolute layman , I was wondering why on earth temperature measurements are so often taken at airports , doesnt the masses of fiercly hot air blown out of the aircrafts engines mix with the normal air and warm it .
Over a day thousands of gallons of fuel would be burnt launching the aircraft into the sky . , so why do people think that thermometers at airports wouldnt record a warmer than natural temperature ?”
It is an important factor in safe air travel. Unfortunately, that “data”, along with all the other bullcrap and “adjusted” data etc etc, has been fed into the AGW propaganda machine, leads to “AGW”.
Does the Department of Defense know about this development? Anyone who can torture data like these guys can and make it talk oughta be involved in anti-terrorist activities, “Talk hamper-head or we’ll ‘adjust’ you! And if you don’t give us what we want, we’ll ‘homogenize’ you! And don’t even get us started on what we’ll do if you are missing! Now spill!”
“Okay okay I’ll talk!…sniffle snuffle…in -sob- 5th grade I…I…”
So . . . you’ve got a Stephen’s screen, sitting on a grassy knoll outside some unknown little town, a long long time ago. You diligently record the temperature, and low an behold, over the last 50 years, it’s gotten warmer! (and never mind the fact that the little town this used to sit outside of has now enveloped the screen . . .)
So . . . you make an adjustment, say about half of the total increase over the last 50 years, and move the screen back outside of town. (Funny, don’t recall where I saw or heard this, but did you know that the temperature in a major metropolitan area can be six degrees warmer during the day than just outside of town, and that the effect actually increases in the overnight hours! In light of recent ‘peer reviewed climate science’ I had to laugh. ) Oh, and since you’ve moved the screen now you have to make another adjustment, otherwise your temp record will show a sudden decline! The solution; adjust all of the old temps down (not kidding here)! Now the discontinuity is gone, record looks fine until; 50 years later . . . town has expanded yet again to encircle the (now) MMTS shelter, which is 15 feet from a building, due to a short cable, but no adjustment has been made for this. Time to move it again, make another adjustment to all of the old data, and so forth.
Repeat ad nauseum and you’ve truly got Mann made Global Warming! (Though this one effect cannot be said to be his fault.)
The intelligence of mankind is coming under severe scrutiny just now.
jorgekafkazar says:
July 27, 2010 at 8:35 pm
Visual inspection shows that the hottest zones have the fewest sensor locations. I think the next hottest have just slightly more sensors, and so on. I tried to do an overlay, but the maps are not congruent. It’s worse than we thought.
This also is a feature of the Global temp rises, the largest rises always seem to be in unpopulated areas, eg North Pole, Sibera etc. Then when you look at areas with long wel documented temp records and large populations you see no or slight warming eg Central England, Northern Ireland.
Just doesn’t smell right.
From the looks of the map. there are many areas in the central part of the continent that lack black asphalt airport runways to locate thermometers.
anthony holmes says:
July 28, 2010 at 4:50 am
Why measure temperatures at airports ? Surely the blowtorch effects of jet engines hitting the runway will make the ground absorb significant amounts of heat – enough to skew temperature readings to the warm side . Tens of thousands of liters of jet fuel being burned every day must make the temperature reading false – or am i missing something ?
The large expanses of tarmac are huge heat sinks that distort the temp records upward on sunny days, this will have a greater effect than heat from engines as air loses temp much more rapidly than tarmac.
“The more a person relies on statistical analyses the farther their feet are from the ground.”
Is there any wonder than that politicians, bankers, and scientists have been seen flying through the sky in greater numbers? The fault is not entirely their own however. Their constituents, share holders, and students have consistantly demanded less for more and a bigger bang for the buckeroo too. The first law of human physics – “Ya get what ya vote for Mate!” (aka – “Garbage In, Garbage Out”)
Dixon said (July 28, 2010 at 5:10 am):
The problem with that argument is that satellite-based global temperature series are almost exactly the same as the terrestrial series.
Geoff Sherrington asks
July 28, 2010 at 3:08 am
…Can everyplace really be warming much faster than everyplace else?
Why not? Soviet tractor production continues to soar.
In comparing the unadjusted maximum temperatures for Melbourne from March 7-13 of 1940, with maximum temperatures for the same period, as reported in The Argus on Friday March 15 of 1940, I found no discrepancies.
It was still the hottest unadjusted March on record.
JohnH: July 28, 2010 at 6:57 am
The large expanses of tarmac are huge heat sinks that distort the temp records upward on sunny days, this will have a greater effect than heat from engines as air loses temp much more rapidly than tarmac.
Concrete has the same effect. Airport temperature data is only valid for the air temperature *on* the airport. Temperature here on our ramp today at 1300 (from an aircraft’s OAT gauge) was 48°C, and the official reading from the tower (500 meters east) was also 48°C.
The temperature in the desert scrub 500 meters *west* at 1300 (measured with the same aircraft OAT gauge) was 43°C.
I think 5 yr trend lines are great! These types of trends are far more valuable for agricultural purposes than a beginning to end trend line (data poor, information poor). Icarus, why did the author of your linked graphs not do 5-yr trend lines starting in 1978? Even better, if we actually had an unadjusted, unsmoothed, unhomogenized data set back to 1900 or earlier, 5-yr trend lines would be extremely educational and have far greater potential of being both data and information rich.
One more point about the graphs Icarus linked to. Too bad the author didn’t do 5/yr trend lines for all the various temperature data sets and then compared these shorter trend line to each other.
Bill Illis,
I think this is one of your charts. It shows there is nothing to worry about WRT global warming.
JohnH said (July 28, 2010 at 6:53 am):
The CET shows warming of ~0.27C per decade over the last 50 years. Not ‘slight’.