GISS adjustments in Australia

Ken Stewart shows how GISS adjusts temperature records in two adjacent sites in Australia

This issue was also recently covered on the Climategate blog here


Despite its assurances, GISS has adjusted the temperature records of two sites at Mackay to reverse a cooling trend in one and increase a warming trend in another.   This study presents evidence that this is not supportable and is in fact an instance of manipulation of data.

I decided to have a look at the temperature records of the weather stations closest to where I live, near Mackay in North Queensland.  The Bureau of Meteorology lists 3 current stations: Mackay MO, Mackay Aero, and Te Kowai Exp Station, plus the closed station Mackay Post Office.  GISS has a list of nearby stations.  One is “Mackay Sugar Mill Station”.  I had never heard of it.  Te Kowai Exp Station, only a few kilometres from Mackay, is in fact at the same co-ordinates as Mackay Sugar Mill.  I checked on AIS for the GHCN  site, and there is Mackay Sugar Mill on the map.  The co-ordinates given by GHCN put it  in the middle of a cane paddock 600m to the south of Te Kowai Sugar Experiment Station, so that’s definitely it!  (If not, it’s identical in every other way!)  And that is the closest weather station to my home, so I became even more interested.

Te Kowai is an experimental farm for developing new varieties of sugar cane, run by scientists and technicians since 1889.  It has a temperature record of over 100 years with only a couple of gaps.  So in fact it’s an ideal rural station for referencing a nearby urban station, as it should have a similar climate.


I plotted data from BOM for maxima and minima and obtained the means for Te Kowai, all Mackay city stations, all GHCN stations in our 5 x 5 grid, and several other towns and cities with long records (Te Kowai’s starts at 1908).   This is because “ In our analysis, we can only use stations with reasonably long, consistently measured time records.”

GISS combines GHCN data from all urban stations at the same location, and then homogenises this with data from neighbouring rural stations.  So I then plotted the same-location data and the post-homogenisation data.

A problem that appeared immediately is that the GISS annual mean runs from December to November, while BOM’s raw data is for calendar years.  Most of the time it matches pretty well, but there are several examples of poor quality data.  Another problem is that BOM does not compute a mean for any year with even one month of data missing, while GISS tolerates several missing months.

Here are graphs of the results.

Read his entire post here

0 0 vote
Article Rating
Newest Most Voted
Inline Feedbacks
View all comments
Phillip Bratby
February 11, 2010 12:50 am

Nothing unexpected here. Has anyone asked GISS how they justify this data manipulation?

February 11, 2010 12:57 am

So whats to believe anymore? It becomes more and more apparent that the entire temperature record worldwide will need a major rework. I would not trust those agencies to do it either.

Sydney Sceptic
February 11, 2010 1:06 am

I think we need to extend Anthony’s surfacestations to Australia. Volunteers? I can help with Sydney based ones?

David Ball
February 11, 2010 1:23 am

What is shocking to me is the unmitigated gall that must be in place to assume that no one would actually look at their “adjustments”. I learned as a child that when you tell a lie you have to tell ten lies to cover up that one lie, and so on. Did it not occur to GISS that people might eventually look at these “adjustments” and start piecing together what was done? I would not want my children and grandchildren to be faced with what “daddy” did to the data, as time would eventually reveal the fabrication. What a legacy to leave behind.

Patrick Davis
February 11, 2010 1:41 am

Many Aussies I blog with about temperature data manipulation don’t appear to be too concerned about the fact the data is fabricated as they believe in MMCC, and “something” has to be done, not for us but for our children.
Go figure!

February 11, 2010 1:43 am

Sadly, if an expected result requires that data be “massaged” so that the “correct” conclusion can be reached, it is hard not to “search” for the means to represent the “truth”.
As we are seeing, a lot of “truth” was being found with ever-greater certainty.
Fortunately, reality intruded just in time.

Neil McEvoy
February 11, 2010 1:45 am

This is a definite “smoking gun”.
A quick précis for those who haven’t clicked through and read to the end:
The nearest rural station is not used in the adjustment of the urban readings for Mackay (pop 35,000). If it were, the trend for Mackay would be substantially reduced. The reason for not using it is that it is also classified as urban – despite being in the middle of sugar cane fields.
The smoking gun is that the record for that station was ended in 1992 – when records for many other rural stations ceased. So, it seems it was once classified as rural and mysteriously reclassified as urban. Convenient.

John Whitman
February 11, 2010 1:48 am

Great post.
GISS now has to realize that the blogosphere is going eventually look at virtually all of their data treatments for temperature sensing locations . . . . . . the bias in their treatments is being revealed. It is hard not to conclude, as we get more and more of these posts, that they manipulated the data to show it is warmer. At the same time they advocate in support of an AGW agenda. tic, toc, tic, toc . . . . blog post by blog post . . .showing the biased GISS data treatment.

February 11, 2010 1:59 am

Do they also extrapolate the McKay temps to the south island of New Zealand? OK just being flipant now 😉

February 11, 2010 1:59 am

Great piece of work Ken which shows GISS messing with the data again. Seems every time someone makes the effort to see what’s happening in their area, upwards adjustments are found.
The CAGW is being hit hard by the high impact Climategate and IPCC revelations of fraud – now it’s suffering the death of a thousand cuts from concerned people across the world.

Peter of Sydney
February 11, 2010 2:03 am

How long do we have to wait before those responsible are charged with fraud and if found guilty are put behind bars? Someone must be brought to account. Otherwise this sort of data manipulation will continue unabated.

February 11, 2010 2:09 am

I was interested to note this post as I too have been puzzled by GISS population estimates for the sites used in its data set. I know Warwick Hughes ( has done some excellent work in this area.
I have only examined the population estimates of a few GISS sites but some were only half the recent census population estimates for the same areas.
Suspicions aroused I tried to see if the same applied to CRU data. I corresponded with Phil Jones who was both polite and prompt in his response. He referred me to the Brohan et al (2006) paper.
Populations were not included in CRU data for reasons explained in that 2001 paper. Rather, Jones (and his CRU colleagues) appears to accept the Folland et al (2001) assumption of a 0.0055C/decade global temperature increase due to urbanisation. In our email exchange Jones confirmed this corresponded to an increase in global 20th temperatures increase of 0.055C: I would suggest of statistically no importance.
My interpretation of this is simple: CRU data dismisses urbanisation as having any influence on 20th century global temperatures. This premise stands or falls on the Folland et al (2001) paper. Am I correct in saying that if new research concludes that urbanisation had a positive 0.055C/decade effect on global temperatures, equating to 0.55C for the 20th century, then the idea of 20th century global warming virtually collapses?
As it stands, the theory of 20th global warming rests on the validity of the Folland et al (2001) paper. Is this a correct deduction or am I missing something?

February 11, 2010 2:12 am

How GISS calculate a station’s temperature increase.
Think of a number;
Add 0.8;
Take away the number you first thought of;
Surprise, surprise you end up with 0.8

February 11, 2010 2:14 am

Apologies for a typo in the post above. It should read:
“Populations were not included in CRU data for reasons explained in that 2006 paper.”
Apologies for the confusion.

John Whitman
February 11, 2010 2:14 am

I am in Taiwan and will see if I can get some of my Taiwan national associates to do a little investigative work on GISS treatments for temperature sensing locations here. : ) It gives me something to do during the upcoming Chinese Lunar New Year holiday week except shooting off fireworks and drinking Kinmen guaolian (phonetics are mine).

February 11, 2010 2:16 am

I read the guest blog post at this site, where professor Ravetz made a razor sharp analysis of why it could go so wrong. Link here. He described the extended peer communityas the driving force from the blogosphere as the only force that could correct science that had been so corrupted as the current climate science. (It is very much worth reading if you haven’t.)
It is just amazing to see how this extended peer community works. The example in this post is another of noone knows how many errors that are exposed that otherwise had gone unnoticed. Good work!

February 11, 2010 2:22 am

The link in my previous post should be this one. :-/

February 11, 2010 2:23 am

There is a choice on the GISS graphing site that lets you change the year to be a calendar year. It is at the very top of the ‘drop down’ menu (that for me in fact opens upward… a ‘drop up’? 😉 making the option very easy to not notice)
FWIW, GISS never justifies their changes. GIStemp has a process built into it, so no human chooses the changes. All you will get are pointers to published papers that claim the method is fine…. (“The Reference Station Method). But it does an odd thing where all the stations averaged together to make the ‘reference’ have their “bias removed” by having their data shifted to keep their mean matching the very first station (that is not so moved). It just looks to me like the first station is allowed to SET the bias, then the others are conformed to it. Unfortunately, I’ve not had time to work over that bit of code enough to assure if it is or isn’t a bug…
Bulldust (01:59:40) : Do they also extrapolate the McKay temps to the south island of New Zealand? OK just being flipant now 😉
Well, a single thermometer can look up to 1000 km away to get missing values ‘filled in’. It can then in the UHI section use data from up to 1000 km away to adjust the UHI of a station. AND that data used may have been filled in in the prior step. Now I doubt that it happens very often, but it is POSSIBLE for that 1000 km UHI “reach” to be picking up a 1000 km “fill in” reach.
But wait, there’s more…
In the Grid / Box step that ‘homogenized and UHI adjusted’ thermometer can fill in a Grid Box up to 1200 km away… So… It’s theoretically possible (though IMHO unlikely) for a 1200 km grid reach from a 1000 km UHI reach from a 1000 km fill in reach to put some impact from a thermometer 3200 km away into a grid box. (There are 80 “grids” of 100 “boxes” each on the planet).
Now it will be all nicely averaged in with loads of other thermometer data, so it would not be standing out there all naked on it’s own (most of the time…) but with less than 1500 thermometers in GHCN for the world and with 8000 grid boxes there are going to be some boxes with only one thermometer doing the work…
And that is why Madagascar has temperature anomalies reported even though their last thermometer record was from 2005 and the few years before that had holes in the data. It’s also why both Panama and Bolivia have nice red patches of anomaly over them even though they have not reported temperatures since 1980…
After all, you don’t really need temperatures. Just anomalies. And both can be filled in if you really need some… after all, there’s got to be a thermometer or two within a couple of thousand kilometers somewhere we can use…

Ken Coffman
February 11, 2010 2:35 am

This is interesting…
I don’t know why they bother with heaters. Why don’t they just let the CO2 forcing and water vapor feedback increase the heat by up to 9 degrees. Oops, was that catty?

February 11, 2010 3:12 am

Great post. There are very many examples of GISS doing a ‘wrong way’ correction for UHI, adjusting rural stations etc. Having a pair of stations like this is really valuable but it takes local knowledge. We need more example like this.

February 11, 2010 3:21 am

Do they also extrapolate the McKay temps to the south island of New Zealand? OK just being flipant now 😉
There is a New Zealand connection here somewhere.
The name of the station (Te Kowai) is very obviously from New Zealand Maori, Te Kowhai, which means “yellow.” Kowhai is a New Zealand tree with yellow flowers.
Someone up Mackay way must have been in New Zealand when that locale was named.

A C Osborn
February 11, 2010 3:37 am

Ken Coffman (02:35:02) :
It’s called a Greenhouse! LOL

Tony Hansen
February 11, 2010 3:57 am

David Ball (01:23:01) :
What is shocking to me is the unmitigated gall that must be in place to assume that no one would actually look at their “adjustments”.
Well I was kind of thinking that if I had done it and I had been getting away with something for twenty years, and if it seemed like I was on a good thing … why would I change?
Mr. Ball, Perhaps you just can’t help letting your integrity get in the way….. not your fault at all…. I’m guessing you were either born or brought up that way.
And more credit to you.

Adam Ruth
February 11, 2010 4:02 am

Somewhat O/T. This story reminded of a funny article I read a couple years ago about that very same sugar mill.

Sam the Skeptic
February 11, 2010 4:08 am

Did it not occur to GISS that people might eventually look at these “adjustments” and start piecing together what was done?
No, David, it didn’t. Because when they started doing these things nobody was faintly interested in what they were doing. A very arcane subject was “climatology” in those days and if you had asked the man in the street he would have said it was something to do with weather forecasting (and he would have been right, up to a point).
Whether (and if so when) they started “improving” the figures we don’t know but the end result is that they have been able to present climate change to us as a fully-fledged philosophy and we have had to play catch-up because we all know that scientists are trustworthy, don’t we? And anyway who would have the time or the inclination to go hunting out the sites and even if they did and found something amiss they didn’t have a way of bringing it to public notice, did they?
It is absolutely true that the internet has made sure that science (along with a lot of other things) will never be the same again.
A little O/T – Richard North has a worthwhile piece today (“A Wolf in Sheep’s Clothing) at
I especially like his description of Hulme and Parry as “expert in climate change impacts”. Since I’m not aware that there have as yet been any impacts for them to be expert in it makes one wonder what they are doing and how we have let them get away with it all these years!

February 11, 2010 4:11 am

Could someone explain to me what the Earth-shattering ground-breaking purpose of NASA-GISS is that blends space seamlessly with the terrasphere?
Can’t we just petition Congress to dump GISS Climate and put the money into more worthy projects, like space probes and landers?
We already have NOAA, and they contend that they understand all there is to know about everything there is to know about climate.
Or maybe we could use the GISS Climate funding to re-establish the rural network.
And then there is NCDC.

February 11, 2010 4:35 am

OT – Many IPCC scientists say it’s impressive that so far only four errors have been found in 986 pages of the second report, with the overwhelming majority of the findings correct and well-supported.
Only 4 errors and they want to totally revamp the system. That’s another big lie. These guys can’t stop lying.

February 11, 2010 4:36 am

For Harry:
Well, the Portland article made great claims “..In the 5 years since he first published his results, not one peer review has come back disproving his theory..
But the journalist was taking her cue from the physicist. She obviously didn’t check – and how would she be able to?
Probably 10 or 100 papers have been published in that period which would disprove his theory. Maybe no one actually mentioned his theory – why would they if no one takes it seriously..
At the time, I saw that article and in the comments I cited one peer-reviewed paper, which I happened to have open on my PC at the time. After all, the article said “not one..”
Since then I’ve seen many others. The paper I mentioned was Dessler’s paper which showed that relative humidity stayed constant (while temperature changed) over a period of 10 years.
One day his paper might be covered on Science of Doom, but why not understand mainstream science properly first?
CO2’s impact on the atmosphere is not so hard to grasp. There is a huge gap between the simple aspects of how CO2 absorbs and re-emits energy and the doom predictions that we read about.
Check out CO2 – An Insignificant Trace Gas? Part One

February 11, 2010 4:48 am

btw I was just scrolling down the page and notice white precip on one of the weather maps. Is that another snow storm in Texas?? The jet stream has moved really really far south this year.

February 11, 2010 5:05 am

I think in essence the paper is restating the argument that cloud coverage is a significant determinant of climate.
I don’t think the consensus voices what to hear this argument.

Henry chance
February 11, 2010 5:12 am

Home of James Hansen? Manipulating data? He wouldn’t do it unless it needed manipulation.
What a group of thugs.

February 11, 2010 5:20 am

TanGeng (04:48:40) :
btw I was just scrolling down the page and notice white precip on one of the weather maps. Is that another snow storm in Texas??

We’ve gotten a few inches overnight; not a blizzard since there is no wind, just a covering that is very pretty as it sits on trees, grass, etc (north of Dallas) …

dave ward
February 11, 2010 5:31 am

Somewhat O/T, but the UEA have just announced a “New scientific assessment of climatic research publications” in a further attempt “to win back the hearts and minds of the public on the issue of climate change”
The local papers story:
And the UEA’s own announcement:

Peter Plail
February 11, 2010 5:33 am

Or was it 3000 pages or even 6000 pages, both figures I have heard quoted in the context of the errors/page count. Can they not even get the number of pages in the report right!

February 11, 2010 5:38 am

It’s still not clear to me why adjustments are made. Oh, I’ve read the specifics, unfortunately with less than a full understanding, but it seems to make sense at the time.
Unfortunately, it is like Car Salesman Double-talk: sounds great at the time, but when you think about it later it just does not make sense.
I believe the ‘adjustments’ and ‘filler temperatures’ are useful to get a broad picture of localized weather for use by, say, airports, farmers and the general populace who need to know if it’s going to be below zero of a scorcher. This gives those users an indication of how the localized weather will affect their specific application.
From my experience, based on the two temperature readings in my local area (an airport reading and a coastal reading) my actual temperature where I live can be independently higher or lower than either, depending on the temperature, time of year and wind patterns (Cold front? Warm front? Occluded front?).
In addition temperatures can vary over a 60 degree temperature variation in a single 24 hour cycle (More in some areas, less in others, also depending on time of year).
I won’t get back into my ‘pet peeve’ of calibration certificates, but suffice to say, these data points represent temperatures where sensors don’t necessarily exist, which aren’t calibrated, which have a dubious measurement technique, with a suspect siting, yet can produce accurate temperatures over the long term with an accuracy to the tenth of a degree.
It’s not surprising that any adjustment is called into question when the adjustment produces a trend which mates quite nicely with a political agenda (which AGW is).

February 11, 2010 5:41 am

Keep it up! Keep collecting a mountain of evidence and proof. It is easy to see why these snake oil salesmen don’t want you to look at their data; nobody wants to volunteer something that will discredit them. But thanks to the internet, the evidence is mounting so that only con artist “scientists”, politicians, and the eco-communists will believe in climate change. When that happens, they will make another crisis. I’ve already seen reports of the water crisis. They will keep changing the scam until the goal is achieved.

February 11, 2010 5:47 am

Followed this link, and then another at that site by Warwich Hughes, to an email exchange with Hansen in 2001. Hansen basically says this kind of exercise is cherry picking, and you need a larger area before the overall impact of these adjustments becomes rational.
To me, this implies there must be other sites where “cooling” is introduced by these adjustments. . . but I can’t recall those being held up by the skeptic community. So, has any effort been made to test Hansen’s claims, if not globally, at least in a large enough area to satisfy Hansen’s “large area” parameter, and was there significant cooling introduced in other sites vs the ones where warming was introduced?

February 11, 2010 5:54 am

Re: “another snow storm in Texas??”
Meteorological/weather conditions in North Tejas have even warranted their own mesoscale discussion this morning:

VALID 111251Z – 111645Z

February 11, 2010 5:59 am

As a note to my previous post, I’m an engineer: If I want to know how fast my car is going on the interstate, I measure it. I don’t use the speeds of all the cars around me to ‘guess’ if I’m going over the speed limit.
While it’d be true I’d be going about 70mph (or whatever), it won’t hold water in court for that 72mph speeding ticket.
When even a few hun’red bucks are at stake, the rules and regulations about even a simple measurement are very well defined. The temperature measurements will amount to billions/trillions of dollars of money being spent.
Summarily, my question to any scientist (er, statistician) using these temperature readings to ‘prove a point’ is: How do you know these temperature readings are accurate?

February 11, 2010 6:00 am

From BBC:
A panel of independent experts has officially begun its inquiry into the “Climategate” affair.
The experts, headed by Sir Muir Russell, will investigate how e-mails from the UK’s Climatic Research Unit (CRU) appeared on the web.
They will also consider if the e-mail exchanges between researchers show an attempt to manipulate or suppress data “at odds” with scientific practice.
The panel hopes to present “preliminary conclusions by spring 2010”.
Speaking at the launch of the inquiry, Sir Muir, who is chairman of the Judicial Appointments Board for Scotland, said: “We are free to pursue and follow any line of inquiry that we wish.”
Climate sceptics suggest that the affair shows that either human activities are not affecting the planet’s climate system, or that the impacts are not as bad as many climate scientists suggest.
The panel’s investigation will:
• “Examine the hacked e-mail exchanges, other relevant e-mail exchanges and any other information held at CRU to determine whether there is any evidence of the manipulation or suppression of data which is at odds with acceptable scientific practice.”
• “Review CRU’s policies and practices for acquiring, assembling, subjecting to peer review and disseminating data and research findings.”
• “Review CRU’s compliance or otherwise with the university’s policies and practices regarding requests under the Freedom of Information Act.”
• “Review and make recommendations as to the appropriate management, governance and security structures for CRU and the security, integrity and release of the data it holds.”
However, the panel will not review the past scientific work of the CRU, as this will be re-appraised by a UEA-commissioned study, which will involve the Royal Society.
The other members of the inquiry, which is being funded by UEA, are Geoffrey Boulton, general secretary of the Royal Society of Edinburgh; Dr Philip Campbell, editor-in-chief for Nature journal; Professor Peter Clarke of the University of Edinburgh; David Eyton, head of research and technology at BP; and Professor Jim Norton, vice president for the Chartered Institute for IT.

February 11, 2010 6:12 am

I need help replying to this author (Unity)
He states:
“This is Hansen’s recent review of the 2009 global temperature record, which you’ll note not only discusses the differences between his methods and those used by HADCRUT but also includes references to the FIVE published papers in which Hansen documents his methods.
From memory, there are AT LEAST FOUR separate review papers that set out the methodology used by NOAA to homogenise the station data used in the US surface temperature record, all of which can downloaded from NOAA’s website.
Try reading the literature instead of Anthony Watts’ website, you might actually learn something of value for a change.”
What is the best link to post there in response?

Dr. Gerhard Loebert
February 11, 2010 6:44 am

Climate-Gate, Climate Change and New Physics
Climate Change is a Fact, Its Supposed Anthropogenic Cause is Fiction
Dr. Gerhard Löbert*, Munich
o There is no direct connection between CO2 emission and climate warming. This is shown by the fact that these two physical quantities have displayed an entirely different time behaviour in the past 150 years. Whereas the mean global temperature varied in a quasi-periodic manner (mean period = 70 years), with temperature maxima in 1870, 1940 and 2006, (see Fig. 2.1 of the CO2 concentration – after having essentially remained constant for centuries – increased exponentially with the onset of massive hydrocarbon burning in the1950’s.
In contrast, there is a close correlation between mean global temperature and the geomagnetic aa-index which reflects the effect of energetic solar eruptions on the Earth’s magnetic field. The solar activity leads the terrestrial temperature by some 6 years.
This proves that climate change is not man-made but is driven by solar activity.
o The extremely close correlation between the changes in the mean global temperature and the small changes in the rotational velocity of the Earth in the past 150 years (see Fig. 2.2 of – two, within present teaching, unrelated physical quantities – clearly shows that
a) these two physical quantities are driven by a common extraterrestrial agent, and
b) a new physical theory is required to identify this common agent.
Note that rotational velocity leads terrestrial temperature by some 6 years.
o The author has developed a new theory of gravitation that not only covers the well-known Einstein effects but also shows up a number of post-Einstein effects that are substantiated by geophysical and astrophysical observations. This new physical theory is, in contrast to Einstein’s theory, based on quantum mechanics. It is called Seaon Theory and is explained in the post of September 19, 2008 in
o The following paragraphs give a physical explanation for the strong correlation between fluctuations of the rotational velocity of the Earth and solar eruptions on the one hand and the corresponding retarded changes of the mean surface temperature of the Earth on the other hand.
Seaon Theory, a new theory of gravitation based on quantum mechanics that was developed eight decades after Einstein’s corresponding Theory of General Relativity, not only covers the well-known Einstein-effects but also shows up half a dozen post-Einstein effects that occur in nature. From a humanitarian standpoint, the most important super-Einsteinian physical phenomenon is the generation of small-amplitude longitudinal gravitational waves by the motion of the supermassive bodies located at the center of our galaxy, their transmission throughout the Galaxy, and the action of these waves on the Sun, the Earth and the other celestial bodies through which they pass. These vacuum density waves, which carry with them small changes in the electromagnetic properties of the vacuum, occur in an extremely large period range from minutes to millennia.
On the Sun, these galactic waves modulate the intensity of the thermonuclear energy conversion process within the core which is highly sensitive to small changes in the permittivity of the vacuum, and this has its effect on all physical quantities of the Sun (this is called solar activity). This in turn has its influences on the Earth and the other planets. In particular, the solar wind and the solar magnetic field strength are modulated which results in large changes in the intensity of the cosmic radiation reaching the Earth. Cosmic rays produce condensation nuclei so that the cloud cover of the atmosphere and the Earth albedo also change. A mere 1% reduction in cloud cover explains most of the temperature increase of the past 150 years.
On the Earth, the steady stream of vacuum density waves produces parts-per-billion changes in a large number of geophysical quantities. The most important quantities are the radius, circumference, rotational velocity, gravitational acceleration, VLBI baseline lengths, and axis orientation angles of the Earth, as well as the orbital elements of all low-earth-orbit satellites. All of these fluctuations have been measured. The modulations of the Earth’s circumference (in the decimeter range) trigger large earthquakes. By closely monitoring the foregoing geophysical quantities, the approach of a devastating vacuum density wave can be detected in time for a global earthquake warning to be issued. Such a global earthquake warning system has to be supplemented with local earthquake alarm systems and suitable human protection devices.
*Physicist. Recipient of the Needle of Honor of German Aeronautics.
Conveyor of a super-Einsteinian Theory of Gravitation that explains the continual climate changes and the associated, closely correlated fluctuations of the rotational velocity of the Earth, as well as the triggering of devastating earthquakes as the result of the action of galactic vacuum density waves on the Sun and the Earth.

February 11, 2010 6:51 am

The adjustment algorithm is really simple. They call Rudd and ask him what he wants it to be, and they make it so.

February 11, 2010 7:00 am

As E.M. Smith stated above:
“Well, a single thermometer can look up to 1000 km away to get missing values ‘filled in’.”
Ladies and gentleman, that just blows my mind. I can see no justification for this “filling in” data charade. Since when are thermometers allowed to “look” anywhere except right where they’re at? No, it isn’t the thermometer “looking” at all–it’s some dimwitted programmer that had a hairbrained idea and made an arbitrary decision.
Why isn’t the “looking” distance 1200, 500, 2, or 17,000 km away? Was 1000 km selected because somebody likes round numbers? And why not look up (into the sky) or down (into the ground) instead of horizontal? It makes just as much sense!
Has some climatologist or programmer somewhere got such an obsession about numbers he has to go hunting for data? Are they so insecure in their personal or professional lives that they can’t bear the thought of a “missing” value?
In the scientific and engineering fields I’ve worked in for decades, there is no justification for “filing in” data. A missing data point is acceptable, even preferable. It simply means you have less data to work with, but it’s far better than inventing or borrowing data. “Filling in” is another term for falsify.
(“Hey… you don’t have that check amount listed in your check register–let me put one in for ya! Oh, you don’t like $1,589? But that’s close to what this other check was written out for. I’m just “filling in” like the climatologists do. Why do you have a problem with that? You think you’re smarter than a climatologist??”)
No wonder the profession has such a devious reputation.
My solution? Record the temperature at the station and if the thermometer is broke, FIX IT!

Jean Parisot
February 11, 2010 7:02 am

I remain unconvinced that the “surface temperature record” can ever be more then a measure of local weather. Extrapolating point measurements, that have significant incidental variation over a wide area cannot be done without introducing a significant error component. The mechanism for “gridding” these data points concerns me and I can’t find a decent reference for the spatial statistics used to accomplish it.

February 11, 2010 7:03 am

A bid to win back the hearts and minds of the public on the issue of climate change has been launched today with news that the Norwich university at the centre of the “climategate” scandal has asked for an independent review of the research in question.
The University of East Anglia has announced experts identified by the prestigious Royal Society will look again at key publications by the Climatic Research Unit (CRU), including the work of Prof Phil Jones,
Experts recommended by the Royal Society, ummm.

February 11, 2010 7:17 am

The chief executive of the U.S. approves of the actions of GISS. He currently wants to expand their efforts to measuring the rising seas. The fact that the data is manipulated to show global warming is a requirement of the job. The AGW agenda continues from the top down and has not changed even with all of the current discoveries of falsified data because politics is in charge not science. Has anyone noticed the huge deficits, has anyone ever wondered where the money is going while there is around 10% of the population still unemployed. Have these numbers been adjusted as well?

Ken Coffman
February 11, 2010 7:28 am

OT: Speaking of our friends at RC, here is a clip from a recent comment by Kevin McKinney…
A different example would be the “we don’t need no stinking greenhouse effect” meme which comes out in various blog discussions. This is the idea that “a non-conducting N2/O2 atmosphere” will by itself raise planetary temperatures by somehow “keeping heat in.” (It seems intuitively obvious to some, apparently, but those folks must be overlooking the fact that zero heat can leave the atmosphere–any atmosphere– by conduction in the first place. “Nothing from nothing leaves nothing,” and this line of er, thought, clearly gives us “bupkes.”)
I think it’s interesting…they apparently believe a CO2 molecule will radiate, but a nearby N atom at exactly the same temperature will not. Why do they keep talking about greenhouses when a hot water bottle in a blanket is a better analogy?

Steve Oregon
February 11, 2010 7:42 am

This reminds me of arguments I used to have with my cell phone company. In complaining about various excessive charges that appeared on my bill the helpful representative would expalin that some were glitches. This glitch problem was attributed to the wireless reliability not being as good as a land line.
I was told that over and over again.
Naturally I then wondered why there were never any glitches which resulted in under-charging me.
They were always over charging glitches.
And any and all corrections were always related directly to my level of complaining. The more agressivley I complained the more they took off the bill.
My conclusion was they were deliberate glitches and all of the people who never complained meant millions for the company.
So if there are no CRU and/or glitches which showed mistaken cooling then the glitches may not be so innocent.

February 11, 2010 7:46 am
Haven’t seen this comic linked yet — sorry if it’s a repeat 🙂

February 11, 2010 7:55 am

TerryBixler (07:17:53):
“Has anyone noticed the huge deficits, has anyone ever wondered where the money is going while there is around 10% of the population still unemployed. Have these numbers been adjusted as well?”
The numbers aren’t adjusted, just cherry-picked. Since 0bama became president, unemployment has increased substantially. U-6 unemployment, which includes those who can’t find work and who have stopped looking, has risen to well over 17%. The media only reports U-1 unemployment, which only includes those whose unemployment payments haven’t run out.

February 11, 2010 7:59 am

UK Now on WMD inquiry Number 3
For those who are not familiar with the saga of WMD in the UK, the public were lied to about the “unequivocal evidence” and real and imminent threat of WMD, and that is what convinced most people who supported the war against Saddam to assent to the war.
It was then found that the evidence had been “sexed” up in the “dodgy dossier”, and we’ve since had a string of inquiries each failing to convince the public that they were serious.
On Weather of Mass Destruction (the use of ordinary destructive weather events to terrorise the public to believe global warming), we’ve had an internal review by Muir Russell, the UK parliament is holding an inquiry, now the University is asking in academics. Outside the UK, there’s been the inquiry in Penn University, and no doubt there are more going on/planned.
Now where can we find a dodgy dossier which has been sexed up to suggest the evidence was unequivocal, real and imminent … any ideas?

February 11, 2010 8:09 am

Dave Ward:- They can announce what they like but if the premise is flawed as it seems to be then it means less than nothing. Can they provide proof of what they are postulating besides the projections of computer models?
I will answer that myself and that is no. The theory of global warming via CO2 is just that and remains unproved and untestable.

February 11, 2010 8:09 am

If we keep searching long enough, we’ll find the one thermometer that all fill-in and homogenization data comes from, located in Jim Hansen’s back yard.

February 11, 2010 8:17 am

Dr. Gerhard Loebert (06:44:37) said:
“o There is no direct connection between CO2 emission and climate warming. This is shown by the fact that these two physical quantities have displayed an entirely different time behaviour in the past 150 years. Whereas the mean global temperature varied in a quasi-periodic manner (mean period = 70 years), with temperature maxima in 1870, 1940 and 2006, (see Fig. 2.1 of the CO2 concentration – after having essentially remained constant for centuries – increased exponentially with the onset of massive hydrocarbon burning in the1950’s.
I can’t say much about his post, not having studied that subject, but I keep wondering if the increase in CO2 is exclusively due to the increase in hydrocarbon burning – or if this is a coincidence which has been accepted because it fits the ‘A’-bit of perceived GW.
How do we know if this doesn’t show the CO2 increase following the MWP, lagging the then temperature rise by ca 800 years, which has been observed in ice core data of previous ice age/interglacial periods?
I find this really puzzling – any explanations gratefully received!

February 11, 2010 8:29 am

Viv Evans (08:17:26):
“How do we know if this doesn’t show the CO2 increase following the MWP, lagging the then temperature rise by ca 800 years, which has been observed in ice core data of previous ice age/interglacial periods?
“I find this really puzzling – any explanations gratefully received!”
The planet naturally emits CO2: click
Those are the IPCC’s own figures. As we can see, for every 34 CO2 molecules emitted, only one comes from human activity. The other 33 come from natural processes, such as decaying vegetation, termite emissions, etc.
Human CO2 emissions could double, or cease entirely, and the difference would have no significance.

dave ward
February 11, 2010 8:37 am

Further to my (and other) posts re the UEA’s independent review, Bishop Hill has more:

February 11, 2010 8:38 am
February 11, 2010 8:42 am

Viv Evans, in the early days of Climate “science” when they augured the Camp Century ice, they thought they found a periodicity (I think it was 80 and 150 years).
Using that periodicity, we got the first prediction: GLOBAL COOLING, and that was seriously being considered in the early 1970s. And although it was the “orthodoxy” as global warming is now, it was behind all the cooling scares and e.g. the setting up of climatic research institutions.
But …. it didn’t cool, so some enterprising researchers in the 1970s decided to explain the FAILURE of the camp century cycle predictions by ADDING to this effect the CO2 warming. (see first paper on “global warming” Broecker published a paper entitled: “Climate Change: Are We on the Brink of a Pronounced Global Warming?” or
And, coincidentally, for the next two decades it warmed, and those who had dreamt up the idea gained a lot of credibility for their marvellous ability to predict two decades.
But, in reality, everything we see in the climate signal can be explained as pure and simple noise. The fact is that the climate has long term variation that is much bigger than short term variations, so it has long term upward/downward “thrusts” (trends) which are pure noise but which will appear to many people to be a some kind of “thing” happening. If you get a series of natural ups interspersed with downs the result is that the signal will look as if there is some kind of cycle, but follow it back or wait for it to recur and it will disappear into the noise it is.

February 11, 2010 8:42 am

Identical to the NIWA alterations. Likely a conspiracy.

February 11, 2010 8:44 am

dave ward (05:31:17) :
Somewhat O/T, but the UEA have just announced a “New scientific assessment of climatic research publications” in a further attempt “to win back the hearts and minds of the public on the issue of climate change”
It’s big news but I’m not so cynical. They can’t do anything without including the bloggers.
“Scandal university climate science to be probed,”

Harold Vance
February 11, 2010 8:45 am

This post clearly demonstrates the fact that the GIStemp analysis produces garbage.
I feel sorry for the guys at Clear Climate Code. Their porting of the software to Python, while noble, will produce the same nonsensical results. If only they could free themselves from Reto Ruedy’s tractor beam.
Data In, Garbage Out. It’s the GISS way.

February 11, 2010 8:48 am

Anthony, this is way off topic, but given your interest in solar power, etc. I thought you’d like to see this. And I don’t know of any other way to contact you. This sounds like a game changer for small installations on homes etc., where solar is a reasonable alternative.
IBM researchers are developing a solar cell with an eye towards what’s in the ground.
Researchers on Wednesday published a technical paper in the journal Advanced Materials that describes a solar cell made of abundant materials with relatively high efficiency. The cell can convert 9.6 percent of solar energy into electrical energy, a 40 percent boost over current methods.
That level of efficiency is already far exceeded in commercial silicon-based cells and even beat by thin-film solar cells, which are cheaper to make than silicon cells but are less efficient. But IBM researchers set out to make a cell that uses materials that are relatively abundant elements–copper, zinc, tin, and sulfur, or selenium (CZTS). The availability of materials for existing solar technologies limits their long-term potential, according to IBM.

p.g.sharrow "PG"
February 11, 2010 9:03 am

Great work Ken, 2 boots on the ground are worth a hundred on desks 6,000 miles away.
More evidence of man made (hansenized) global warming.

John from CA
February 11, 2010 9:06 am

Fascinating comment Dr. Loebert
Here’s a link to a study done in 2003 by the Food and Agriculture Organization of the United Nations titled Climate change and long-term fluctuations of commercial catches — The possibility of forecasting;
Conclusions are in the y2787e10.pdf but an interesting chart appears on page 50 which shows a 55 year oscillation of the atmospheric circulation index and temperature range. Page 51 indicates the global temperature anomaly related to “Global Warming” concerns.

John from CA
February 11, 2010 9:18 am

This is from the document:
Figure 9.1 shows the results of modelling the detrended global dT and zonal ACI based on a 55-year period of climate oscillations. The figures indicate that a harmonic with this period length is in good agreement with the past oscillations of both dT and ACI, and suggest that similar cyclic changes are likely to continue during the future 30–60 years.
Figure 9.2 shows the temperature dynamics reconstructed from the Greenland ice cores data for the last 400 years combined with the detrended dT dynamics, calculated from the time series of directly measured temperature for the last 150 years (Figure 9.3). It is clear that the dynamics of the measured temperatures for 1861–1975 coincide with the reconstructed Ice Core dT dynamics. With a 55-year period length, the projected model curve is in good agreement with the observed fluctuations of both reconstructed and measured dT.

Ken Harvey
February 11, 2010 9:37 am

Thank you dearly whoever it was above that led me to this site and its follow ups
It sounds good to me – but is it? If I can believe it then I can forget about climate change and go back to trying to educate so called bankers as to what they have done wrong that assured a banking collapse and which they have not yet grasped and changed. The author of the original paper has been around for some years. Why haven’t I heard of his theory till now?

February 11, 2010 10:37 am

One thing with all the temp adjustments and thermometer movement, they must know that this can only bring a decade of so of warming charts – after that it will normalize at the ‘higher’ levels. I guess they are only looking for a 10 year run and then move to another impending catastrophe.

February 11, 2010 10:39 am

Ken Harvey, does a Crookes radiometer rotate with white or black surface forward? The theory says that black emits, so … or is it that black absorbs … or is it that black gets hotter so the few molecules bounce off with more energy.
The only thing I can remember is that when I tried to predict the direction using what seemed an “obvioius” theory, it proved to be wrong and after a bit of thought I realised that there were other possible explanations.
That’s why in real science, you make predictions, and then you validate the science against those predicitions. Like e.g. if you predict in the dodgy dossier of 2001 that the world will warm by 1.4-5.8C/century and you know that long term noise is greater than short term (i.e. short term easier to predict), if the world doesn’t heat up at all but cools, at a rate of -0.8C/century
… you shut up, go away and think where you got the science wrong!
That’s the marvellous thing about science – it is based on what happens, on the evidence not on wild speculation of the dodgy dossier nor opinion polls of scientists.

February 11, 2010 10:40 am

The fact that GISS make the past cooler, suggests to me they are avoiding making the present warming trend steeper, so their results still match UAH MSU data reasonably well. So instead they cool he past where we have limited methods of checking the work, this to me alone is an admission of guilt!

A C Osborn
February 11, 2010 10:56 am

Smokey (08:29:59) :
Viv Evans (08:17:26):
I can understand Viv’s question, if the climate was very warm 800 years ago, wouldn’t we be seeing CO2 rising in response to it as shown by the Ice Cores?

A C Osborn
February 11, 2010 10:57 am

Smokey (08:29:59) :
Viv Evans (08:17:26):
I should have said CO2 rising now 800 years later.

George E. Smith
February 11, 2010 11:26 am

On a related issue, is the global CO2 ever going to change from 388.09 ppm ?
It seems to have been stuck on that number for a long time; I would think even the natural unman induced increase, would have raised it by now.
Maybe the gizmo need to be tapped on the dial to unstick the needle.

Peter Plail
February 11, 2010 12:08 pm

As I understand it, most anthropogenic CO2 occurs as a consequence of some sort of heat generating activity – e.g. coal, oil and gas burning for heat, propulsion and electrical generation, as a by product of biological activity (food converted to energy to warn and power living beings with CO2 given off). I have never seen any discussion of whether the quantity of heat produced is significant in the context of global temperature.
For example, I think the average human alone gives of about 60 to 100 watts, so for a 7 billion population we are talking about 420 to 700GW. I have no idea where to start calculating heat output from transportation, industry and power stations, let alone cooking fires.
All told that’s a lot of extra heat energy given a doubling of world population in the last 50 years.

February 11, 2010 12:51 pm

Ken Harvey – you asked about the link. It is authentic. I corresponded with the author a month ago.
What country do you live in?

February 11, 2010 1:22 pm

Peter Plail (12:08:33) : Peter, you raise an interesting point, but in my limited knowledge, may I suggest that the answer is Energy Conservation; i.e. the energy we give out = the energy we take in. Regards, Bob.

February 11, 2010 1:39 pm

Looking at the monthly data from BOM it is evident that there is much more variability in the earlier data.
The end of the 1930 hump in 1932 is evidenced by a measurement change of 4 deg C in 1 month (not impossible but other april may periods show now similar change.
The beginning of the hump is less pronounced with many gradual changes.
Surely it is correct for GISS to correct this?
On another matter the copyright notice for the data is:
The copyright for any data is held in the Commonwealth of Australia and the purchaser
shall give acknowledgement of the source in reference to the data. Apart from dealings
under the copyright Act, 1968, the purchaser shall not reproduce, modify or supply
(by sale or otherwise) these data without written permission

i.e. although I paid nothing for the data I have no right to pass it on – just like CRU!

February 11, 2010 2:26 pm

Curiousgeorge (08:48:03) :
Anthony, this is way off topic, but given your interest in solar power, etc. I thought you’d like to see this. And I don’t know of any other way to contact you. This sounds like a game changer for small installations on homes etc., where solar is a reasonable alternative.

That article reads in a somewhat confused manner, you need to see this abstract to follow it better.
The 40% improvement is only over other solar cells made from “copper, zinc, tin, and sulfur, or selenium (CZTS).” Currently commercial silicon-based and thin-film cells beat it. The major things are these are using more common materials, and they are thin-film cells made with a “printing” process that does not use vacuum technology. With improvements these might go from the current 9.6% efficiency to perhaps 12%. At that point the “watts per (area times cost)” factor makes them look attractive. Except, on small home installations you only have so much area to work with, so that’s a built-in limiting factor.
Now, over at Uni-Solar Ovonic they “print” flexible cells the size of football fields, that get cut into rolls that can be deployed like roofing material. They’ll work nice over a tin roof. They are not as efficient as a “normal” solar panel. However, as detailed in this paper, they work with diffuse light, they do not require bright direct sunlight. Thus they yield more electricity than more-efficient cells. And since they can cover a roof nicely, they sure look more aesthetically-pleasing than traditional panels with far less installation hassles.
It is nice to want to use less-rare materials. However, for practical solar electric power that people will prefer to use, there are other factors to consider.

Peter Plail
February 11, 2010 2:34 pm

Bob (Sceptical Redcoat) (13:22:43) :
The energy input of the body is food but the energy out is useful power and heat accompanied by CO2; the energy input of a power station is coal, or gas, or oil and the output is useful power and heat accompanied by CO2, etc (I’m ignoring waste products for simplicity). I picked the human example as the easiest given that the figure for heat output is generally accepted and pretty consistent. I suspect that other human activities generate greater quantities of heat per capita, certainly in colder latitudes.
The point I was trying to make though, is that all the processes that generate CO2 also generate heat. I lack the skills to do the physics, but it seems to me that there is a possibility that at least part of the temperature rise attributed to man-made CO2 might actually have arisen from the heat produced at the same time that the CO2 is produced. If this is the case then that would make CO2 even less potent.

Dave N
February 11, 2010 3:35 pm

Sydney sceptic: I can possibly do Adelaide and surrounds, not that there’s many stations within cooee that GISS use.

JP Miller
February 11, 2010 4:01 pm

Please, please send this out to a list of climate scientists in the US and Australia and ask them what they think is going on and whether they continue to be willing to use GISS data from before 1979 despite example after example of raw data that looks to be “tweaked” to suit the AGW hypothesis.
In fact, Anthony, it would be great if someone could compile a list of, what, a few hundred climate scientists who would get a short email and a link everytime this site serves up data that questions whether climate science is on sound footing with its dependent variable. After all, these analyses do not show up in the “peer reviewed” literature, so are likely to be overlooked by “respectable” climate scientists.
Oh, by the way, isn’t it interesting that the upward adjustments stop at about 1979, when satellite temp data starts becoming available such that it would have become obvious if there had been GISS temp fudging after that date?

February 11, 2010 6:53 pm

TanGeng (04:48:40) :
btw I was just scrolling down the page and notice white precip on one of the weather maps. Is that another snow storm in Texas??

So far, an all-time record for snowfall at DFW airport has been recorded: 8.7 inches –
– breaking the old record of 7.8 inches.
Just coped live off-the-air from WFAA-TV CH 8 here in Dallas, Tejas.

February 11, 2010 7:11 pm

Please people let’s get some perspective here, consider the nature of the actual “raw” data; per nosa

Maximum temperature. This is the highest temperature (°F) recorded for the calendar day.
Minimum temperature. This is the lowest temperature (°F) recorded for the calendar day.
Average temperature. The sum of the previous two columns, divided by 2, and rounded, gives the value for this column.

therefore it’s completely synthetic, made-up, has no basis in reality so it doesn’t matter how it’s adjusted or homogenized, ground-station thermometers have no scientific validity in climatology and were never intended to be used for that purpose. At least if they consistently used TMax or Tmin, we would have a number that was actually measured in the real world.

February 11, 2010 7:15 pm

NWS Ft. Worth office announcement:
808 PM CST THU FEB 11 2010

F. Ross
February 11, 2010 7:37 pm

Re; GISS adjustment …
“Why …even Father Lonegan had a mother!”
“What did you expect?”

February 11, 2010 8:02 pm

The world will be focusing on Vancouver Canada soon.
The warmers will be making a big to do over snow depths and banging on how it’s due to GW.
So to any BC readers, a local examination of GISS adjustments will be worth a ton of snow in Washington DC.

February 11, 2010 9:43 pm

NASA applies an urban correction of its GISS temperature index in the wrong direction in 45% of the adjustments. Instead of eliminating the urbanization effects, these wrong way corrections makes the urban warming trends steeper. My article discusses Steve McIntyre’s audit of the GISS corrections at:

Jim Masterson
February 12, 2010 2:34 am

Years ago (before 2003 and after 1998), I became interested in desert temperatures (specifically Death Valley). One of the predictions of greenhouse theory is that dry regions, like deserts and polar regions, will show the effects of CO2 warming more than other areas. This is because CO2 effects are masked by water vapor, so dry regions are the “canary in the mine” signal of GW. Unfortunately, during the hot year of 1998, Death Valley had a cold year–third coldest in fact. I stored my data away and didn’t check Death Valley temperatures until recently. The current data show that 1998 is still a cool year, but something has changed. The temperatures now shown for Death Valley weren’t as I remembered them. So I pulled out my old data and checked. Below is a comparison of these datasets. The first graph is the pre-2003 plot of my saved data. The second plot is the current GISTEMP values. In the third plot, I overlay the two datasets. Apparently Hansen’s been busy “correcting” these temperature values during the last few years.

The linear trend slope of the pre-2003 data is 0.0143 °C/year and the current data has a linear trend slope of 0.0192 °C/year.
Have fun trying to figure out the temperature modification algorithm. I tried to check the original B91 forms and that’s a lot of work. Too bad there isn’t a fancy OCR program that will scan these forms. The two years that I checked don’t match either dataset.

Jim Masterson
February 12, 2010 7:18 am

My picture link works fine on my site and on other sites, but it doesn’t work on this site. You might as well delete my previous post as it doesn’t make sense without that link.
[WordPress doesn’t support picture links. ~dbs, mod.]

Robert of Ottawa
February 12, 2010 7:22 am

You gotta adjust for lack of UHI, come on!

Baa Humbug
February 12, 2010 8:32 am

Re: Jim Masterson (Feb 12 02:34),
Hi Jim. Wasn’t a new weather station added to death valley? I’m checking now at the late John L Daly web site.
Yes HERE it is. Badwater was set up to try and obtain the world all time highest temp record. The original station was at Furnace Creek. Fascinating read. Daly actually traveled to there and took photos.
Seems NASA isn’t new to “tricking” temp data.

Jim Masterson
February 12, 2010 12:25 pm

Baa Humbug (08:32:07) :
Hi Jim. Wasn’t a new weather station added to death valley? I’m checking now at the late John L Daly web site.
Yes HERE it is. Badwater was set up to try and obtain the world all time highest temp record. The original station was at Furnace Creek. Fascinating read. Daly actually traveled to there and took photos.
Seems NASA isn’t new to “tricking” temp data.
I reposted my previous posting on John Brignell’s Number Watch web site (; where, unlike WordPress, his site supports picture links.
Let’s see–moving a weather station changes all of the historical values? I think that’s a little too much homogenizing.
(John Daly’s Death Valley temperature plot matches my pre-2003 data, so there’s some confirmation.)

Ken Stewart
February 12, 2010 4:03 pm

Gday folks!
Well the games up, and I can’t rely on Queenslander! for anonymity anymore.
Thanks all for your very interesting comments. I’ve had modem problems (it died) this week so I’m using the local library…
Neil McEvoy:
Thank you- that’s it in a nutshell. They make out a farm to be a town.
thanks for the hint about the drop up box- I’ll look for it! But it really makes very little difference unless December or january is has an extreme up or down. Most of the data matches pretty well. I’ve been urged to contact you so will shortly, if you don’t mind.
Yes Te Kowai is a Maori name. Possible from the blackbirding days when 1000s of pacific islanders were brought to work in the canefields.
geo: Actually, there are instances of GISS correcting UHI properly. I show this in my Postscript post- Rockhampton and townsville both adjusted down which is nice. So why Mackay and te Kowai?
JP Miller:
They do make some adjustments after 1979- see Rocky and Townsville. It’s not as straightforward as we might think. THEY cherry pick!
Thanks Anthony for giving me a go!

February 13, 2010 12:28 am

Funny how the warmists are pointing to snow as proof of AGW when only a short time ago that CRU guy (Onions?) said that snow would be increasingly rare and children wouldn’t know what it was.

February 13, 2010 6:30 am

An interesting document from BOM:
Note this table:
This shows no heatwaves corresponding to the 1915 to 1932 hump. I would have expected high monthly average temperatures to have been caused by heat! but the table has an absence of anything outstanding during this period.

February 13, 2010 8:45 am

Jim Masterson (12:25:05) :

I reposted my previous posting on John Brignell’s Number Watch web site (; where, unlike WordPress, his site supports picture links.

Jim, let me take a stab at posting an HTML ‘href’ link to the image (as opposed to posting an HTML ‘img’ link):
“… I pulled out my old data and checked. Below is a comparison of these datasets. The first graph is the pre-2003 plot of my saved data. The second plot is the current GISTEMP values. In the third plot, I overlay the two datasets. Apparently Hansen’s been busy “correcting” these temperature values during the last few years.”
Three data plots as follows:
1) Pre-2003 data plot
2) Current GISTEMP plot
3) overlay of the two above
“The linear trend slope of the pre-2003 data is 0.0143 °C/year and the current data has a linear trend slope of 0.0192 °C/year.”
Now, I click SUBMIT COMMENT and see what happens …

February 14, 2010 4:47 pm

You must write to your local member to stop the madness on so called global warming. This will not end until Rudd is out of office and there is no climate minister. Penny Wong is getting a wage on a false pretence and must be stripped from that title and office. Then and only then the debate is over. Legal action must then be taken to recover monies spent on a false premise, and people need to go to jail or be fined who propagated the deceit.

Jim Masterson
February 14, 2010 5:40 pm

_Jim (08:45:52) :
Jim, let me take a stab at posting an HTML ‘href’ link to the image (as opposed to posting an HTML ‘img’ link):
Thanks _Jim. If I had remembered that img links didn’t work on WordPress, I probably would have used an href link instead.

%d bloggers like this: