By Dr. David Whitehouse, the Global Warming Policy Foundation
Today’s Times says, “Nasa analysis showing record global warming undermines the skeptics.” However, a closer look at the information which the Times bases its headline on shows that a combination of selective memory and scientific spin play a large role in arriving at it.
The conclusion is based on a new paper written by James Hansen and submitted to Reviews of Geophysics. The paper released by Hansen has not been peer reviewed, and he admits that some of the newsworthy comments it contains may not make it past the referees.
Hansen claims that, according to his Gisstemp database, the year from April 2009 to April 2010 has a temperature anomaly of 0.65 deg C (based on a 1951 – 1980 average) making it the warmest year since modern records began. It is a fractionally warmer than 2005 he says, although an important point to be made is that statistically speaking, taking into account the error of measurement and the scatter of previous datapoints, it is not a significant increase.
The Nasa study said: “We conclude that there has been no reduction in the global warming trend of 0.15-0.20 deg C per decade that began in the late 1970s.”
This is a selective use of a trend line that joins a datapoint in the late 1970’s with the most recent one ignoring the details in the data inbetween. The fact is that one could have taken a datapoint a decade ago and tied it to the same point in the late 1970’s and deduced an even greater rise in temperature per decade. So another way of describing the data is that the rate of increase has actually declined.
Another point to be made is that an increase of 0.2 deg C per decade, if it is real and sustained, is 2.0 deg C per century, an increase not that unprecedented in the climatic record of the past 10,000 years, and substantially less than the widespread predictions of a higher increase.
In the Times article, the Met Office in the form of Vicky Pope, said that their data showed that the past year was “just below” the 12-month record achieved in 1998. Remember, 2009 annual temperature was, according to the Met Office, statistically indistinguishable from every year between 2001–2008.
Vicky Pope then says that Nasa might be right because the Met Office had underestimated the recent warming detected in the Arctic! There are few weather stations in the Arctic and the Met Office, unlike Nasa, does not extrapolate where there are no actual temperature readings. It is curious to hear this given the criticism that Met Office scientists have expressed in the past about the way the Gisstemp dataset is pieced together this way!
Vicky Pope does say however that, “the Met Office continues to predict that 2010 is more likely than not to be the warmest calendar year on record, beating the 1998 record.” This is also a curious statement since she adds that Met Office analysis showed that the four months to the end of April were probably the third warmest for that time of year.
In only the past few weeks however the Met Office has been saying something different.
In the Sunday Times of May 23rd Vicky Pope says that 2010 could be the hottest year on record due to the current El Nino. She also says that the 2010 January – April temperature was the seventh warmest on record meaning that out of the past ten years (allowing for the 1998 El Nino) most of them have been warmer during the January – April period, though not statistically so.
In the Sunday Times article Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in Colorado, adds what is missing from the article mentioned earlier: “We have seen rapid warming recently, but it is an example of natural variation that is associated with changes in the Pacific rather than climate change.”
In the Times article poor journalism is compounded with scientific spin from James Hansen’s article to give a misleading impression about the state of the science and what the data actually shows. It will be interesting to see if 2010 breaks any records in the Gisstemp or Met Office datasets. If it does the next question to ask would be, is it statistically significant as one would expect the occasional high point due to errors of measurements causing measured datapoints being scattered around a constant mean (the case post 2001). It would be highly misleading and scientifically fraudulent to look at one datapoint that is higher than the rest yet within the error bars of the previous years and say, “look, a record.” This will not undermine the skeptics but science itself.
Feedback: david.whitehouse@thegwpf.org

So, based upon the overwhelming consensus here (which is of course very important, just like the consensus of the overwhelming majority of climate “scientists”) the climate shows no sign of this warming, which means, either:
1. The temperature network (whether thermometer or satellite) is accurate but reading something entirely irrelevant, according to our perception of climate
2. The temperature network is reading the right stuff, but getting it wrong
3. Some lying barstewards are making it all up.
4. All of the above
What do you reckon?
UK Sceptic says:
June 3, 2010 at 1:10 pm
Must be something wrong with my internal thermometer then because I shivered through most of April.
__________________________________________________________________________
I am in North Carolina, where we usually do not see snow and we had a low of 35F or 1.6C in MAY!!! Not to mention FIVE snow storms this winter. We only had four days with highs of 91 (33C) for the entire month of May this year compared to 17 days over 90F (32C) in 2004 and two days with a high of 98F (36.6C). (the Wunderground data goes no further back)
Heck, I have not even bothered to turn on the A/C yet because it has been so cool the house has not really warmed up.
What warming? Our part of Vancouver Island still four degrees lower than seasonal average up to last night, and it’s been 2-4 degrees below the posted average for the area since before Christmas. We are promised that temperatures will be ‘average’ next week.
Hey, was that a derisive snort just then or me sneezing? Must be the weather, not the climate. /sarc
First week of June, and the lake/pond outside my office window is nicely fringed with daffodils. This is the first warm spell of the year. Globally its only going to go downhill for the rest of the year. Hansen’s peculiar choice of April-April means he maybe senses that fact – this time-frame centers around the (now faded) el Nino, and thus the hasty publication before the cooling sets in. Perhaps in his mind he cant separate climate and taxes, thus his choice of the financial year for climate.
Peter Hearnden says:
June 3, 2010 at 2:08 pm
….But the message is confusing for us basic farm folk who can’t comprehend it. Do we rear hardier Cold loving beasts in the future, or make ready for needing hardier species to raise for a Warming world. But the science we are told is not the reality, and it can be so confusing for us folk down in Devon who still live quite isolated from the modern world…..
____________________________________________________________________________
So Peter, it must be you folks that the USDA was referring to in their manual when they tell Staff to address Farmers at *Sixth Grade Level*
It’s really due to gravity. It pulls the mercury lower in the thermometers than it should be, so the ‘experts’ have to make adjustments to correct it. In Colorado, when we are 10-15 degrees below average, it’s just weather, but 5 deg above and CAGW is proven! (of course, ‘average’ hereabouts is meaningless. That’s what it won’t be.)
All this “Warmest Year ever” crap, is based despetately in manipulated data from the GISS, the ONLY thing to turn to nowadays for record Heat.
Must we Forget The MOST SNOWPACK in the US in Like, 50 years.
The Only reliable resources to look at are the satellites put up in the 70’s for the fear of global cooling. Year 1979….. The years when the PDO and AMO turned warm, has created this temperature rise. This is the LAST year that “record” warmth will be discussed, we’re about to turn a huge corner. I’d hope we don’t get colder globally than the 70’s, but it looks like it will end up that way.
Whats going on here is the oscillations of the PDA and AMO> First the PDO Goes Cold, and temps level off and slowly Fall, then, he AMO goes cold, and we’re back to the 70’s, and we still have no idea what the sun has to do with it, or, at least we’ll find out. CO2, a 2% fraction of the atmosphere….. You noticed, back in the 90’s we were talking about the ozone layer thinning, then it slowly died off? The Global Warming idea, while it will slowly fade, the money, the “green”, the Texes, Big Government, and NOAA, will find a way to keep something going, whether it be the Next Ice age, Global Drought…. somway, they’ll find a way.
Its about the MONEY, Not the Sceince. If it were about thye science, and if it were settled, then we skeptics WOULDN’T be winning the debate, as we are.
This AGW Myth is hore sh*t, I’m out
Peter,
I’m headed for your neck of the outback this weekend – a break from a research trip. I’ve been having trouble with my internet connections at a B&B in Southampton, and hope things aren’t as bad as you paint them in Devon. Hope the government comes through for your pigs and hogs.
Totally off topic – the last time I went on a research trip (the archival kind) Climategate broke. I suppose it’s too much to hope for something similar to happen again? I won’t hold my breath, and at least the memory is a grand one.
Thanks for your analysis, Dr. Whitehouse. I’m glad people like you are keeping a weather eye on alarmist lies and exaggerations, and deconstructing the message for us.
Juraj V. says:
June 3, 2010 at 11:05 am
Once again, look how GISTEMP runs warmer compare to all remaining datasets, including HadSST:
http://www.woodfortrees.org/plot/gistemp/from:1998/plot/hadcrut3vgl/from:1998/plot/rss/from:1998/plot/uah/from:1998/plot/hadsst2gl/from:1998
=====
You forgot the offsets to put the four on a comparable base( – 0.24 for gisstemp and – 0.15 for hadcrut). Do that and the differences shrink.
http://www.woodfortrees.org/plot/gistemp/from:1998/offset:%20-.24/plot/hadcrut3vgl/from:1998/plot/rss/from:1998/plot/uah/from:1998
#
Chris says:
June 3, 2010 at 9:52 am
Jon P
here monthly GISS txt.
http://data.giss.nasa.gov/work/gistemp/STATIONS//tmp.425722900004.1.1/station.txt
#
davidmhoffer says:
June 3, 2010 at 9:58 am
If I may, I think I just found a gigantic hole in Hansen’s data. I was running trend lines from various scenarios back to 1880 using the GISS/TEMP data from the GISS map site. There’s a most interesting problem. When you run a trend from 1880 to 2009 there are 1,060 cells (of 16,200) which are used to calculate the trend, but which have no data to calculate against in 1880.
—…—…
1) Has Hansen released the raw US temperature data from 1880?
Or are we allowed to look at only his corrupted (er, corrected) and adjusted (upwards) temperature data?
2) I understand (from an old WUWT thread in 2008) that some of his program code has been released. Apparently he “re-analyzes” and then “re-calculates average temperatures” for EVERY temperature record ever received each month. Rather than, say for example) actually leaving the historical record intact and unedited, and comparing only the “new” data to the 1970-based 0.0 reference point? A reference very conveniently located right at the valley between 1940’s warm years and 2000 – 2010 warm years.
3) Further, I understand that he adjusts upward all data from remote sites within 1200 km (max, if 600 km data is not available while correcting for UHI hot spots) but refuses to release the actual “remote sites” he uses for this purpose.
4) I understand that “new” remote site temperature records are used every month for each GISS run based on NASA’s light index. How often is this index updated?
5) What is the effect of this update (new light index patterns1) have on previous calculated “remote site” corrections? That is, if every month a new light index is used, what happens to old record corrections that are now being compared against different remote sites? Also, if every month a different remote site could be “found” by the calculator, then wouldn’t Boston’s Logan Airport be “corrected by Ontario’s temperature one month, some parking lot in rural south PA the next, Appalachian VW the next, and North Carolina the next month?
6) Hansen’s GISS “remote site correction selector” sub-routine appears to accept and reject remote sites based on longevity (years of comparable records available at that given month) and “circular radius” only. If one or more months, a previously valid remote site drops out, he is now (automatically and without checking!) using an entirely different baseline temperature to “correct” his temperature history for how many sites?
7) Hansen’s “circular radius” selection criteria accepts all “remote site” records: regardless of latitude changes (Washington DC to southern Canada are acceptable “remote sites” to adjust all east coast cities.) Is this valid? Who (and when – other than Hansen himself in his 1987 papers) decide this is acceptable?
Not at all sure why we pay attention to this [snip] (stuff). Assuming things cool off, (from the last few months) will Hansen be pointing to the Dec(09) / Dec(10) slope as proof of dangerous cooling?
Just how many wrong guesses does Jim get before Politicians and the Public wake up?
Perhaps all that missing “hidden heat” is now to be found on the far side of the Moon eh Jim …
Well, you have to hand it to Hansen for consistency. Consistently shovelling sh*t, that is.
Hansen’s rubbish has also hit the headlines of the smh:
http://www.smh.com.au/environment/climate-change/the-warmest-year-yet-says-nasa-20100603-x7f5.html
… funny how the expanding Pacific Islands don’t rate a mention.
El Nino is over and the 2010 El Nino didn’t surpass the 1998 El Nino. If La Nina starts (and there are indications it could be in effect soon) then temperatures will begin to drop quickly.
Joe Bastardi thinks temperatures will begin to drop quickly by September and could be in negative anomaly for a time in 2011.
3:33 video at link:
http://www.accuweather.com/video/89141767001/major-drop-in-global-temps-is-around-the-corner.asp?channel=vblog_bastardi
p.s., moderators, could you suggest to Anthony that a guest post from Joe Bastardi would be uber cool?
davidmhoffer says:
June 3, 2010 at 9:58 am
If I may, I think I just found a gigantic hole in Hansen’s data. I was running trend lines from various scenarios back to 1880 using the GISS/TEMP data from the GISS map site. There’s a most interesting problem. When you run a trend from 1880 to 2009 there are 1,060 cells (of 16,200) which are used to calculate the trend, but which have no data to calculate against in 1880.
1) Has Hansen released the raw US temperature data from 1880?
I thought that he had, but when I pick the “unadjusted” from the web site it just gives me an error. The unadjusted data says it is available up to 1999. Doesn’t seem to be there.
My first crack at it (see post above) was astounding. To have over 1000 data points in the calculation that are in error is astounding. That they appear to have been hand crafted is astounding. That they are slid in here and there in place where they would be hard to spot is astounding. They are on average about double the actual anomalies for nearby data, more astounding still. See for yourself if you don’t believe me.
http://knowledgedrift.wordpress.com/2010/06/03/the-most-interesting-gisstemp-errors-can-this-be-an-accident/
Was going to re-examine the data for the same date span as in his article above, but it is turning out to be a bit of a challenge. Showing that there was bogus data from 1880 to 2009 had an increasing effect on the temperature trend was a challenge, but not unsumountable and all the evidence is on my blog. Trying to do the same for just the last part of the data is another thing.
I ran 1970 to 2009 and plugged it into the same spread sheet. In that time period, there are 43 cells out of 12,721 that are in error. But there are 13,446 with data. What happened to them? Turns out the problem is in reverse. There are 768 cells with data at the beginning of the time period, but no corresponding data at the end of the time period. This is the case where the average fom the zone is supposed to be used in order to complete the caluculation, but loooks like it has not been done. So while the first problem was easy to back out of the numbers, this isn’t. Since I can’t guestimate the amount of the missing numbers, then I would have to figure out how the other numbers are or are not in the calculation in order to back them out or put them to adjust. I’m not certain the info is there to do that, but I am looking at it.
Dr A Burns:
June 3, 2010 at 5:27 pm
If the temperature anomaly stays the way it was in Jan–Apr throughout the rest of the year then 2010 will be the hottest on record. But James Hansen knows full well, I’m sure, that the high temperature anomaly in the first 1/4 of this year was caused by the El Nino. And I am sure he also knows El Nino ended quickly at the beginning of May. The anomaly will not continue high like it was. But most of us here know there are odd things going on with James Hansen’s data set, GISTemp. If these odd things continue from this point on throughout the year then it is possible that his data set for 2010 will be the highest ever on record. But that record will only be a fact of life in James Hansen’s pretend world.
If La Nina sets in and temps cool but NASA says it’s a record hot year then the average person will know something is up at NASA GISTemp. And I will enjoy that.
David Oppenhaimer says:
June 3, 2010 at 2:25 pm
Hansen was never right, why does anybody list to him?
the four letters N-A-S-A attached to his name is the reason
If the words ‘environmental activist’, which is what he really is, were attached to his name then most everyone would be like you and say ‘why would I listen to him?’
As Wren points out the temperature data have offsets as much as 0.24 degrees in this temperature region. The satellite data from UAH and RSS should serve as a standard to which all the others ought to be compared. And if you really want to get information out of such curves do not average but use a magic marker instead. I have long known that these offsets are devices to show that warming exists when there actually is no warming. Let’s look at Hanssen’s approach using satellite data as a guide.. First thing to take note of is the fact that in 1988 when he testified to Congress that warming had started there was no warming. Global temperatures just oscillated, up and down by half a degree for twenty years, and real warming did not start until 1998. That is ten years after he claimed that warming was here. The oscillations I spoke of were not noise but real and were caused by the alternation of warm El Nino and cool La Nina phases of the ENSO system in the Pacific. ENSO has existed since the Isthmus of Panama rose from the sea and is guaranteed to exist for the foreseeable future. But this period in the eighties and nineties shows up in NASA, NOAA, or Met Office curves as a warming period. And comparing their data to satellites we see that this is achieved simply by raising up low La Nina temperatures between El Nino peaks and this way creating a rising temperature curve from an originally horizontal curve. Clearly this system was already in place when Hansen testified. Even so, this only got them 0.1` degree rise per decade, not 0.15 or 0.2 degrees for that temperature segment. The larger numbers come mostly from luck: the super El Nino of 1998 which was not a product of the greenhouse effect helped them hugely, and so did its aftermath, the twenty-first century high. But even so this did not satisfy them and the highest offsets are found in that last warm plateau. This plateau included six warm years where the temperature stayed near El Nino maximum. It ended with the La Nina cooling of 2008. And this was in turn followed by the present El Nino which has just peaked. I expect that the oscillating climate we had in the eighties and nineties is back and should deliver us another La Nina this year. To show the correct average temperature for the last thirty years you should start with a horizontal straight line that ends at the beginning of the super El Nino of 1998. Another horizontal straight line from 2002 to the present belongs to the twenty-first century high that followed it.. They are disconnected and must not be statistically combined.. The temperature difference between these two horizontal lines is 0.3 degrees. The transition that includes the super El Nino and the climb up to the twenty-first century high should not be averaged. The super El Nino itself was produced by a storm surge that dumped warm water at the start of the equatorial countercurrent near New Guinea. The countercurrent carried it to South America where it ran ashore, spread out, and produced the warm spike in our records. The most important conclusion from all these temperature curves is this: anthropogenic global warming has never been observed because Hansen’s warming is imaginary and real warming was not carboniferous.
Mango says:
June 3, 2010 at 10:03 am
Am i mistaken or should a paper being made ready for publication not be under a press embargo until accepted?
It is a time-honored practice to send out ‘preprints’ to friends and colleagues when the paper is submitted. A general press release at that point would be frowned upon.
2010 may not end up being the warmest year on record, or it may. While you all complain about GISS, your own Dr. Roy Spencer’s analyses show the same very warm first five months of the year – in complete agreement with GISS and NOAA. In fact, Spencer’s data shows that 2010 so far matches the highest daily temps in the past 20 years.
While you all threaten Hansen with arrest, the modelers at the Blackboard can’t understand why the GISS temperatures run so much lower than theirs from the same data set: http://rankexploits.com/musings/2010/the-great-gistemp-mystery/
Manfred says:
June 3, 2010 at 1:33 pm
Recent 2010 temepratures were on top of the weather phenomenon an El Nino contributing several tenths of a degree.
2005 was not.
There is no way to regard 2010 to be warmer than 2005 from a clomate science perspective..
Realclimate is very quick in adjusting temperatures upwards during La Ninas, but here they, Hansen and Pope are silent again.
This is further evidence, that the wrong people are employed at the top of climate science institutions.
========== ===========================
NOAA claimed “Conditions are favorable for a transition to La Niña conditions during June – August 2010.”
So it is official now.
Thanks to Wren for the giss and hadley offsets. As you can see: http://www.woodfortrees.org/plot/gistemp/from:1985/offset:%20-.24/plot/hadcrut3vgl/from:1985/offset:-.15/plot/rss/from:1985/plot/uah/from:1985
the GISS, HADCRUT, RSS, and UAH methods produce essentially the same temperature record – the agreement in nothing short of phenomenal. What’s all the fuss about?
Quoted from the post by Arno Arrak :
June 3, 2010 at 6:24 pm
“As Wren points out the temperature data have offsets as much as 0.24 degrees in this temperature region. The satellite data from UAH and RSS should serve as a standard to which all the others ought to be compared. And if you really want to get information out of such curves do not average but use a magic marker instead. I have long known that these offsets are devices to show that warming exists when there actually is no warming …………”
==============
Perhaps I’m missing your point, but I don’t see how offsets could “show that warming exists when there is no warming.”
Offsets are simply adjustments that are necessary if we are comparing the four temperature anomalies, since they use three different baseline periods. UAH and RSS both use Jan 1979 – Dec 1998 as a baseline period, while HADCRUT uses Jan 1961 – Dec 1990, and GISTEMP uses Jan 1951 – Dec 1980.
The offset adjustments (- 0.15 for HADCRUT , and – 0.24 for GISTEMP) put these two series on a common baseline period with UAH and RSS. You can find out more about this, and see the effects of the adjustments at
http://www.woodfortrees.org
Quoted from a post by matt v. says:
June 3, 2010 at 2:17 pm
“How can there be any credibility in any of the global temperature dialogue when there is such a wide spread in just the last ten years between the various temperature data sets .
Least square trend line slopes per Wood for Trees data
JAN 2000 to April 30/2010[124 months or about 10 years]
HADCRUT 3vgl 0.00428C/YEAR
RSS 0.00895C/YEAR
UAH 0.01289C/YEAR
GISS 0.01556C/YEAR
The difference is almost 3.6 times between low and high. The first thing the scientific community needs to do is to fix the data sets…….”
=====
After I put the four temperature anomaly series on a common baseline over at woodforytrees.org, the trends from 2000 look pretty similar to me.
http://www.woodfortrees.org/plot/gistemp/from:2000/offset:%20-.24/plot/hadcrut3vgl/from:2000/plot/rss/from:2000/plot/uah/from:2000
I wouldn’t expect the temperature changes shown by the different series to be exactly the same, since they aren’t measuring exactly the same thing. However, their long-term trends are about the same.