Guest post by Paul Homewood
The HADCRUT data has now been released for September, so we can have a look at the latest figures for the four main global temperature datasets. I have now switched to HADCRUT4, although the Hadley Centre are still producing numbers for HADCRUT3.
| RSS | HADCRUT4 | UAH | GISS | |
| September 2012 anomaly | 0.38 | 0.52 | 0.34 | 0.60 |
| Increase/Decrease From Last Month | +0.12 | -0.01 | +0.14 | +0.03 |
| 12 Month Running Average | 0.16 | 0.42 | 0.11 | 0.50 |
| Average 2002-11 | 0.26 | 0.47 | 0.19 | 0.55 |
Global Temperature Anomalies – Degree Centigrade
The pattern is similar across all datasets, with September temperatures above both long term and 12 month averages, although, interestingly, both satellite sets have picked up a bigger spike than the other two. We are currently seeing the lagged effect on temperature from the mild El Nino, which began in April and has now pretty much fizzled out, as can be seen below. Purely thinking aloud, but is this an indication that atmospheric warming is slower to dissipate than surface warming?
http://www.esrl.noaa.gov/psd/enso/mei/
My guess is that temperatures will settle back slightly by the end of the year. If ENSO conditions remain fairly neutral in the next few months, we should get a good indication of underlying temperatures, for the first time for a while.
The following graphs show 12-month running averages for each set. As I mentioned before, we often get fixated with calendar year figures, which obviously change a good deal from year to year. It therefore seems much more sensible to look at 12 month averages on a monthly basis, rather than wait till December.
In all cases, the 12 month averages are lower than they were at the start of the year.
Finally, I mentioned last month that UAH had just brought out a new Version 5.5, which corrected for spurious warming from the Aqua satellite. (Roy Spencer has the full technical stuff here). The new version is now incorporated and backdated in my analysis above. I have also plotted the difference between the old and new versions below.
As can be seen, the divergence really started to be noticeable towards the end of last year, and has steadily grown wider over the last few months.
Remember that all monthly updates can be accessed on my “Global Temperature Updates” page, at
http://notalotofpeopleknowthat.wordpress.com/category/global-temperature-updates/
Sources
http://nsstc.uah.edu/climate/index.html
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/download.html#regional_series
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
I am still, after all this time, not convinced that averages like this are not being given far to much attention. It seems to me, many others are attaching a far greater importance to it then deserved or is truly useful. No matter what arguments I have seen, none demonstrate any connection to any place or region. It if isn’t effecting you or me then I think the claimed usefulness is bogus.
Phil Jones claimed an increase of 0.6°C in 120+ years in the 2001 IPCC Report and it became, with the hockey stick, a major part of the proof of human induced warming. The error range of ±0.2°C was overlooked.
The numbers presented here show that there is a 0.28°C difference between HadCRUT4 and UAH and a 0.36°C difference between GISS and UAH in just 9 years.
All this with temperatures taken to 0.5°C.
These numbers are the modern equivalent of the medieval argument about number of angels on the head of a pin.
The claim that there has been no (statistically significant) warming since 1997 is borne out. If there has been any, it is not detectable. I can’t see any reason to get worried about cooling, yet either. The mysteryis why there have been so few major storms hitting the USA in recent years and why Sandy did not develop into one of the powerful monsters that have hit the same region in the past.
WUWT?
My local weather station here in the English Midlands has us running at -0.4 degrees Centigrade so far this year on a thirty year average.
Facts and figures here: http://bws.users.netlink.co.uk/
Twas the cooling that caused Sandy, mostly.
I blame you and the satellites for not picking it up. If I can see it why cannot you? Try looking at maxima.
@Tim Ball
The numbers presented here show that there is a 0.28°C difference between HadCRUT4 and UAH and a 0.36°C difference between GISS and UAH in just 9 years.
Don’t forget, Tim, they are all based on different baseline periods, so cannot be directly compared. Since 1979, though GISS show 0.15C more warming than RSS, with the other two in the middle (based on 12 month averages).
http://notalotofpeopleknowthat.wordpress.com/2012/10/08/hadcrut-update-august-2012/
The September maximum was 3 degrees colder than last year in the UK, the minimum was 1.7 degrees colder, probably due to all the wet cloudy weather we have had.
I know the use of anomalies has been explained before but I have to admit to not ‘getting’ it. What exactly are anomalies? Are they the excursion above a baseline average by the daily average? I don’t really see that tells us a lot about daily temperatures. For example, if there IS some sort of warming trend that encourages slightly higher daily maxima OR minima that would cause the daily average to increase but it would not necessarily mean that temps overall have increased surely?
That is, if the only effect was to see a short spike in daily temps (eg at 4 AM) but the rest of the day was largely normal, we’d still see a difference in anomaly wouldn’t we? What do the daily actuals show when plotted over time? Is it possible to see the daily temp range for specific long term stations plotted against the same day for say 100 years? If we don’t have the date to show temp ranges hourly for each day then how can we really say what is happening?
I am not discounting the concept, I just don’t quite see that it is really telling us anything about climate…
Dennis Nikols, P. Geo says:
“I am still, after all this time, not convinced that averages like this are not being given far to much attention. ”
I became interested in how much the temp drop every night.
60 years (1950-2010) of the Northern Hemisphere difference between how much today’s temp goes up, minus how it drops tonight.
http://www.science20.com/files/images/1950-2010%20D100_0.jpg
Based on the NCDC’s summary of days data set (~110m samples).
2012 in Perspective so far on Six Data Sets
Note the bolded numbers for each data set where the lower bolded number is the highest anomaly recorded so far in 2012 and the higher one is the all time record so far. There is no comparison.
With the UAH anomaly for September at 0.34, the average for the first nine months of the year is (-0.13 -0.13 + 0.05 + 0.23 + 0.18 + 0.24 + 0.13 + 0.20 + 0.34)/9 = 0.123. If the average stayed this way for the rest of the year, its ranking would be 10th. 1998 was the warmest at 0.42. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. With the adjustments, the 2010 value is 0.026 lower than 1998 instead of 0.014 as was the case before.
With the GISS anomaly for September at 0.60, the average for the first nine months of the year is (0.32 + 0.36 + 0.45 + 0.55 + 0.67 + 0.55 + 0.46 + 0.57 + 0.60)/9 = 0.503. This would rank 10th if it stayed this way. 2010 was the warmest at 0.63. The highest ever monthly anomalies were in March of 2002 and January of 2007 when it reached 0.88.
With the Hadcrut3 anomaly for September at 0.520, the average for the first nine months of the year is (0.217 + 0.194 + 0.305 + 0.481 + 0.475 + 0.477 + 0.446 + 0.513+ 0.520 )/9 = 0.403. This would rank 10th if it stayed this way. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less.
With the sea surface anomaly for September at 0.453, the average for the first nine months of the year is (0.203 + 0.230 + 0.241 + 0.292 + 0.339 + 0.352 + 0.385 + 0.440 + 0.453)/9 = 0.326. This would rank 10th if it stayed this way. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555.
With the RSS anomaly for September at 0.383, the average for the first nine months of the year is (-0.059 -0.122 + 0.072 + 0.331 + 0.232 + 0.338 + 0.291 + 0.255 + 0.383)/9 = 0.191. If the average stayed this way for the rest of the year, its ranking would be 11th. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857.
With the Hadcrut4 anomaly for September at 0.524, the average for the first nine months of the year is (0.288 + 0.209 + 0.339 + 0.514 + 0.516 + 0.501 + 0.469 + 0.529 + 0.524)/9 = 0.432. If the average stayed this way for the rest of the year, its ranking would be virtually tied for 10th. 2010 was the warmest at 0.54. The highest ever monthly anomaly was in January of 2007 when it reached 0.818. The 2011 anomaly at 0.399 puts 2011 in 12th place and the 2008 anomaly of 0.383 puts 2008 in 14th place.
On all six of the above data sets, a record is out of reach.
On all data sets, the different times for a slope that is at least very slightly negative ranges from 11 years and 9 months to 15 years and 9 months, but note *
1. UAH: (*New update not on woodfortrees yet)
2. GISS: since January 2001 or 11 years, 9 months (goes to September)
3. Combination of 4 global temperatures: since November 2000 or 11 years, 10 months (goes to August)
4. HadCrut3: since March 1997 or 15 years, 6 months (goes to August)
5. Sea surface temperatures: since February 1997 or 15 years, 8 months (goes to September)
6. RSS: since January 1997 or 15 years, 9 months (goes to September)
RSS is 189/204 or 92.6% of the way to Santer’s 17 years.
7. Hadcrut4: since December 2000 or 11 years, 10 months (goes to September.)
See the graph below to show it all.
http://www.woodfortrees.org/plot/hadcrut3gl/from:1997.16/trend/plot/gistemp/from:2001.0/trend/plot/rss/from:1997.0/trend/plot/wti/from:2000.8/trend/plot/hadsst2gl/from:1997.08/trend/plot/hadcrut4gl/from:2000.9/trend
Thank you Tim Ball !
The oft quoted 2 decimal point accuracy of the supposed global anomaly is ridiculous – if I did this sort of thing – and I did cause I’m a precision freak – I’d fail the test I was sitting for.
I’ve seen many reporting stations equipment in many parts of Queensland and to give an accuracy as +/- 0.5 degrees C is a guess at best.
How the hell does anyone take thousands of readings with at best 0.5 degree accuracy and come up with a global average of 0.55 or whatever ??
It is simply inconceivable that this is “science” !!
Perhaps global wine production makes better thermometers than trees. There was a small article in this morning’s LA Times business section titled “Wine levels worldwide shrinking to 37-year low” that I traced to this link:
http://www.goerie.com/article/20121101/BUSINESS05/311019911/Wine-levels-worldwide-shrinking-to-37-year-low
“Uncooperative weather has damaged grapes worldwide, causing global wine production to shrivel 6.1 percent to its lowest point since 1975, according to a Paris trade group.”
@Graeme M
I know the use of anomalies has been explained before but I have to admit to not ‘getting’ it. What exactly are anomalies?
Simply it is the difference between the monthly temperature for a particular year compared with the same month for a given baseline period.
So, for instance, GISS work on a baseline of 1951-80. The average temperature ,for say January ,during those years at each station they monitor is compared to January this year. If this year is higher, it is presented as a positive anomaly.
The reason for using anomalies is that temperature changes can be measured, as opposed to absolute ones.
So, for instance, if station x has temperature records from 1900-1950, and station y (100 miles away) has records from 1940-2012, you should be able to construct a long term temperature trend from 1900-2012 for the area, which you could not do just by looking absolute temps.
re: Dr. Tim Ball, 1Nov12 at 10:42 am: — “These numbers are the modern equivalent of the medieval argument about number of angels on the head of a pin.”
I agree. You point is fundamental. Maybe viral marketing would be a way to communicate the essence of your point in a very brief message. Just a thought …
How the hell does anyone take thousands of readings with at best 0.5 degree accuracy and come up with a global average of 0.55 or whatever ??
Independent errors grow as the square root of sample size. So if you measure 1000 data points with individual errors of 0.5 the error on the mean is 0.5 / sqrt(1000) or 0.016
Paul Homewood says: “The reason for using anomalies is that temperature changes can be measured, as opposed to absolute ones.”
The main reason for using anomalies is that is takes out the “average” seasonal variation. This leaves a (mostly) non seasonal record. Filters would do this better a filter needs a window of data to work on and thus cannot run up to the end of the data. (I’m not sure how your plots run up to the end of the year, you probably have not centred the data and thus have a 6 month offset in your results.) Anomalies are often preferred since they do run up to the last date available.
Paul Homewood says: ” It therefore seems much more sensible to look at 12 month averages on a monthly basis, rather than wait till December.”
That is a good approach. It would be even better if you used proper filter instead of a runny mean. Runny means distort the data as much as they filter it. Just look at the amount of sub-annual detail you have on something that you ran a 12m filter on.
Here’s why:
http://oi41.tinypic.com/nevxon.jpg
runny means are crappy filters and let through large amounts of what you intended to get rid off. Worse, every second lobe is in fact negative. So not only does it get through it get inverted !!
NOT A LOT OF PEOPLE KNOW THAT 😉
Here is the UAH TLT data with a clean gaussian filter.
http://i46.tinypic.com/dy0oyb.png
Even the 3m gaussian is smoother than the 12m runny mean. If you look at the 12m filter there is not visible detail on a scale of less than a year.
Since these data are already deseasonalised by being anomalies, a light filter should be enough if it is a proper filter.
If wordpress does not mangle it , here is a simple awk script that will run a gaussian filter :
[sourcecode]
#!/bin/awk -f
# pass input through 3 sigma gaussian filter where sigma, if not given, is 2 data points wide
# usage : ./gauss.awk filename <sigma=2> <scale_factor=1>
# optional scale_factor simply scales the output
# use OFMT="%6.4f"
# sigma can be compared to the period of the -3dB point of the filter
# result is centred, ie not shift. dataset shortened by half window each end
# check whether data is continuous !!
BEGIN { OFMT="%6.4f"
# ARGV[1]=filename; argv[0] is script name, hence ARGC>=1
pi= 3.14159265359811668006
if ( ARGC >3 ) {scaleby=ARGV[3];ARGV[3]=""} else {scaleby=1};
if ( ARGC >2 ) {sigma=ARGV[2];ARGV[2]=""} else {sigma=2};
print "filtering "ARGV[1]" with gaussian of sigma= ",sigma
root2pi_sigma=sqrt(2*pi)*sigma;
two_sig_sqr=2.0*sigma*sigma;
gw=3*sigma-1; # gauss is approx zero at 3 sigma, use 3 sig window
# eg. window=2*gw-1 – 5 pts for sigma=1; 11pts for sig=2; 3 sig=17
# calculate normalised gaussian coeffs
for (tot_wt=j=0;j<=gw;j++) {tot_wt+=gwt[-j]=gwt[j]=exp(-j*j/two_sig_sqr)/ root2pi_sigma};
tot_wt=2*tot_wt-gwt[0];
tot_wt/=scaleby;
for (j=-gw;j<=gw;j++) {gwt[j]/=tot_wt};
# strip off last .xxx part of file name
# improve this (doesn’t work with paths like ../filename)
split(ARGV[1],fn,".");
basename=fn[1]
gsfile=basename"-gauss"sigma".dat";
print "# ",gsfile >gsfile;
ln=-1;
}
($0 !~ /^#/)&&($0 != ""){
xdata[++ln]=$1;
ydata[ln]=$2;
if (ln>2*gw)
{
gauss=0
for (j=-2*gw;j<=0;j++) {gauss+=ydata[ln+j]*gwt[j+gw]}
print NR,xdata[ln-gw],gauss
print xdata[ln-gw],gauss >> gsfile;
}
else
{
# print $1,$2;
}
}
END {
print "#gausssian window width = "gw+gw+1",done"
print "#output file = "gsfile
}
[/sourcecode]
[Looks like WordPress mangled it. Sorry. — mod.]
Joe Bastardi estimated that the UK Met Of HadCru 3/4 was about 0.2°c higher than the satelite values because of the averageing period. That makes UAH RSS and CRU anomolies about the same.
[Looks like WordPress mangled it. Sorry. — mod.]
No, it looks alright. If you hover over the code text, some flash gadgets pop up and you can click on “view source”. This seems to be an intact copy just scanning by eye.
awk ( or gawk of nawk…) is available on all major platforms so anyone capable of using a computer beyond just reading MSN and Facebook should be able to use it.
@ur momisugly Stephen Richards
Joe Bastardi estimated that the UK Met Of HadCru 3/4 was about 0.2°c higher than the satelite values because of the averageing period. That makes UAH RSS and CRU anomolies about the same.
Yes, you cannot directly compare the four sets, as they all have different baselines.
RSS – 1979-98
HADCRUT – 1961-90
UAH – 1981-2010
GISS – 1951-80
If you compare the current 12-month averages with Dec 1979, you get an increase in temps of :-
RSS + 0.25C
HADCRUT4 + 0.36C
UAH + 0.28C
GISS + 0.40C
(NB HADCRUT3 was 0.31 a month ago)
(So 37% of the warming claimed by GISS since 1979 is not reflected by RSS)
I’ll run some figures with them all on a 1981-2010 baseline next time.
Rosco says:
“How the hell does anyone take thousands of readings with at best 0.5 degree accuracy and come up with a global average of 0.55 or whatever ??”
Simple, plug the temperatures into a calculator and divide by the number of observations.
Using a basic calculator, you can get precision to 8 to 10 decimal places by completely ignoring the concept of “significant figures”.
Well now,to me, (as a lay man) this Climastrology thing is no more than a joke.After looking at these so called ‘Graphs’, with no data points,I see the scam that they are using.
Anyone who has used an oscilloscope knows that you can expand or compress the Amplitude (y axis)and the Timebase (x axis) to suit your needs.Think about it,stretch the amplitude and you can make 0.00000001 degree centigrade to look scary,compress the timebase, scarier still, reverse the controls,drop the amplitude, expand the timebase,nothing to see but a wavy or straight line (depending on back-round-hum).
Well I hope anyone in Electronics can see where I’m coming from on this Climate ‘Scam’.
Can we start up a scare story of how much a voltage variation of 5v on the Grid will cause a Catastrophe,or 0.5v, or 0.05v,you know where I’m coming from on this B.S.
Electronic Engineers can become the new World Saviours,,jump on the band wagon now.
Why not? everyone else is.SHOW YOUR GRAPHS.
@Rosco says:
>How the hell does anyone take thousands of readings with at best 0.5 degree accuracy and come up with a global average of 0.55 or whatever ??
As my brother the historian used to say, it i not quite a simple [to dismiss it] as that. The precision cannot be imporived, that is true, so there are error bars of a fixed size above and below. However the confidence about where the middle point is located can be improved by having a larger number of readings. The location is the accuracy, the error bars are the precision. They are different. The location of the middle point can be assigned a confidence level (like 95%). If the deviation is less than the precision, there is no statistically significant difference between readings. Basically, that is what the ‘no warming in 16 years’ message contains.
It’s AGC – Anthropogenic Global Calming. Over the long term, humans have been moderating the weather and climate. Sandy is a failure of that process, probably caused by excessive pollution controls, and reductions in CO2 output. Such policies should be reversed, forthwith! Bring back that old time AGC!!
Tim Ball says:
November 1, 2012 at 10:42 am
These numbers are the modern equivalent of the medieval argument about number of angels on the head of a pin.
There was no medieval argument about the number of angels on the head of a pin. It was invented in the 1800s as either sarcasm or slander (I don’t know which) against the Catholic Church.