A challenge to Steig et al, on Antarctic warming

Jeff Id of the Air Vent has offered me this study he just completed for the consideration of WUWT readers. Given the wide audience we have here, I’m sure it will get a rigorous review and inspection.  Let’s see how well it stands up. – Anthony


Closest Station Antarctic Reconstruction

In my last alternate reconstruction of Antarctic temperature I used the covariance of satellite information to weight surface stations. While the reconstruction is reasonable I found that it distributed the trends too far from the stations. This prompted me to think of a way to weight stations by area as best as I can. The algorithm I employed uses only surface station data laid on the 5509 grid cell locations of the Steig satellite reconstruction.

This new reconstruction was designed to provide as good a correlation vs distance as possible and the best possible area weighting of trend, it can’t make a good looking picture though but for the first time we can see the spatial limitations of the data. The idea was to manipulate the data as little as possible to make where the trend comes from as clear, simple and properly weighted as possible.

The algorithm I came up with works like this.

Calculate the distance from each of 42 surface stations to 5509 satellite points store them in a matrix 42 x 5509.

For each of 5509 points find the closest station and copy the data to that location. If there are missing values infill those from the next closest station looking farther and farther until all NA’s are infilled.

This is what the spatial distribution of trends looks like.

id-recon-spatial-trend-by-distance-weight-1956-2006

Figure 1

You can see how the trends are copied to the points of each polygon from each surface station. There’s quite a bit of noise in the graph but it seems that like temperatures are grouped reasonably well together.

The code looks for the above plot takes about 20 minutes to run.

#calc distance from surface stations to sat grid points

dist=array(0,dim=c(5509,42))

for(i in 1:42)

{

dist[,i]=circledist(lat1=Info$surface$lat[i],lon1=Info$surface$lon[i],lat2=sat_coord[,2],lon2=sat_coord[,1])

}

Circledist is Steve McIntyres’s great circle function with slight modifications.

circledist =function(lat1,lon1,lat2,lon2,R=6372.795)

{

pi180=pi/180;

y= abs(lon2 -lon1)

y[y>180]=360-y[y>180]

y[y<= -180]= 360+y[y<= -180]

delta= y *pi180

fromlat=lat1*pi180;

tolat=lat2*pi180;

tolong=lon2*pi180

theta= 2* asin( sqrt( sin( (tolat- fromlat)/2 )^2 + cos(tolat)*cos(fromlat)* (sin(delta/2))^2 ))

circledist=R*theta

circledist

}

Then I wrote a function to get the closest distance greater than a value ‘mindist’ I pass. The first call for the grid number ‘ ind’, mindist is set to zero and the closest station is returned. If the closest station has missing data, I infill what it does have and pass the distance from the closest station to mindist and get the second closest station returned. The process is repeated until all values are filled.

getnextclosestdistance = function(ind=0,mindist=0)

{

tdist=dist[ind,]

while(min(tdist)<=mindist)

{

mind=min(tdist)

if (mind<=mindist)

{

tdist=tdist[- (which(tdist == min(tdist), arr.ind = TRUE)[1])]

}

}

g= which(dist[ind,] == min(tdist), arr.ind = TRUE)[1]

g

}

This is the loop function that fills the array.

recon=array(NA,dim=c(600,5509))

recon=ts(recon,start=1957,deltat=1/12)

for (i in 1:5509)

{

lastdist=0

while(sum(is.na(recon[,i]))>0)

{

dd=getnextclosestdistance(i,mindist=lastdist)

lastdist=dist[i,dd]

mask = is.na(recon[,i])

recon[mask,i]=anomalies$surface[mask,dd]

print (paste(i,lastdist))

}

}

After that all that’s left is the plotting algorithms by RomanM SteveM and Jeff C which I’ve shown before.

The next graph is the trend calculated from all 5509 grid points.

id-recon-total-trend-by-distance

Figure 2

The trend is again positive by 0.052 C/Decade, this time it is on the outer edge of the stated 95% confidence interval of Steig09 of 12 +/- 0.07C/Decade.

Like before I also looked at the trend from 1967 – 2007.

id-recon-spatial-trend-by-distance-weight-1967-2006

Figure 3

id-recon-trend-closest-station-1967-2007

Figure 4

So from this reconstruction temperatures have dropped since 1967 at an average rate of 0.31 C/Decade. These results are similar to my previous reconstruction which looks like this.

The Antarctic, an engineers reconstruction.

id-recon-total-trend

Figure 5

id-recon-spatial-trend-1956-2006

Figure 6

And from 1967 – 2007

id-recon-trend-1967-2007

Figure 7

id-recon-spatial-trend-1967-2006

Figure 8

While I was initially happy with the engineers reconstruction, I found that station trends were not well localized by linear correlation weighting. (The correlation vs distance was not good) While peninsula station information stayed localized, the rest of the continent spread widely.

The trends shown match my last reconstruction reasonably well but in my opinion these are of superior quality.

Certainly the Antarctic temperatures have been flat or insignificantly cooling/warming in general for the last 40 years while 50 years ago there were lower temps recorded causing a very slight upslope in the 50 year trend. This is confirmed by the fact that sea ice has grown during the last 30 years among other observations.

The Steig 09 paper seems to be an artifact of the mathematics more than an actual trend. Amundsen Scott is the south pole data. The surface measurement is visually clean and has a downtrend for the full length of the data. This cooling is represented by the blue polygon in the center of the antarctic in this reconstruction.

TCO keeps asking me if I’ll post a trend higher than Steig. Every reconstruction I’ve done has reduced the trend from Steig 09. Every change no matter how small has resulted in a trend reduction from Steig 09, even the attempt to match Steig 09 has resulted in a slight trend reduction. I’ll say it now for the first time. In my opinion the paper is flawed and has an exaggerated warming trend due to bad mathematics. Temperature distributions on the continent are a result of artifacts in RegEM and not supported by the natural weather patterns as they were presented.

As an example which is pretty clear. Steig’s paper shows warming across the entire Antarctic. Here’s a plot of the ground data at the south pole.

south-pole-temp-1957-2007

Figure 9

A reconstruction cannot ignore a trend this strong.  So TCO, it isn’t up to me. As Gavin likes to say, the data is the data. This data just cannot support Steig’s conclusions.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
153 Comments
Inline Feedbacks
View all comments
E.M.Smith
Editor
April 12, 2009 9:53 pm

Jack Century (19:49:42) : most oilpatch professionals are rigid in their denial positions. The question is, WHY ?
Perhaps because they have had to be right during their careers or millions of dollars would be wasted on dry holes; and that means they were careful and competent scientists. Secondarily, geology gives you the perspective to understand that the holocene stability is a fluke in the global history of climate and that there isn’t a darned thing people can do to influence the earth and its changes. They have a clue and their willing to use it…

April 12, 2009 10:02 pm

Jeff: First of all, good work and thank you for the considerable effort you and the others you mention have put in on this topic. Have you been in touch with Steig et al about this? Have you had any response or is it all quiet? Will they be in denial mode and hope it all goes away?

April 12, 2009 10:06 pm

Several reasons why deniers of anthropogenic global warming are so rigid and uninformed are : (1) they have little understanding of the differences between geologic and human time scales and processes, (2) Svante Arhennius proved over 100 years ago that CO2 changes in the atmosphere are the driving mechanisms of global temperature changes and (3) Norman Newell proved in 1987 there is a near perfect statistical correlation of 0.9985 between CO2 measured in the atmosphere on top of Mauna Loa and the growth of human population, rigorously calculated by others. There are no current, natural phenomena that come remotely close to Newell’s statistical proof. We are now living in the Anthopocene Epoch as defined by the Geological Society of America. Period. Full Stop.. End of discussion.

John F. Hultquist
April 12, 2009 10:56 pm

Anthony, no need to post this but Jack’s post and your response sent me looking for information. You probably already know this:
From:
http://www.rpsgroup.com/Canada/News/RPS-Canada-hosts-CSPG-Heavy-Oil.aspx
In 2008 (Jack) Century became an official Presenter of An Inconvenient Truth along with 30 Albertans selected for a weekend of training by TCP-Canada.

Matt Bennett
April 12, 2009 11:29 pm

They’re a bit like that round here Jack. Never ones to let reality get in the way of a good pre-determined position. The fact that all the top climatologists in the world have findings that go against their viewpoint generally amounts to nought.
Oh, and a seismic shift in energy policy to renewables (which any sensible visitor from elsewhere in the galaxy would undoubtedly recommend as mandatory for prolonged planetary survival) necessarily means a collapse of life as we know it, according to them. This despite the fact that evolution’s long since worked that way and despite the findings of multiple economic reviews which point to the longterm savings achieved by going this route.
But good luck to ya!!
REPLY: “all” the top climatologists in the world?
Gosh, and all this time I thought Dr’s Pielke Sr., Pielke Jr., Spencer, Christy, Lindzen, and several others I know (but you probably wouldn’t) were skeptical climatologists.
But we would not want to “let that get in the way of a good pre-determined position”. Matt are you one of Gore’s presenters too? If not, perhaps you should sign up.
Cheers!
– Anthony

F. Ross
April 12, 2009 11:29 pm

In a previous WUWT review of Steig wasn’t there considerable question of the validity of some of the temperature measurements because of buried [snow] sensors and other possibly faulty data sources?
If so, were these issues ever successfully resolved or at least taken into account?
If not, would this affect the current study?

Fluffy Clouds (Tim L)
April 12, 2009 11:45 pm

wattsupwiththat anthony, to soon to post
typo/error here 0.31 C/Decade thats 3c per 100 years
the chart looks like .03
So from this reconstruction temperatures have dropped since 1967 at an average rate of 0.31 C/Decade. These results are similar to my previous reconstruction which looks like this.
any how Jeff how about weighting the peninsula less as well do to it’s being smaller than the rest of the Antarctic?
let the good times role
leave out the R code tooo lol

April 12, 2009 11:49 pm
Jerry
April 13, 2009 12:00 am

Jeff,
Good stuff. I must agree with CodeTfch, though:
CodeTech (18:53:44) : “I hate straight lines…They always will remind me of Homer Simpson buying Pumpkin futures… they were going up and up (just before halloween), and ‘if this trend continues, we’ll be rich in 2 years!’ Nature doesn’t make straight lines.
A straight trend line is invariably inappropriate on such graphs as we have for temperature records. They quite obviously display temporal variation on all scales and the lines are normally drawn to the ends of the data, giving far too much influence to the chosen end-points. Such trend lines are statistically illiterate.

Manfred
April 13, 2009 12:15 am

“So from this reconstruction temperatures have dropped since 1967 at an average rate of 0.31 C/Decade”
“Certainly the Antarctic temperatures have been flat or insignificantly cooling/warming in general for the last 40 years.”
i wouldn’t call a downtrend of 3.1C/century flat or insignificant.
i think it is also quite unpleasant, that the poorest junkscience of mann, steig and altri received the highest publicity.
but i think it has been analyzed that this is a general defunction of mass media, not only related to fields as primitive as climate science.

AlanG
April 13, 2009 12:22 am

So, you get a warming trend if you cherry pick a start date of 1957 and a cooling trend by starting in 1967 instead. As fine an example of cherry picking you’ll get. Of course the temperature change from 1957 to 1967 makes no contribution to any period outside that date range so a straight line plot is probably not appropriate here. It’s also impossible to reconcile the south pole temperature chart in Figure 9 with the red in Figure 6, but never mind.
I going to run an idea past people here, including Anthony, that looking for trends in daily data over a time period is flawed. The temperate on 1st January doesn’t contribute to the temperate on 1st July in the same year or in any year. Instead of averaging daily temperatures we should be averaging the CHANGE in temperatures from 1st Jan to 1st Jan, 2nd Jan to 2nd Jan and so on for all the days in the year. In other words we should be looking for a DISTRIBUTION of temperature change.
This might be important when Anthony publishes his analysis of temperature changes from the best of the US surface stations. If you find the same signature from a small number of widely scattered, top quality, stations then that signature is the real deal.

Dave Wendt
April 13, 2009 12:30 am

Jeff Id:
Another impressive piece of work, but it seems to me that you are wasting a fair bit of your valuable time trying to construct rational challenges to Steig et al when the only logical response to that tripe is a universal derisive horselaugh. From the numbers I’ve seen the land area of Antarctica is 75% greater than the area of the continental US. If Steig et al had selected the continental US for their subject, grabbed data from south Florida and the Keys, amalgamated it with data from a handful of least well sited stations from Anthony’s Surfacestations project from the other 47 states, reduced it to three PCs and published their projections for the long term temperature trend of the nation they’d have been laughed out the room. Admittedly, the temps down there are much less variable than here in the States, but neglecting the mammoth scale of the place gives much more respect to the possibility of the task they were attempting,even if they used the best available methods, than it deserves. I would suggest amending the graphics for your post to include a to scale outline of the lower 48 so folks are reminded exactly how large the cells you assembled really are.

Dave Wendt
April 13, 2009 12:46 am

Sorry, I missed John Hultquist’s comment that already addressed my point, but it’s nice to know I’m not the only one.

Alan B
April 13, 2009 12:48 am

I feel daunted by all that has gone before but can I ask a simple question?
There is a well established statistical process for interpolation and smoothing of data used by geologists and other earth scientists called the semi-variogram, often followed by krigging to allow the plotting of contours. I used it myself to try to assess a multitude of data points on a cylindrical object.
IIRC the semi-variogram stage gives an idea of how far away from each other points need to be to still have a correlation. Is this not something like people are doing for the Antarctic data?
No doubt this is all old hat …

Malcolm
April 13, 2009 12:57 am

The rules for Mannian data reconstructions are;
1. Pick the proxies that give the required result.
2. Pick the methodologies that give the required result.
3. Utilise group think when referencing authors and when dealing with the peer review process.
4. Don’t archive data sets.
5. Prevent publication of the methodologies.
6. Deny all wrong doing.

AlanG
April 13, 2009 1:30 am

John F. Hultquist (21:37:04) Well spotted sir …a majority [of surface stations] were along or near the coast with only a few at higher elevations in the interior. One source has the highest elevation there as 4,897 meters…
Of course any infilling or interpolation of temperatures should be altitude adjusted. Ice in Antarctica at 4,897 meters is never going to melt. Ever.

Robert Bateman
April 13, 2009 2:05 am

I have often wondered when looking at Antarctic stations, why they don’t build permanent camps by drilling into some of those mountains that are bare rock.

jmrSudbury
April 13, 2009 2:26 am

I agree with Fluffy Clouds (Tim L) (23:45:35) who wrote about the 0.31 number. In the graph, the slope is labeled 0.031 C/dec, but the text below increases that by a factor of 10. That 0.31 should be per century. — John M Reynolds

April 13, 2009 2:57 am

.
>>Oh, and a seismic shift in energy policy to renewables
Unfortunately, it is renewable energy that will destroy nations, not Global Warming (or cooling).
It is an undeniable fact that all renewable energy is intermittent. It is another undeniable fact that there is no viable storage medium that can bridge the energy outage gaps. And THIS is what happens when the electrical supply goes off:
http://en.wikipedia.org/wiki/2003_North_America_blackout
As I have often said before, the Green agenda is to take us all back to the Dark Ages (literally), an era that could only support a world population of a few tens of millions.
.

Perry Debell
April 13, 2009 3:13 am

To Matt Bennett and Jack Century,
In two years time, global temperatures will be lower than they are now. The data do seem to indicate that humans are to be faced with suboptimal climate conditions for growing food.
Which also means that in spite of your beliefs, CO2 emissions will be shown NOT to be the unmitigated disaster for humans that you are convinced that they are. In fact, we need to give plants as much CO2 as we can, because we need to eat to live.
Thus Jack, in an incredibly short geologic time, the flawed works of both Svante Arhennius and Norman Newell, will be consigned to the “department of unusually stupid ideas”.
Now, I know that you two chaps will resist acknowledging, the reality that is just about 700 days away. The point is that whatever you both believe, will affect neither the weather, not the climate. They are independent of human requirements. They both changeable and rather like stock markets, the figures must rise and fall. Like the credit crunch that financial market computer modelling did not foresee, the decline in global temperatures was not predicted by computers models used by AGWarmists. What you believe will not make it so. It’s getting colder and if we are really unlucky, it’s going to be very much colder.
If only humans could make the planet warmer.

Perry Debell
April 13, 2009 3:15 am

Nor, not “not” in line 14.

April 13, 2009 3:46 am

Thanks to everyone for the nice comments. I think it’s pretty telling when Anthony can present simplified reconstructions which demonstrate a 50 yr warming trend here without any adverse reaction from a crowd of skeptics.
There are several comments about fitting a line to the data, I agree this data is much better represented with higher order curves. The line only has the purpose of showing a general trend for comparison to the trend of Steig 09. As most people here know the trend is sensitive to the ends of the curves.
Dave Wendt said,
“Another impressive piece of work, but it seems to me that you are wasting a fair bit of your valuable time trying to construct rational challenges”
Actually, I didn’t know if the paper would be quickly verified and the Antarctic was warming, I don’t have the experience in this subject Anthony and some of the others here do so it’s an exploration of the data for me as much as anything. Now that I understand the paper, the big problem I have with it is the lack of verification by Steig’s coauthors that trends were appropriately distributed across the continent.
steptoe fan,
The language is R, it’s freeware statistical software. Actually all the variable initialization (minus the download) is in the code as used. It doesn’t have formal declaration lists like C or other software and the variables are actually vectors and matrices. I’ve got several readers of the Air Vent who regularly run their own code to see how things work. If you’re interested in running some software, drop a request on my blog.
http://noconsensus.wordpress.com/2009/04/12/closest-station-antarctic-reconstruction/
I’ll stop back in a few hours to answer more of the questions.

Larry Sheldon
April 13, 2009 3:57 am

John F. Hultquist made mention of Mt. Whitney and suddenly dawn on me what it is that has been bothering me about all this (re)construction, infilling, and other manipulations (that tend to give flesh to them notion of “imaginary” numbers).
It is this (by illustration) that bothers me about all of the methods I have seen discussed.
Suppose you had good solid observations from a weather station at Mount Whitney, California. And suppose that you had equally good observations for Furnace Creek, California.
Now, suppose you needed to make predictions, using those records, for Panamint Springs, for which you have incomplete and perhaps poorly measured data.
Do you use one? The other? The averages of the two?
Being a non-scientist, non-engineer it looks to me like the correct answer is to place the problem in the “Too Hard” box.
And if you absolutely must do the problem, use the Panamint data, perhaps discarding data that exceeds the limits established from the other two sets.

Larry Sheldon
April 13, 2009 3:59 am

I wish the typos would appear when I proof read–honest, I did.
John F. Hultquist made mention of Mt. Whitney and suddenly IT dawnED on me….