Jeff Id of the Air Vent has offered me this study he just completed for the consideration of WUWT readers. Given the wide audience we have here, I’m sure it will get a rigorous review and inspection. Let’s see how well it stands up. – Anthony
Closest Station Antarctic Reconstruction
In my last alternate reconstruction of Antarctic temperature I used the covariance of satellite information to weight surface stations. While the reconstruction is reasonable I found that it distributed the trends too far from the stations. This prompted me to think of a way to weight stations by area as best as I can. The algorithm I employed uses only surface station data laid on the 5509 grid cell locations of the Steig satellite reconstruction.
This new reconstruction was designed to provide as good a correlation vs distance as possible and the best possible area weighting of trend, it can’t make a good looking picture though but for the first time we can see the spatial limitations of the data. The idea was to manipulate the data as little as possible to make where the trend comes from as clear, simple and properly weighted as possible.
The algorithm I came up with works like this.
Calculate the distance from each of 42 surface stations to 5509 satellite points store them in a matrix 42 x 5509.
For each of 5509 points find the closest station and copy the data to that location. If there are missing values infill those from the next closest station looking farther and farther until all NA’s are infilled.
This is what the spatial distribution of trends looks like.
You can see how the trends are copied to the points of each polygon from each surface station. There’s quite a bit of noise in the graph but it seems that like temperatures are grouped reasonably well together.
The code looks for the above plot takes about 20 minutes to run.
#calc distance from surface stations to sat grid points
dist=array(0,dim=c(5509,42))
for(i in 1:42)
{
dist[,i]=circledist(lat1=Info$surface$lat[i],lon1=Info$surface$lon[i],lat2=sat_coord[,2],lon2=sat_coord[,1])
}
Circledist is Steve McIntyres’s great circle function with slight modifications.
circledist =function(lat1,lon1,lat2,lon2,R=6372.795)
{
pi180=pi/180;
y= abs(lon2 -lon1)
y[y>180]=360-y[y>180]
y[y<= -180]= 360+y[y<= -180]
delta= y *pi180
fromlat=lat1*pi180;
tolat=lat2*pi180;
tolong=lon2*pi180
theta= 2* asin( sqrt( sin( (tolat- fromlat)/2 )^2 + cos(tolat)*cos(fromlat)* (sin(delta/2))^2 ))
circledist=R*theta
circledist
}
Then I wrote a function to get the closest distance greater than a value ‘mindist’ I pass. The first call for the grid number ‘ ind’, mindist is set to zero and the closest station is returned. If the closest station has missing data, I infill what it does have and pass the distance from the closest station to mindist and get the second closest station returned. The process is repeated until all values are filled.
getnextclosestdistance = function(ind=0,mindist=0)
{
tdist=dist[ind,]
while(min(tdist)<=mindist)
{
mind=min(tdist)
if (mind<=mindist)
{
tdist=tdist[- (which(tdist == min(tdist), arr.ind = TRUE)[1])]
}
}
g= which(dist[ind,] == min(tdist), arr.ind = TRUE)[1]
g
}
This is the loop function that fills the array.
recon=array(NA,dim=c(600,5509))
recon=ts(recon,start=1957,deltat=1/12)
for (i in 1:5509)
{
lastdist=0
while(sum(is.na(recon[,i]))>0)
{
dd=getnextclosestdistance(i,mindist=lastdist)
lastdist=dist[i,dd]
mask = is.na(recon[,i])
recon[mask,i]=anomalies$surface[mask,dd]
print (paste(i,lastdist))
}
}
After that all that’s left is the plotting algorithms by RomanM SteveM and Jeff C which I’ve shown before.
The next graph is the trend calculated from all 5509 grid points.
The trend is again positive by 0.052 C/Decade, this time it is on the outer edge of the stated 95% confidence interval of Steig09 of 12 +/- 0.07C/Decade.
Like before I also looked at the trend from 1967 – 2007.
So from this reconstruction temperatures have dropped since 1967 at an average rate of 0.31 C/Decade. These results are similar to my previous reconstruction which looks like this.
The Antarctic, an engineers reconstruction.
And from 1967 – 2007
While I was initially happy with the engineers reconstruction, I found that station trends were not well localized by linear correlation weighting. (The correlation vs distance was not good) While peninsula station information stayed localized, the rest of the continent spread widely.
The trends shown match my last reconstruction reasonably well but in my opinion these are of superior quality.
Certainly the Antarctic temperatures have been flat or insignificantly cooling/warming in general for the last 40 years while 50 years ago there were lower temps recorded causing a very slight upslope in the 50 year trend. This is confirmed by the fact that sea ice has grown during the last 30 years among other observations.
The Steig 09 paper seems to be an artifact of the mathematics more than an actual trend. Amundsen Scott is the south pole data. The surface measurement is visually clean and has a downtrend for the full length of the data. This cooling is represented by the blue polygon in the center of the antarctic in this reconstruction.
TCO keeps asking me if I’ll post a trend higher than Steig. Every reconstruction I’ve done has reduced the trend from Steig 09. Every change no matter how small has resulted in a trend reduction from Steig 09, even the attempt to match Steig 09 has resulted in a slight trend reduction. I’ll say it now for the first time. In my opinion the paper is flawed and has an exaggerated warming trend due to bad mathematics. Temperature distributions on the continent are a result of artifacts in RegEM and not supported by the natural weather patterns as they were presented.
As an example which is pretty clear. Steig’s paper shows warming across the entire Antarctic. Here’s a plot of the ground data at the south pole.
A reconstruction cannot ignore a trend this strong. So TCO, it isn’t up to me. As Gavin likes to say, the data is the data. This data just cannot support Steig’s conclusions.









OT but…..
Found this article by Dr. Roy Spencer:
http://www.drroyspencer.com/2009/04/a-global-warming-cookbook-what-causes-temperature-to-change/
It’s very basic and it will be of interest to the Liberal Arts majors amongst us.
Regards,
Steamboat Jack
Since the Steig paper came out, I have wondered why no one (for example, the anonymous reviewers for the journal) has called for a test of this method on another continent, where the answer is known. Why not try the Steig method on, say, North America (see Dave Wendt (00:30:04)) or on Australia? Take a sparse collection of readings, mostly around the periphery, and see how easy it is to nail the temperature trend for the whole continent going back a few decades. I suppose someone might argue that the method can only work for Antarctica because it is relatively homogenous. I don’t buy it: the temperatures there vary tremendously both spatially and seasonally.
I think this is the best way to average the temperatures using basic interpolation. This method is very similar to the grid-ding pattern that Reservoir Simulators work breaking up the rock volume or in this case Atmosphere into cells is the right way to handle it. Some way of estimating the micro climates or micro weather due to elevation and or proximity to mountain ranges and or the ocean would need to be considered. All in all great work if you are just looking for a delta temperature over time assuming that these micro climates would average out and probably account for the variability shown in the range of data. Nice work.
On a slightly less technical note, can someone explain how the West Antactic Ice Shelf (WAIS) got its name? What is it ‘West” of?
From the South Pole, everything is North.
East of the WAIS is either sea or more Antarctic.
Good stuff Jeff Id as usual.
The most important point is that every variation of the analysis you did had a lower warming trend than Steig (et al et Mann).
That means just like in all the temperature reconstructions done by the pro-AGW crowd, every assumption, every adjustment, every little innocent wrong-station data error, is structured to Maximize the warming trend.
Just like in the previous thread where the quote from Lindzen questions why all the adjustment goes toward helping the AGW proposition.
I’m sure there are many different ways to reconstruct Antarctica’s temperature trends, but why did they end up with the maximum trend possible.
Every time I dig into the base data behind some study, some issue, I find that the base data used does not support the claims made or the what the abstract says. I think the researchers collect the data, find it is not quite what they expected, but they have to publish the data anyway. And, to keep getting invited to all the great global warming parties, they still have to publish “the data supports dangerous global warming” but they still know it doesn’t.
Bias is understandable I guess. And so is questioning the analysis given that bias.
OT:
Scientific American, April 8:
Is Global Warming a Myth? How to respond to people who doubt the human impact on the climate
http://www.sciam.com/article.cfm?id=is-global-warming-a-myth
New Scientist, April 8, 2009 11:08 AM
Has global warming really stopped?
http://www.newscientist.com/blogs/shortsharpscience/2009/04/has-global-warming-really-stop.html
I would also like to question the data fill in methodology. As was mentioned previously, I would think an interpolation method would be better than tying the grid to progressively further distant stations. A normal interpolation is essentially linear, but I wonder if a different sort of weighting for interpolation would be better; i.e. some sort of non-linear to account for something like the Colorado effect Jack mentioned. Just speculating off the top of my head, but I would like to see a comparsion between the method used, and a regular interpolated fill-in .
Indeed it will soon be time to write a comment on Steig et al. 2009 to Nature…
I don’t see how much of anything derives from the proposition “We are now entering the Anthropocene Epoch.” I haven’t read that many scientific papers but my impression was that they usually they involve data that lead to a conclusion.
Saying we are now entering the Anthropocene Epoch is a very broad generalization that doesn’t get you anywhere in the way of specific predictions.
Could someone reference me to a study on the effects of underwater volcanisms contribution to the wilkins iceshelfs demise,if at all it has contributed towards it.I remember reading a study a few yrs ago but am having trouble locating it.I’m a Globullwarming skeptic and would like to counter a few Alarmists accusations that there is no such thing as volcanism in the western antarctic ocean.
re: my comment on straight lines:
Yes, I’m aware that the straight lines represent a trend, but my point was that most (yes, most) people who don’t understand the concept of a trend are easily convinced that a given trend will continue forever.
The biggest problem with straight lines is they are, by nature, inflexible, and completely dependent upon start and end points. Thus, straight lines are often used for the purpose of showing an intended result rather than illuminating reality from a dataset.
What follows from this is that we simply do not have a long enough run of reliable data to determine ANY trend, or curve, since prior to the ice age scare of the 70s people were sensible enough to not divert a sizable portion of the scientific community’s resources toward weather.
Funny, any cooling is “weather” but any warming is “climate” to these people, and any warming trend is “AGW” while any cooling trend is “cherry picking”. Go figure.
Reality is a very difficult concept for Warmongers to accept. They prefer their comfy fantasies, theories and beliefs, their models and reconstructions, their holier than thou preachers and saints telling us how bad we are and how we must repent our climate sins.
They just can’t accept reality. They chose to believe the models rather than the data.
They believe. They Believe. It makes them feel better. I am not a religious person, but I understand how Believing is comforting to them. It is their warm comfy fur.
A note on Arrhenius from Wiki: “Svante Arrhenius was also actively engaged in the process leading to the creation in 1922 of The State Institute for Racial Biology in Uppsala, Sweden, which had originally been planned as a Nobel Institute. Arrhenius was a member of the institute’s board, as he had been in The Swedish Society for Racial Hygiene (Eugenics), founded in 1909. Swedish racial biology was world-leading at this time, and the results formed the scientific basis for the Compulsory sterilization program in Sweden, as well as inspiring the Nazi eugenics in Germany.”
Eugenics was all the “consensus” in those days.
Jack Century (04:14:26) :
To Perry Debell. Re: “Thus Jack, in an incredibly short geologic time, the flawed works of both Svante Arhennius and Norman Newell will be consigned to the
‘ department of unusually stupid ideas,’ ” The Geological Society of America has published well balanced reasons why we’re now living in the Anthropocene Epoch. Since Arhennisu and Newell are both dead, are you claiming the world’s most prestigious geological society is also stupid ?
Perry never claimed that Arhennius and Newell were stupid – only that certain ideas of theirs might be considered so. Ideas have a “life” independent of those who create them – whether they are stupid or not does not change once the man dies. What that has to do with the GSA and epochs, I’m not sure. Do you know what an epoch means or what the GSA specifically means with their suggestion of the likeliness of an “Anthropocene” epoch?
I’ll give you a hint: the epoch they describe has very little to do with our use of fossil fuels, other than the fact that they have allowed us to realize our current level of industrialization. If by some miracle we were able to convert to 100% renewable energy sources overnight, we would STILL continue to create the defining geologic (stratigraphic) signature they describe.
Furthermore, as it pertains to the GSA and climate change, the GSA acknowledges changes in current climate as do most. They do NOT claim that man is the primary cause of these changes.
Feeding the troll?
I can appreciate Jack Century’s sermon on AGW as the loyal alarmists’s condescension.
But his lack of any credentials relating to climate, lack of any peer reviewed published climate work and lack of current science to bolster his position leaves him standing in the AGW choir chanting demands.
His comments on this thread is the boiler plate kind of decreed certainty that avoids the thorough exploration, definition and progress of science.
With his approach, he is essentially recommending a limit to the scrutiny of science.
His brand of scientific curiosity appears to be that which only produces questions that are already “conveniently” answered.
Or something like that?
Clearly, for AGW research, we can spend billions of dollars sending up satellites, billions of dollars on data expeditions, and billions of dollars on computer modeling, all to justify shifting (risking) trillions of dollars of our economy on “going green”.
But spend 50 million dollars to place remote satellite monitored weather stations every 100 miles gridding Antarctica?
Can’t do it old man.
Jack Century: are you claiming the world’s most prestigious geological society is also stupid ?
This same society made a fool of itself at least once before when it tried to trash Alfred Wegonner’s continental drift theory. Dear me.
Jeff. I admire your analytic ability and insight. As Peat pointed out, compared to the 48 states, the relative scarcity of accurate long term data is challenging. Given the problems Anthony has detailed in Surfacestation, it seems likely the accuracy is even more questionable in Antarctic data. ISTM, by using flawed data, interpolations, etc., we are allowing the AGW crowd to set the terms of the debate. Rather than debate the ” trend”, i believe showing the lack of credibility of the data, as Anthony is doing with Surfacestation, is preferable. Then again, i lack your ability to do the type of analysis you presented. Thank you. fm
The overall trend, up or down, is very small. How much of this could be attributed to siting problems? Granted, you won’t find many barbeques down there, but what of stations buried in snow? With only 42 ‘observers’, it won’t take much to distort the results.
With the separation and elevations questions raised by others, I question whether we can derive much of anything from a limited dataset.
From the Telegraph article here:
http://www.telegraph.co.uk/earth/earthnews/5116352/Arctic-will-be-ice-free-within-a-decade.html
“It could be several hundred thousand years ago the last time we were ice free, it was certainly seven to eight hundred years since we have had close to conditions like we have now,” added Dr Meier.
Is Dr. Meier acknowledging the MWP?
Jack Century,
GSA is prestigious so its pronouncements have the force of verity? Where were they on the subject of “Continental Drift” in the 1920s. Another prestigious organization AAPG (I’m sure you will agree it is prestigious) at the time stated (I paraphrase): ” If we are to believe that the continents drifted as suggested by Wegener, we would have to consign to the waste basket all that we have learned over the past 100 years.”
They resisted for another 30 years before they had to make the consignment and when the evidence overwelmed, it was accepted as a new theory with someone else’s name on it and with a new (ugly) name – plate tectonics- which I thought at the time was an orthodontists term.
Prestige doesn’t cut it for an old timer. By the way, I suppose we could say of the GSA and AAPG at the time that they “just (weren’t) geologically uninformed”
Oops: Error
“just (weren’t) geologically uninformed” Leave out the (weren’t)
I am still wondering if we don’t have this all Back Asswards. I am still wondering why, if there is strong agreement between ground stations and their local temps and satellite data for those locales, why would this not simply prove the accuracy of the satellite data and remove any need to interpolate at all?
Sorry for not being clearer on my question. I sadly do not have copious amounts of free time to give this question serious analysis given my numerous NASA projects (day job) and my blog (night job) – not to mention family commitments.
But it seems to me that all ground stations can do is confirm the satellite data, not override it with expanding error bars from extreme interpolation.
Maybe I just don’t see things right!
To Gary Pearse, as one old-timer to another. RE: The Geological Society of America (GSA) and the American Association of Petroleum Geologists (AAPG) about (1) Alfred Wegener’s theory of plate tectonics and (2)anthroprogenic global warming and climate change. The AAPG made it’s first monumental geoscience blunder by denying Wegener’s theory of continental drift, not based on science, but because it was a “foreign ideology. ” Sound familiar, folks ?
I was there as an undergraduate geology student at the University of Illinois
and saw and heard this nonsense first hand, from geology professors who were card-carrying, AAPG members. The AAPG made it’s second monumental blunder by continuing to deny plate tectonics and AGM for many decades. When will the AAPG, or the Canadian Society of Petroleum Geologists (CSPG ), the second largest petroleum geology society in the world ever learn ? I’m an Emeritus member of both the AAPG and the CSPG .The truth will eventually make you free, Gary.
Just the usual comment from Maynard.
In the same way that the IPCC and the Hockey team managed to ignore all of the historical evidence for a MWP, Steig simply ignored the GISS station at the South Pole. If the warming is global then why does it not appear it the observations from there. I have not checked the quality of the data from this site but would assume, bearing in mind it’a er pivotal nature that it might be one that passes muster. Anthony, has your Surface Stations project checked it out?
Off thread but I suggest all readers here look the the heartland Proceedings.
Cheers
Paul