GISS Polar Interpolation

By Steve Goddard

There has been an active discussion going on about the validity of GISS interpolations. This post compares GISS Arctic interpolation vs. DMI measured/modeled data.

All data uses a baseline of 1958-2002.

The first map shows GISS June 2010 anomalies smoothed to 1200 km. The green line marks 80N latitude. Note that GISS shows essentially the entire region north of 80N up to four degrees above normal.

The next map is the same, but with 250 km smoothing. As you can see, GISS has little or no data north of 80N.

Now let’s compare the GISS 1200 km interpolation with the DMI data for June 2010.

Daily mean temperatures for the Arctic area north of the 80th northern parallel, plotted with daily climate values calculated from the period 1958-2002.

http://ocean.dmi.dk/arctic/meant80n.uk.php

DMI shows essentially the entire month of June below the 1958-2002 mean. GISS shows it far above the the 1958-2002 mean. Yet GISS has no data north of 80N.

Conclusion : GISS Arctic interpolations are way off the mark. If they report a record global temperature by 0.01 degrees this year, this ↑↑↑↑↑↑↑ is why.

——————————————————————

Straight from the horse’s mouth.

the 12-month running mean global temperature in the GISS analysis has reached a new record in 2010…. GISS analysis yields 2005 as the warmest calendar year, while

the HadCRUT analysis has 1998 as the warmest year. The main factor is our inclusion of estimated temperature change for the Arctic region.

– James Hansen

In other words, the GISS record high is based on incorrect, fabricated data. Why did Hansen apparently choose to ignore the DMI data when “estimating” Arctic temperatures? GISS Arctic anomalies are high by as much as 4 degrees, and yet he claims a global record measured in hundredths of a degree. As Penn and Teller would say …. well I guess I can’t say that here.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
154 Comments
Inline Feedbacks
View all comments
Julienne Stroeve
July 28, 2010 12:48 pm

stevengoddard says:
July 28, 2010 at 12:07 pm
Julienne,
If temperatures at the North Pole had of been between 3-5C during June – as GISS seems to be implying – what would that have done to the ice there?
Steve, the NCEP reanalysis shows 925 mbar monthly mean temperatures for June at the North Pole around 1C, with anomalies of 3-5C. It makes sense that temperatures in June were anomalously warm given the SLP pattern in June. I haven’t looked into detail at the buoy data in that region, but I can talk with Don Perovich when I see him and see if he has any new information about melt rates near the pole from his mass balance buoys.

July 28, 2010 12:53 pm

Jim G,
I don’t know how many data points DMI uses, but there are quite a few buoys up there.

pwl
July 28, 2010 12:54 pm

Michael Schaefer’s suggestion is an excellent idea, I’ve had the same idea too.
It would be an excellent and natural extension of the Surface Stations project to start putting in it’s own temperature stations.
What say you all? What say Anthony Watts,
Also, this one for Steve and Anthony, what is an appropriate “grid resolution” for temperature stations? 10km? 20km? 25km? 50km? 100km? 200km? 250km? R km? What radius R is appropriate for ONE temperature station to obtain accurate results in “interpolating” temperature data? Why would 10km be better than 250km? Why not 1km if we want real accuracy?
Wouldn’t geography matter? Near a river would be cooler? Up a mountain? In a valley?
In visiting the Island of Hawaii I learned that it had something like 21 of 22 climate zones, all except Arctic. You can even see them as you drive around, one side of the road is desert like while the other side is lush vegetation jungle like. Wouldn’t all these climate zones need their own (at least one) temperature stations?
Would all the climate zones across the planet need to be mapped out and temperature stations put there, at the same scale as needed for an island such as Hawaii? Would this actually be an irregular grid depending on these climate zones?
What are these climate zones anyhow? An article about them and how they can be quite small and next to each other with varying temperatures would be interesting and we might learn something about how to measure temps.
By the way, Hawaii’s best climate zone is molten rock! ]:)]
So for a Surface Stations project you’d pick the GPS coordinates where a station needs to be put near and someone would obtain the equipment Node for that location and put it there. It’s kind of like that game GeoCaching. In this case it’s a Temperature Station being cached at the location. Someone or someones will need to attend the location from time to time to pick up the data and / or replace the equipment or perform maintenance on it or make sure it’s still there and hasn’t been poached. So the first step is to draw up a GPS grid tartan map for where stations are to be located near and scout them out with Google Earth for their suitability and legal access and make GPS adjustments as needed. Then obtain the temperature station equipment in bulk and get the volunteers start getting them. Of course it’s best to bill the local governments for the equipment, after all we’re doing their job!
What say you all?
Oh, another question, what about all the TV and radio stations that broadcast weather reports? Can’t those be aggregated and used for temperature data? Would that fill in some of the global grid?

July 28, 2010 12:55 pm

Turboblocke
So you believe that you can accurately gauge the earth’s temperature within 0.01 degrees using 1200 km extrapolations?
That idea is scientifically farcical beyond comprehension.

Caleb
July 28, 2010 1:19 pm

In response to:
“GeoFlynx says:
July 28, 2010 at 10:15 am”
That was a neat movie. However it ended before the recent re-freezing spell.
One thing I wish I could see more clearly is the piling up of that mountain-range pressure-ridge in the far distance. The thing to remember is for every foot those things rise, a root grows nine feet downwards. They were something subs had to avoid, (though I suppose Tom Clancy would have a sub hide behind one.)
The lead between the near buoy and the far buoy is also interesting, for it appeared quite early, but never grew very wide. Some grow wide enough for subs to surface in open water right at the pole, but this one didn’t. The far buoy drifts over to one side of the near buoy, and then back to the other side, which shows the lead didn’t act like the San Andreas fault and represent a grinding fracture between two major chunks of ice.
The “internal temperature” of the camera is likely like the air temperature inside your car on a summer day; it tends to be higher, especially in the bright sunshine.
There must be an external thermometer among all the gizmos they had set up. I wonder why the heck GISS doesn’t use it.
I think DMI does use data from buoys. When you are looking at a model, you need to think whether actual data is being put in. I respect models that have actual data, without “adjustments.” Otherwise it is garbage you are putting in, and you know what you can expect to come out.
In the end, the value of a model boils down to whether or not it verifies.

Lichanos
July 28, 2010 1:20 pm

When I hear the statement, “AGW- it’s basic physics!” this is what I think of. I don’t think physicists would be very happy with this sort of intepolated data. Not a good foundation for a theory of how nature works. I find it depressing that this sort of data processing, not too dissimilar to what a novice GIS-user might do without applying too much thought, is the basis for statements like this.
As a engineer who uses GIS extensively, I run into this all the time. We have a more or less sparse data set – pollution samples, bathymetry, rainfall, whatever – and someone needs a map or a chart or a picture to convey the information. Or, they need to do calculations.
Fair enough, but then the question always comes back, “Is this accurate?”
Accurate enough for you, perhaps. After all, where there is no data, there is really no way to know. Go out and check. If we interpolate, we invent data. Nothing wrong with that, and it’s quite useful in many instances, but you have to be straightforward in your explanation and make sure the users know what they are looking at!
Estimated data values are simply that, and nothing more. And what are they worth? No way to know until you get direct observations.

Lichanos
July 28, 2010 1:32 pm

Thanks for the link to the Hansen paper. I found these two passages in it that seem of interest:
(4) The cool weather anomalies in the United States in Jun-Jul-Aug 2009 and in both the United States and northern Eurasia in the following Dec-Jan-Feb are close to the cool extreme of the range of seasonal temperatures that are now expected (Figure 17) given the warming of the past few decades. Although comparably cool conditions could occur again sometime during the next several years, the likelihood of such event is low in any given year and it will continue to decrease as global warming continues to increase.
I thought this is interesting because it’s nothing more than a polemical talking point.
(5) we suggest a new procedure for use of satellite SST data …We adjust the satellite data by a small constant such that the monthly temperature anomalies of satellite and in situ data are equal over their common area.
I don’t get this. It makes sense if you are trying to keep your satellite data in sync with your surface data, but it doesn’t tell you anything about which one is more accurate. Given the questions about surface stations that have been raised, this could be no more than the blind data set leading the blind…

Dave Wendt
July 28, 2010 1:44 pm

stevengoddard says:
July 28, 2010 at 12:53 pm
Jim G,
I don’t know how many data points DMI uses, but there are quite a few buoys up there.
Here’s a map from the IABP the shows their currently deployed buoys
http://iabp.apl.washington.edu/maps_daily_tracknsidc.html
I don’t think it’s comprehensive as there are nonparticipating organizations that also have buoys deployed

James Sexton
July 28, 2010 1:47 pm

Chris G says:
July 28, 2010 at 10:51 am
“I don’t know; maybe it’s because DMI is estimating the temperature of the sea and GISS is estimating the temperature of the air.”
The difference, of course, is that DMI uses real thermometers in the close proximity to deduce the temperatures. While GISS makes up numbers based on thermometers hundreds of miles away.
James Sexton says:
July 28, 2010 at 9:12 am
“You linked back to a graph of 1200km smoothing. Goddard makes the argument that the global record, if a new record is set, will be the result of said smoothing and interpolation over 80 north, (and I suppose 80 south as well). What I’m asking for is some indication that the calculations used to arrive at the global mean are using data from the 1200km smoothing rather than the original data used to produce that smoothing. You haven’t provided that any more than Goddard has given us a calculation showing what the result would be without the model-generated data.”
Sorry, misunderstood the question. While I don’t have any first-hand knowledge about whether they use the extrapolated data in determining the global mean or not, Jim Hansen’s statement in Stephen’s post (scroll up to close to the top) seems to indicate that they do indeed use the manufactured data.
“the 12-month running mean global temperature in the GISS analysis has reached a new record in 2010…. GISS analysis yields 2005 as the warmest calendar year, while
the HadCRUT analysis has 1998 as the warmest year. The main factor is our inclusion of estimated temperature change for the Arctic region.
So, from the head of GISS, he seems to think the cause of GISS declaration of the warmest year being 2005 is because they included the “estimated” temps from the Arctic. So, not only are they using the manufactured data, it apparently carries significant weight in determining the global mean.
Hope that clears things up for you,
James

July 28, 2010 1:54 pm

Part of the problem is that Dr. James “coal fired power plants are factories of death” Hansen and his tribe of data corruptors keep getting away with cooking the temperature books in the MSM. Fortunatley, the MSM keeps losing audience, and web sites like WUWT continually monitor and analyze the bilge coming out of NASA GISS.
The other part of the problem is the intoxication created by computer presentations, which convert boring climate data into mesmerizing patches of color. Its like CAD drawings: They look impressive, but are they correct?

Steven mosher
July 28, 2010 1:55 pm

CE
“carrot eater says:
July 28, 2010 at 8:51 am
If you don’t like the Arctic interpolation in GISS, then use CRU. This is the reason for the slight difference between the two. CRU just leaves the Arctic blank, along with any other empty grid cells. Given that the stations that ring the Arctic indeed are warming faster than the rest of the world, that probably leaves CRU trending too low.”
As more often than not Carrot gets this one right.
The simple facts are these. You have a data source ( GHCN) that represents temperature in a given way. A monthly MEAN that is the result of (tmax/tmin)/2
recorded at a standard time of the day or ADJUSTED via TOBS to a standard time of the day, midnight.
For the artic region There is a dearth of stations that report data in this fashion.
You have various choices.
1. Dont extrapolate or interpolate over this region ( CRU)
2. Smooth data to fill in the hole. (Giss)
3. Use alternative data sources.
if you want to use DMI then you have some work ahead of you. Floating moving bouys ( i’ve looked at that data) or Reanalysis data based on reading at 0Z and 12Z.. not sure you get anything better than just the observation that
CRU says X.
GISS says X+a bit.
Anyways, I’ll take a look at DMI

July 28, 2010 1:58 pm

Julienne
I check the North Pole webcams almost every day, and there were only a few days in June where there was any visible signs of melting.
As I am sure you are aware living in Colorado, snow/ice melts pretty fast on a sunny day at 5C.

Lichanos
July 28, 2010 2:07 pm

@GeoFlynx:
What’s with this movie? Very cool, but a picture isn’t always worth a thousand words. Am I to simply assume that those puddles of water are extraordinary? What evidence does the film present that this is unusual in anyway? It wasn’t exactly open water from what I could see.

Richard M
July 28, 2010 2:10 pm

Obviously, both methods of measuring the temperature have their own aspects. It is true that measuring temperatures above ice will be kept low in the summer months. However, is that a good reason to throw out the numbers?
Clearly, GISS thinks so. And, by throwing them out they can create a larger warming trend. But, one has to go back and ask the basic question … what are we trying to measure? I thought it was surface temperature and the changes over time. Given that is the goal then using actual surface temperatures above the ice is the correct choice. Anything else is not measuring the surface temperature. Period.
Now, there may be questions about the accuracy of DMI, but there is no question that GISS is using the wrong approach.

Billy Liar
July 28, 2010 2:35 pm

Chris G says:
July 28, 2010 at 11:18 am
Interesting, average temp for the north pole is around 0 C in July. Recently, it is +11, (Well , that’s in the camera, which could be above ambient air, but then, in is cloudy.) and I do see meltwater in the view.
If you looked more carefully (or more frequently at the pictures from that web cam) you would see that the snow on the left hand side of the pond slopes gently and merges with the frozen surface of the pond. Whereas, in this picture:
http://www.arctic.noaa.gov/npole/2010/images/noaa2-2010-0716-191646.jpg
The pond is not frozen and the snow around the edge has been affected by the ripples caused by the wind on the pond.
If you now examine all the webcam pictures for June and July you will see that not only does it snow quite frequently but also that sometimes the pond is frozen or half frozen and sometimes it is unfrozen.
This leads me to think that that DMI have a more accurate theory of what’s going on up there. If a huge pancake of ice is melting, how can the temperature in the boundary layer above the ice be anything other zero degees C or close to that. Try it with your next gin and tonic; lots of crushed ice in the glass, suspend a thermocouple close to the surface and measure away as the ice melts.
Furthermore you will note that the webcam takes 3 or 4 pictures each time it switches on; take a look at the camera temperature. It rapidly increases because, funnily enough, it uses power. The temperature sometimes rises 3C in the 30 seconds between shots.
It’s amazing how much a webcam can tell you isn’t it?

John Finn
July 28, 2010 2:54 pm

The first map shows GISS June 2010 anomalies smoothed to 1200 km. The green line marks 80N latitude. Note that GISS shows essentially the entire region north of 80N up to four degrees above normal.
This is a bit misleading. Only about a quarter of the region (the red bit) is between 2 and 4 deg. At least half is only between 0.5 and 1 deg. I doubt the anomaly for the region (based on the map) is much above 1.5 deg. The UAH NoPol anomaly for June is 0.81 deg, so while it’s not exactly in close agreement there’s nothing to suggest that GISS is “way out”.

Chris G
July 28, 2010 2:54 pm

James,
“…it apparently carries significant weight in determining the global mean. ”
Apparent to whom? I’m still waiting for what the difference is between the GISS global data with and without the inclusion of areas between 80-90 north and south.
BTW, You guys claiming it can’t be 4 C above normal there, please follow my link and note the 11 C temperature and the meltwater.

Steven mosher
July 28, 2010 3:05 pm

Richard M
“Now, there may be questions about the accuracy of DMI, but there is no question that GISS is using the wrong approach.”
That is not entirely correct. The approach will either OVERESTIMATE the the warming or underestimate the warming, since no estimate is perfect. Methods have bias and error.. Its an open question as to wether they overestimate the warming or underestimate the warming. personally I view GISS as an overestimate of the warming in the arctic and CRU as an underestimate.
That level of uncertainty is not an issue. At some point CRU and GISS will learn to put error bars on all their charts and they will stop making silly claims about “hottest” without explaining the uncertainty in that claim.

Julienne Stroeve
July 28, 2010 3:08 pm

Steve, more interesting than the North Pole webcam will be the actual mass balance buoy data. From that you can see how much surface versus basal melt is occurring.
Once melt starts, the temperatures remain near 0.
Some interesting numbers that I just looked at, the difference between the maximum ice extent in winter and the most recent ice extent (i.e. July 27th):
1979-2000: 6.79 million sq-km
2007: 7.91 million sq-km
2008: 7.41 million sq-km
2009: 7.97 million sq-km
2010: 8.09 million sq-km
These numbers illustrate that 2010 is continuing the trend of large seasonal ice loss, which is a result of thermodynamics (e.g. surface, lateral and basal melting) and ice dynamics (e.g. compaction, deformation and ice export). Melt onset fields derived from passive microwave do reveal early melt onset this year, which hints at warmer than normal air temperatures.

Billy Liar
July 28, 2010 3:16 pm

Reference says:
July 28, 2010 at 12:16 pm
Recent 2010 Atmospheric Data near the North Pole from the North Pole Environmental Observatory
http://psc.apl.washington.edu/northpole/PAWS_atmos_recent.html

Thanks for the link! I plotted the data and surprise, surprise it bears an uncanny resemblance to the DMI plot. OK, it may from a ‘variety of errors’ but it’s probably a lot better than GISS’s guesses.

Michael Schaefer
July 28, 2010 3:21 pm

pwl –
I’m happy you support my idea.
Yes, I think, it should be possible, to build an independend grit – or swarm – of stations collecting and transfering temperature- and humidity-data from otherwise inaccessible areas to a website.
These stations must be affordable, rugged, maintenance-free over a long time and should be able to transmit data automatically via cellphone of like, so as to avoid extensive care and maintenance, to fetch the data.
One could even try and fund this independent, global surface climate grit the same way the AGW-proponents keep funding their expensive projects: Sell it to donors as a project to monitor “Climate Change” – which, in fact, it does…
Beat them with their own bats, I say!

Julienne Stroeve
July 28, 2010 3:23 pm

Steve, BTW…I don’t have a problem with you showing the DMI temperatures, we do a similar thing with NCEP data and these data are useful, but they have their limitations and users of the data need understand that and use the data in the way they are intended.
The only problem I have is the comparison you make between DMI and GISS and use that to prove GISS data are invalid. They are not the same thing, and I’m not even clear that the same atmospheric level is being used in this comparison. More importantly though is that you are comparing two completely different methodologies for filling in missing pixels (extrapolation/interpolation versus modeling), each method having their own biases and accuracy problems. While such intercomparisons can be useful in helping to see if similar data sets reveal similar seasonal and interannual variability, they are not going to tell you which data set is the most accurate w/o also comparing with actual in situ data. I work regularly with reanalysis temperature data (not with GISS data), but I tend to not use the surface fields because of accuracy problems.

July 28, 2010 3:30 pm

Julienne,
I agree that temperatures over the ice can never get much above 0C because of thermodynamic limitations. That is why I do not find the GISS data showing 3-5C at the North Pole to be credible

John Finn
July 28, 2010 3:31 pm

Re: my earlier post
John Finn says:
July 28, 2010 at 2:54 pm
I checked the 1958-2002 zonal anomalies above 80N. They are
81.00000000 0.9818134904
83.00000000 1.277962089
85.00000000 1.277962089
87.00000000 1.277962089
89.00000000 1.277962089
So my 1.5 eyeball estimate was a bit high – but not bad.
I also checked the 1979-1998 anomalies to check againt UAH. They are
81.00000000 0.8285245299
83.00000000 1.092934608
85.00000000 1.092934608
87.00000000 1.092934608
89.00000000 1.092934608
UAH has an anomaly of 0.81 so I reckon GISS is close enough at 1 deg or thereabouts.

James Sexton
July 28, 2010 3:33 pm

Chris G says:
July 28, 2010 at 2:54 pm
James,
“…it apparently carries significant weight in determining the global mean. ”
Apparent to whom? I’m still waiting for what the difference is between the GISS global data with and without the inclusion of areas between 80-90 north and south.
BTW, You guys claiming it can’t be 4 C above normal there, please follow my link and note the 11 C temperature and the meltwater.
Chris, look at HadCrut and then GISS, Hansen said that was the difference. I mean, if you really want me to do the leg work for you and find the un-extrapolated data, compile it, average it ect., I’d be more than happy to, but my contract services aren’t very cheap. Let me know if/when I need to get started and I’ll send a contract right away! But even without the extra money, look at the maps here http://wattsupwiththat.com/2010/07/26/giss-swiss-cheese/#more-22599 . That show extrapolated vs non. Look at the extrapolated areas and guess about how much land mass it covers in percentage of the globe.(I’m guessing 15%, just eyeballing) Then look at the colors that represent a value. Oh, look!, the extrapolated areas are almost exclusively showing a warming anomaly. This necessarily raises the global anomaly quoted by GISS. How much exactly? I don’t really know.
I don’t think anyone is claiming it can’t be +4 in the Arctic, I’m asserting, declaring, avowing that there is absolutely no freaking possible way GISS can know it is +4 with the thermometers they use in their data. It is a number they pulled out of their posterior and used that soiled number in their formula to determine the global mean. PNS at its best.