Guest post by Steven Goddard
I have been noticing in recent weeks that NSIDC extent is much closer to their 1979-2000 mean than NANSEN is to their 1979-2007 mean. This is counter-intuitive, because the NANSEN mean should be relatively lower than NSIDC – as NANSEN’s mean includes the low extent years of the 2001-2007 period. Those low years should have the effect of lowering the mean, and as a result I would expect the NANSEN current extent to be equal to or above the 1979-2007 mean.
I overlaid the NANSEN graph on top of the NSIDC graph below, and it is easy to see how large the discrepancy is. In fact, the NSIDC mean sits at about one standard deviation below the NANSEN mean – which makes little sense given their base time periods. It should be the opposite way.
(Note – the NANSEN and NSIDC measuring systems are not identical, and I had to make a shift along the Y-axis to line them up. However, the X and Y scales are identical for both graphs in the overlay image.)
Nansen uses a different algorithm to calculate the sea ice extent. The algorithms differ in the way combine the raw data together to estimate extent. As long as one uses the same algorithm, the stories are all the same, but the details can differ, more so at certain times of year. When there is a diffuse, broken up ice edge and melt is starting is one such time.
I suspect the Bering Sea is probably the region resulting in most of the differences. While our algorithm shows the region has mostly “ice-covered” the ice cover there is very fragmented, broken-up, and thin.
….
The other thing that’s important to mention is that I was referring simply to discrepancy between how close the current lines are to climatology. However, there is also generally an “offset” between algorithm outputs – a bias or mean difference between the algorithms that is fairly consistent throughout the record. That is why NSIDC’s climatology is different than the Nansen climatology.
The important thing to remember is that there is a good consistent record from the passive microwave data as long as you consistently use the same algorithm and the same processing. But you can’t mix and match products.



Using the 1979 to 2000 period doesn’t make much of a difference – its just an average to compare against.
Is a 30 year period better? I would think that would only be true if it encompassed whatever the period is that encompasses a full cycle of arctic ice variability. Since we don’t know what that period is, 1979 to 2000, arbitrary as is clearly is, is as good any other.
For all we actually know, the cycle could be 40 years, or 20, or 60. My guess would be 60, based on the PDO. But that’s still a guess and irrelevant since we don’t have 60 years of data.
Len van Burgel (20:35:20)
Thanks for reposting that response. This is one of those FAQ’s that will keep coming up from time to time as new readers come to WUWT. And will doubtless keep on coming up in the months ahead, probably most anytime there’s a discussion about sea ice.
Yes, as Dr. Meier points out, it is poor science to “mix and match products” in making a graph. However, it is good science to check a variety of sources and approaches to see if your trend is robust over a number of analyses.
Why does the NSIDC continue to calculate the mean up to 2000? This makes the later trend lines look farther from it than if they calculated the mean up to 2008.
Paul (14:16:49) : Just out of curiosity, why doesn’t NSIDC use the 1979-2007 average?
An international agreement has most countries reporting “normals” or averages using 30 years of data with the final year ending in “0”, such as 1990, 2000, 2010.
Adolfo Giurfa (14:38:48) : why the need of an algorithm?
The camera is a sensor that records a “signature” in numbers based on the wavelengths reflecting off of the surfaces (note the plural). Something has to convert all those numbers into something recognizable and reportable as ice, water, tundra or whatever based on what you are taking a “picture” of. That something is an algorithm.
Before someone says that I’m rejecting correlation graphs, I must say that correlation graphs are useful instruments for identifying and interpreting variables because graphs are visual representations of real data. Unfortunately, if the databases or the algorithms have been flawed, the correlation graphs will be flawed also. The latter has been demonstrated many times here, in WUWT.
As the days, weeks, months, years go by they keep “adjusting” the data but the temperatures keep falling. What are they trying to achieve?
Just look at current NH snow cover. I forcast a cool summer and a colder winter than last year.
Has anyone ever wondered about the choice of years for averages or normals of climatic variables? In case you have but haven’t found the answer, here is one:
“Climatologists define a climatic normal as the arithmetic average of a climate element such as temperature over a prescribed 30-year interval. The 30 year interval was selected by international agreement, based on the recommendations of the International Meteorological Conference in Warsaw in 1933. The 30 year interval is sufficiently long to filter out many of the short-term interannual fluctuations and anomalies, but sufficiently short so as to be used to reflect longer term climatic trends. Currently, the 30-year interval for calculating normals extends from 1971 to 2000.”
http://www.aos.wisc.edu/~sco/normals.html
Also, here: http://ams.confex.com/ams/pdfpapers/26747.pdf
What the heck is going on? This was posted by S Goddard just last week
http://eva.nersc.no/vhost/arctic-roos.org/doc/observations/images/ssmi1_ice_area.png. Quoting this graph, he said that NH ice was back to normal limits 1SD. So is it now a fact that Norsex has again lowered the WHOLE 2008and 2009 graphs or increasecto make it appear that NH ice is still below normal the mean graph? (as I warned many time already and recorded here
http://mikelm.blogspot.com/2007/09/left-image-was-downloaded-from.html
. If this is so NORSEX must be a 100% junk site (NORSEX) because neither AMSR, CT, NSIDC or DIM have changed. It appears to be run by one person so check before you believe.
If you go north of the Arctic Circle during the NH winter there is very little sunlight and for much of it none. So whether or not the Sun is in a “lull” would seem to make no difference. This lack of sunlight within the Arctic extends back in time for each and every year – so the average is not going to be directly effected. The Arctic basin is like a gigantic toilet bowl with rotation, tides, currents, and wind. When these things combine in certain ways the flush is on and the ice clears out and melts at lower latitudes. When rotation, tides and so on do not effectively clear the ice out, it stays and thickens and grows older. Just because Gore and Waxman and others think the ice melts/evaporates and exposes the tundra doesn’t mean we skeptics should.
An average is easily skewed by a few extreme values. Example, the average wealth of folks in King County, Washington State – Bill Gates and other Microsoft billionaires live there. Most folks are well below the average and have no hope of getting close. That’s why some things are better compared to a mode or a median. If you like playing with numbers look at each of the years of ice extent and see if some were not quite large. That would explain why the current year is still below average.
Bill Illis (19:00:43) :
Thank you for the numbers and the graphs Bill. I didn’t know data was available back to 1972. We always hear 1979. I can see there isn’t much of a difference between the 70’s and now.
I liked reading your view on PDO yesterday too. You said you were hesitant to write about. I don’t know what anyone else thinks about it. But I just want truth. I was fine with reading what you had to say. It was interesting– did you like Columbo in the 70’s? 😉
[snip, while there are few limits on what you can say about Hansen, please be more respectful of Dr. Meier who contributes his time to posting here]. ~ charles the moderator
[blockquote]
Paul James:
“nearly large enough to place California inside”
YIKES !!!
Ring the alrm bells !!!!
Has any one told Mr Waxman that CA was moved North and then apparently melted ? Or did it evaporate ? I always have trouble keeping those two in order.
[/blockquote]
Well, that would explain why my condo is under water.
.
>>Heat or no heat the maximum winter ice extent has no real
>>significance in my opinion as the arctic is constrained by landmass.
But thickness and continuity may be a factor. If most ice is destroyed by wind and flushing, rather than melting, then the ice’s resistance to wind and tide may be a crucial factor.
If the ice is contiguous and strong, it may resist wind flushing. If it is weak, it may be flushed easily. So you might have a situation where the thickness reaches a ‘tipping point’ (sic) of weakness where great swathes of it is flushed out of the Arctic ocean (2007 and 2008), whereas if it was a little more contiguous and robust it would resist this (2009) and much much more ice will stay in the Arctic.
This would result in huge differences in summer ice extent, for only small changes in temperature.
Ralph
>>Question– did they mean the “feet” thickness instead of “meters”?
The Germans using ‘feet’? You do jest, surely?
No this was 4 meters thick – quite thick indeed.
Ralph
Sorry Charles the moderator but you should have just snipped out the Mx#!?r part and not the whole post. These Hansen/Gore alarmists are scaring people to gain power. Consider editing my post removing my mention of the nice guy Mx#!?r. It’s obvious with this Ice Extent manipulation that these groups are working the answer backwards to a conclusion that fits the story that CO2 the problem.
Reply: I considered that, but it would have required an editorial rewriting of your post to make sense and that is not really an option. ~ charles the moderator.
Jack Green:
Just so you won’t feel singled out, I’m going to bed, so any further comments may not be approved until one of our East Coast moderators comes online or one of our early morning old folks on the West Coast.
4) Wouldn’t removing any 9 year period from calculating the mean make that mean a poor representation and basis for comparison?
If the last 9 years have cooled, would it make sense to include them if the objective was to continue to parade the old data which effectively portrayed rapid warming?
It’s also a good way to hide the cyclic nature of the ice.
Perhaps the algorithm has an inbuilt routine to keep the extent out of the STD region. Was not doing it’s job – a quick tweak will soon fix that. Can’t have people thinking there is anything “normal” going on up there!!
vg,
NANSEN tracks both ice extent and area. They show area in the normal range, but not extent.
On thing this does tell us is that soot landing in the Arctic isn’t apparently too significant. Dr. Meier was right on that point I admit. And Hansen is therefore wrong again alas: I had been thinking that was the one thing he managed to get right. But no! Clearly much of what is called climate science is mere guesswork.
>Nansen uses a different algorithm to calculate the sea ice extent.
Ok, am I missing something here? This is purported to be science, right?
If you change the algorithm for one year, you need to recalculate every year to get a new average. If you keep an average with an old algorithm for 27 years and begin a new algorithm then you can not even pretend the new years data will even be relevant to compare to any previous year.
I’m sorry, I’m just beside myself. I do not know how to comprehend this stupidity.
So, here is today’s numbers.
Day 119 – April 29th, 2009 – 13,160,000 km^2 from Jaxa
– 12th out of 38 years.
– highest extent in 8 years.
– 2001 was the last year above 2009 and it is substantially above, won’t catch up for a long time.
– 2009 is above 1989 however, 20 years ago.
– 431,000 km^2 below the 1972 – 2009 Average (versus 428,000 yesterday).
– 481,000 km^2 below the 1979 – 2000 Average (NSIDC’s chart looks to be about half this number but I don’t know what they are using now – thanks for telling us. This number, however, is closer to the Cryosphere Today’s number and Nansen’s chart.)
– 82,000 km^2 below the Standard Deviation of the 1979-2000 Average.
http://img407.imageshack.us/img407/8504/day119april29sei.png
Early morning old folks out West!?!?!?!?!?!? I resemble that remark.
I also agree about the PDO versus other oscillations regarding Arctic ice events. The PDO forms the lion’s share of on-shore weather pattern variation from West to East. With a shifted jet stream North, (and possibly more volatile and loopy), the PDO keeps me up to date on possible weather in coming months. As for Arctic ice events, it seems to me that the AMO couples more with Arctic ice events.