Arctic Sea Ice Reports: who to believe?

We’ve all seen  that Arctic Sea ice area and extent has expanded and is back to normal. NANSEN Arctic ROOS just got their web page plots back online yesterday after an outage, and there’s a bit of a surprise when compared to NSIDC’s plot.

Arctic Sea ice extent - NANSEN Artic ROOS left, NSIDC right - click for larger image

Sources:

http://arctic-roos.org/observations/satellite-data/sea-ice/observation_images/ssmi1_ice_ext.png

http://nsidc.org/data/seaice_index/images/daily_images/N_timeseries.png

Here’s a magnified view with the NANSEN graph zoomed and set to match the NSIDC scale, done with the help of my graphics program:

Arctic Sea ice extent - NANSEN Artic ROOS left, NSIDC right - click for larger image

Both datasets use the SSMI satellite sensor, both datasets plot extent at 15%. Yet we have significant differences in the output which would seem to point to methodology. Note that in the magnified view, NANSEN has a peak “normal” of ~15.25 million square kilometers while NSIDC’s “normal” is higher at ~15.75 million square kilometers. You’d think there would be a standard for deciding what is the “normal” baseline wouldn’t you? [Note: The NSIDC average is for 1979-2000, NANSEN’s is for 1979-2006] Maybe the scientists can hammer this out at the next ice conference.

Regarding the plots above.

NANSEN says:

“Ice extent is the cumulative area of all polar grid cells of the Northern Hemisphere that have at least 15% sea ice concentration, using the NORSEX algorithm. Ice area is the sum of the grid cell areas multiplied by the ice concentration for all cells with ice concentrations of at least 15%. Ice extent and ice area are calculated for a grid resolution of 25 km.”

NSIDC says:

“Extent defines a region as “ice-covered” or “not ice-covered.” For each satellite data cell, the cell is said to either have ice or to have no ice, based on a threshold. The most common threshold (and the one NSIDC uses) is 15 percent, meaning that if the data cell has greater than 15 percent ice concentration, the cell is considered ice covered; less than that and it is said to be ice free.”

NSIDC also says:

“Other researchers and organizations monitor sea ice independently, using a variety of sensors and algorithms. While these sources agree broadly with NSIDC data, extent measurements differ because of variation in the formulas (algorithms) used for the calculation, the sensor used, the threshold method to determine whether a region is “ice-covered,” and processing methods. NSIDC’s methods are designed to be as internally consistent as possible to allow for tracking of trends and variability throughout our data record.”

Given that both NANSEN and NSIDC use the same SSMI sensor data, and calculate the extent based on 15% concentration, that half a million square kilometers difference in the “normal” sure seems significant in the context of the magnified extent view NSIDC presents. A half million here, a half million there, and pretty soon we are talking about real ice extent differences.

Another interesting difference is that NANSEN plots Arctic Sea ice area in addition to the extent. Here’s that graph:

click to enlarge

Arctic Sea ice area is above the “normal” line as defined by NANSEN. As far as I know, NSIDC does not offer an equivalent plot. If I am in error and somebody knows where to find NSIDC’s area plot, please let me know and I will include it here.

One final thing to note about the difference between NANSEN and NSIDC. I don’t recall the director of NANSEN/Arctic ROOS ever coming out and saying something like “Arctic ice is in a death spiral” or making any sort of press announcements at all. They seem content to just present the data and let the consumer of the data decide.

In contrast, NSIDC has a whole section that addresses sea ice in the context of global warming. I haven’t found a comparable section on NANSEN Arctic ROOS.

Of course we know that NSIDC director Mark Serreze is very active with the press. Perhaps some of our media friends reading this should seek out someone at NANSEN for the next sea ice story so that there’s some balance.

The differences in the way each organization presents their data and views to the public might explain the differences in the way the output is calculated. One might take a “glass half full” approach while the other takes a “glass half empty” approach. Or it may have a basis in science that I’m not privy to yet. The point is that there are significant differences in the public presentation of sea ice data between the two organizations. One showed sea ice extent as normal, the other took sharp right turn just before it was expected to happen.

I welcome input from both of these organizations to explain the difference.

In related news, Steve Goddard writes:

NSIDC seems to confirms the WUWT 12 Month Ice Forecast. Twelve months ago, WUWT forecast that 3 year old ice would increase during the next year, and explained why.  NSIDC confirmed the accuracy of the forecast with their most recent Sea Ice News.

Source: http://nsidc.org/images/arcticseaicenews/20100406_Figure6.png

Note that 3+ (>2) year old ice has increased from 10% to about 14% during the past year, shown with the two black horizontal lines near the bottom.  That shows an impressive growth of 40% relative to last year.

Ice older than one year has also increased by a substantial amount over 2008 and 2009.  The implication being that ice thickness has been increasing for the last two years. Older ice is thicker ice.

So we will leave it up to the readers to do the math.  Thickness has increased.  Area has increased. What does that tell us about volume?  What does that tell us about the “Arctic Death Spiral“?

Don’t be fooled though. “Decreasing ice is climate. Increasing ice is weather.”

Advertisements

  Subscribe  
newest oldest most voted
Notify of
kim

I was a Teen Age Werebear.
===============

Anthony.
I really enjoy your website.
It has become a standard read of mine!
Thanks

“Mistakes” happen all the time, but why are all the errors in favor of AGW???
Strange !!!

Eric Anderson

Great catch. I’ve been talking with some colleagues about sea ice recently, and this apparent discrepancy is an important one. Thanks.

Frederick Michael

The NSIDC average is for 1979-2000, NANSEN’s is for 1979-2006. Walt Meyer has explained their choice to stop at 2000 many times (they want an unchanging reference). You can disagree with his choice but it is a choice and nothing’s fishy.

“hide the decline”

De Rode WIllem

What a fuzz about nothing !!!! Both data gives about the same absolute amount of sea ice. The one gives just a normal amount and the other a almost ignorable shortage compared to normal.
I don’t see any reason to make noise about these data. Gosh….How low do we need to go to make a scene about something normal ?….If this normal situation is actually something whith a high news value….then there is something wrong ! If we stayed under normal vor the last 8 years, then it should be logical and normal to conclude something is terribly wrong with the Arctic sea ice content.

pat

These folks homogenize temperatures all day long, why can’t they homogenize 15% ice extent? This is absurd, and if Anthony did not do such a careful explanation, appears calculated. I do note, though, the trend line still differs.

Steve Goddard

By tomorrow or Saturday, the NSIDC and Nansen graphs should look more similar. NSIDC’s averaging algorithm creates some lag.
Also, NSIDC uses a higher baseline from 1979-2000, so their average is higher.

R. de Haan

For me it’s a simple choice.
I am with Nanson Arctic Roos.
I don’t carry the concept of death spirals and I detest the the scaremongering tactics in support of power and money grabbing politicians.
There is nothing wrong with the climate or the Arctic, we won’t experience any dramatic effects in the rise of ocean levels.
The only thing we have to fear are the carbon scam artists that are after our money.
That’s it.

Henry chance

Good news. In europa they are launching another satellite today. It will measure freeboard. It can then calculate how much ice is submerged. Then they can toss in some proxies from morrocco and we will have more data. I am shocked that they can now create a trend from a single point.

Bruce Cobb

NSIDC has raised the bar, to make it easier for Serreze’s “death spiral” to limbo under.

Leon Brozyna

The most obvious difference between the two is that NSIDC defines normal based on the period 1979-2000, while NANSEN uses a period of 1979-2006. Those extra six years would have brought the average down; if they add in 2007 to the average next year, it should be even lower.
The really interesting figure should be the level of melt reached this September. We’ll see then how fares the dreaded death spiral…

rbateman

All this suggests to me that the cycle is headed for the opposite end of the spectrum: Above normal ice pack conditions. The pendulum swings back.

Paal

The director of NERC sent out this just after the Gore-Støre report. May be of interest here:
Press Release
Nansensenteret (http://www.nersc.no), Bergen
15. December 2009
Gore – Gahr Støre report; “Melting snow and ice, a call for action”;
http://www.regjeringen.no/upload/UD/Vedlegg/klima/melting_ice_report.pdf
The report prepared by the Norwegian Polar Institute after a “closed” conference in April 2009, is superficial. The authors, according to Gahr Støre are “world leading scientists”, refer to few published work and often to themselves. For example, the ice chapter has only10 published articles – of which only 8 deal with the ice in Arctic and two the ice in Antarctic. Of the 8 articles just 4 of them refer to articles written by 2 of the 5 authors. Other important and published articles are not mentioned.
Several statements in the ice chapter are incorrect. In the introduction (page 34) it says “”Sea ice extent in the Arctic has shrunk by almost 40% since 1979”, the correct numbers is 4,1% per decade or 12,3% since 1979 – not 40%. 4,1% per decade comes from the Nansen Center ice information system; http.arctic-roos.org.
Furthermore, it is not mentioned that the ice extension has great natural variability. In the period 1915-1935 for example, during the natural warming of the Arctic, the ice decreased 0.6 mill km2 while in the summer of 1996 the ice increased with 1.6 mill km2 – caused by the North Atlantic Oscillation. The connection between the increase in CO2 and decrease in the ice spreading is not mentioned either – see for example Johannessen 2008 (enclosed).
The chapter about the Greenland ice sheet has also few references, only 7 articles, which are referred to by the authors – this is not good enough. Regarding the increase of the ocean level Gore and Gahr Støre have over-interpreted the report. They say that the ocean level will increase with 1 to 2 m. The highest value that is mentioned in the report is in the range of 0.5m – 1.5m and with great uncertainty with regard to the increase of 1.5m in this century.
For more information or questions please contact;
Ola M. Johannessen mobile: +4790135336.

Tero-Petri Ruoko

What’s the problem here, they both clearly use a different baseline?
Nansen uses a 79-06 average and NSIDC uses a 79-00 average, that wholly explains the difference in their normal lines. Same measuring system + same “ice-covered” definition + different baseline == different normal value…
REPLY: Sure the baselines differ. I’m pointing out that the public presentations differ significantly and who defines “normal”? Normal seems to be in the eye of the beholder of the data. Essentially it is an anomaly, and you can make an anomaly look like anything you want with a simple choice of defining the baseline. – Anthony

Mike Bryant

” Frederick Michael (09:51:36) :
The NSIDC average is for 1979-2000, NANSEN’s is for 1979-2006. Walt Meyer has explained their choice to stop at 2000 many times (they want an unchanging reference). You can disagree with his choice but it is a choice and nothing’s fishy.”
That doesn’t explain the decline compared to the incline…

John Galt

De Rode WIllem (09:52:14) :
What a fuzz about nothing !!!! Both data gives about the same absolute amount of sea ice. The one gives just a normal amount and the other a almost ignorable shortage compared to normal.
I don’t see any reason to make noise about these data. Gosh….How low do we need to go to make a scene about something normal ?….If this normal situation is actually something whith a high news value….then there is something wrong ! If we stayed under normal vor the last 8 years, then it should be logical and normal to conclude something is terribly wrong with the Arctic sea ice content.

It shows how the artificially chosen baseline measurement affects the reported anomaly.
What is “normal” for the Arctic sea ice? We’ve only measured it by satellite for slightly over 30 years. How do we know what normal really is?

bill-tb

Sea ice forecasts are now made out of fudge … Climate changes, it gets hot, then it gets cold. And cold is much worse for life on earth than is hot.

Does anyone know where the “Ice by Age” data comes from? I tried searching for the IceBridge data from NASA but apparently I am a little useless. I did find a great picture of a crack in the ice though, With an alarming proclaimation about open water in the NW Passage.
The images are visually making my BS detector going off and I would like to see the raw data and any algorithms used to create the graphics.
Since most are smarter than me around here anybody got this?

paulo arruda

There is a review and extension of area taken NSDIC very interesting. http://www.americanthinker.com/2010/04/was_the_arctic_ice_cap_adjuste.html

Jack Maloney

According to the NSIDC web page, “Fossil fuel burning is responsible for climate change”
Does this mean that “fossil fuel burning” has been going on for 4.5 billion years of continuous climate change on Earth?

KevinM

I bet if extent increases above normal, the +/- 2 sigma shading stretches out to +/- 3 sigma, and/or the base period is adjusted to increase the mean.
If there were an options contract on it, I’d be all in. C’mon ProShares, I want a climate market!

Ibrahim

There is nothinhg unusual, all of this happened before.
Read (a book from 1943):
http://www.archive.org/stream/arcticice00zubo#page/444/mode/2up

I don’t think that NSIDC do use the same sensors now. Following the problems with the F13 satellite last summer NSIDC switched to the F17 following cross calibration. Arctic-ROOS did something different and came on-line with a product that has varied from the other sources ever since, NSIDC, ASMR-E/JAXA and MODIS appear to be self consistent while Arctic-ROOS is the outlier.
REPLY: Could be, the point I’m making here is that the public presentation differs significantly. Who’s got the “right” presentation? I don’t know. – A

KC

A bit off topic, but has anyone taken a look at the arctic death spiral propaganda being perpetrated by EDF?
“This is the story of a fictional polar bear family — Aakaga and her cubs Qannik and Siku — as they make their perilous journey in a melting Arctic world.”
http://www.edf.org/page.cfm?tagID=53590
This is hysterical!

paulo arruda

NSDIC uses extension because it shows a decline greater than that area. But area is more important, no? “Hide the decline”

Steve Goddard

paulo arruda (10:18:34) :
I have looked at that American Thinker article and believe the author has made a critical error. Concentration at the pole hole is much higher than 15% – actually closer to 100%. Using the correct concentration numbers, the NSIDC area and extent trends correlate correctly.

But if we assume that the pole hole is only 15% ice (the low end of what is assumed), then the downward trend is only 0.1% per decade, which is not statistically significant. (The corresponding downward trend for “extent” was 2.6% per decade.)

REPLY: I agree. Send me the results and I’ll send it to the American Thinker with a suggestion that they check their work and make a correction if indeed they made that error. -Anthony

frederik wisse

There is appearing today a new fly in the ointment . In order to study the alarming results of climate change , the publicity tune of today , the european space agency launched a new satellite , cryosat , intended to take hourly pictures from both icecaps and whereof spectacular results are expected within half a years time . A new radar will be able to look under the ice and is surely going to deliver the proof that agw is resulting in an impoverished icefloor even when the real ice-extent is growing . All ice will be almost certain classified as unfit for human travelling and the proof of this fact may even come earlier from the catlin-insurance group or a crazy individual trying to conquer the north pole on his own . Anyway we are living in fascinating times and any real capitalist will applaud the hype as long as he is able to make money out of it ! You are a winner , baby al on the way of becoming a zillionaire !

AlexB

The difference in direction in the recent trend (i.e., still increasing for NANSEN vs. decreasing for NSIDC – and the NSIDC revisions, documented here, for the last days of March) seems a bit incongruent, but the methodologies are somewhat different. Both eliminate any cells with less than 15% cover. NSIDC counts cells at >15% as all ice. NANSEN multiplies those same cells (that is if the cells are gridded the same) by the ice concentration within the cell (as I understand the description). So NSIDC should show a greater extent on any given day. I can’t tell from the graphs whether that is the case, but would be interested in that comparison if anybody wants to put out the effort. Regarding the difference in “Normal”, NSIDC is based on 1979 – 2000 while NANSEN is based on 1979 – 2006. So the lower NANSEN normal makes sense since extent in 2001 – 2006 was lower than in 1979 – 2000. But it’s interesting to see that we do seem to be returning to a more “normal” condition (as normal may be defined), at least for now. It will be interesting to watch the trend in the months to come as we go through the “Straits of June”, as on commenter recently coined.

geo

I’m still sitting on 6.0-6.2M km2 for summer extent minimum.
But apparently I need to call out specifically that is based on NSIDC’s and JAXAs historical graphs.
Take a look at the extent minimums on NANSEN as well. It’s not just the max where they disagree with NSIDC. For 2009 NANSEN shows an extent minimum at ~6M km2, whereas NSIDC was a hair shy of 5.3M km2.
Okay, what’s going on here? They both claim to be doing 15%.
Are they using doing different grid cell sizes, perhaps? If NSIDC’s grid cell size is significantly bigger than NANSEN’s 25km, that could explain it. A bigger grid cell would have a harder time reaching the 15% threshold.

PeterB in Indianapolis

De Rode Willem,
You have to understand what “normal” actually means in context. NSIDC uses a baseline period of 1979-2000 to calculate “normal”. NANSEN uses a baseline period of 1979-2006 to calculate “normal”.
What this means is that the NSIDC uses 1979 and the early 1980’s in their calculation of normal (all years with very HIGH ice cover), but does NOT use 2000-2006 (some of these years had very LOW ice cover). NANSEN, on the other hand, does include 2000-2006, so their calculation of “normal” is somewhat lower than the NSIDC number for “normal”.
In order to know what is truly normal (as in, what is a typical average ice-cover area for the arctic), we would probably have to have an absolute minimum of 100 years of data, and it would be far better to have, say, 10,000 years of data. That would give us a MUCH better idea of what is truly normal as opposed to some arbitrary “normal” based on a very small dataset.
Unfortunately, we did not begin getting satellite data for this until 1979, and 1979 and the early 1980s had very high ice-cover. In a 30-year database, 4 or 5 years of very high ice-cover is going to bias “normal” to the high side.
So, it is important to remember that “normal” has nothing to do with “the expected average” in this case. In this case “normal” simply means either the 21-year or 27-year average ice-cover in the arctic. Whether these 21 or 27-year averages resemble “normal” in any meaningful way is basically anyone’s guess untill we get a LOT more years of data.

Steve Goddard

Extent has to decline faster than area, because extent is greater than area. So the extent slope has to be steeper than the area slope.

I’m flipping out over this news! That increase is going to increase albedo during the summer months, and cause rapid cooling of the planet! We’ve reached a tipping point, and we’re going to plummet into a new Ice Age! And it’s all from ice formed from the gases released by mankind’s industrialization! AHHHHHHH!!!
*dons sackcloth*
And don’t accuse me of overreacting. The science is settled!
*sets hair on fire*

Ben Kellett

Intersting to see temps in the Arctic seem to have taken a massive jump recently! Any info on this anyone?

Frederick Michael

Mike Bryant (10:10:14) :
” Frederick Michael (09:51:36) :
The NSIDC average is for 1979-2000, NANSEN’s is for 1979-2006. Walt Meyer has explained their choice to stop at 2000 many times (they want an unchanging reference). You can disagree with his choice but it is a choice and nothing’s fishy.”
That doesn’t explain the decline compared to the incline…

The current value from NSIDC is 15 million km. sq. here:
http://nsidc.org/data/seaice_index/images/daily_images/N_stddev_timeseries.png
The current NORSEX value is about 14.7 million km. sq. Here (look at the extent graph on the right):
http://arctic-roos.org/observations/satellite-data/sea-ice/ice-area-and-extent-in-arctic
My guess is that the NSIDC 5-day moving average is shorter than the smoothing method used by NORSEX. Thus, the NSIDC plot shows a decline already (but from a higher peak) while the smoother NORSEX plot is just beginning to slide.

Fred

I love it when a scientific consensus comes together 🙂

Alan S. Blue

I’m still interested in seeing the actual raw data or pictures for the “maximum ever” Arctic ice extents. When you compare the “better instrument” in the 2002-now AMSR-E’s data to the average, the difference is a whole lot of ice. And half the data has to lie on the other side of the average.
The actual Arctic is still freezing pretty solid – the “missing” ice is precisely the same areas that (1) should be well documented by shipping reports, and (2) just inherently seems unlikely to freeze from a current perspective.

Steve Goddard

Frederick Michael (10:48:39) :
Looking at DMI’s data, which does not do any averaging, you can see that the trend since mid-March is still upwards. NSIDC’s short tail should turn upwards tomorrow or Saturday.
http://ocean.dmi.dk/arctic/icecover.uk.php

Brian in Bellingham

I don’t understand why they don’t use all years for the average. Isn’t that the way it is supposed to be calculated? We all know that we really don’t know what the “average” is supposed to be, since we have relatively limited data, but why not recalculate it every year? They do that with baseball statistics. A players’ season or career average changes every day, though the career average will barely change day to day, and even later in the season, his seasonal average will not change that much day to day.
But why would they not calculate recent years in order to figure out what “average” is? Is it because they assume the ice is not supposed to be low, therefore the last decade or so should not be counted? I would think every baseball player would love to have their bad years thrown out too!
Someone said Meier wants an unchaning reference, but averages do change over time, don’t they? Someone else could use the last 10 years as the average baseline, and we would see that ice is now WAY above average, and we must do something to stop it!

John Egan

Pardon?
If NANSEN uses a baseline from 1979-2006 while NSIDC uses one from 1979-2000 then it is pretty obvious where the difference most likely lie. 2001 thru 2006 were considerably warmer years with lower sea ice concentrations. Thus, not only does NANSEN have a lower “peak” but the the entire cure is lower – hence 2010’s line intersecting the NANSEN curve, but not intersecting the NSIDC curve.
That said, NSIDC needs to extend its baseline past 2000.
REPLY: I agree, and I find it interesting that NSIDC does not point out this baseline difference in the explanation about differences with other presentations I cited from their web page above. – Anthony

The main item of note here is that NANSEN seems to show sea ice extent expanding for the first week of April while NSIDC shows it decreasing, but if you look at NANSEN’s raw data plot (dashed red), you can see that it too shows decreasing ice for the first week of April. It’s half obscured by the solid red homogenized line, but clearly pops up above the homogenized line at the end of March, then descends from there.
If the solid line is a smoothed line, then the April decrase in the raw data is consistent with NSIDC. If the solid line involves data processing that makes it improper to look at the raw data on its own, then there may be an ongoing mystery as to how the two data sets show opposite April directions.

Steve Goddard

In 2007, Mark Serreze appeared ready to open up The Northwest Passage as a commercial route. Hopefully no one invested too heavily in that idea.
http://www.livescience.com/environment/070914_northwest_passage.html


A fabled sea route above North America linking the Pacific and Atlantic Oceans has become a reality thanks to global warming.
Scientists have confirmed that in August, Arctic sea ice shrank to its lowest levels since satellite measurements began monitoring the region nearly 30 years ago. One consequence of this is that the Northwest Passage has opened up much earlier than expected.
“We’re several decades ahead of schedule right now,” said Mark Serreze, a senior scientist at the University of Colorado’s National Snow and Ice Data Center, which monitors the region.
The premature opening of the passage does not mean that climate models are unreliable, only that their predictions have been far too conservative, Serreze said. “They’re getting the right trajectory, but they’re too slow,” he said.

Tom_R

As others have pointed out, the ‘normal’ lines will be different because of the different baseline years. There’s no mystery there. I wonder what the ‘normal’ curve from 1940 to 2010 would look like.
The slight differences in the current curves can be explained by their using different smoothing algorithms. The NSIDC curve could be explained by their using the ‘average’ line (or last-year’s line) to estimate future measurements in order to smooth the tail end of the current line. It would be interesting to see the raw data to find out if those differ.
Where did the information on ice ‘age’ come from? I can’t imagine that satellites could tell whether a section of ice was two, three, or four years old.
REPLY: The point is the difference in public presentations. Who defines “normal” baselines? It would seem to me that that the public interest is not served by having conflicting presentations, one above and one below, “normal”. – A

Wayne

Why does NSIDC show 2010 area at or below 2007 for most of Jan-Feb while Nansen 2010 is always above 2007? Watt am I missing?

John

NSIDC does not have a rational explanation for the fixed average they use. I could speculate that they saw a decline and decided to fix a time range to reference to. Since we are seeing an increase your point is well taken. They should add each successive year into the average, much as they do when the perform their regression analysis.

geo

@Mike Bryant (10:10:14) :
“but it is a choice and nothing’s fishy”.
I’m sorry, but I can’t agree with that. You’ve stopped your analysis one level short of truth.
Here’s what NSIDC says on their website on this issue in their FAQ:
“Why do you use the 1979–2000 average for comparisons?
NSIDC scientists use the 1979 to 2000 average because it provides a consistent baseline for year-to-year comparisons of sea ice extent. Scientists call this long-term average over a data series a “climatology.” If we were to recalculate the climatology every year to incorporate the most recent year of data, we couldn’t meaningfully compare between recent years. To borrow a common phrase, we would be comparing apples and oranges.
The problem with relying on a sliding average becomes clear over time, when we try to compare new years of data with previous years. For example, if we rely on a standard, unchanging climatology like 1979 to 2000, we can easily and clearly compare September 2007 and September 2008 with each other. However, if we were to use the sliding climatology of 1979 to 2006 for September 2007, and the sliding climatology of 1979 to 2007 for September 2008, we would no longer be comparing “apples to apples” when we compared the two years to climatology.
Finally, some scientists point out that since 2000, sea ice has declined precipitously. While you can do an average over any period, it is better to do so over a stable period, either a period of relatively flat change or cyclical change with little overall trend. If you include a strong increasing or decreasing trend when you calculate an average, you probably will not have a representative average.
That said, NSIDC has recently considered revisiting the 1979 to 2000 average. We now have thirty years of Arctic sea ice data. A thirty-year time series is a widely accepted scientific standard for a climatology because it is long enough to encompass most cyclical patterns of natural variation. The problem, however, is that we would have to deal with the potential confusion caused any time that a standard is changed. The graphs would look different to the general public and would require a great deal of explanation.”
Consider that final paragraph, which is the important one for our discussion on this topic. NSIDC admits that 30 years is “a widely accepted scientific standard”, and yet they have not made the move because of “potential confusion”.
Potential confusion to who? Pielke, Jr? Gavin Schmidt?
No, of course not. In fact, they say “the general public” is their concern re confusion.
Well now, isn’t that interesting. That friends is in fact the defintion of politics vs science right there.
So I have to disagree. It is indeed “fishy”.
I could go on at length about what I think they really mean by confusing the general public, but while I have a great deal of confidence in my analysis, it would still be speculative. And frankly, I shouldn’t even need to in this context, because NSIDC just admitted in public that they are engaging in politics instead of science by not going to a 30 year baseline, and frankly that should be enough for me to win my case.

Anu

Brian in Bellingham (10:57:38) :
http://nsidc.org/arcticseaicenews/faq.html#1979average
Why do you use the 1979–2000 average for comparisons?
NSIDC scientists use the 1979 to 2000 average because it provides a consistent baseline for year-to-year comparisons of sea ice extent. Scientists call this long-term average over a data series a “climatology.” If we were to recalculate the climatology every year to incorporate the most recent year of data, we couldn’t meaningfully compare between recent years. To borrow a common phrase, we would be comparing apples and oranges.
The problem with relying on a sliding average becomes clear over time, when we try to compare new years of data with previous years. For example, if we rely on a standard, unchanging climatology like 1979 to 2000, we can easily and clearly compare September 2007 and September 2008 with each other. However, if we were to use the sliding climatology of 1979 to 2006 for September 2007, and the sliding climatology of 1979 to 2007 for September 2008, we would no longer be comparing “apples to apples” when we compared the two years to climatology.
Finally, some scientists point out that since 2000, sea ice has declined precipitously. While you can do an average over any period, it is better to do so over a stable period, either a period of relatively flat change or cyclical change with little overall trend. If you include a strong increasing or decreasing trend when you calculate an average, you probably will not have a representative average.
That said, NSIDC has recently considered revisiting the 1979 to 2000 average. We now have thirty years of Arctic sea ice data. A thirty-year time series is a widely accepted scientific standard for a climatology because it is long enough to encompass most cyclical patterns of natural variation. The problem, however, is that we would have to deal with the potential confusion caused any time that a standard is changed. The graphs would look different to the general public and would require a great deal of explanation.
For those who are interested in comparing the thirty-year decline in Arctic sea ice extent to something different than the 1979 to 2000 average, the NOAA Arctic Report Card 2008: Sea Ice offers a graph showing groups of five-year averages from 1979 to 2008. What one immediately notices is that the overarching story remains the same: Arctic sea ice is rapidly declining over the satellite record, no matter how you calculate the averages.

Neo

… but Anthony the “science is settled”

It has already been mentioned that the Cryosat-2 satellite was launched today.
One of its missions is measuring ice thickness. Unfortunately it seems it will have a limited life span.
The other possibly unfortunate part might be that it’s european. The EU is very keen in promoting GW business. Let’s wait and see.