It looks as if we are about to see the turn in Arctic sea ice, and if so it will be earlier than last year. But right at that same time, JAXA has decided to switch horses mid-stream.
They say timing is everything, and this timing couldn’t be more wrong. You”d think they would have waited until after the minimum had been recorded, so that there would be no questions or issues with the timing. But for some reason, JAXA has decided that now is the opportune time, right when everyone is watching. An update on their Arctic Sea-Ice Monitor page dated September 6th shows that they are switching from Version 1 to Version 2, and revising 2012. Of course the revision is for less ice:
In Sep. 2012 the arctic sea ice extent renewed the smallest record in observation history, but as the result of the version 2 using AMSR2 data of 2012, minimum sea ice extent became 3.18×106km2 which was 0.3×106km2 smaller value than Version 1 result using WindSat.
Here is what they display, on the plus side, at least they are keeping version1 in place until September 30th:
I have overlaid the two graphs, and it looks like all of the sudden about 250,000 square kilometers of ice has disappeared.
Note: I don’t have issues with their methodology, which is to remove uncertainty/noise related to the land mask boundary, which is always a good thing. But, the timing is certainly odd.
=============================================================
From their update page:
With the version upgrade of AMSR-E Level 1 brightness temperature data, geolocation errors were improved from ±10km to ±1km.
The Version2 sea ice extent was calculated after the analyzing the arctic sea ice concentration derived from the upgraded AMSR-E brightness temperature data.
In addition, the other satellite observational data (1980, 1990, 2000 and 2010′ s average of SMMR, SSM/I and WindSat) was used to calculate sea ice concentration after adjusting the brightness temperature of each sensor using AMSR-E as standard data, and the adjustment of the sea ice concentration threshold which counts the sea ice extent was applied to consist with the AMSR-E sea ice extent.
The modified processing point due to the improvement of the geometric precision of AMSR-E Level 1 brightness temperature data is shown on description below.
With version 1, sea ice can be falsely detected along coasts due to contamination of ocean pixels by the passive microwave emission of land (the false sea ice). To decrease this false sea ice, we applied the “land expanded mask” (See Fig.1).
By improvement in AMSR-E geometric precision and decreasing of the false sea ice, we stopped the land expanded mask in the processing of version2.
Compared to Version 1, Version 2 sea ice extent has increased.
For the purpose of eliminating the false sea ice near the coast, Land Expanded Mask consider horizontally and vertically adjacent pixels as land when the 3×3 box centered on the land pixel.
Version 1 used the land-ocean mask which is provided for SMMR and SSM/I, but for Version 2, due to the AMSR-E geometric precision improvement, we made new land-ocean mask which is adjusted for footprint size of the 18GHz band of AMSR-E (IFOV: 16×27km) and applied to the analysis of sea ice concentration.
Compared to Version 1, the sea ice extent of Version 2 has decreased.
In version 2, the false sea ice near the coast has decreased by the geometric precision improvement of the AMSR-E. But the false sea ice still cannot be removed completely, so we applied the land filter which Cho (1996) proposes. When at least one of 3×3 pixels was inspect as land, as the considering that the central pixel is effected by land spill over and has increased in sea ice concentration, central pixel will be replaced with the minimum value within the 3×3 pixels.
By applying this land filter process, sea ice extent of Version 2 has decreased in the melting period compared to Version 1.
After the observation halt of AMSR-E, the sea ice extent was calculated by WindSat in Verion 1, but in version 2, it was replaced by AMSR2 since July 2012.
In Sep. 2012 the arctic sea ice extent renewed the smallest record in observation history, but as the result of the version 2 using AMSR2 data of 2012, minimum sea ice extent became 3.18×106km2 which was 0.3×106km2 smaller value than Version 1 result using WindSat.
Furthermore, there is no modification in ranking of the successive sea ice extent due to the latest upgrade.
Fig.4 Arctic Sea Ice Extent during the minimum period
(Left:Ver.1, Right:Ver.2) – click to enlarge



![fig1-ii-1-SIC_AMSE_N_PS12_20030301_05diff_only-cncl[1]](http://wattsupwiththat.files.wordpress.com/2013/09/fig1-ii-1-sic_amse_n_ps12_20030301_05diff_only-cncl1.png?w=640&resize=640%2C672)

![fig2-1-Sea_Ice_Extent_ver1[1]](http://wattsupwiththat.files.wordpress.com/2013/09/fig2-1-sea_ice_extent_ver11.png?w=300&resize=300%2C187)
![fig2-2-Sea_Ice_Extent_ver2[1]](http://wattsupwiththat.files.wordpress.com/2013/09/fig2-2-sea_ice_extent_ver21.png?w=300&resize=300%2C187)
I don’t have a big problem with this provided they use the same methodology across the entire data set, which it appears they have.
That said, the graphs pretty much track each other except for two rather sharp divergences in 2012. Small one at end of Nov and larger one at end of Dec. I’d be interested to know of any explanation as to why those exist.
I have been expecting the switch to the Japanese hosted replacement AMSR-2 from the stopgap US Navy WindSat data. They have been bringing systems up for some time.
Anthony:
I’m more concerned about what seems to have happened to the winter maxima than the summer minima. Even though their technical changes seem to be about reducing counting of false ice, which should reduce extents, the winter maxima seem all to have increased substantially – more so for the 80’s than the others, and more for the 90’s than the 00’s. In fact, the 80’s totals seem to exceed the 90’s totals for the whole year by about the same amount. Suspicious, what?
But they have reduced the summer minima. Why should a change that reduces summer minima also increase winter maxima?
Again, suspicious, in that full-year ice totals now are shown decreasing steadily over the decades, better according with the warmist line than before.
@Jeffrey, yes your suspicions are confirmed:
see this image:
http://sunshinehours.files.wordpress.com/2013/09/jaxa-2013-difference-v2-minus-v11.png
Looking for an explanation of the anomalous 2week straightline during mid-August in the DMI link at WUWT, I see that DMI is revising their charting as well:
Old: http://ocean.dmi.dk/arctic/plots/icecover/icecover_current.png
New: http://ocean.dmi.dk/arctic/plots/icecover/icecover_current_new.png
Some questions seem to be raised:
http://ocean.dmi.dk/arctic/plots/icecover/osisaf_nh_iceextent_monthly-09_en.png
Harold Ambler says:
September 6, 2013 at 9:46 am
You really can feel some of the people all of the time, as Lincoln said, but not all of the people all of the time.
As Lincoln also said: “The worst thing about the internet is that not everything you read there is true”
The new algorithm creates a downshift bias that is instantly apparent in the curves. In particular, it makes little difference in years with comparatively high sea ice at minimum, and a lot of difference in years with comparatively low sea ice minimum. This fairly obviously makes sense — if the entire arctic were ice coast to coast, the correction wouldn’t do anything at all. The more the coverage is marginal or broken up, the more opportunities there are for the correction to kick in.
, the ratio of coastal pixels to open water (unaffected) pixels should be
. In general, the ratio of the coastal area to the interior volume should be proportional to
where
is the size of the coarse grained area element (pixel) and
is the length scale associated with the total area. For non-smooth surfaces the constant of proportionality might well be larger than 2, but it should not in general be as much as an order of magnitude greater unless the area itself is terribly resolved because $\Delta r$ is much too large to get meaningful results in the first place.
as being order of 10 km. Are the pixels used in the sea ice estimate really that large? That’s terrible resolution. But it is a lot worse than this — in order to lose this much, the area exposed to be lost has to be much larger, because not ever coastal pixel will suddenly convert from being covered with sea ice to not being covered with sea ice. In fact, if the previous resolution were close to adequate, the only places where one could lose pixels in a biased way is at boundaries between open water and sea ice right up to the coast, where the open water can reasonably shift a pixel one way or the other around the perimeter. If the prior granularity of ice on the entire boundary was one pixel (every other pixel sea ice and open water) and every sea ice pixel disappeared, it would only be half of the pixels that were formerly ice, so the pixel size would have to be at least twice the estimate above, some 30 km, and the entire coastal boundary would have to be covered by an alternating checkerboard of ice and open water 30 km to the side. Obviously this is almost certainly not the case — it would be surprising if the granularity were less than ten times this scale, exposing only 1/10th of the pixels to “sudden” conversion from sea ice to open water. Again, even if 100% of those pixels lost the conversion algorithm decision, this means we’re up to pixels hundreds of kilometers across where I would have expected them to be order of one kilometer across at worst. If the granularity of coastal ice-covered regions was order of as little as ten kilometers in size (where hundreds is a lot more reasonable) there is simply no way to lose 250,000 square kilometers of sea ice from any algorithmic conversion, even one biased to lose every contested pixel!
The bad news for the new algorithm is that it is self-evident proof that their algorithm corrected or not is either terribly converged and hence inaccurate from the beginning or else has artifacts in it both before and after of unknown sign. There is no way a correction like this should have this large an effect unless coastal coverage is nearly fractal at minimum or unless there is a large systematic bias in the old way it was computed (which seems unlikely — nearly any averaging with sufficiently small pixel resolution for the average itself to be meaningful should not be seriously affected by repixellating the boundary. The two should scale like perimeter to area, that is, quadratically, and with anything like adequate pixellation only a small fraction of the pixels should be coastal in the first place. For example, if the entire water area were circular and the pixel size was
Let’s see if this is the scaling of the results above. At first glance this is unreasonable. To put it in simple terms, to lose 250,000 square kilometers from the perimeter of a 16 million square kilometers square with sides 4000 kilometers in length, you have to remove a strip of width 250,000/16,000 = 16 km in width from the whole thing. That is, we can estimate
Unless, of course, my Fermi-estimate arithmetic sucks (entirely possible), but it is pretty simple stuff and enough to make me wonder if they’ve done sufficient common sense checking to ensure that their algorithm change produced a reasonable result or if the shift is the result of some artifact in the code. Or, maybe the pixels from which they make the estimate are huge AND there is a bias — that would do it too, but in that case they really need to include an error estimate per point in the original graph that is equally huge because the empirical resolution of the actual data is directly related to the number of pixels that can convert from a shift like this — the shift from a simple unbiased algorithm change SHOULD HAVE been on the order of the uncertainty in the curves themselves, and should not really have been systematic. To me it looks like the algorithm is taking lots of pixels that are square in the middle of coastal regions that are fairly solidly ice covered and just randomly flipping them to free in a way that would not make sense if one smoothed from the ocean side as well.
rgb
I’ve always thought JAXA reports too low anyway. I prefer the DMI 30% chart, now sadly removed from Anthony’s Sea Ice page, but still at http://ocean.dmi.dk/arctic/old_icecover.uk.php . It looks like it’s reached minimum, maybe. The 30% chart is produced by DMI’s own people, unlike the 15% OSISAF chart which shows lots of spurious wobbles. Maybe DMI could be encouraged to keep it going.
The Land Expanded Mask, and Land Filter reference-images (dated 1996) appear to be an application of the well-known Convolution Matrix. These techniques are heavily used in image-processing, and DSP digital signal processing.
Very brief intro: http://stackoverflow.com/questions/2219386/how-does-a-convolution-matrix-work
As such, a close relative of what JAXA is doing here (convolutions are commonly used for “edge-detection”, in images) has already received intense scrutiny. The ups, downs and artifacts of the mathematical methodology have been extensively explored & characterized.
Indeed, the citation visible in the second Land Mask (image convolutions are also referred to as masks; the JAXA case may be a 2-D DSP function) is dated 1996 … so this has been around long enough to be a ‘known-known’, in the literature.
There is something resembling the Laws of Thermodynamics at work, with such digital operations. You can trade or convert one form of noise to/from another form … but entirely “disappearing” stuff you don’t like, usually breaks a fundamental principle.
The devil is in the specific side-effects of the specific manipulation employed.
And this, of course, is just an incompetence issue, nothing to do with intentions?
apparently both versions will be available until 30th Sept. But timing is indeed stupid. There was no urgency and blurring the lines just when everyone is focused on it not good.
It will lead to less confidence in JAXA.
They could hardly have not thought of this, so someone has a reason. Undoubtedly some loon at the Guardian will manage to compare this year’s V2 to last years V1 and claim another record breaking ice melt for 2013.
Then in two months when no one is interested any more there will be a discrete correction. (Or maybe not).
Alarmists now know the data and science is a lost argument so they are going for cynical misrepresentation and cash in politically before there’s time to correct it. eg 97% of lies…
“The Land Expanded Mask, and Land Filter reference-images (dated 1996) appear to be an application of the well-known Convolution Matrix. These techniques are heavily used in image-processing, and DSP digital signal processing.”
You can have “a” convolution matrix, there is not such thing as the Convolution Matrix.
While it does look a bit like a small 2D kernel what they are doing in not convolution (nor edge detection: gauss-diff kernel). They just take the lowest cell value if there is land in the frame.
All look rather kludgey, so good it’s being removed. Just should have been done later.
All the more reason to pay attention to the alternative method of estimating sea ice extent. The microwave sensors have always missed ice underneath melt water. National Ice Center (NIC) has the mission to ensure safe navigation of ships in icy water, and their analysts supplement the NASA products, with other data, especially satellite imagery and observations from vessels.
NIC charts show ice extent hovering around 6.0 M sq.Km for the last several days. The 8/10ths portion (packed ice) of ice extent is 4.6 to 4.7 M sq. Km. In 2012 the extent at this date was. 5.1 M, and the 8/10ths part was 2.8 M.
Over the past several years, the minimum has occurred on day 265 +/- 1 day (Sept. 21 to 23). For example, last year on Sept. 21, NIC showed ice extent at 4.2 M. Sq. Km., with the 8/10ths portion at 3.3 M.
“@Jeffrey, yes your suspicions are confirmed:”
Yeah, more BS adjustments that just ‘happen’ to go in their favor. I don’t buy it.
A blink comparator picture of version 1 and 2 would be nice.
Is it possible they hope this will prevent the coming ice age?
“@Jeffrey, yes your suspicions are confirmed:”
>>>>>>>>>>>>>>>
OK, now I have a problem with it. Thanks for drawing attention to the issue.
They install two new things..that both reduce extent…and at the same time, jack up the past to make it look like more ice
…who’s crazy enough to think they didn’t do that on purpose?
JAXA has been underreporting for a long time, I thought. I prefer the 30% DMI graph, which is honestly produced by DMI’s own staff, unlike the 15% OSISAF replacement. Maybe DMI can be lobbied to keep the 30% graph going.
The satellite data is much beholden to the orientation of its polarizing lens, which is turned to suit conditions at the pole of interest — causing the Arctic & Antarctic ice growth to appear to be anti-correlated.
The winter maximums increase and the summer minimums decrease. All due to a land mask change? This change will further exaggerate the annual swings shown for the most recent years and “adjust” the minimums lower. The measurement methods and data analysis processes continue to make any current data comparison to historic data an apples vs. oranges endeavor.
One other remark, after reading a few of the latter comments. Yes, the artifact could have been in the prior version, but even so the pixel resolution and bias have to be pretty staggeringly poor to have a 250,000 square kilometer (or more — Anthony is eyeballing and to my eyeballs it looks more like 300,000) effect.
The second comment is even simpler. Why doesn’t anybody ever bother to validate this at the local level? Surely one can afford to fly a plane along the coastline and take high resolution photos that allow the correct classification of every coastal pixel to be unambiguously resolved. In fact, the misclassification of water-covered ice can similarly be addressed by means of Monte Carlo sampling actual pixels from the interior volume. If I were writing ANY sort of classifier algorithm, I wouldn’t be just saying “oh, this algorithm sucks we need a new one” I’d be saying “lets get the actual precisely correct data and compare the algorithms available to see which one best approximates it”.
It doesn’t SOUND like this has been done, although I’m sure that there is a lot of literature out there beyond the mere announcement. If the algorithm has been verified by comparing its accuracy at actually getting the right answer as determined by looking at the crib sheet over a year or so, then while the disappearing ice is still puzzling and difficult to make sense of via scaling arguments, accurate as in agreement with direct observation is accurate as in agreement with direct observation. Accurate as in a supposedly “better algorithm” doesn’t mean anything at all without direct validation.
rgb
@rgbatduke “…staggeringly poor to have a 250,000 square kilometer (or more — Anthony is eyeballing and to my eyeballs it looks more like 300,000) effect.”
My first eyball SWAG was 1/3 of a million, but I decided to be conservative- your estimate is probably closer.
The changes increase the seasonal cycle of the ice extent.
The Feb/March maximum is up 300K to 400K (in all years but it varies some from year to year in terms of the change and the timing). And then the September minimum is lower by 300K to 400K (with 2012 having probably the biggest drop) .
It was not the right time of the year to bring this change in (but then, it would have been even worse doing in late-September for example).
They have been having problems with the data from the WindSat satellite lately which they were using and now they have just switched to AMSR2 satellite and revised all the data to what AMSR2 would have produced.
Friends:
rgbatduke said at September 6, 2013 at 12:46 pm
An important and irrefutable statement repeated here in case anybody missed it.
Richard
Anthony Watts says:
@Jeffrey, yes your suspicions are confirmed:
http://sunshinehours.files.wordpress.com/2013/09/jaxa-2013-difference-v2-minus-v11.png
===
What we need to see is the same thing for at least 10 years.
Is this data available as ice total , I gave up on JAXA because it meant processing a complex grid structure, area weighting etc.
(Duh I forgot our host’s name was a moderation trip wire):
WUWT says:
@Jeffrey, yes your suspicions are confirmed:
http://sunshinehours.files.wordpress.com/2013/09/jaxa-2013-difference-v2-minus-v11.png
===
What we need to see is the same thing for at least 10 years.
Is this data available as ice total , I gave up on JAXA because it meant processing a complex grid structure, area weighting etc.
Greg Goodman @ur momisugly September 6, 2013 at 12:29 pm protests:
“You can have “a” convolution matrix, there is not such thing as the Convolution Matrix.”
The relatively accessible Wikipedia note, in their intro: Computing the inverse of the convolution operation is known as deconvolution. https://en.wikipedia.org/wiki/Convolution
The relatively formal Wolfram also use “the”, extensively. http://mathworld.wolfram.com/Convolution.html
I’d have to see more of the JAXA literature/article, to say whether they are actually doing a matrix transform, or not. My first blush assumption was, they are.
It would be natural & effective, to use an edge-detection related transform (matrix) to deal with land-ocean interface effects.