Guest post by Steven Goddard
I have been noticing in recent weeks that NSIDC extent is much closer to their 1979-2000 mean than NANSEN is to their 1979-2007 mean. This is counter-intuitive, because the NANSEN mean should be relatively lower than NSIDC – as NANSEN’s mean includes the low extent years of the 2001-2007 period. Those low years should have the effect of lowering the mean, and as a result I would expect the NANSEN current extent to be equal to or above the 1979-2007 mean.
I overlaid the NANSEN graph on top of the NSIDC graph below, and it is easy to see how large the discrepancy is. In fact, the NSIDC mean sits at about one standard deviation below the NANSEN mean – which makes little sense given their base time periods. It should be the opposite way.
(Note – the NANSEN and NSIDC measuring systems are not identical, and I had to make a shift along the Y-axis to line them up. However, the X and Y scales are identical for both graphs in the overlay image.)
Nansen uses a different algorithm to calculate the sea ice extent. The algorithms differ in the way combine the raw data together to estimate extent. As long as one uses the same algorithm, the stories are all the same, but the details can differ, more so at certain times of year. When there is a diffuse, broken up ice edge and melt is starting is one such time.
I suspect the Bering Sea is probably the region resulting in most of the differences. While our algorithm shows the region has mostly “ice-covered” the ice cover there is very fragmented, broken-up, and thin.
….
The other thing that’s important to mention is that I was referring simply to discrepancy between how close the current lines are to climatology. However, there is also generally an “offset” between algorithm outputs – a bias or mean difference between the algorithms that is fairly consistent throughout the record. That is why NSIDC’s climatology is different than the Nansen climatology.
The important thing to remember is that there is a good consistent record from the passive microwave data as long as you consistently use the same algorithm and the same processing. But you can’t mix and match products.



Why would you accept just 15% coverage as full coverage. Maybe 85% sea ice would classify as coverage; but 15% ice coverage is open water to me.
George
Considering that 85% of ice is below water, 15% coverage sounds reasonable. I wouldn’t want to captain the Titanic through and an ocean with 15% sea ice.
John F. Hultquist (22:18:51) :
I know it perfectly, but a massaging algorithm is the one to adjust reality to wishes. Chances are that they zip it as much as it is possible in order to not be blamed of deniers by the people who pay their salaries.
I don’t understand Dr. Meier’s explanation. If the ice extent means calculated by NSIDC and NANSEN differ so much from what would be expected because the algorithms used and treatment of data differ between the two groups why does the current 2009 extent estimate to this point match perfectly? Is this just a freaky year in which the different algorithms etc. happen to produce the same results or am I missing something?
vg (22:53:23) :
What the heck is going on? This was posted by S Goddard just last week
http://eva.nersc.no/vhost/arctic-roos.org/doc/observations/images/ssmi1_ice_area.png. Quoting this graph, he said that NH ice was back to normal limits 1SD. So is it now a fact that Norsex has again lowered the WHOLE 2008and 2009 graphs or increasecto make it appear that NH ice is still below normal the mean graph? (as I warned many time already and recorded here
http://mikelm.blogspot.com/2007/09/left-image-was-downloaded-from.html
Unfortunately for your agenda it didn’t happen! What actually happened to the Nansen graph was that because of an error the data on their lower graph where it is plotted along with the mean was not correct, this was noticed because the upper graph (which should be exactly the same) was different. The error was corrected and the data plotted on the lower curve then matched the correct data that had always been on the site!
Ever wonder why Goddard only showed the plot that changed and not the other one? Another guy with an agenda.
Here’s the change in the actual data curve that should have accompanied the one shown by Goddard above:
http://wattsupwiththat.files.wordpress.com/2008/12/nansen_sea_ice_area1-520.gif
ralph ellis (01:23:57) :
.
>>Heat or no heat the maximum winter ice extent has no real
>>significance in my opinion as the arctic is constrained by landmass.
But thickness and continuity may be a factor. If most ice is destroyed by wind and flushing, rather than melting, then the ice’s resistance to wind and tide may be a crucial factor.
If the ice is contiguous and strong, it may resist wind flushing. If it is weak, it may be flushed easily. So you might have a situation where the thickness reaches a ‘tipping point’ (sic) of weakness where great swathes of it is flushed out of the Arctic ocean (2007 and 2008), whereas if it was a little more contiguous and robust it would resist this (2009) and much much more ice will stay in the Arctic.
So far this year the outflow through the Fram has been strong, if anything stronger than at the same time in 2007.
Examples:
http://i302.photobucket.com/albums/nn107/Sprintstar400/20090309-20090315.png
http://i302.photobucket.com/albums/nn107/Sprintstar400/Drift-1.jpg
This is one reason why the current extent is not dropping very fast, there is so much ice flowing out into the Atlantic. If this is correct then the ice remaining in the Arctic is being weakened with significant consequences for later in the year.
Charles: I’m just picking on you. You guys do a great job. Dr Meier is a real pro and it’s nice to have his attention. At least he’s using the “peer review” that we all provide. Thanks. I’ll think before I post again. This Ice data “algorithm” language is just another “forcing function” adjustment factor.
Phil,
You are the one with the agenda, and you made no attempt to explain the point of this article, why is why the ratio of NANSEN current/mean is significantly lower than the ratio of NSIDC current/mean – when it should be the other way around.
You also missed the fact that in Anthony’s original article (linked to in this one) Stein Sandven at NANSEN gave an explanation for why they changed the graph. His explanation did not give any insight into the question I am raising in this article, and your accusatory nature and non-sequitur response also does nothing to shed any light.
I generally observed a discrepancy between total sum of the artic ice regional area and the total area reported in the site “cryosphere today”. The summed areas are about 300000 or 400000 kmq wider in comparison to the official total area. Could somone explain me something about that?
vg (22:53:23):
“It appears to be run by one person, so check before you believe.”
Repost of my posting at the “Bad news from NSIDC” thread:
[I would like to point you to this article I found today in the net pages of the biggest Norwegian dead trees daily; VG:
http://www.vg.no/nyheter/utenriks/klimatrusselen/artikkel.php?artid=542650
From the article (my translation):
“Climate Scientist: Ice Free Arctic by 2100
BERGEN (VG): The experienced climate scientist Ola M. Johannessen (70) was baffled when he calculated when the Arctic Ocean will be ice free year round.
“- It shows, if we put the numbers into that formula, that we are going to have an ice free Arctic – summer as well as winter – already in this century”, says Johannessen.
Now, it is not just any kind of formula the Research Director at the Nansen Center for Climate Research has developed.
He has compared the annual ice extent in the Arctic Ocean with the annual concentration of CO2 in the atmosphere.
AND THE RESULT IS FRIGHTENING:
“In the beginning of the century we have some natural variations we cannot account for. But the last five decades there is a very strong statistical correlation between the measured CO2 concentration and the actual measured ice extent.”
BAFFLED
“Yes, if we put the graphs of the ice extent and the CO2 content on top of each other – then the connection is apparently striking.”
“- I was certainly quite baffled when I saw it,” says Johannessen to VG.
His analysis shows that the increase in CO2 alone may account for as much as 90 per cent of the ice decline in the Arctic.
On this basis, he has simply been able to construe a formula which suggests how much a given increase in CO2 content in the air affect the ice extent. Thereby he can simply enter both values into the formula on his PC, and look at when this formula says there is no more ice left:
“- If we use my statistic formula, all ice will be gone, even in winter, when CO2 concentration in the atmosphere reaches 765 parts per million (ppm). Today the concentration is about 385, but 765 will most probably be reached by year 2100, if we don’t execute drastic cuts.”
Photo captures (top): “Arctic – without ice. If the shocking calculations of research veteran Ola M. Johannessen is correct, Arctic will be without ice by year 2100. Summer as winter.” (bottom): “WARMER: During the last ten years sea level has risen by about 3 centimeters.”
Anthony, I don’t know if this is the right thread, but I think this deserves some attention. It looks like we have another contender for the prize of the boldest ice prediction for the Arctic.
We have previously Dr. Serreze with his prediction of an ice free North Pole in the summer of 2008. Then we have big Al with his prediction of an ice free Arctic by 2013.
However, I think Johannessen is in a class of his own. Please notice, he not only predicts an ice free Arctic in summer, he predicts an ice free Arctic in WINTER.
It is conceivable that an ice free Arctic in summertime may occur if the atmosphere and the oceans warms by a few degrees C. This has probably also happened earlier in Holocene, according to archeological evidence. However, in the wintertime, there is bitter cold and darkness 24/7 all over the Arctic for the good part of 6 months. Thanks to the invaluable research efforts by the Catlin Arctic Survey team, we now know that temperatures in the Arctic, even in March/April, are between -25 and -40 degrees C. How is a doubling of CO2 concentration going to increase temperatures enough to avoid freezing of sea water in Arctic winter?
According to reasonably accepted science, a doubling of CO2 will increase temperatures by about 1.2 C, give or take a few tenths. Even if we accept the baseless and highly unlikely assumption of a climate sensitivity of 3, the temperature increase will not be more than 3.6 C. So how exactly is this temperature rise going to stop water from freezing in the Arctic during the 6 months of winter temperatures below -20 or -30 degrees?
What this exercise in statistical extrapolation shows, is how absurdly out of reality it is possible to end if you just extend short time trends to infinity.
Anyone who thinks he can up the predictions even more?
Do we have a winner?]
The article presents this guy as the Research Director at the Nansen Center, but whether he is responsible for the NORSEX graphs I don’t know.
What do you think, just another guy with an agenda?
Looking at amsr-e, it strikes me that for a good while in May last year 2008 was better than even 2002 and 2003, but then tanked in August and September, in part no doubt due to the “first year ice” phenomenon.
Right now, 2009 is even better than 2008, 2002, and 2003. Right now.
I think it is going to be very interesting to see how it does in August and September this year. If the NSIDC (much maligned) claims about “second year ice” hold true, then I would expect the 2009 summer minimum to end up better than 2008 but not as good as 2005. On the other hand, if 2009 improves on the 2005 minimum that would be a real milestone worth a little chest-pounding and would be quite interesting to see what tack the AGWer’s would take in explaining it.
And if 2009 can beat 2003, then yippee and stfu about arctic ice in “crisis” for awhile.
But I’m somewhat conservative. I think second year ice does matter at least a little. Not as much as first year ice, but some. I think we’re going to end up well above 2008, and in the ballpark of the 2005 minimum, but I’m not willing to predict if it will be a little above or a little below.
Very interesting!
JAN, this is utterly unbelievable. Whatta scientific method! I saw a graph relating global temperatures to Iceland population and its regression was R=0,98 maybe.
Recently I read pretty cool sentence: “mainstream science is on the verge of being overturned by the efforts of a group of dedicated amateurs” (The Australian Financial Review, April 23) – so lets roll!
S Goddard my silly… should check myself LOL
Sorry, vg
No pun intended!
What is the scientific value of the Catlin expedition compared with the value of the very recent Alfred Wegener instite DC-3 fligt across the polar area? In a relatively short time the DC-3 did a huge amount of measurements compared with the restricted amount of measurements of Catlin and the surveyed distance/area of AW was much longer/bigger. The fact, the DC-3 measurements have been done in a very short time gives me the feeling, that these measurements gives at least a better idea of the general ice thichkness situation because Catlin needs to much time and in this long time too much natural changes can happen in the ice situation.
Conclusion’ cadre displaying the vindication of that’s proven.
The scientific value of Catlin? Scientists got paid. You can’t have scientists without figuring out ways to get them paid. Grants got granted. Paper shufflers got to shuffle paper. Websites were hit and newspapers were sold. Heck, a grand old time was had by many!
Seriously, re that comparison it would be interesting to know the lead times on the two projects as to planning and approval. Probably the Catlin people had no idea of the other project.
The Catlin expedition had the potential of being a very useful adjunct means to calibrate / verify the Wegener Institute airborne study. Any time you can ground tructh an airborne survey your greatly increase the confidence and value of the data collected from the air.
Imagine having a 300 km or greater transect of the ice thickness using ground penetrating radar. And this transect has been calibrated with numerous aurger measurements along the line. If you flew an airborne line along the exact same path as the surface transect you would have unimpeachable calibration data to supplement the interpretation of the EM results.
It seems that providing useful quantitative scientific data was not the primary mission of the Catlin expedition. That is truly unfortunate.
That’s interesting to see the 1979 – 2007 average. But it would be interesting to see the entire range of positions as a time series.
h.oldeboom (13:17:59) :
What is the scientific value of the Catlin expedition compared with the value of the very recent Alfred Wegener instite DC-3 fligt across the polar area? In a relatively short time the DC-3 did a huge amount of measurements
The radar measurements are very inaccurate:
http://www.awi.de/fileadmin/user_upload/Research/Research_Divisions/Climate_Sciences/Sea_Ice_Physics/pdf_poster/EM-Bird.pdf
Top right corner of PDF gives comparison
Errors often 25cm and up to 1metre in the short comparison given. Would you trust this data?
bill,
Neither of these instruments use radar. The graph you refer to compares thicknesses calculations for the airborne vs. the “ground based” electromagnetic conductivity instruments. There is no ice auger thickness to decide if one or both of the measurements are in error.
The size of the EM bird vs the ground based EM31 account for some of the observed difference. One is measure thickness over an approximately 2 m diameter circle whereas the other is measuring thickness over a 4 m diameter circle. Variance between the two instruments is to be expected.
The good correlation with EM31 does vouch for the methodology and instrumentation. If I were reviewing a report or publication I would insist on seeing several calibrations against ice of known thickness. The difference between the two instrument measurements suggests that there may variations in conductivity within the ice such as cracks or inclusions of brine or seawater.
Shouldn’t the ice be above normal given the lull that the Sun has been in and the low(er) temperatures experienced across the region this past winter? It seems weird that given those factors it’s still below the average.
Things take time to heat up and cool off. Big things take longer.
Heat transfer in a nut shell.
Earle Williams (20:15:13) :
Not sure why I wrote radar! but thanks for the correction.
On this page the sledge pulled electromagnetic (EM) induction sounding device is compared against bored holes. There are still errors of over a metre shown.
http://www.awi.de/en/research/research_divisions/climate_science/sea_ice_physics/subjects/ice_thickness_measurements/
Pardon me, but on cryosphere arent we seeing the same weird stretches of open water we were when the satelites were screwing up? How is it we can trust the data we are getting?
pkatt (16:56:10) :
Pardon me, but on cryosphere arent we seeing the same weird stretches of open water we were when the satelites were screwing up?
No.
http://arctic.atmos.uiuc.edu/cryosphere/NEWIMAGES/arctic.seaice.color.000.png