The Ice Who Came In From The Cold

Guest post by Willis Eschenbach

A few days ago, Steve Goddard put up a post called “Does PIOMAS Verify?” In it, he compared the PIOMAS computer model estimate of the Arctic ice volume with the SIDADS satellite measured Arctic ice area. He noted that from 2007 on, the two datasets diverge.

Intrigued by this, I decided to compare the PIOMAS ice volume dataset with the Cryosphere Today (CT) Arctic ice area dataset.  Here is that data:

Figure 1. Arctic ice area (red line) from Cryosphere Today. Black line is a 6 year Gaussian average.

When I compared the two datasets, I expected to find something curious happening with the PIOMASS dataset. Instead, I found a puzzle regarding the CT dataset.

I compared the CT area dataset with the PIOMAS dataset, and I found the same thing that Steve Goddard had found. The datasets diverge at about 2007. So I took a hard look at the two datasets. Instead of an problem with the PIOMAS volume dataset, I found the CT area dataset contained something odd. Here is a plot of the CT daily data with the daily average variations removed:

Figure 2. Cryosphere Today daily ice area anomaly. Average daily variations have been removed.

The oddity about the data is what happens after 2007. Suddenly, there is a strong annual signal. I have put in vertical black lines to highlight this signal. The vertical lines show the end of September of each year. Before 2007, there is only a small variation in the data, and it does not have an annual signal. After 2007, the variation gets large, and there is a clear annual aspect to the signal. The area in September (the time of minimum ice) is smaller than we would expect. And the area in March (the time of maximum ice) is larger than we would expect.

I considered this for a while, and could only come to the conclusion that there was some kind of error in the CT dataset. So I decided to look at another dataset, the NOAA SIDADS dataset.

Again, I removed the monthly signal, leaving only the anomaly. Here is that result:

Figure 3. SIDADS monthly ice area anomaly. Monthly variations have been removed.

Again we see the same oddity after the start of 2007, with a large annual variation where none existed before 2007. In the SIDADS dataset the variation is even more pronounced than in the CT data.

So that is the puzzle. What has changed? Are they using a new satellite? If so, has the changeover been done properly? Since the smallest of the data has gotten smaller and the largest of the data has gotten larger, is the average data still valid? Just what the heck are we looking at here?

Despite searching, I have not been able to find the answer to this question. However, I have great faith that the assembled masses of the WUWT readership will find it very quickly. (And then some of the readers will likely tell me that this shows I am a layman and a fool, and that I should have been able to find the answer easily on my own … so sue me.)

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
187 Comments
Inline Feedbacks
View all comments
June 1, 2010 2:16 pm

Willis Eschenbach says:
June 1, 2010 at 1:24 pm
899 says:
June 1, 2010 at 3:12 am
So Willis, what was the Sun doing in 2007?
“Taking a nap?”
______________________________________________________
That was Maunder yes?
Sunspots and coronal holes not do get on well. At Dec 2006, Feb 2007 and June 2007, we can see a drop in SSN, this is precisely when the solar wind from coronal holes becomes stronger, and provides more warming: C23/24:
http://www.solen.info/solar/solcycle.html
Note the higher SSN in Nov 2006 and lower solar wind speed, then see it pick up at the above dates:
http://www.solen.info/solar/coronal_holes.html
If you look at the velocity in the last 2 yrs, it has generally been very low, this not the norm with a solar minimum with a lower SSN count, typically, minimums with higher SSN are more likely to have colder winters, due to suppressed coronal hole activity, and maximums with higher SSN are more likely to have colder winters. Last winter had some significant gaps in coronal hole activity, and a quick tally will show that yearly hole totals have been well down from the norm over the last 2 years. I use N.Hemisphere winter as a reference, as this is the largest global temp` variable.
The “modern winters” that we had a run of until recently, can all be seen to have much higher solar wind velocity occurring through them, scorching summer months again can be seen occurring at high solar wind speed times. Its global warming in a nutshell if you think about it.

phlogiston
June 1, 2010 2:23 pm

And Tsonis predicted this several years ago (I dont have the paper to hand), unlike the Asterix and soothsayer “this I had also foreseen” type of retro prophecy from the AGW camp.

Richard M
June 1, 2010 2:25 pm

There was an article awhile back that mentioned an ice arch that had melted and let the wind blow more ice into the open ocean. Clearly, if this arch was a blocking mechanism and it suddenly disappeared, that could change the entire dynamics of the ice melt. If this is true, then don’t expect any changes unless the arch reforms.

Enneagram
June 1, 2010 2:32 pm

Willis Eschenbach:
In response to this, NASA created a new algorithm and has used it to synthetically create channel 4 data from October 1st, 2007 onward.
Wow!, why not replace NOAA, GISS, Hansen, Mann et Al. by an algorithm, and perhaps, IT would openly provide its invented data.
Just imagine!: Instead of a Global Government, an algorithm; but that would be an Al- Gore-rithm.
We, third world’ers, better just watch. These things will become dangerous in the near future.
We’ ll need a psychiatrist to post here in WUWT.

₳ɳʊ
June 1, 2010 2:45 pm

I don’t think many people predicted the record melt rates of Arctic sea ice this Spring:
http://www.ijis.iarc.uaf.edu/seaice/extent/AMSRE_Sea_Ice_Extent_L.png
The fastest melt rate in the decade of deepest summer melt.
They need more information on the warm ocean waters underneath the sea ice – the minus 20° C air temperatures right above some sea ice has little to do with the melting from below.
The US Navy realizes the amount of summertime ice coverage has decreased by half over the past 50 years, and the icecap is also about 50 percent thinner, resulting in greater seasonal variations. The climate changes are expected to cause several weeks of annual “ice-free” conditions, meaning there will be less than one-tenth ice coverage. This would likely bring a flotilla of trans-Arctic container shipping, fleets of fishermen and even the ill-advised thrill seekers.
And all these will meet in a resource-rich region buried beneath disputed claims, untested treaties and amidst five nations vying for their share of sovereignty.
Indeed, the Arctic is opening. The fundamental question is not if, but when?
And the answer is, sooner than you may think.

http://www.afji.com/2010/03/4437078
“The Arctic is changing, and it is changing rapidly,” said Rear Adm. David W. Titley, oceanographer of the Navy.
As few as four years ago, leading experts anticipated ice-free summers by the end of the century. Now, such conditions are expected in the 2030s — and many key scientists say those may be conservative estimates; some put the earliest ice-free summer at 2016.
With rights to much of this resource-rich region in dispute, and previously inaccessible areas open to exploration for their abundant reserves of oil and natural gas, the Arctic has been likened to a 21st century gold rush.
Diminishing ice fields mean Russia has a new active border to protect, and one that is close to many of its key oil and gas fields. As such, Russian Bears have been flying frequent patrols in the region and missile tests have been conducted near the North Pole.
“This opening of the Arctic may lead to increased resource development, research, tourism, and could reshape the global transportation system,” the road map says. “These developments offer opportunities for growth, but also are potential sources of competition and conflict for access and natural resources.”
Even four to six weeks of ice-free conditions would offer lucrative resource extraction, fishing and commercial shipping. That means the Navy must prepare itself for potential mission requirements in the far north.
But as the Navy looks to map and forecast oceanographic conditions, all eyes are not on multibillion-dollar submarines or satellites, but instead on a few small, relatively inexpensive gliders. These $110,000 gliders collect salinity, depth and temperature data and can measure optical properties such as water clarity. The information is then fed real-time into ocean models.
Such data is invaluable to naval operations in the Arctic. For example, a glider released from the Healy in 2009 recorded temperatures dropping from 3.5 degrees Celsius at the surface, to -1.5 degrees at 100 meters then rising to 0.5 degrees in deeper waters.

The glider fleet will expand from 20 to 170 by 2015.

These gliders are very interesting – and they are built by the University of Washington.
Yup, that University of Washington:
http://psc.apl.washington.edu/
The Navy is getting serious about understanding the Arctic sea ice, and so is collaborating with the experts.

Jbar
June 1, 2010 4:22 pm

These annual cycles happen to coincide with a sharp decrease in the annual arctic sea ice minimum beginning in 2007.
Willis E says: “And why would it appear just when the average area started to rise?”
Offhand, I would say it appears precisely because the average area started to rise at the same time sea ice was making record September lows.
Arctic ice minimum has been decreasing faster than the ice maximum on average for 30 years, from 9.5 MM sq km to about 10.5 MM more recently, and there is no sign of this differential rate of change abating, so I wouldn’t be surprised if this annual cycle becomes a regular feature of arctic ice. The lines have to get between those deepening minima up to the lagging maxima somehow. On the OTHER hand, the 2007/8 winter was a sharp La Nina cooling phase of ENSO, which may explain while winter ice increased so much over 2006 even though 2007 summer ice was a record low.
So who knows?! Interesting observation though.
As long as people are making predictions, arctic summer ice will reach zero between 2040 and 2050. Hope to still be here.

3x2
June 1, 2010 4:37 pm

₳ɳʊ says: June 1, 2010 at 2:45 pm
They need more information on the warm ocean waters underneath the sea ice – the minus 20° C air temperatures right above some sea ice has little to do with the melting from below
Not sure that minus 20 even begins to describe the Arctic. Imagine open (relatively ice free) water being whipped up by 100mph winds into an energy loss frenzy. Imagine the same water “isolated” by a few feet of “calming” ice. A SWAG of the differing energy loss I will leave to others.

Jbar
June 1, 2010 4:39 pm

Wayne,
Might have been Wikipedia, http://en.wikipedia.org/wiki/Ixtoc_I

Malaga View
June 1, 2010 5:28 pm

If you want to get a handle on the veracity of satellite data then I would suggest taking a look at the Magic Java site: http://magicjava.blogspot.com
The bottom line is that the quality of “the data” is unknown because it cannot be verified.
chopbox says:
June 1, 2010 at 8:17 am
Thanks to toby (June 1, 2010 at 6:27 am) for providing that clip of Dr. David Barber of the U. of Manitoba. What I found most interesting about that clip is confirmation from a scientist in the field that satellite data may be giving data that should not be trusted.

So the validity of satellite data is moving from UNKNOWN towards FALSE.
http://magicjava.blogspot.com/2010/04/three-valued-logic-and-irreproducible_29.html
A concrete example of the spreading of Unknown results in published research is provided by NASA’s claims of increased yield due to synthetic channel 4 data. We’ll assume that these claims are True and that yields are in fact increasing. However, even with this assumption, we cannot demonstrate that yields should be increasing. Because it cannot be verified that the synthetic channel 4 data is valid, we cannot verify that the synthetic data causes bad data to pass QA or good data to fail QA. The quality of the data in these increased yields is Unknown. This is because the quality of the synthetic data is Unknown.
This cascading of the Unknown value continues through anything that uses the data from these increased yields. In practice, it turns out that all processes referred to by NASA as “Level 2” or higher that use Aqua AMSU data will be infected by the Unknown values. That is, all such data sets have an Unknown truth value themselves due to their dependence upon the increased yield data. These “Level 2” products include:
● Temperature profile from 3 mbar (45 km) to the surface.
● Water vapor profiles.
● Snow and ice coverage.
● Cloud liquid water.
● Cloud-cleared IR radiances.
● Rain Rate.
● Ozone.
● Carbon Dioxide Support Products.

3x2
June 1, 2010 5:36 pm

Jbar says: June 1, 2010 at 4:22 pm
These annual cycles happen to coincide with a sharp decrease in the annual Arctic sea ice minimum beginning in 2007.
Willis E says: “And why would it appear just when the average area started to rise?”
Offhand, I would say it appears precisely because the average area started to rise at the same time sea ice was making record September lows.

Will try my previous comment on a different form …
(A) By Sept 07 we have record low ice area (for whatever reason) [the initial shock/disturbance]. This exposes (?) 2.5 million Km2 more of the incoming “warm” water to Arctic conditions (storms). Energy loss (water to atmosphere) is fast and large (see the bob Tisdale SST graph for that period and view any increase as energy loss rather than a warming).
(B) Having lost energy so quickly, the following freeze is rapid and by mid (no sun) season there is plenty of ice. Ice has two aspects at this time of year, insulation and isolation. Water under the ice is protected from storms so severe that they would kill any exposed life.
(C) Come sunrise (08), water under the ice is “warmer” and that, together with insolation (rapidly reducing albedo), means the ice melt over “summer” is rapid.
(D) Sept 08 – ice back toward Sept 07 level though “exposed water” has reduced to (say) 2 Million Km2. Same annual process (above) only slightly reduced.
(E) Sept 09 – back through same cycle though “exposed water” has reduced to (say) 1.5 Million Km2. Same annual process ….
(F) Sept 10 – back through same cycle though “exposed water” has reduced to (say) 1 Million Km2. Same annual process …
We progress from the original 07 “shock” back, year on year, to an ever narrower freeze/thaw range until the next “shock”.

June 1, 2010 6:17 pm

Willis Eschenbach says:
June 1, 2010 at 4:42 pm
What is so different between 1990-1995 and 2004-2010, apart from a big +ve spike in 2007, and a -ve spike mid 1992 (and 0.1C base line raise)?
They look rather similar otherwise.
http://wattsupwiththat.com/2010/06/01/the-ice-who-came-in-from-the-cold/#comment-401337

D Gallagher
June 1, 2010 6:22 pm

Perhaps I am really slow, but when someone talks about an anomoly, I assume that they are talking about a departure from normal, and I expect to see a graph that is centered around zero (average or normal). I am looking at this graph and seeing an anomoly running around 8-11 km^2 for years on end. That’s not really an anomoly.
You say that you removed the average daily variation, I not sure what that means exactly, but I would expect a much greater daily variation at the time of the minimum. Variation can only occur at the linear edge of the ice that’s open to water. When the entire arctic basin is frozen solid in the winter there isn’t very much variation at the boundaries, the shoreline is static. During the min, there is much greater boundary to add variability.
Question – what was the purpose of backing out the average varibility (monthly or daily). It seems to me that you have mixed apples and oranges here. I suspect that the data is showing more very short term varibility starting in 2007 which has nothing to do with climate, but everything to do with a change in methodology (different sensor or algorithm) that has more short term noise. Perhaps the daily data is more accurate, and therefore is capable of showing greater variabilty from day to day.
As for this “being what you would expect”, nonsense. There is just more noise at the edges, and there is more edge or less edge on a seasonal basis. I own a large pond and when it is frozen solid there is no daily variation in the area period, it only varies when it is melting.

June 1, 2010 7:08 pm

Willis Eschenbach says:
June 1, 2010 at 4:42 pm
“Nature just doesn’t seem to work that way, whether the records are of rainfall, or of temperature, or of any other natural variable. If you have another example of that happening, please produce it.”
These clusters in paricular, are following the 17yr coronal hole cycle. You can track them backwards on CET for centuries without many exceptions. Look at 4/5yr chunks from 2005, 1988, 1971, ect ect, clusters of warmer winters every 17yrs, yearly temp`s usually are higher as a result. It breaks down through Dalton and Maunder though:
http://climexp.knmi.nl/data/tcet.dat
There are many monthly strings in temperature series, but the 17yr is the most robust. If you look at every 17th July in CET, going back from 2006, restarting at 1852, tracks a string of very hot July`s over 350yrs. This is one the Cicada beetle are following.
There is a 23yr string, look on CET from Nov 2006 back every 23rd Nov for a hot string.
10 and 20yr yearly temp` and precipitation can also be seen.
I won`t bore you with astronomical heiocentric syzygies driving these at this point.

Joe Lalonde
June 1, 2010 7:26 pm

Carl McIntosh says:
June 1, 2010 at 10:12 am
Not even beavers?
At least the water that beavers disrupt is still in the evaporation system.
90% of what man does takes it right out of the evaporation system being held hostage or dumped into the ground for oil. But a great amount per day is taken out of the evaporation cycle.

D Gallagher
June 1, 2010 7:28 pm

Willis,
OK I have read your reply at 4:42, I think I understand what you did to produce the graph. If I’m reading it correctly. you calculated an average monthly (or daily) departure from the annual average (I’m not sure whether that was each years average or over the entire data set) and substrated that from each months or each days total. The chart represents a running annual average with the average monthly or daily departure from average removed. Or another way of seeing it is that it’s the anomoly with the annual average added back in, each year’s average or the whole period isn’t clear.
When I look at the sea ice extent chart (assume area is similar) I am struck by the fact that there is some small amount of variability from year to year at the maximum extent and it’s very noisy, there is very little varibility during the periods of the melt and the period of the freeze, but there is still short term noise. Only during the minimum is there an opportunity for great varibility and only the last three years has there been any great departure of the monthly averages for those months.
I think that the annual signal is there due to the nature of the enclosed arctic basin. There is no real opportunity for large departures from the monthly averages except for the 6 weeks either side of the minimum. For years there wasn’t much departure from the monthly averages but in 2007, the ice bridges didn’t form between Greenland and the Canadian islands and a great deal of ice was flushed out into the Atlantic.
The annual signal will disappear when the summer minimum returns to normal, which is likely this year since the ice has thickened up again.

Steven mosher
June 1, 2010 10:46 pm

Dunno, looks like an issue for the null hypothesis. ice cycling like its never cycled before.

Frank
June 1, 2010 10:48 pm

It might be worth noting that changes in sea-ice extent are occurring in very different locations at different times of the year. In January, the Arctic Ocean (except the area north of Norway warmed by the Gulf Stream) is always frozen. The January differences between high and low coverage may be occurring mostly in, Hudson Bay, the Bering Sea, off Laborador and off Kamchatka. At other times of the year, differences between high and low coverage are occurring the the Arctic Ocean itself and a potentially thousands of miles apart.
You are correct in pointing out that anomaly plots can exaggerate the importance of changes by providing no reference to the absolute magnitude of the signal. However, anomalies should probably always be reported so that the average anomaly is zero. You can, however, report the vertical scale as a percentage of average annual sea ice coverage. An anomaly change with a vertical scale of +/-20% is instantly recognizable as being more significant that one with a vertical scale of +/-2%.

Eric Anderson
June 1, 2010 10:49 pm

David Gould,
Is there any way I can get in on your bet with Willis? In fairness, I think you said the Arctic would be “effectively ice free” by the end of the 2014 melt season but I’m willing to go with Willis’ 1M sq. km. number and plunk down my $100. If you are game to make it a bit more interesting, I’m happy to add another zero on there as well.
Eric

June 1, 2010 11:45 pm

Eric Anderson,
Probably already too rich for my blood. 🙂 $100 is what I can afford to lose, and Willis got in first.