Tisdale on Liu and Curry's 'Accelerated Warming' paper

On Liu and Curry (2010) “Accelerated Warming of the Southern Ocean and Its Impacts on the Hydrological Cycle and Sea Ice”

Image above courtesy Dr. Judith Curry

The Liu and Curry (2010) paper has been the subject of a number of posts at Watts Up With That over the past few days. This post should complement Willis Eschenbach’s post Dr. Curry Warms the Southern Ocean, by providing a more detailed glimpse at the availability of source data used by Hadley Centre and NCDC in their SST datasets and by illustrating SST anomalies for the periods used by Liu and Curry. I’ve also extended the data beyond the cutoff year used by Liu and Curry to show the significant drop in SST anomalies since 1999.

Preliminary Note: I understand that Liu and Curry illustrated the principal component from an analysis of the SST data south of 40S, but there are two primary objectives of this post as noted above: to show how sparse the source data is and to show that SST anomalies for the studied area have declined significantly since 1999.

On the Georgia Tech on: “the paradox of the Antarctic sea ice” thread at WUWT, author Judith Curry kindly linked a copy of the paper in manuscript form:

http://www.eas.gatech.edu/files/jiping_pnas.pdf

Liu and Curry use two Sea Surface Temperature datasets, ERSST and HADISST. They clarify which of the NCDC ERSST datasets they used with their citation of Smith TM, Reynolds RW (2004) Improved Extended Reconstruction of SST (1854-1997). J. Clim. 17:2466-247. That’s the ERSST.v2 version. First question some readers might have: If ERSST.v2 was replaced by ERSST.v3b, why use the old version? Don’t know, so I’ll include both versions in the following graphs.

Liu and Curry examine the period of 1950 to 1999. Sea surface temperature data south of 40S is very sparse prior to the satellite era. The HADISST data began to include satellite-based SST readings in 1982. Considering the NCDC deleted satellite data from their ERSST.v3 data (making it ERSST.v3b) that dataset and their ERSST.v2 continue to rely on very sparse buoy- and ship-based observations. ICOADS is the ship- and buoy-based SST dataset that serves as the source for Hadley Centre and NCDC. Figure 1 shows typical monthly ICOADS SST observations for the Southern Hemisphere, south of 40S. The South Pole Stereographic maps are for Januarys in 1950, 1960, 1970, 1980, 1990 and 2000. Since I wanted to illustrate locations and not values, I set the contour levels so that they were out of the range of the data. I used Januarys because it is a Southern Hemisphere summer month and might get more ship traffic along shipping lanes.

http://i37.tinypic.com/x1wtvm.jpg

Figure 1

As you can see, there is very little data as a starting point for Hadley Centre and NCDC, but they do manage to infill the SST data using statistical tools. Refer to Figure 2. It shows that the three SST datasets provided complete coverage in 1950 and 1999, which are the start and end years of the period examined by Liu and Curry. For more information on the ERSST and HADISST datasets refer to my post An Overview Of Sea Surface Temperature Datasets Used In Global Temperature Products.

http://i34.tinypic.com/j5jhp5.jpg

Figure 2

A question some might ask, why did Liu and Curry end the data in 1999? Dunno.

As noted above, Liu and Curry illustrate data for the latitudes south of 40S. There are differences of opinion about what makes up the northern boundary of the Southern Ocean. Geography.com writes about the Southern Ocean, “A decision by the International Hydrographic Organization in the spring of 2000 delimited a fifth world ocean – the Southern Ocean – from the southern portions of the Atlantic Ocean, Indian Ocean, and Pacific Ocean. The Southern Ocean extends from the coast of Antarctica north to 60 degrees south latitude, which coincides with the Antarctic Treaty Limit. The Southern Ocean is now the fourth largest of the world’s five oceans (after the Pacific Ocean, Atlantic Ocean, and Indian Ocean, but larger than the Arctic Ocean).”

But isolating the Southern Ocean for climate studies really isn’t that simple. The Antarctic Circumpolar Current (ACC) is said to isolate the Southern Ocean from the Atlantic, Indian and Pacific Oceans. Unfortunately, the northern boundary of the ACC varies as it circumnavigates the ocean surrounding Antarctica. Refer to the University of Miami Antarctic CP current webpage.

In this post, I’ll illustrate the SST anomalies of the area south of 40S that was used by Liu and Curry. They capture additional portions of the ocean within the Antarctic Circumpolar Current. (They also capture small areas north of the ACC.) And I’ll identify that data as the Mid-to-High Latitudes of the Southern Hemisphere (90S-40S).

I’ll also illustrate the SST anomalies of the Southern Ocean, as defined above (south of 60S), because they capture the Sea Surface Temperature anomalies of the Southern Ocean most influential on and influenced by Sea Ice. Let’s look at that data first.

THE SOUTHERN OCEAN (90S-60S) SST ANOMALIES

Figure 3 compares the three versions of Southern Ocean (90S-60S) SST anomalies, from January 1950 to December 1999, the same years used by Liu and Curry. Included are ERSST.v2, which Is used in Liu and Curry, ERSST.v3b which is the current version of that dataset, and the HADISST data, also used in Liu and Curry. All three datasets are globally complete. And as shown in Figure 1, the Hadley Centre and NCDC have to do a significant amount of infilling to create spatially complete data for those latitudes. The data has been smoothed with a 13-month running-average filter to reduce the noise. Also shown are the linear trends. Again, this is not the full area of the Southern Hemisphere SST data used by Liu and Curry. I’ve provided it because it presents data that is more impacted by (and has more of an impact on) Sea Ice. The linear trend of the ERSST.v2 is almost twice that of the HADISST data. Note also the change in the variability of the HADISST data after the late 1970s. HADISST has used satellite data since 1982 and this helps capture the variability of the Southern Ocean SST anomalies.

http://i36.tinypic.com/287ejkg.jpg

Figure 3

Figure 4 shows the Southern Ocean SST anomalies for the ERSST.v2, ERSST.v3b, and HADISST from January 1950 to December 2009, with the data smoothed with a 13-month filter. The HADISST data peaked in the early 1990s and has been dropping since. This was not easily observed with the shortened dataset. The two ERSST datasets peaked in the early 1980s. For all three datasets, the recent declines in the SST anomalies have caused their linear trends to drop sharply from the values presented in Figure 3. In fact, the HADISST is now basically flat.

http://i36.tinypic.com/snea10.jpg

Figure 4

MID-TO-HIGH LATITUDES OF SOUTHERN HEMISPHERE

Figure 5 is a comparison graph of the SST anomalies for the latitudes (90S-40S) and years (1950-1999) used by Liu and Curry. Note how there are two distinctive periods when there are sharp rises in SST anomalies: from 1966 to 1970 and from 1974 to 1980. Then from 1980 to 1999 the SST anomalies for the mid-to-high latitudes of the Southern Hemisphere flattened considerably. The HADISST data flattened more than the ERSST datasets.

http://i34.tinypic.com/4kwc51.jpg

Figure 5

In Figure 6, I’ve included the data through December 2009. Note the significant drops in the SST anomalies in all three datasets. All three peaked in 1997 (curiously before the peak of the 1997/98 El Niño), and have been dropping sharply since then.

http://i37.tinypic.com/3447dk7.jpg

Figure 6

CLOSING

The title of Liu and Curry (2010) “Accelerated Warming of the Southern Ocean and Its Impacts on the Hydrological Cycle and Sea Ice” contradicts the SST anomalies of the latitudes used in the paper. The SST anomalies are not warming. They are cooling and have been for more than a decade.

SOURCE

The Maps were created using, and the data is available through, the KNMI Climate Explorer:

http://climexp.knmi.nl/selectfield_obs.cgi?someone@somewhere

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

91 Comments
Inline Feedbacks
View all comments
david
August 19, 2010 10:49 pm

When I look at Willis’s southern ocean graph divided into 6 latitudes, (from the very source used in this paper) I see cooling at all latitudes from about 1985 to the present.
If you go to the southern hemisphere sea ice anomaly chart (1979 – 2010) at cryosphere today you will see this is the period of greatest rise in sea ice.
So again “The main focous of the paper was postulating expanding sea ice in a warming environment, when in fact the environment was cooling during most of the sea ice expansion, 1985 to current. Please tell me how this is incorrect.

August 19, 2010 11:26 pm

“Why does a study about current climate written in 2010 end its data series in 1999?”
Statistical Santerism.

nevket240
August 19, 2010 11:31 pm

“A question some might ask, why did Liu and Curry end the data in 1999? Dunno.”
“I’ve also extended the data beyond the cutoff year used by Liu and Curry to show the significant drop in SST anomalies since 1999.” )))
Everybody now ..
CANCUN, CANCUN, CANCUN….. we saw the same media saturation of AGW BS before CopenHagen did we not. This upcoming FundFest is no different. In fact I think they are losing control of the publics perception of the ‘catastrophe’.
regards

TimG
August 20, 2010 12:25 am

Judith,
I assume the 1999 limit was because the model had no forcing data (i.e. aerosols) beyond that point so running it beyond that point would not be meaningful even if it was possible to crunch the numbers with a computer.
That said, can you comment on what effect the recent flatline trend might have on your conclusions – especially if it continues for another 10-15 years. If your answer is ‘the conclusions won’t change’ can you explain why it would not because it would be quite counter intuitive.

Brian H
August 20, 2010 12:27 am

Perhaps she meant “Accelerated Negative Warming”? 🙂

phlogiston
August 20, 2010 12:43 am

Judith Curry says:
August 18, 2010 at 1:31 pm (previous thread)
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations. Again, the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation
This is rather cute – the paper appears to have “points” (or be trying to “score points”) on different levels. Dr Curry says “the main point of this paper was not so much to document the temperature trend”; but a headline starting with “Accelerated warming of the southern ocean …” will create an unambiguous warming alarmist mood music for the MSM. The vast majority seeing the title and not reading further will take away an alarmist message of Southern ocean accelerating warming that, as Bob Tisdale clearly shows, is the exact opposite of the reality. Tucking away the “ending in 1999” part in the small print is consistent with a scam. Perhaps it is a bone thrown to the climate community to try to atone for her recent statements questioning the climate AGW consensus? Is this backtracking in the face of some pretty ugly intimidation? If accuracy was the only intention then the title should have started “Historic warming in the Southern Ocean …”
Further, the use of GCM climate models as a source of “data” on surface temperatures, precipitation, and evaporation, etc., is scientifically very weak and may also be political. Trying to mend some fences? – or maybe just doing all of this at gunpoint?

Tom
August 20, 2010 12:51 am

I’m curious about the infilling methods used to generate complete coverage in those maps of 1950 and 1999. A quick eyeball seems to show a pretty good correlation between places where a sample is available and places showing a non-zero anomaly. I guess this means that they’ve just assumed that grid cells with no sample available have the base-period average temperature, then applied some smoothing so that measured samples ‘spread’ a bit into surrounding cells.
Is this sound? I’d guess probably not. It assumes that there is minimal correlation of temperature anomalies across the region at a given time – ie. that a measured temperature in one place tells you nothing about the temperature more than a few grid cells away.
Eyeballing the 1950 and 1999 charts above again, this assumption doesn’t seem justified, at least for the ERSST data sets. The measured anomaly in each seems to be either almost entirely positive or almost entirely negative, suggesting that the assumption that unmeasured places have the base period mean temperature will significantly overstate the geographical variability and understate the time-domain variability. The HADISST data, OTOH, does show a significant mix of positive and negative anomalies – curious.
Can someone who knows something about southern ocean temperatures comment? Is the extrapolation based on a “we know nothing” model, or is there some physical knowledge of ocean patterns involved as well?

Gnomish
August 20, 2010 12:54 am

Excessing anthropogenic data warming has caused us to reach a tipping point.
The economic climate has been disrupted and catastrophic global science defunding can no longer be avoided.
It’s worse than they thought.
If they don’t get cap and tax, they are out of a job pdq.

August 20, 2010 1:17 am

rbateman says:
August 19, 2010 at 7:39 pm (Edit)
Judith Curry says:
August 19, 2010 at 5:55 pm
It is a trivial matter to renumber the years if the program does not accept input past 1999.
The software would not know the difference between 1999 as 1999, or 2009 as 1999.
I’m getting the hint of some old hardware that has a bonafide y2k glitch never fixed.
############################################
The start dates and stop dates for the simulations are in the simulation plan. The teams gets together and the decide what they will do.
a 20th century simulation, a 2100, and a 2300. so they decided to stop the 20th century sims at 1999.
Sheesh guys. Go read the plans for Ar5! they are already posted.
Psst. some models only have 360 days in their years. betcha didnt know that.

Mac
August 20, 2010 1:39 am

Infilling, cherry-picking and hiding the decline.
This paper highlights that climate scientists have not learnt any lessons from Climategate.
Judith Curry will be welcomed with open arms back into the Warmist Club.

August 20, 2010 2:02 am

Bob, this is a superbly clear and well-written post. Why don’t you write a short paper or comment? The combination of fig 6 with the misleading title of Liu and Curry is just amazing.
The excuse given by Judith Curry for hiding the decline in the latest data
(“we chose 1999 as the end date because that is the end date of the 20th century climate model simulations“) is feeble, in fact non-existent, since the paper itself includes simulations into the 21st century.

Judith Curry
August 20, 2010 4:21 am

Anthony et al. This has been an interesting and useful thread. We will be continuing our research on the Southern Ocean and Antarctic sea ice. During the period 2000-2010 we actually have good satellite datasets, but work needs to be done to verify the satellite retrievals with what little data we do have in the region. In particular, there were some very useful observations made during the recent International Polar Year. Further, the IPCC AR5 is making some simulations on decadal time scales, so we will have those simulations to look at also.
In our paper we discussed extensively the main mode of natural variability in the Antarctic. ENSO also has an impact, see our previous paper at http://curry.eas.gatech.edu/currydoc/Liu_GRL31.pdf
I think this is a fascinating topic, and I hope our paper plays a role in stimulating additional research on this.
I am just about to leave for travel and will have limited time and internet access for the next 4 days. I will check back when I can. Thanks to all for your participation and input.

Jane Coles
August 20, 2010 4:40 am
TomVonk
August 20, 2010 4:51 am

Just one general comment and several comments about the use of the EOF method .
I will not examine the question of data availability , accuracy and relevance which has been amply discussed above .
1) General comment
The title “Accelerated Warming of the Southern Ocean and Its Impacts on the Hydrological Cycle and Sea Ice” is misleading .
It suggests that there is an accelerated warming in the 1959-1999 data and that the paper studies its influence on hydrological cycle both in the past and in the future .
This is not the case . There is no accelerated warming in the data as even J.Curry agreed . The correct title should have been :
Simulation of the SST of the Southern Ocean with CCSM and GFDL models in variable CO2 emission scenarios and a proposition of a physical explanation of the model’s behaviour .
I agree that this title is less sexy but it describes what the paper tries to do .
2. Comments about the EOF method .
– EOF is PCA for all practical purposes . Mathematically it is simple linear algebra and it takes 4 Matlab lines to produce an EOF decomposition from a given data matrix .
The difficulty is with the interpretation of the results .
Quoting von Storch&Navarra Such methods are often needed to find a signal in a vast noisy phase space , i.e the needle in the haystack . But after having the needle in our hand , we should be able to identify the needle by simply looking at it . Whenever you are unable to do so there is a good chance that something is rotten in the analysis
The “needle” in the paper are Fig1a and Fig1b which are the first EOF of the data .
It doesn’t appear that this spatial structure looks like anything significant .
Fig1c through m are just the first EOF of models and correspond to an emission scenario analysis .
3)The relevance of the EOF1 for real data .
It only explains one fourth of the variance (28% and 29% are closer to one fourth than to one third contrarily to what is said in the paper) .
This is very small part of variance explained and the following EOFs must be used too .
Considering that the spatial structure of SST can be significantly represented by the first EOF doesn’t seem to be reasonable . The first EOF is (almost) sufficient for the models but that is not data . On the contrary it suggests that there the models are ovewhelmed by something that doesn’t exist in real data .
4) Spatial resolution
The small scale structures in Fig1a are smaller than the resolution given by the data .
They may be artefacts of the method that creates data at places where no data exists .
This criticism doesn’t apply on artificial model data because the model computes at its own resolution which is much smaller than the real data resolution .
5) First EOF in real data vs first EOF in model (Fig1e and 1f)
While the real EOF is approximately unimodal (radial symmetry) , the model EOF , especially CCSM3 is asymmetrical and bi modal .
This suggests that the model doesn’t reproduce the spatial structure of the reality correctly .
6)Sensibility to domain definition .
The worst trap for the EOF use is the sensibility to domain definition .
This effect happens when the study domain is cut in two and an EOF analysis of each of the new domains destroys the spatial structures in the first analysis .
When that happens then quote Björnsson In this case any attempt at a physical explanation for the EOFs is difficult or plain foolish .
The paper didn’t tell what kind of verification , if any has , been made on domain shape dependency . When the original data matrix used for the study is available , I am sure that the first test that will be done in the blogosphere will consist to cut the disk in 2 halves and perform an EOF analysis on each half .
7) Sampling problems .
These problems happen when the eigenvalues are closely spaced . The paper references the right text (North in Ref 26) but the Eigenvalues and tests are not given .
As the first EOF explains only a small part of the variance , it can be supposed that the Eigenvalues could be closely spaced .
8) SVD
EOF is used to detect spatial structures (i.e standing waves) in a single scalar field .
When one wants to look for correlations between 2 scalar fields , the SVD method is used .
The paper looks at SST vs P-E (Precipitation – evaporation) fields .
The first SVD modes are shown in Fig3a and Fig3b .
First problem is that the link to P-E data (Ref 18) doesn’t work .
After visiting the ECMWF site , it appears that the “data” provided are subject to “reanalysis” (by models) . Specifically 1957 – 1971 “data” are result of a reanalysis of NCAR data .
The maps provided at the ECMWF site have a not the spatial resolution that Fig 3b shows . The Antarctics is in white (e.g no data available) and the surrounding area is flat (e.g no variation) .
Not having seen the data and the paper not mentioning how they are obtained and where they come from I cannot say anything definitive .
However I suspect that as there surely is no real P-E data for most part of the domain , the ECMWF has computed them from the model and other available data .
As that must heavily use SST (principal driver of P-E) , then necessarily the resulting computed P-E are correlated to SST . In that case the finding that P-E correlates to SST would just be a tautology .

August 20, 2010 5:07 am

Jane Coles,
Thank you for that fascinating Proxmire obit. I wonder what the Senator would say about today’s profligate waste of taxpayer funds?
Where people were shocked and the government was ridiculed and embarrassed for wasting thousands of taxpayer dollars, today $20 billion is handed out to organizations like ACORN to facilitate their election shenanigans, over the loud protests of citizens, and with the silent concurrence of the major media. Today’s Democrat party would have a predictable effect on Sen Proxmire.

Glenn
August 20, 2010 5:40 am

TomVonk says:
August 20, 2010 at 4:51 am
“There is no accelerated warming in the data as even J.Curry agreed . The correct title should have been :
Simulation of the SST of the Southern Ocean with CCSM and GFDL models in variable CO2 emission scenarios and a proposition of a physical explanation of the model’s behaviour .
I agree that this title is less sexy but it describes what the paper tries to do .”
Not sure she agreed, but the abstract starts with:
“The observed sea surface temperature (SST) in the Southern Ocean shows a
substantial warming trend for the second half of the 20th century.”

August 20, 2010 6:00 am

TomVonk: Thanks for your insights. I’m writing an introduction series of posts on SST based ocean indexes–AMO, ENSO, and PDO–for those new to discussions of climate. The first two are done and I’m about to start putting together the one on the PDO. I’ve been trying to find simple ways to explain to readers EOF and the “leading principle component”. I’m thinking of using something a little different than what’s been written in the past, maybe the “dominant pattern” or something to that effect. Any ideas of how to present this for readers without science backgrounds?

david
August 20, 2010 6:35 am

The effect of snow on ice and sea water is postulated as a reason for greater sea ice in a warming ocean. This effect is seasonal as the vast majority of antartic sea ice is first year ice.
HadSSTI shows a decline in the antarctic sea ice anomaly from 1973 until about 1980
and an increasing trrend after 1980. This decline in sea ice from 1973 until 1980 correlates perfectly with an increase in sea surface temperatures from the same period. From 1980 until 1999 (actualy 2010) there is a steady increase in sea ice which correlates well with a decrease in SST over the same period.
WHERE is the PARADOX? (sorry to shout)
The IPCCAR4 (IPCC 2007) concludes that warming of the climate system is
‘unequivocal’ and that sea ice is projected to shrink in both the Arctic and Antarctic under all future emissions scenarios.” Wrong in the southern hemisphere for the past 30 years!

david
August 20, 2010 6:38 am

Still looking for graphs of sea ice anomaly prior to 1973?

david
August 20, 2010 6:41 am

I would really like to find an SST chart overlayed on to a sea ice anomaly chart for the period of 1950 until current.

Alan the Brit
August 20, 2010 6:53 am

Call me Mr Picky, but eyeballing those extended graphs give a slow temperature decline in the Southern Ocean from about 1977-2010! Tried every way I can but I get the same trend downwards. Sorry! I agree the trend is upwards to 1999, but hey that was nye on 11 years ago. Dr Vicky Pope the “Wet Office” UK did the same thing with temperature into this century by choosing an end point towards the end of 2007, showing global temps rising still, but she did that in 2009! It’s all about start points & end point & chartmanship.
Would Ms Curry like to borrow my 1925 Pocket OED? Simulation: Feign, pretend, wear the guise of, act the part, counterfeit, shadowy likeness of, mere pretence. (I didn’t include all the definitions on the last post thread). Essentially, the word means unreal! Come on you guys, learn to use words that really do convince us of some credibility. Quit using representation, simulation, sophisticated, etc etc. They are meezly snivilling words that have more holes than a collinder! Stop using may, possibly, could, perhaps, but of course you can’t! I do hope computer models weren’t used in the space race, man would never have got to the moon.

Richard M
August 20, 2010 7:27 am

IMO, this paper demonstrates the pitfalls of group-think and confirmation bias. I don’t think there was any attempt to produce a paper that could be classified as “not even wrong”. But that is what appears to have happened. From the dialog at WUWT I get a strong feeling that neither Curry or Liu were trying to produce a propaganda paper. I think the blame is simply there are no skeptics among reviewers. They are also almost non-existent at universities. So, how does anyone get a critical review of their work? Pretty much impossible within the current scientific framework in climate.
Of course, this paper is not the only one to suffer from this disease. We’ve seen one example after another of this same kind of material. The field of climate science has a serious disease. It’s time the field itself recognized this problem and attempted to do something about it.

pyromancer76
August 20, 2010 7:41 am

Excellent, clear discussion, Bob. Thanks again and again.
“A question some might ask, why did Liu and Curry end the data in 1999? Dunno”
Her answer is so innocent. Doesn’t wash in today’s “climate” of claims of massive global warming. It would be fine if there was a caveat that the oceans did not continue to warm. The paper, after all, is published in the present. And maybe it will be used as part of the run-up to the next conference the goal of which is to strangle the energy development of the developed (democratic) world?

August 20, 2010 9:00 am

Bob Tisdale, nice post. But I think you know that already. : )
Having an open dialog on this paper in a blog that has wide public access provides for education of a broader demographic/audience outside of academia. I thank Curry and Liu for their willingness to do so. I sincerely hope her example will show other scientists that society can benefit by such openness.
Having seen a sampling of papers on climate science over the last 2 years, the Curry and Liu paper seems to me to fit within broad patterns of other studies that are clearly postulated upon the idea of net significant (compared to natural causes) AGW by CO2. This genre of papers does not try to justify the postulate. They openly assume it, openly show some biasing toward it and are not trying to hide it. Modeling tools which are used in such studies are openly built on it. The critical comments on this thread and the previous 2 WUWT threads are consistent with the comments on critical reviews of similarly patterned studies. Perhaps others have noticed this?
Although this J & L paper was interesting it is not at the root of the climate science debate but a consequence. I personally, would like to see more posts on the fundamentals of mainstream climate science that are the essential basis of their conclusion of “net significant (compared to natural causes) AGW by CO2”.
Anthony, Anthony, you do know how to give us all jolts of intellectual delight! Thanks again.
John

George E. Smith
August 20, 2010 10:20 am

This is slightly OT; but still the best place to put it. I was checking the ice page today for the JAXA extent and then I looked at the North Polar view from DMI. A quick check with a real world map confirmed that the perimeter of the DMI polar map is indeed at +60 deg Lat; so that means this map is truly a map of “The Arctic” as distinct from the Curry and Liu pictures from the other end; wheich went all the way out to _40 deg Lat (mebbe for good reasons).
But the punchline is that the DMI north polar picture makes it quite clear that there really is MORE LAND in the Artic than water; and that land can and often is the repository for lots of snow even when the Arctic ocean is a lot of open water; and given that the land is more southerly than the water; the albedo contribution from snow/ice on the land can be more than what the sea ice gives.
At the Antarctic end, the land and water are much closer to being equal; but there still is more water than land; although a lot of that water is under Ross Ice shelves and the like.
So keep that in mind next time you want to win yourself a beer at the bar; there really is more land in the Arctic, than in the Antarctic; and by a good amount.

Verified by MonsterInsights