Image above courtesy Dr. Judith Curry
The Liu and Curry (2010) paper has been the subject of a number of posts at Watts Up With That over the past few days. This post should complement Willis Eschenbach’s post Dr. Curry Warms the Southern Ocean, by providing a more detailed glimpse at the availability of source data used by Hadley Centre and NCDC in their SST datasets and by illustrating SST anomalies for the periods used by Liu and Curry. I’ve also extended the data beyond the cutoff year used by Liu and Curry to show the significant drop in SST anomalies since 1999.
Preliminary Note: I understand that Liu and Curry illustrated the principal component from an analysis of the SST data south of 40S, but there are two primary objectives of this post as noted above: to show how sparse the source data is and to show that SST anomalies for the studied area have declined significantly since 1999.
On the Georgia Tech on: “the paradox of the Antarctic sea ice” thread at WUWT, author Judith Curry kindly linked a copy of the paper in manuscript form:
http://www.eas.gatech.edu/files/jiping_pnas.pdf
Liu and Curry use two Sea Surface Temperature datasets, ERSST and HADISST. They clarify which of the NCDC ERSST datasets they used with their citation of Smith TM, Reynolds RW (2004) Improved Extended Reconstruction of SST (1854-1997). J. Clim. 17:2466-247. That’s the ERSST.v2 version. First question some readers might have: If ERSST.v2 was replaced by ERSST.v3b, why use the old version? Don’t know, so I’ll include both versions in the following graphs.
Liu and Curry examine the period of 1950 to 1999. Sea surface temperature data south of 40S is very sparse prior to the satellite era. The HADISST data began to include satellite-based SST readings in 1982. Considering the NCDC deleted satellite data from their ERSST.v3 data (making it ERSST.v3b) that dataset and their ERSST.v2 continue to rely on very sparse buoy- and ship-based observations. ICOADS is the ship- and buoy-based SST dataset that serves as the source for Hadley Centre and NCDC. Figure 1 shows typical monthly ICOADS SST observations for the Southern Hemisphere, south of 40S. The South Pole Stereographic maps are for Januarys in 1950, 1960, 1970, 1980, 1990 and 2000. Since I wanted to illustrate locations and not values, I set the contour levels so that they were out of the range of the data. I used Januarys because it is a Southern Hemisphere summer month and might get more ship traffic along shipping lanes.
http://i37.tinypic.com/x1wtvm.jpg
Figure 1
As you can see, there is very little data as a starting point for Hadley Centre and NCDC, but they do manage to infill the SST data using statistical tools. Refer to Figure 2. It shows that the three SST datasets provided complete coverage in 1950 and 1999, which are the start and end years of the period examined by Liu and Curry. For more information on the ERSST and HADISST datasets refer to my post An Overview Of Sea Surface Temperature Datasets Used In Global Temperature Products.
http://i34.tinypic.com/j5jhp5.jpg
Figure 2
A question some might ask, why did Liu and Curry end the data in 1999? Dunno.
As noted above, Liu and Curry illustrate data for the latitudes south of 40S. There are differences of opinion about what makes up the northern boundary of the Southern Ocean. Geography.com writes about the Southern Ocean, “A decision by the International Hydrographic Organization in the spring of 2000 delimited a fifth world ocean – the Southern Ocean – from the southern portions of the Atlantic Ocean, Indian Ocean, and Pacific Ocean. The Southern Ocean extends from the coast of Antarctica north to 60 degrees south latitude, which coincides with the Antarctic Treaty Limit. The Southern Ocean is now the fourth largest of the world’s five oceans (after the Pacific Ocean, Atlantic Ocean, and Indian Ocean, but larger than the Arctic Ocean).”
But isolating the Southern Ocean for climate studies really isn’t that simple. The Antarctic Circumpolar Current (ACC) is said to isolate the Southern Ocean from the Atlantic, Indian and Pacific Oceans. Unfortunately, the northern boundary of the ACC varies as it circumnavigates the ocean surrounding Antarctica. Refer to the University of Miami Antarctic CP current webpage.
In this post, I’ll illustrate the SST anomalies of the area south of 40S that was used by Liu and Curry. They capture additional portions of the ocean within the Antarctic Circumpolar Current. (They also capture small areas north of the ACC.) And I’ll identify that data as the Mid-to-High Latitudes of the Southern Hemisphere (90S-40S).
I’ll also illustrate the SST anomalies of the Southern Ocean, as defined above (south of 60S), because they capture the Sea Surface Temperature anomalies of the Southern Ocean most influential on and influenced by Sea Ice. Let’s look at that data first.
THE SOUTHERN OCEAN (90S-60S) SST ANOMALIES
Figure 3 compares the three versions of Southern Ocean (90S-60S) SST anomalies, from January 1950 to December 1999, the same years used by Liu and Curry. Included are ERSST.v2, which Is used in Liu and Curry, ERSST.v3b which is the current version of that dataset, and the HADISST data, also used in Liu and Curry. All three datasets are globally complete. And as shown in Figure 1, the Hadley Centre and NCDC have to do a significant amount of infilling to create spatially complete data for those latitudes. The data has been smoothed with a 13-month running-average filter to reduce the noise. Also shown are the linear trends. Again, this is not the full area of the Southern Hemisphere SST data used by Liu and Curry. I’ve provided it because it presents data that is more impacted by (and has more of an impact on) Sea Ice. The linear trend of the ERSST.v2 is almost twice that of the HADISST data. Note also the change in the variability of the HADISST data after the late 1970s. HADISST has used satellite data since 1982 and this helps capture the variability of the Southern Ocean SST anomalies.
http://i36.tinypic.com/287ejkg.jpg
Figure 3
Figure 4 shows the Southern Ocean SST anomalies for the ERSST.v2, ERSST.v3b, and HADISST from January 1950 to December 2009, with the data smoothed with a 13-month filter. The HADISST data peaked in the early 1990s and has been dropping since. This was not easily observed with the shortened dataset. The two ERSST datasets peaked in the early 1980s. For all three datasets, the recent declines in the SST anomalies have caused their linear trends to drop sharply from the values presented in Figure 3. In fact, the HADISST is now basically flat.
http://i36.tinypic.com/snea10.jpg
Figure 4
MID-TO-HIGH LATITUDES OF SOUTHERN HEMISPHERE
Figure 5 is a comparison graph of the SST anomalies for the latitudes (90S-40S) and years (1950-1999) used by Liu and Curry. Note how there are two distinctive periods when there are sharp rises in SST anomalies: from 1966 to 1970 and from 1974 to 1980. Then from 1980 to 1999 the SST anomalies for the mid-to-high latitudes of the Southern Hemisphere flattened considerably. The HADISST data flattened more than the ERSST datasets.
http://i34.tinypic.com/4kwc51.jpg
Figure 5
In Figure 6, I’ve included the data through December 2009. Note the significant drops in the SST anomalies in all three datasets. All three peaked in 1997 (curiously before the peak of the 1997/98 El Niño), and have been dropping sharply since then.
http://i37.tinypic.com/3447dk7.jpg
Figure 6
CLOSING
The title of Liu and Curry (2010) “Accelerated Warming of the Southern Ocean and Its Impacts on the Hydrological Cycle and Sea Ice” contradicts the SST anomalies of the latitudes used in the paper. The SST anomalies are not warming. They are cooling and have been for more than a decade.
SOURCE
The Maps were created using, and the data is available through, the KNMI Climate Explorer:
http://climexp.knmi.nl/selectfield_obs.cgi?someone@somewhere

BTW it would be good for WUWT have a place where all advisable papers´links could be found for the welfare of all those gwrs who are beginning to doubt….
Tallbloke and Bob-here is the SOI graph from an Australian weather site:
http://www.weatherzone.com.au/climate/indicator_enso.jsp?c=soi
Is it me or is Nina coming on stronger than the last go-around?…
I feel in Dec/Jan that Judith and Liu will have explain the cooling…
Along with the sparse spatial coverage of the Southern Ocean, ably demonstrated here by Tisdale, there is also very sparse temporal coverage. When ships’ observations are made 4 times daily in accordance with WMO guidelines, it requires a bare minimum of 365×4 = 1460 obs each year to obtain an intact time series, assuming no overlaps from two or more ships reporting from the same square at the same time. In most of Marsden squares involved here, we’re lucky to get a 15% of that–with some overlaps, The anomaly series in the standard datasets prior to the advent of satellites are largely the product of data manufacture, rather than measurement. And even the actual measurements are quite crude. Only those who habitually wax academic dare say anything about the long-term temperature regime of the Southern Oceans.
One of Dr Curry’s comments on the other thread is this:
Judith Curry says:
August 18, 2010 at 5:29 pm
“….We pose a hypothesis about how the upper ocean, sea ice, and hydrological cycle interaction in the Southern Ocean. There are many ways that hypotheses can arise: inferences from observations, identification of patterns, imagination, etc. You then test the hypothesis with available information. In this case it is observations and model simulations. The evidence that is available supports our hypothesis. If say the SST has greater uncertainty than we think it does, well then there is less support for our hypothesis.
When you say garbage in garbage out, you are not understanding the scientific process. We posed a hypothesis, we tested it using data and model simulations, which support the hypothesis. If the data are bad and the model is wrong, that doesn’t falsify the hypothesis, it reduces the support for the hypothesis. So it doesn’t make any sense to say that our hypothesis is incorrect because there are holes in the sea surface temperature data set.”
However is the hypothesis proved wrong by the 2000 to 2010 data showing no increase in SST? She states else where that what she is testing a “warming” scenario
How much precip does it take to hide the increase, so to speak?
And since precip is spotty, how can this possibly be considered a viable mechanism?
I think it should be clear by now that the reason for 1999 was because that was where the models they were using had ended their analysis – nothing to complain about there. It was all they had to work with within the scope of what they were trying to do.
It is probably reasonable to critique the use of models, but as I noted on the other thread, high powered computer models are the “best” tools that modern science has. I do really think that this issue of modelling is probably the core issue around which the establishment just shakes their head at so-called skeptics.
Critics of AGW point out that computer models are virtual worlds, not the real world. But scientists have reached a point where models are the only way they have of viewing the real world – so they cannot but report their results as if they are real.
I had noticed this with respect to Gavin’s arguments about the Tijander data – he said quite emphatically that historic data was virually noise in the argument – that the case was almost all about the physics. What he didn’t say, but comes out as we explore further is that the physics are loaded into the models and it is the models that “tell” us how the world works.
That is why they have absolute confidence that the world will warm and that this will accelerate.
It is only by acknowledging our ignorance and dragging the models back into the real world by rigorous common-sense testing that the models can be used without being misused. That means there must be a determination to utterly reject their future predictions until they really can have a track record of future success [not past success – as that only shows the skill of multiple parameter adjustment – not the skill of real understanding]
Now I can hear voices of protest already – that climate is a choatic system and no model can ever reflect this. And that means we can never know what the future will bring unless we rely on these models. Well I am sorry – but the determination to plot the future should not override our realistic abilities to plot that future. One of the problems with Forensic Science is the idea that it will always lead to a criminal – and we have seen numbers of cases where forensics have misled.
By all means study using the models – but by no means ascribe real predictive power to them.
As Pielke Jnr has been saying for a long time now – there are better reasons to develop new technologies than the drive to save the planet.
Back in the context of this thread – it now shows that the paper was concerned to show a mechanism whereby Antartic Ice could increase in a warming world. Since we can see the ice has been increasing in a cooling environment – not a warming one – it would be reasonable to ask the authors whether their modelling can show that this also happens. That would be a real world test of their proposal. Does the mechanism they are describing cause ice to increase in both a cooling as well as a warming mode?
With regards to the period chosen for the study, 1950-1999, I addressed this issue on the other thread. To sort out the natural variability from any forced change, you need decades. The SST data in the Southern Ocean is definitely unreliable before 1950. Because of the data inadequacies, we used the 20th century climate model simulations (from the IPCC AR4). The 20th century ends in 1999. So that is the end point of our analysis.
The AR5 is doing 30 year simulations starting in 2000. It will be very interesting to redo this study for the first decade of the 21st century, a period for which we have very good data, much natural variability in the Southern Ocean, and little to no temperature trend.
Hi Judith, the question I have then is, why would the 20th century climate model simulations (from the IPCC AR4) not work on 21st century data?
Data is data, centuries are a construct of man, and mathematics knows nothing of it nor cares.
I just wonder what reputable peer-reviewed journal would print such out of date data without comment.
“…the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation.”
Still doesn’t answer the question of why it uses data only to 1999, unless it was interested in a warming period only.
Actually, it makes sense to study ice during a period of warming and not conflating it with a cooling period. This study is useful to observe ice behavior during a warming period. (Perhaps a separate study of the 2000 – 2010 period should be done.)
When I study surface stations (re. siting) I find it far more illuminating to observe behavior of poor vs. good sites from 1979 – 1998 and then from 1998 – 2008 as opposed to lumping it all together into 1979 – 2008. Poor sites appear to react differently during warm phases vs. cool phases, so why mix them? It just muddies the observation.
People carrying out studies which require the use of numbers should not be innumerate(not saying doctor Curry is; more likely the “scientists” who ended a model run at the end of 1999 because that was the end of the twentieth century).
Fact: The twentieth century ended at 2400 on 31 Dec 2000.
PS Could everybody please stop using the word “methodology” when they mean “method”. Methodology is not another, cleverer, used by smarter more knowledgeable people, word that means method. It does NOT mean method. The “ology” is an unmistakable clue to the true meaning. Ta.
Curry states: Again, the main point of this paper was not so much to document the temperature trend,
If what you are saying here is true, Judith, then you have some serious explaining to do about why the very title of your paper includes the headline “accelerated warming.” That phrase connotes a temperature trend. A title of a paper is where you wish to call someone’s attention to the primary purpose of you writing that paper. I find your post-facto explanation in direct opposition to the very title of your paper. And that makes me angry.
Judith Curry says:
August 19, 2010 at 5:55 pm
………………The 20th century ends in 1999. So that is the end point of our analysis………..
—————————–
Uhmmmmm, I do not think so. The 20th century ended at 11:59:59 and change 🙂 PM
on Dec 31st 2000.
To add to my comment above, it looks that for your research you’ve used model which was: 1. not Y2K ready , 2. somewhat outdated 🙂 (I think at least 11+ years old), but
3. perfectly designed to support CAGW histeria at that time
Taking a historical perspective using 1950 to 1999 and publishing a ‘current’ paper in 2010 relying on limited data sets seems to be pushing the envelope for even warmism “science”. One must ask why has this been done now and to what end. Maybe this is related to the 500,000 people that just filed for unemployment. A diversionary tactic to show that things are worse than we thought but right on target if you ignore the last 10 years.
Judith Curry says:
August 19, 2010 at 5:55 pm
It is a trivial matter to renumber the years if the program does not accept input past 1999.
The software would not know the difference between 1999 as 1999, or 2009 as 1999.
I’m getting the hint of some old hardware that has a bonafide y2k glitch never fixed.
Ms Curry says (above) “Because of the data inadequacies, we used the 20th century climate model simulations (from the IPCC AR4).”
It feels to me like a studied insult that Ms Curry would so blithely use backward simulations to fill in her data set without showing how well her simulations fit forward data. If the models deviate as significantly from real data in the 1999 to 2010 data as I suspect, the conclusions she draws from the backward simulations have no validity whatsoever. The phrase “garbage in, garbage out” does, on the suface of it, seem appropriate in that case.
Brian Eglinton, very well-thought out points. Thank you.
tallbloke says: “I notice from Bob’s final plot that the much touted ’1976 global climate shift’ took place in 1967 in the southern ocean. Hmmm.”
And then look to see when it appears to show up in the South Atlantic:
http://i42.tinypic.com/1z4ah6q.png
From this post:
http://bobtisdale.blogspot.com/2010/05/200910-warming-of-south-atlantic.html
Ray Hudson says: August 19, 2010 at 7:05 pm
“And that makes me angry.”
Hello Ray
There is no need for anger, please let your facts speak for themselves. It is important that we maintain civility in this debate, such that all parties can focus on the facts at hand and those with divergent view points feel comfortable presenting them for consideration on WUWT.
Nothing is gained on these issues when minor questions sidetrack the discussion. When the century ends is not in doubt and as Watts says it isn’t important. If Liu & Curry meant to work in the same time frame as modelers that stopped at 1999 so be it.
Having said that I would be more forgiving had the title of the paper been The Hydrological Cycle and Sea Ice in the Southern Ocean – What we Know and Don’t Know from 1950 to 1999 Data.
Where things really went wrong was with this statement:
““We wanted to understand this apparent paradox so that we can better understand what might happen to the Antarctic sea ice in the coming century with increased greenhouse warming,” said Jiping Liu, a research scientist in Georgia Tech’s School of Earth and Atmospheric Sciences.””
http://wattsupwiththat.com/2010/08/16/georgia-tech-on-resolving-the-paradox-of-the-antarctic-sea-ice/
There is no paradox (comment at 10:37 pm on that post) and “the coming warming” is a wild guess and very likely wrong.
I used Januarys because it is a Southern Hemisphere summer month and might get more ship traffic along shipping lanes.
For most shipping traffic, I would suggest the Decembers — with the build-up to Christmas. Certainly that is when the ports are busiest.
thank you mr. ‘latitude’, for your perspicacity.
Judith once made a very nice gesture to my kid and my mother is a graduate of GA Tec so I’ll try not to give her too hard of a time here… I usually think this AGW stuff is such a joke… and I thnk this paper is a joke.
On the previous thread I gave a physical mechanism of why this paper’s conclusions will be all wrong. (to which I’ll add that increased offshore flow will also increase cold water up-welling… feedback to slow warming) But one thing did stick in the back of my mind while reading the posts and comments… haven’t southern ocean temps been falling? Then when I read Bob’s post I went into total belly laugh mode. I didn’t realize that things had cooled that much down there. Thanks Bob, you really made me laugh after a long, crappy day.
For those who keep harping about the accelerated warming, that part is not in the observed temperatures, but in the future… somewhere around 2090 if I recall… which is a good reason to read the paper… a good laugh is another reason to read the paper…
But Judith, as a consolation, my computer also showed accelerated warming… but that’s only because I programmed Pac-Man to eat the sea ice, then the little alien space ship from asteroids blew up Vostok, and Godzilla and Godzuki boiled the ACC with their fire breath and smoke rings… hint: you can tune the timescale with this method (wink wink).
I noticed on a different thread that some statisticians had done a rebuttal of the Michael Mann Hockey stick data. The paper is due to be published very soon I believe, but their data and computer models are already available. I wonder If Dr Curry would be so kind as to release the data and code used in this paper, then someone could maybe add the last 10 or 11 years worth of data and see if the model had indeed produce a correct, observable and verifiable prediction.
I am sure there would be a few volunteers, and Dr Curry and her team would welcome the confirmation of their predictions.