Tisdale on Liu and Curry's 'Accelerated Warming' paper

On Liu and Curry (2010) “Accelerated Warming of the Southern Ocean and Its Impacts on the Hydrological Cycle and Sea Ice”

Image above courtesy Dr. Judith Curry

The Liu and Curry (2010) paper has been the subject of a number of posts at Watts Up With That over the past few days. This post should complement Willis Eschenbach’s post Dr. Curry Warms the Southern Ocean, by providing a more detailed glimpse at the availability of source data used by Hadley Centre and NCDC in their SST datasets and by illustrating SST anomalies for the periods used by Liu and Curry. I’ve also extended the data beyond the cutoff year used by Liu and Curry to show the significant drop in SST anomalies since 1999.

Preliminary Note: I understand that Liu and Curry illustrated the principal component from an analysis of the SST data south of 40S, but there are two primary objectives of this post as noted above: to show how sparse the source data is and to show that SST anomalies for the studied area have declined significantly since 1999.

On the Georgia Tech on: “the paradox of the Antarctic sea ice” thread at WUWT, author Judith Curry kindly linked a copy of the paper in manuscript form:

http://www.eas.gatech.edu/files/jiping_pnas.pdf

Liu and Curry use two Sea Surface Temperature datasets, ERSST and HADISST. They clarify which of the NCDC ERSST datasets they used with their citation of Smith TM, Reynolds RW (2004) Improved Extended Reconstruction of SST (1854-1997). J. Clim. 17:2466-247. That’s the ERSST.v2 version. First question some readers might have: If ERSST.v2 was replaced by ERSST.v3b, why use the old version? Don’t know, so I’ll include both versions in the following graphs.

Liu and Curry examine the period of 1950 to 1999. Sea surface temperature data south of 40S is very sparse prior to the satellite era. The HADISST data began to include satellite-based SST readings in 1982. Considering the NCDC deleted satellite data from their ERSST.v3 data (making it ERSST.v3b) that dataset and their ERSST.v2 continue to rely on very sparse buoy- and ship-based observations. ICOADS is the ship- and buoy-based SST dataset that serves as the source for Hadley Centre and NCDC. Figure 1 shows typical monthly ICOADS SST observations for the Southern Hemisphere, south of 40S. The South Pole Stereographic maps are for Januarys in 1950, 1960, 1970, 1980, 1990 and 2000. Since I wanted to illustrate locations and not values, I set the contour levels so that they were out of the range of the data. I used Januarys because it is a Southern Hemisphere summer month and might get more ship traffic along shipping lanes.

http://i37.tinypic.com/x1wtvm.jpg

Figure 1

As you can see, there is very little data as a starting point for Hadley Centre and NCDC, but they do manage to infill the SST data using statistical tools. Refer to Figure 2. It shows that the three SST datasets provided complete coverage in 1950 and 1999, which are the start and end years of the period examined by Liu and Curry. For more information on the ERSST and HADISST datasets refer to my post An Overview Of Sea Surface Temperature Datasets Used In Global Temperature Products.

http://i34.tinypic.com/j5jhp5.jpg

Figure 2

A question some might ask, why did Liu and Curry end the data in 1999? Dunno.

As noted above, Liu and Curry illustrate data for the latitudes south of 40S. There are differences of opinion about what makes up the northern boundary of the Southern Ocean. Geography.com writes about the Southern Ocean, “A decision by the International Hydrographic Organization in the spring of 2000 delimited a fifth world ocean – the Southern Ocean – from the southern portions of the Atlantic Ocean, Indian Ocean, and Pacific Ocean. The Southern Ocean extends from the coast of Antarctica north to 60 degrees south latitude, which coincides with the Antarctic Treaty Limit. The Southern Ocean is now the fourth largest of the world’s five oceans (after the Pacific Ocean, Atlantic Ocean, and Indian Ocean, but larger than the Arctic Ocean).”

But isolating the Southern Ocean for climate studies really isn’t that simple. The Antarctic Circumpolar Current (ACC) is said to isolate the Southern Ocean from the Atlantic, Indian and Pacific Oceans. Unfortunately, the northern boundary of the ACC varies as it circumnavigates the ocean surrounding Antarctica. Refer to the University of Miami Antarctic CP current webpage.

In this post, I’ll illustrate the SST anomalies of the area south of 40S that was used by Liu and Curry. They capture additional portions of the ocean within the Antarctic Circumpolar Current. (They also capture small areas north of the ACC.) And I’ll identify that data as the Mid-to-High Latitudes of the Southern Hemisphere (90S-40S).

I’ll also illustrate the SST anomalies of the Southern Ocean, as defined above (south of 60S), because they capture the Sea Surface Temperature anomalies of the Southern Ocean most influential on and influenced by Sea Ice. Let’s look at that data first.

THE SOUTHERN OCEAN (90S-60S) SST ANOMALIES

Figure 3 compares the three versions of Southern Ocean (90S-60S) SST anomalies, from January 1950 to December 1999, the same years used by Liu and Curry. Included are ERSST.v2, which Is used in Liu and Curry, ERSST.v3b which is the current version of that dataset, and the HADISST data, also used in Liu and Curry. All three datasets are globally complete. And as shown in Figure 1, the Hadley Centre and NCDC have to do a significant amount of infilling to create spatially complete data for those latitudes. The data has been smoothed with a 13-month running-average filter to reduce the noise. Also shown are the linear trends. Again, this is not the full area of the Southern Hemisphere SST data used by Liu and Curry. I’ve provided it because it presents data that is more impacted by (and has more of an impact on) Sea Ice. The linear trend of the ERSST.v2 is almost twice that of the HADISST data. Note also the change in the variability of the HADISST data after the late 1970s. HADISST has used satellite data since 1982 and this helps capture the variability of the Southern Ocean SST anomalies.

http://i36.tinypic.com/287ejkg.jpg

Figure 3

Figure 4 shows the Southern Ocean SST anomalies for the ERSST.v2, ERSST.v3b, and HADISST from January 1950 to December 2009, with the data smoothed with a 13-month filter. The HADISST data peaked in the early 1990s and has been dropping since. This was not easily observed with the shortened dataset. The two ERSST datasets peaked in the early 1980s. For all three datasets, the recent declines in the SST anomalies have caused their linear trends to drop sharply from the values presented in Figure 3. In fact, the HADISST is now basically flat.

http://i36.tinypic.com/snea10.jpg

Figure 4

MID-TO-HIGH LATITUDES OF SOUTHERN HEMISPHERE

Figure 5 is a comparison graph of the SST anomalies for the latitudes (90S-40S) and years (1950-1999) used by Liu and Curry. Note how there are two distinctive periods when there are sharp rises in SST anomalies: from 1966 to 1970 and from 1974 to 1980. Then from 1980 to 1999 the SST anomalies for the mid-to-high latitudes of the Southern Hemisphere flattened considerably. The HADISST data flattened more than the ERSST datasets.

http://i34.tinypic.com/4kwc51.jpg

Figure 5

In Figure 6, I’ve included the data through December 2009. Note the significant drops in the SST anomalies in all three datasets. All three peaked in 1997 (curiously before the peak of the 1997/98 El Niño), and have been dropping sharply since then.

http://i37.tinypic.com/3447dk7.jpg

Figure 6

CLOSING

The title of Liu and Curry (2010) “Accelerated Warming of the Southern Ocean and Its Impacts on the Hydrological Cycle and Sea Ice” contradicts the SST anomalies of the latitudes used in the paper. The SST anomalies are not warming. They are cooling and have been for more than a decade.

SOURCE

The Maps were created using, and the data is available through, the KNMI Climate Explorer:

http://climexp.knmi.nl/selectfield_obs.cgi?someone@somewhere

0 0 votes
Article Rating
91 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
tallbloke
August 19, 2010 2:59 pm

Why does a study about current climate written in 2010 end its data series in 1999?
Hide the decline???? AGAIN????
Has Judith Curry made any comment on this?

rbateman
August 19, 2010 3:12 pm

I think competition amongst models is a good thing.
Appreciate the time you took to offer an alternative by using data beyond 1999.
If the same movies came out each year, the audience would evaporate.

August 19, 2010 3:12 pm

I agree with tallbloke’s question.
If one is going to create a paper that implies that something is happening “now”, then why wouldn’t it include data through “now”. (“Now” being the latest data.)
To only include data from “then”, and extrapolate to “now”, extrapolating over data that you already have, gets a “dunno” from me, too.

david
August 19, 2010 3:13 pm

Yes she did in response to my question. I will have to find it in the previous thread.
[reply] Thanks David. RT-mod

Tenuc
August 19, 2010 3:19 pm

Strange how the sparse in-filled (modelled) pre early ’80s temperature anomaly data, which is stated to an accuracy of 0.1K, shows a slight warming trend, but with the inclusion of satellite data shows a declining trend. WUWT???

August 19, 2010 3:20 pm

That’s what I was thinking, tallbloke, as obviously the data was available to them.
Hate to ascribe an ulterior motive to people I don’t know much about, but…

david
August 19, 2010 3:21 pm

Judith Curry says:
August 18, 2010 at 1:31 pm
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations. Again, the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation.

latitude
August 19, 2010 3:22 pm

“A question some might ask, why did Liu and Curry end the data in 1999? Dunno.”
“I’ve also extended the data beyond the cutoff year used by Liu and Curry to show the significant drop in SST anomalies since 1999.”
==================================================
You know, this might have been as innocent as the wind driven snow on the surface of the ocean…
…But they worked on this paper long enough to have noticed that.

tallbloke
August 19, 2010 3:29 pm

david says:
August 19, 2010 at 3:21 pm
Judith Curry says:
August 18, 2010 at 1:31 pm
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations. Again, the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation.

Thanks again David.
If the temperature data was that sparse, I wonder how good their precipitation and evaporation data is.
Good post Bob.

david
August 19, 2010 3:29 pm

So my question was how can you properly “interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation” if you are not concerned with “documenting the trend” which is actually down (from the peak) for the southern oceans sense about 1985 at all latitudes, and this is the period when the sea ice has steadily increased in a cooling, not a warming environment?

cleanwater
August 19, 2010 3:41 pm

The most damaging thing to this paper is that they projected into the future with that dirty crystal ball called Greenhouse gas effect. I have sent more than enough technical papers to Judy that she should know better than believing in the 200 year old Fairy-tale Hypotheses of GHG effect and Mann-made global warming.
For any real scientist to refer to H2O , CO2, or Ch4 as the fictitious greenhouse gas is beyond me- these gases should be called IR absorbing gases(IRag) or IR absorbing materials ( IRam) as water is a vapor, liquid ,solid in the Atmosphere.

August 19, 2010 3:41 pm

david says:
August 19, 2010 at 3:21 pm
Judith Curry says:
August 18, 2010 at 1:31 pm
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations. Again, the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation.

Um, in that case, why isn’t the sea ice’s behavior meaningful when the trends are in the cooling direction?
More “dunno”.

pat
August 19, 2010 3:41 pm

The interior has been cooling for about 30 years, although the rate is marginal and sparsely measured. As the world warmed up a but over the last 30 years , I would have expected more of an impact even in the interior of the continent.

August 19, 2010 3:42 pm

If I remember right the reason Dr. Curry and Dr. Liu cut off the date at 1999 was because that is when the model runs ended. Yup found it in the other thread

Judith Curry says:
August 18, 2010 at 1:31 pm
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations. Again, the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation

http://wattsupwiththat.com/2010/08/17/dr-curry-warms-the-southern-ocean/#comment-460780

david
August 19, 2010 3:46 pm

If the main exploration of the paper was to “interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation” would you not include the last ten years of the best data for all three factors? Apparently only because that is the “end date of the 20th century climate model simulations.”
So the best data, extended from 1985 to 2010 all appear to me to run contrary to the postulated conclusions of the paper. (Expanding sea ice in a cooling envirement)
Of course (by definition) 1999 is the end date of 20th century climate model simulations. But did they (who is they) really not extend the models into the 21st century?

david
August 19, 2010 3:52 pm

Sorry for the dyslexia. ( I hear ten out ot two pople suffer with this)
The main focous of the paper was postulating expanding sea ice in a warming envirement, when in fact the envirement was cooling during most of the sea ice expansion, 1985 to current.

latitude
August 19, 2010 3:55 pm

Since when is ‘Accelerated Warming’ not “document the temperature trend”?

1DandyTroll
August 19, 2010 3:56 pm

Could I use Julius Caesars death toll data today, for general purposes, or are those like outdated or what ever?

david
August 19, 2010 3:58 pm

In my view the fact that the environment studied was cooling during most of the sea ice expansion, 1985 to current, out weighs the scarcity of the measurements, which are also very cogent to criticism of this study.

August 19, 2010 4:08 pm

Judith Curry says:
August 18, 2010 at 1:31 pm
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations.
============================================================
The 20th century didn’t end in 1999, it ended at the start of 2001 where the trend line clearly shows a downward trend.
Even if we want to use the common belief that 2000 was the end of the 20th century, well, in the above graph 2000 also starts to show a downward decline.
Ending in 1999 seems selective to me.

tallbloke
August 19, 2010 4:09 pm

I notice from Bob’s final plot that the much touted ‘1976 global climate shift’ took place in 1967 in the southern ocean. Hmmm.

Slabadang
August 19, 2010 4:18 pm

Always HALF the truth!! Grrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
A comment at the end of the article concerning the period after should have been
non controversial. I cant understand why they insist to put fule on the fire of mistrust!

EthicallyCivil
August 19, 2010 4:32 pm

It takes real confidence and optimism to “infill” when the “infill” area is at least an order of magnitude larger than the known area.
On top of that, we will then use this data a boundary conditions to stiff, non-linear differential equations. Then we will create derived properties from the results of these models.
Yes. Optimism. Lots and lots of optimism.

rbateman
August 19, 2010 4:51 pm

Why can’t the model be run past 1999?
Dunno wants to know, but suspects a system (hardware/software) with a bonafide Y2k problem.

Enneagram
August 19, 2010 4:53 pm

Judith Curry says:
August 18, 2010 at 1:31 pm
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations

Then forget it and back to real work! BTW, after 1999 there have appeared a big number of far better computer games!.
As for the climate, don´t worry, we already know about it what you don´t know. Just read WUWT every day and you´ll know it too, believe me!

Enneagram
August 19, 2010 4:57 pm

BTW it would be good for WUWT have a place where all advisable papers´links could be found for the welfare of all those gwrs who are beginning to doubt….

Douglas Dc
August 19, 2010 5:04 pm

Tallbloke and Bob-here is the SOI graph from an Australian weather site:
http://www.weatherzone.com.au/climate/indicator_enso.jsp?c=soi
Is it me or is Nina coming on stronger than the last go-around?…
I feel in Dec/Jan that Judith and Liu will have explain the cooling…

sky
August 19, 2010 5:11 pm

Along with the sparse spatial coverage of the Southern Ocean, ably demonstrated here by Tisdale, there is also very sparse temporal coverage. When ships’ observations are made 4 times daily in accordance with WMO guidelines, it requires a bare minimum of 365×4 = 1460 obs each year to obtain an intact time series, assuming no overlaps from two or more ships reporting from the same square at the same time. In most of Marsden squares involved here, we’re lucky to get a 15% of that–with some overlaps, The anomaly series in the standard datasets prior to the advent of satellites are largely the product of data manufacture, rather than measurement. And even the actual measurements are quite crude. Only those who habitually wax academic dare say anything about the long-term temperature regime of the Southern Oceans.

Gail Combs
August 19, 2010 5:19 pm

One of Dr Curry’s comments on the other thread is this:
Judith Curry says:
August 18, 2010 at 5:29 pm
“….We pose a hypothesis about how the upper ocean, sea ice, and hydrological cycle interaction in the Southern Ocean. There are many ways that hypotheses can arise: inferences from observations, identification of patterns, imagination, etc. You then test the hypothesis with available information. In this case it is observations and model simulations. The evidence that is available supports our hypothesis. If say the SST has greater uncertainty than we think it does, well then there is less support for our hypothesis.
When you say garbage in garbage out, you are not understanding the scientific process. We posed a hypothesis, we tested it using data and model simulations, which support the hypothesis. If the data are bad and the model is wrong, that doesn’t falsify the hypothesis, it reduces the support for the hypothesis. So it doesn’t make any sense to say that our hypothesis is incorrect because there are holes in the sea surface temperature data set.”

However is the hypothesis proved wrong by the 2000 to 2010 data showing no increase in SST? She states else where that what she is testing a “warming” scenario

hunter
August 19, 2010 5:40 pm

How much precip does it take to hide the increase, so to speak?
And since precip is spotty, how can this possibly be considered a viable mechanism?

Brian Eglinton
August 19, 2010 5:45 pm

I think it should be clear by now that the reason for 1999 was because that was where the models they were using had ended their analysis – nothing to complain about there. It was all they had to work with within the scope of what they were trying to do.
It is probably reasonable to critique the use of models, but as I noted on the other thread, high powered computer models are the “best” tools that modern science has. I do really think that this issue of modelling is probably the core issue around which the establishment just shakes their head at so-called skeptics.
Critics of AGW point out that computer models are virtual worlds, not the real world. But scientists have reached a point where models are the only way they have of viewing the real world – so they cannot but report their results as if they are real.
I had noticed this with respect to Gavin’s arguments about the Tijander data – he said quite emphatically that historic data was virually noise in the argument – that the case was almost all about the physics. What he didn’t say, but comes out as we explore further is that the physics are loaded into the models and it is the models that “tell” us how the world works.
That is why they have absolute confidence that the world will warm and that this will accelerate.
It is only by acknowledging our ignorance and dragging the models back into the real world by rigorous common-sense testing that the models can be used without being misused. That means there must be a determination to utterly reject their future predictions until they really can have a track record of future success [not past success – as that only shows the skill of multiple parameter adjustment – not the skill of real understanding]
Now I can hear voices of protest already – that climate is a choatic system and no model can ever reflect this. And that means we can never know what the future will bring unless we rely on these models. Well I am sorry – but the determination to plot the future should not override our realistic abilities to plot that future. One of the problems with Forensic Science is the idea that it will always lead to a criminal – and we have seen numbers of cases where forensics have misled.
By all means study using the models – but by no means ascribe real predictive power to them.
As Pielke Jnr has been saying for a long time now – there are better reasons to develop new technologies than the drive to save the planet.
Back in the context of this thread – it now shows that the paper was concerned to show a mechanism whereby Antartic Ice could increase in a warming world. Since we can see the ice has been increasing in a cooling environment – not a warming one – it would be reasonable to ask the authors whether their modelling can show that this also happens. That would be a real world test of their proposal. Does the mechanism they are describing cause ice to increase in both a cooling as well as a warming mode?

Judith Curry
August 19, 2010 5:55 pm

With regards to the period chosen for the study, 1950-1999, I addressed this issue on the other thread. To sort out the natural variability from any forced change, you need decades. The SST data in the Southern Ocean is definitely unreliable before 1950. Because of the data inadequacies, we used the 20th century climate model simulations (from the IPCC AR4). The 20th century ends in 1999. So that is the end point of our analysis.
The AR5 is doing 30 year simulations starting in 2000. It will be very interesting to redo this study for the first decade of the 21st century, a period for which we have very good data, much natural variability in the Southern Ocean, and little to no temperature trend.

A C of Adelaide
August 19, 2010 6:01 pm

I just wonder what reputable peer-reviewed journal would print such out of date data without comment.

Dave N
August 19, 2010 6:29 pm

“…the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation.”
Still doesn’t answer the question of why it uses data only to 1999, unless it was interested in a warming period only.

Evan Jones
Editor
August 19, 2010 7:03 pm

Actually, it makes sense to study ice during a period of warming and not conflating it with a cooling period. This study is useful to observe ice behavior during a warming period. (Perhaps a separate study of the 2000 – 2010 period should be done.)
When I study surface stations (re. siting) I find it far more illuminating to observe behavior of poor vs. good sites from 1979 – 1998 and then from 1998 – 2008 as opposed to lumping it all together into 1979 – 2008. Poor sites appear to react differently during warm phases vs. cool phases, so why mix them? It just muddies the observation.

acementhead
August 19, 2010 7:04 pm

People carrying out studies which require the use of numbers should not be innumerate(not saying doctor Curry is; more likely the “scientists” who ended a model run at the end of 1999 because that was the end of the twentieth century).
Fact: The twentieth century ended at 2400 on 31 Dec 2000.
PS Could everybody please stop using the word “methodology” when they mean “method”. Methodology is not another, cleverer, used by smarter more knowledgeable people, word that means method. It does NOT mean method. The “ology” is an unmistakable clue to the true meaning. Ta.

Ray Hudson
August 19, 2010 7:05 pm

Curry states: Again, the main point of this paper was not so much to document the temperature trend,
If what you are saying here is true, Judith, then you have some serious explaining to do about why the very title of your paper includes the headline “accelerated warming.” That phrase connotes a temperature trend. A title of a paper is where you wish to call someone’s attention to the primary purpose of you writing that paper. I find your post-facto explanation in direct opposition to the very title of your paper. And that makes me angry.

ML
August 19, 2010 7:10 pm

Judith Curry says:
August 19, 2010 at 5:55 pm
………………The 20th century ends in 1999. So that is the end point of our analysis………..
—————————–
Uhmmmmm, I do not think so. The 20th century ended at 11:59:59 and change 🙂 PM
on Dec 31st 2000.

ML
August 19, 2010 7:23 pm

To add to my comment above, it looks that for your research you’ve used model which was: 1. not Y2K ready , 2. somewhat outdated 🙂 (I think at least 11+ years old), but
3. perfectly designed to support CAGW histeria at that time

trbixler
August 19, 2010 7:26 pm

Taking a historical perspective using 1950 to 1999 and publishing a ‘current’ paper in 2010 relying on limited data sets seems to be pushing the envelope for even warmism “science”. One must ask why has this been done now and to what end. Maybe this is related to the 500,000 people that just filed for unemployment. A diversionary tactic to show that things are worse than we thought but right on target if you ignore the last 10 years.

rbateman
August 19, 2010 7:39 pm

Judith Curry says:
August 19, 2010 at 5:55 pm
It is a trivial matter to renumber the years if the program does not accept input past 1999.
The software would not know the difference between 1999 as 1999, or 2009 as 1999.
I’m getting the hint of some old hardware that has a bonafide y2k glitch never fixed.

A C of Adelaide
August 19, 2010 7:42 pm

Ms Curry says (above) “Because of the data inadequacies, we used the 20th century climate model simulations (from the IPCC AR4).”
It feels to me like a studied insult that Ms Curry would so blithely use backward simulations to fill in her data set without showing how well her simulations fit forward data. If the models deviate as significantly from real data in the 1999 to 2010 data as I suspect, the conclusions she draws from the backward simulations have no validity whatsoever. The phrase “garbage in, garbage out” does, on the suface of it, seem appropriate in that case.

Eric Anderson
August 19, 2010 8:00 pm

Brian Eglinton, very well-thought out points. Thank you.

August 19, 2010 8:08 pm

tallbloke says: “I notice from Bob’s final plot that the much touted ’1976 global climate shift’ took place in 1967 in the southern ocean. Hmmm.”
And then look to see when it appears to show up in the South Atlantic:
http://i42.tinypic.com/1z4ah6q.png
From this post:
http://bobtisdale.blogspot.com/2010/05/200910-warming-of-south-atlantic.html

Editor
August 19, 2010 8:33 pm

Ray Hudson says: August 19, 2010 at 7:05 pm
“And that makes me angry.”
Hello Ray
There is no need for anger, please let your facts speak for themselves. It is important that we maintain civility in this debate, such that all parties can focus on the facts at hand and those with divergent view points feel comfortable presenting them for consideration on WUWT.

John F. Hultquist
August 19, 2010 9:02 pm

Nothing is gained on these issues when minor questions sidetrack the discussion. When the century ends is not in doubt and as Watts says it isn’t important. If Liu & Curry meant to work in the same time frame as modelers that stopped at 1999 so be it.
Having said that I would be more forgiving had the title of the paper been The Hydrological Cycle and Sea Ice in the Southern Ocean – What we Know and Don’t Know from 1950 to 1999 Data.
Where things really went wrong was with this statement:
““We wanted to understand this apparent paradox so that we can better understand what might happen to the Antarctic sea ice in the coming century with increased greenhouse warming,” said Jiping Liu, a research scientist in Georgia Tech’s School of Earth and Atmospheric Sciences.””
http://wattsupwiththat.com/2010/08/16/georgia-tech-on-resolving-the-paradox-of-the-antarctic-sea-ice/
There is no paradox (comment at 10:37 pm on that post) and “the coming warming” is a wild guess and very likely wrong.

Mooloo
August 19, 2010 9:09 pm

I used Januarys because it is a Southern Hemisphere summer month and might get more ship traffic along shipping lanes.
For most shipping traffic, I would suggest the Decembers — with the build-up to Christmas. Certainly that is when the ports are busiest.

Gnomish
August 19, 2010 9:49 pm

thank you mr. ‘latitude’, for your perspicacity.

MikeC
August 19, 2010 10:21 pm

Judith once made a very nice gesture to my kid and my mother is a graduate of GA Tec so I’ll try not to give her too hard of a time here… I usually think this AGW stuff is such a joke… and I thnk this paper is a joke.
On the previous thread I gave a physical mechanism of why this paper’s conclusions will be all wrong. (to which I’ll add that increased offshore flow will also increase cold water up-welling… feedback to slow warming) But one thing did stick in the back of my mind while reading the posts and comments… haven’t southern ocean temps been falling? Then when I read Bob’s post I went into total belly laugh mode. I didn’t realize that things had cooled that much down there. Thanks Bob, you really made me laugh after a long, crappy day.
For those who keep harping about the accelerated warming, that part is not in the observed temperatures, but in the future… somewhere around 2090 if I recall… which is a good reason to read the paper… a good laugh is another reason to read the paper…
But Judith, as a consolation, my computer also showed accelerated warming… but that’s only because I programmed Pac-Man to eat the sea ice, then the little alien space ship from asteroids blew up Vostok, and Godzilla and Godzuki boiled the ACC with their fire breath and smoke rings… hint: you can tune the timescale with this method (wink wink).

Roy UK
August 19, 2010 10:45 pm

I noticed on a different thread that some statisticians had done a rebuttal of the Michael Mann Hockey stick data. The paper is due to be published very soon I believe, but their data and computer models are already available. I wonder If Dr Curry would be so kind as to release the data and code used in this paper, then someone could maybe add the last 10 or 11 years worth of data and see if the model had indeed produce a correct, observable and verifiable prediction.
I am sure there would be a few volunteers, and Dr Curry and her team would welcome the confirmation of their predictions.

david
August 19, 2010 10:49 pm

When I look at Willis’s southern ocean graph divided into 6 latitudes, (from the very source used in this paper) I see cooling at all latitudes from about 1985 to the present.
If you go to the southern hemisphere sea ice anomaly chart (1979 – 2010) at cryosphere today you will see this is the period of greatest rise in sea ice.
So again “The main focous of the paper was postulating expanding sea ice in a warming environment, when in fact the environment was cooling during most of the sea ice expansion, 1985 to current. Please tell me how this is incorrect.

August 19, 2010 11:26 pm

“Why does a study about current climate written in 2010 end its data series in 1999?”
Statistical Santerism.

nevket240
August 19, 2010 11:31 pm

“A question some might ask, why did Liu and Curry end the data in 1999? Dunno.”
“I’ve also extended the data beyond the cutoff year used by Liu and Curry to show the significant drop in SST anomalies since 1999.” )))
Everybody now ..
CANCUN, CANCUN, CANCUN….. we saw the same media saturation of AGW BS before CopenHagen did we not. This upcoming FundFest is no different. In fact I think they are losing control of the publics perception of the ‘catastrophe’.
regards

TimG
August 20, 2010 12:25 am

Judith,
I assume the 1999 limit was because the model had no forcing data (i.e. aerosols) beyond that point so running it beyond that point would not be meaningful even if it was possible to crunch the numbers with a computer.
That said, can you comment on what effect the recent flatline trend might have on your conclusions – especially if it continues for another 10-15 years. If your answer is ‘the conclusions won’t change’ can you explain why it would not because it would be quite counter intuitive.

Brian H
August 20, 2010 12:27 am

Perhaps she meant “Accelerated Negative Warming”? 🙂

phlogiston
August 20, 2010 12:43 am

Judith Curry says:
August 18, 2010 at 1:31 pm (previous thread)
David, we chose 1999 as the end date because that is the end date of the 20th century climate model simulations. Again, the main point of this paper was not so much to document the temperature trend, but rather interpret the behavior of the sea ice in the context of surface temperatures, precipitation, and evaporation
This is rather cute – the paper appears to have “points” (or be trying to “score points”) on different levels. Dr Curry says “the main point of this paper was not so much to document the temperature trend”; but a headline starting with “Accelerated warming of the southern ocean …” will create an unambiguous warming alarmist mood music for the MSM. The vast majority seeing the title and not reading further will take away an alarmist message of Southern ocean accelerating warming that, as Bob Tisdale clearly shows, is the exact opposite of the reality. Tucking away the “ending in 1999” part in the small print is consistent with a scam. Perhaps it is a bone thrown to the climate community to try to atone for her recent statements questioning the climate AGW consensus? Is this backtracking in the face of some pretty ugly intimidation? If accuracy was the only intention then the title should have started “Historic warming in the Southern Ocean …”
Further, the use of GCM climate models as a source of “data” on surface temperatures, precipitation, and evaporation, etc., is scientifically very weak and may also be political. Trying to mend some fences? – or maybe just doing all of this at gunpoint?

Tom
August 20, 2010 12:51 am

I’m curious about the infilling methods used to generate complete coverage in those maps of 1950 and 1999. A quick eyeball seems to show a pretty good correlation between places where a sample is available and places showing a non-zero anomaly. I guess this means that they’ve just assumed that grid cells with no sample available have the base-period average temperature, then applied some smoothing so that measured samples ‘spread’ a bit into surrounding cells.
Is this sound? I’d guess probably not. It assumes that there is minimal correlation of temperature anomalies across the region at a given time – ie. that a measured temperature in one place tells you nothing about the temperature more than a few grid cells away.
Eyeballing the 1950 and 1999 charts above again, this assumption doesn’t seem justified, at least for the ERSST data sets. The measured anomaly in each seems to be either almost entirely positive or almost entirely negative, suggesting that the assumption that unmeasured places have the base period mean temperature will significantly overstate the geographical variability and understate the time-domain variability. The HADISST data, OTOH, does show a significant mix of positive and negative anomalies – curious.
Can someone who knows something about southern ocean temperatures comment? Is the extrapolation based on a “we know nothing” model, or is there some physical knowledge of ocean patterns involved as well?

Gnomish
August 20, 2010 12:54 am

Excessing anthropogenic data warming has caused us to reach a tipping point.
The economic climate has been disrupted and catastrophic global science defunding can no longer be avoided.
It’s worse than they thought.
If they don’t get cap and tax, they are out of a job pdq.

August 20, 2010 1:17 am

rbateman says:
August 19, 2010 at 7:39 pm (Edit)
Judith Curry says:
August 19, 2010 at 5:55 pm
It is a trivial matter to renumber the years if the program does not accept input past 1999.
The software would not know the difference between 1999 as 1999, or 2009 as 1999.
I’m getting the hint of some old hardware that has a bonafide y2k glitch never fixed.
############################################
The start dates and stop dates for the simulations are in the simulation plan. The teams gets together and the decide what they will do.
a 20th century simulation, a 2100, and a 2300. so they decided to stop the 20th century sims at 1999.
Sheesh guys. Go read the plans for Ar5! they are already posted.
Psst. some models only have 360 days in their years. betcha didnt know that.

Mac
August 20, 2010 1:39 am

Infilling, cherry-picking and hiding the decline.
This paper highlights that climate scientists have not learnt any lessons from Climategate.
Judith Curry will be welcomed with open arms back into the Warmist Club.

August 20, 2010 2:02 am

Bob, this is a superbly clear and well-written post. Why don’t you write a short paper or comment? The combination of fig 6 with the misleading title of Liu and Curry is just amazing.
The excuse given by Judith Curry for hiding the decline in the latest data
(“we chose 1999 as the end date because that is the end date of the 20th century climate model simulations“) is feeble, in fact non-existent, since the paper itself includes simulations into the 21st century.

Judith Curry
August 20, 2010 4:21 am

Anthony et al. This has been an interesting and useful thread. We will be continuing our research on the Southern Ocean and Antarctic sea ice. During the period 2000-2010 we actually have good satellite datasets, but work needs to be done to verify the satellite retrievals with what little data we do have in the region. In particular, there were some very useful observations made during the recent International Polar Year. Further, the IPCC AR5 is making some simulations on decadal time scales, so we will have those simulations to look at also.
In our paper we discussed extensively the main mode of natural variability in the Antarctic. ENSO also has an impact, see our previous paper at http://curry.eas.gatech.edu/currydoc/Liu_GRL31.pdf
I think this is a fascinating topic, and I hope our paper plays a role in stimulating additional research on this.
I am just about to leave for travel and will have limited time and internet access for the next 4 days. I will check back when I can. Thanks to all for your participation and input.

Jane Coles
August 20, 2010 4:40 am
TomVonk
August 20, 2010 4:51 am

Just one general comment and several comments about the use of the EOF method .
I will not examine the question of data availability , accuracy and relevance which has been amply discussed above .
1) General comment
The title “Accelerated Warming of the Southern Ocean and Its Impacts on the Hydrological Cycle and Sea Ice” is misleading .
It suggests that there is an accelerated warming in the 1959-1999 data and that the paper studies its influence on hydrological cycle both in the past and in the future .
This is not the case . There is no accelerated warming in the data as even J.Curry agreed . The correct title should have been :
Simulation of the SST of the Southern Ocean with CCSM and GFDL models in variable CO2 emission scenarios and a proposition of a physical explanation of the model’s behaviour .
I agree that this title is less sexy but it describes what the paper tries to do .
2. Comments about the EOF method .
– EOF is PCA for all practical purposes . Mathematically it is simple linear algebra and it takes 4 Matlab lines to produce an EOF decomposition from a given data matrix .
The difficulty is with the interpretation of the results .
Quoting von Storch&Navarra Such methods are often needed to find a signal in a vast noisy phase space , i.e the needle in the haystack . But after having the needle in our hand , we should be able to identify the needle by simply looking at it . Whenever you are unable to do so there is a good chance that something is rotten in the analysis
The “needle” in the paper are Fig1a and Fig1b which are the first EOF of the data .
It doesn’t appear that this spatial structure looks like anything significant .
Fig1c through m are just the first EOF of models and correspond to an emission scenario analysis .
3)The relevance of the EOF1 for real data .
It only explains one fourth of the variance (28% and 29% are closer to one fourth than to one third contrarily to what is said in the paper) .
This is very small part of variance explained and the following EOFs must be used too .
Considering that the spatial structure of SST can be significantly represented by the first EOF doesn’t seem to be reasonable . The first EOF is (almost) sufficient for the models but that is not data . On the contrary it suggests that there the models are ovewhelmed by something that doesn’t exist in real data .
4) Spatial resolution
The small scale structures in Fig1a are smaller than the resolution given by the data .
They may be artefacts of the method that creates data at places where no data exists .
This criticism doesn’t apply on artificial model data because the model computes at its own resolution which is much smaller than the real data resolution .
5) First EOF in real data vs first EOF in model (Fig1e and 1f)
While the real EOF is approximately unimodal (radial symmetry) , the model EOF , especially CCSM3 is asymmetrical and bi modal .
This suggests that the model doesn’t reproduce the spatial structure of the reality correctly .
6)Sensibility to domain definition .
The worst trap for the EOF use is the sensibility to domain definition .
This effect happens when the study domain is cut in two and an EOF analysis of each of the new domains destroys the spatial structures in the first analysis .
When that happens then quote Björnsson In this case any attempt at a physical explanation for the EOFs is difficult or plain foolish .
The paper didn’t tell what kind of verification , if any has , been made on domain shape dependency . When the original data matrix used for the study is available , I am sure that the first test that will be done in the blogosphere will consist to cut the disk in 2 halves and perform an EOF analysis on each half .
7) Sampling problems .
These problems happen when the eigenvalues are closely spaced . The paper references the right text (North in Ref 26) but the Eigenvalues and tests are not given .
As the first EOF explains only a small part of the variance , it can be supposed that the Eigenvalues could be closely spaced .
8) SVD
EOF is used to detect spatial structures (i.e standing waves) in a single scalar field .
When one wants to look for correlations between 2 scalar fields , the SVD method is used .
The paper looks at SST vs P-E (Precipitation – evaporation) fields .
The first SVD modes are shown in Fig3a and Fig3b .
First problem is that the link to P-E data (Ref 18) doesn’t work .
After visiting the ECMWF site , it appears that the “data” provided are subject to “reanalysis” (by models) . Specifically 1957 – 1971 “data” are result of a reanalysis of NCAR data .
The maps provided at the ECMWF site have a not the spatial resolution that Fig 3b shows . The Antarctics is in white (e.g no data available) and the surrounding area is flat (e.g no variation) .
Not having seen the data and the paper not mentioning how they are obtained and where they come from I cannot say anything definitive .
However I suspect that as there surely is no real P-E data for most part of the domain , the ECMWF has computed them from the model and other available data .
As that must heavily use SST (principal driver of P-E) , then necessarily the resulting computed P-E are correlated to SST . In that case the finding that P-E correlates to SST would just be a tautology .

August 20, 2010 5:07 am

Jane Coles,
Thank you for that fascinating Proxmire obit. I wonder what the Senator would say about today’s profligate waste of taxpayer funds?
Where people were shocked and the government was ridiculed and embarrassed for wasting thousands of taxpayer dollars, today $20 billion is handed out to organizations like ACORN to facilitate their election shenanigans, over the loud protests of citizens, and with the silent concurrence of the major media. Today’s Democrat party would have a predictable effect on Sen Proxmire.

Glenn
August 20, 2010 5:40 am

TomVonk says:
August 20, 2010 at 4:51 am
“There is no accelerated warming in the data as even J.Curry agreed . The correct title should have been :
Simulation of the SST of the Southern Ocean with CCSM and GFDL models in variable CO2 emission scenarios and a proposition of a physical explanation of the model’s behaviour .
I agree that this title is less sexy but it describes what the paper tries to do .”
Not sure she agreed, but the abstract starts with:
“The observed sea surface temperature (SST) in the Southern Ocean shows a
substantial warming trend for the second half of the 20th century.”

August 20, 2010 6:00 am

TomVonk: Thanks for your insights. I’m writing an introduction series of posts on SST based ocean indexes–AMO, ENSO, and PDO–for those new to discussions of climate. The first two are done and I’m about to start putting together the one on the PDO. I’ve been trying to find simple ways to explain to readers EOF and the “leading principle component”. I’m thinking of using something a little different than what’s been written in the past, maybe the “dominant pattern” or something to that effect. Any ideas of how to present this for readers without science backgrounds?

david
August 20, 2010 6:35 am

The effect of snow on ice and sea water is postulated as a reason for greater sea ice in a warming ocean. This effect is seasonal as the vast majority of antartic sea ice is first year ice.
HadSSTI shows a decline in the antarctic sea ice anomaly from 1973 until about 1980
and an increasing trrend after 1980. This decline in sea ice from 1973 until 1980 correlates perfectly with an increase in sea surface temperatures from the same period. From 1980 until 1999 (actualy 2010) there is a steady increase in sea ice which correlates well with a decrease in SST over the same period.
WHERE is the PARADOX? (sorry to shout)
The IPCCAR4 (IPCC 2007) concludes that warming of the climate system is
‘unequivocal’ and that sea ice is projected to shrink in both the Arctic and Antarctic under all future emissions scenarios.” Wrong in the southern hemisphere for the past 30 years!

david
August 20, 2010 6:38 am

Still looking for graphs of sea ice anomaly prior to 1973?

david
August 20, 2010 6:41 am

I would really like to find an SST chart overlayed on to a sea ice anomaly chart for the period of 1950 until current.

Alan the Brit
August 20, 2010 6:53 am

Call me Mr Picky, but eyeballing those extended graphs give a slow temperature decline in the Southern Ocean from about 1977-2010! Tried every way I can but I get the same trend downwards. Sorry! I agree the trend is upwards to 1999, but hey that was nye on 11 years ago. Dr Vicky Pope @ the “Wet Office” UK did the same thing with temperature into this century by choosing an end point towards the end of 2007, showing global temps rising still, but she did that in 2009! It’s all about start points & end point & chartmanship.
Would Ms Curry like to borrow my 1925 Pocket OED? Simulation: Feign, pretend, wear the guise of, act the part, counterfeit, shadowy likeness of, mere pretence. (I didn’t include all the definitions on the last post thread). Essentially, the word means unreal! Come on you guys, learn to use words that really do convince us of some credibility. Quit using representation, simulation, sophisticated, etc etc. They are meezly snivilling words that have more holes than a collinder! Stop using may, possibly, could, perhaps, but of course you can’t! I do hope computer models weren’t used in the space race, man would never have got to the moon.

Richard M
August 20, 2010 7:27 am

IMO, this paper demonstrates the pitfalls of group-think and confirmation bias. I don’t think there was any attempt to produce a paper that could be classified as “not even wrong”. But that is what appears to have happened. From the dialog at WUWT I get a strong feeling that neither Curry or Liu were trying to produce a propaganda paper. I think the blame is simply there are no skeptics among reviewers. They are also almost non-existent at universities. So, how does anyone get a critical review of their work? Pretty much impossible within the current scientific framework in climate.
Of course, this paper is not the only one to suffer from this disease. We’ve seen one example after another of this same kind of material. The field of climate science has a serious disease. It’s time the field itself recognized this problem and attempted to do something about it.

pyromancer76
August 20, 2010 7:41 am

Excellent, clear discussion, Bob. Thanks again and again.
“A question some might ask, why did Liu and Curry end the data in 1999? Dunno”
Her answer is so innocent. Doesn’t wash in today’s “climate” of claims of massive global warming. It would be fine if there was a caveat that the oceans did not continue to warm. The paper, after all, is published in the present. And maybe it will be used as part of the run-up to the next conference the goal of which is to strangle the energy development of the developed (democratic) world?

August 20, 2010 9:00 am

Bob Tisdale, nice post. But I think you know that already. : )
Having an open dialog on this paper in a blog that has wide public access provides for education of a broader demographic/audience outside of academia. I thank Curry and Liu for their willingness to do so. I sincerely hope her example will show other scientists that society can benefit by such openness.
Having seen a sampling of papers on climate science over the last 2 years, the Curry and Liu paper seems to me to fit within broad patterns of other studies that are clearly postulated upon the idea of net significant (compared to natural causes) AGW by CO2. This genre of papers does not try to justify the postulate. They openly assume it, openly show some biasing toward it and are not trying to hide it. Modeling tools which are used in such studies are openly built on it. The critical comments on this thread and the previous 2 WUWT threads are consistent with the comments on critical reviews of similarly patterned studies. Perhaps others have noticed this?
Although this J & L paper was interesting it is not at the root of the climate science debate but a consequence. I personally, would like to see more posts on the fundamentals of mainstream climate science that are the essential basis of their conclusion of “net significant (compared to natural causes) AGW by CO2”.
Anthony, Anthony, you do know how to give us all jolts of intellectual delight! Thanks again.
John

George E. Smith
August 20, 2010 10:20 am

This is slightly OT; but still the best place to put it. I was checking the ice page today for the JAXA extent and then I looked at the North Polar view from DMI. A quick check with a real world map confirmed that the perimeter of the DMI polar map is indeed at +60 deg Lat; so that means this map is truly a map of “The Arctic” as distinct from the Curry and Liu pictures from the other end; wheich went all the way out to _40 deg Lat (mebbe for good reasons).
But the punchline is that the DMI north polar picture makes it quite clear that there really is MORE LAND in the Artic than water; and that land can and often is the repository for lots of snow even when the Arctic ocean is a lot of open water; and given that the land is more southerly than the water; the albedo contribution from snow/ice on the land can be more than what the sea ice gives.
At the Antarctic end, the land and water are much closer to being equal; but there still is more water than land; although a lot of that water is under Ross Ice shelves and the like.
So keep that in mind next time you want to win yourself a beer at the bar; there really is more land in the Arctic, than in the Antarctic; and by a good amount.

jason
August 20, 2010 11:34 am

Dr Curry in a recent interview:
“Some people were getting their papers rejected because they disagreed with the IPCC.”
Sounds to me like something worth pressing her on, the implications if true are astonishing.

August 20, 2010 12:10 pm

Dr. Curry and Dr. Liu; I thank you for this venture in collaboration and especially your determination to see the venture through!
As for your paper; I heartily agree with John Whitman’s summary and Tom Vonk’s commentary and the opening posts of Bob Tisdale and Willis Eschenbach. Why?
I tried (and am still trying) to separate the paper into assumptions, observations and findings. I expected firm statements somewhere in the document defining each category clearly. Instead I had to work hard in chasing down terms, where they came from and what they meant. Because of this, the paper comes across to me as a very parochial document meant for distribution within a close knit group. A group where terms are exchanged so frequently that they become familiar and their real meaning is “understood” by insiders. No, not climate science researchers; more like a subset group within climate science. One example of this is the use of the term snowfall in the document. Snowfall as used is a subset of the increased precipitation described. Only by chasing down all references to snowfall and precipitation did I determine that snowfall is derived from a model (20C3M of CCSM3) and (A1B of CCSM3). Silly me! I was assuming that snowfall was an observation; nope it is an assumption.
There are some really confusing statements, again perhaps because the document is meant for distribution within friends. A real concern exhibited by Willis and seconded by many people concerned the data quality prior to the 1970s. Dr. Curry explained this away by stating that she had confidence (my words) in the 1950-1999 data. OK? Well, no! Once I started trying to follow ideas and thoughts through the document I kept getting struck in the face by the lack of information about the specific years used by ALL model runs. Another assumption, I suppose, but one that leaves me uncomfortable. This discomfort level is heightened by the use of PIcntrl. What the freak is a pre-industrial simulation doing in a study where data before the 1970’s is dismally poor at best? Is this because data is so sparse that a simulation is needed to buttress the data infilling assumptions and show an unnatural warming? I don’t know, or I dunno as others have pointed out.
I am really puzzled why pre industrial model runs were used for the 1950-1999 period but no one involved with this document bothered to even a mention recent 2000-2010 observations as confirming, refuting or even that the hypothesis needed further study.
My understanding of science may not be your understanding. When hearing proposal for a hypothetical rationale to a (mostly) computer generated problem, I personally would be more interested in proofs and the methods to obtain those proofs. Instead I spent two days deconstructing sentences in subject lines, modifiers and qualifiers and finally figured out that assumptions were made, models were built and simulations ran using data that should not have been used.
I am also confused by the chosen title for the paper. The term accelerated, does that originate from the GCM models? As far I can find in the document this acceleration is caused by increased GHG in the GCM models. Warm begats warm waters right? So your model uses GHG caused exponential warming as a base assumption. What I find really confusing is the second part of the title “Its Impacts on the Hydrological Cycle and Sea Ice”. Based on that second title line my assumption would normally be that this was a fact finding study detailing the effects caused by warming to date. My bad, I couldn’t have been more wrong in that thought.
Which brings us to;
1. Judith Curry says:
August 18, 2010 at 8:18 am
“…Our paper compares model simulations with available observations (we consider two different data sets) in an effort to unravel the physical mechanisms that determine Antarctic sea ice extent in response to climate variability and change.
We have identified a plausible physical mechanism that seems to make sense. Science is about trying understand how things work.
We have made no extravagant claims in either the paper or the press release. …… It talks about the increase of Antarctic sea ice, which is hardly a talking point for alarmists
Yes, climate models are imperfect and there are deficiencies in SST data sets particularly in the first two decades of the period that we examine. So we have imperfect tools to test our hypothesis. Others will examine this problem from different angles. Eventually we will have better data sets and better models to work with. That is how science works.
This paper raises an issue that climate researchers should pay more attention to. Since the climate model simulations of Antarctic sea ice generally agreed with observations, climate researchers would say “consistent with” without really understanding the mechanism. And we definitely need more and better data in the Southern Ocean…”
I doubt that truer words could ever be spoken in climate science and we should certainly respect that a co-author would know better than us the pitfalls inherent in the study. Honesty with us is terrific, honesty with the world would have been better. Your simple summary should be in the opening statement and parts of it repeated in the findings.
From a personal perspective; if I was handed a paper written like this to review (in any discipline) I would’ve redlined large sections and sent it back for additional work. Papers should clearly introduce the audience to the paper, explain and document the data, tools, methods and findings then exit on a succinct statement that leaves the reader satisfied they grasped the content. This goes even for documents targeting specific audiences. Again, I emphasize that this document assumes very reader is an inner circle initiate. Please, please, please; explain terms and phrases that are not general understanding. Snowfall indeed, I’m thinking white crystalline stuff and the writer is thinking simulated millimeters.
I hope you have a wonderful and productive trip Dr. Curry! I’m looking forward to your return and future participation.

Editor
August 20, 2010 1:16 pm

Jason
That was a telling quote. Here is the interview with Judith Curry.
She’s a brave person and I respect her courage in speaking out. I’m not impressed with the practice though of writing papers using data that doesn’t exist or is so sparse its almost invisible
http://blogs.chron.com/sciguy/archives/2010/08/judith_curry_on_antarctic_ice_climategate_and_skep.html

PhilH
August 20, 2010 7:34 pm

For the layman, like me, I suppose the best description of this paper is that it is not a “What is” study but a “What if.” The media, not surprisingly, considering its title, chose the What is.

Agile Aspect
August 21, 2010 12:08 am

When I look at the plots of the data, a linear fit isn’t what jumps into my mind.
If one fits a nonlinear function with a linear function, then the cure is a lagging indicator.
That is, the trend is looking backwards – the curve has no predictive power.
Where is the original data?
Smoothed experimental data without an indication of the uncertainty in the data is junk science.

JFD
August 21, 2010 12:33 pm

Judith, I am a fan of yours. When you enter a discussion, it becomes much more professional and the data flows readily. It reminds me of my years spent in industry wrestling with problems and situations that had never been addressed before. Without bias and politics, technical problem solving is much more fun and exciting. Liu also entered this discussion so he is to be commended as well.
The fact that I have sat you on a personal pedestal wearing a golden crown, does not impact my professional opinion of your paper, however. You got your technical butt kicked hard and often by some real pros, led by Willis Eschenbach and Bob Tisdale. Your paper raised many, many good rebuttals and a bit of heat, but your excellent standing in WUWT and a bit of nudging from Anthony, kept the rebuttals on a high plane. I am sure that you know without me saying that you are highly regarded and respected in WUWT.
May I offer a suggestion for you and Liu? Pull your paper, carefully read the three discussions of your paper, find a quiet time and go sit on a stump in the woods and reflect on the discussions, then do the mini-studies suggested and finally draft a follow-up paper detailing the overall findings basis the new studies and insights.
Next, privately ask Willis, Bob, Anthony and a few others that offered meaningful rebuttals to your first paper, to review your draft. Revise the paper in light of these reviews and start the publication process over.
And watch the title!
JFD

Editor
August 21, 2010 3:10 pm

JFD
I would like to endorse all the comments and sentiments in your post.
tonyb

Ralph Dwyer
August 21, 2010 7:26 pm

I’m taking credit for this. I haven’t seen it before, but I think it can very easily be summed up as GMIGO: “grant money in, garbage out”!

Paul Vaughan
August 22, 2010 6:58 am

TomVonk
You raise interesting points. I’ve no doubt that if someone took a very serious look into the EOF analysis (on this dataset) they would find several ways to rip it to shreds.
A problem that arises with PCA, EOF, & factor analyses more generally:
People not only don’t do thorough diagnostics – they don’t even know how – worse still: they don’t even know they should. The preceding comment isn’t aimed at anyone in particular. It is an observation arising from hanging around both stats departments at universities & stats consultants.
I also strongly object to defining the Southern Ocean as 40-90S. Even just plain 60-90S is a simplification. For preliminary investigations I recommend using the Antarctic Convergence and tectonic plate boundaries as guides for spatial analysis of the Southern Ocean – (the latter will include the Southeast Pacific & a Humboldt Current component).
I agree very strongly with Dr. Curry that anything that stimulates research on the maritime deep south (not Antarctica alone) is both welcome & due. (I would include the Southeast Pacific beyond just 60-90S.) The excessive focus on the northern hemisphere (and even on the northern portion of the southern hemisphere) is unacceptably smothering.
Continental patterns differ dramatically from maritime patterns. This is not something that simply splits symmetrically over the equator. The ‘dividing line’ does not even follow a line of latitude. (See definitions suggested in the previous paragraph for preliminary investigations of the Southern Ocean — investigators like Bob Tisdale help refine our focus futher…)

Paul Vaughan
August 22, 2010 9:11 pm

Correction:
not “tectonic” — rather, this is what I had in mind:
Earthquake Map:
http://earthquakes.usgs.gov/research/data/plate15.pdf
Using the Southern Ocean / Southeast Pacific boundary suggested by that map (for bounding SST geographically), one finds interesting coherence with stratospheric aerosol optical thickness – (raises more questions than it answers – exactly what makes it interesting).

TomVonk
August 23, 2010 4:31 am

Bob Tisdale
Any ideas of how to present this for readers without science backgrounds?
Yes .
The most basic notion and this one is necessary for all PCA , EOF etc methods is the notion of coordinates .
So you always must begin by a presentation that there is an infinity of possible coordinates among which you choose arbitrarily one .
Then I generally follow by a rugby ball .
If you choose first a coordinate system and then put a rugby ball in those coordinates , then its description is complicated . There are no symmetries and you might actually be unable to recognize that you have a rugby ball .
So the idea is to pick among the infinity of coordinate frames a unique special one which is chosen by observing the symmetries of the rugby ball .
So clearly if you choose the symmetry axis of the ball as 1 direction and a plane orthogonal to the axis as 2 more directions , you obtain a new frame of reference where the rugby ball description is obvious and simple .
The projection of the ball on the orthogonal plane are just circles and its projection on a plane containing the symmetry axis are just ellipses . The rugby ball can be recognised in this frame by simply looking at it (hence the notion of “recognizing the needle” in my post above) .
Now when you collect some data and visualise them in some arbitrary coordinate frame with N dimensions , those data points will just be some N dimensional rugby ball . What the PCA , EOF etc method does , is to find a new coordinate system in which the rugby ball becomes obvious . F.ex the coordinate axis which goes along the most stretched direction of the rugby ball is the most significant (technically “has the highest eigenvalue”) and will show the best correlations .
So its all just about a method of changing an arbitrary coordinate frame in a special one.
Of course if your data don’t really form a rugby ball but a badly deformed potatoid , then changing the coordinate frame doesn’t help much . It still stays a badly deformed potatoid and you recognize nothing .
.
Atheok
Yes , that is what I tried to say in my post . The biggest issue with this draft is that it is impossible to tell what is data and what is just some computer programm .
This is most striking in the P-E case (Precipitation – evaporation) . It’s already bad enough that the link given in the Ref doesn’t work . But it takes unholy time to realize that these numbers are NOT data but again some computer simulation .
Actually when you look at the figures there are only THREE that deal with real data – 1a , 2a and 3a . Everything else , that is 27 figures , are just computer simulations !
That means that the draft is dedicated to 90% to the behaviour of computer models without saying that it is so .
The summit is reached in the 1j , 1l and 1m figures .
In those figures all structure is destroyed , the Antarctic is not only represented by EOF1 , it IS EOF1 .
That means that the phase space became monodimensional !!
For the computer the whole of Antarctic can be explained by only ONE variable and as the domain is a homogeneous red blob , the relationship is linear .
What is this variable ? Well CO2 concentration because it is the only thing that varies between Fig1a (the reality) and Fig1l (the computer) .
The CO2 has overwhelmed and crushed all non linear dynamics in such a way that for the computer it became the only thing that matters – forget ENSO , PDO and all complex spatial structures .
This is clearly not what happens in the real world .
Strangely the authors instead of saying that the computer behaves in an unrealistic way and that the results should be rejected , seem to treat the fact that the phase space became monodimensional as having as much credibility as real (measured) data that obviously contradict it .
This is a much too common feature of “climate” papers . People simply stop making a distinction between data and computer runs . You have no data about snow fall ?
It’s not a problem , you run a computer model 100 times , average and voilà ! You have all the “data” you want . This is not good science in my opinion .
.
Paul Vaugn
I’ve no doubt that if someone took a very serious look into the EOF analysis (on this dataset) they would find several ways to rip it to shreds.
I agree . I strongly doubt that the draft will stay in the present form .
I am pretty sure that there is a sampling problem because the first eigenvalue is much too low . That means that the second is not far from teh first and/or that the eigenvalues are closely spaced .
I am also pretty sure that the size of the domain is too big . That means that the classical test of cutting the domain in 2 and running EOF on each of the 2 parts (I mean here the real data and not computer simulations) will show a significant sensibility to the domain definition .
And if it does , it is enough to destroy the whole idea .

George E. Smith
August 23, 2010 2:31 pm

Well just as an aside, I just (this minute) bounced EOFs off a high powered PhD Mathematician (Indian chap) who sits right adjacent to me. He spends his whole day wrapped up in analysis and filtering of signals of all manner; has Morlet Wavelets for breakfast.
He nearly had a heart attack on the mention of “empirical” with relation to “orthogonal functions” and pleaded complete ignorance (like me).
So I guess I’m in good company; in that he is very good at what he does; but evidently both of us have been leading charmed lives to have never run across EOFs.
He wiki’d it up and took a look; and confirmed that it was not within his ken; but that he could probably get up to speed PDQ (I’d agree on that (for him; not for me)).
So you blokes who use it routinely are out there in hyperspace where some of us have never been.
But when my colleague gets aboard; he will probably be able to explain it to me.
In the meantime; am I correct in that EOFs are NOT a method of synthesizing some arbitrary function in a representation that permits recovery of the original function, without loss of information; but may bring insight into the arbitrary function that was note previously perceived ??
It was the apparent replacement of a host of information, with a dearth of information in another form; in apparent contradiction of the Nyquist Sampling Theorem, that had my alarm bell ringing.

August 23, 2010 3:52 pm

Thanks, Anthony, and thanks, Judith Curry.

TomVonk
August 24, 2010 2:11 am

George E.Smith
In the meantime; am I correct in that EOFs are NOT a method of synthesizing some arbitrary function in a representation that permits recovery of the original function, without loss of information; but may bring insight into the arbitrary function that was note previously perceived ??
The short answer is yes .
As I have repeatedly written on this thread , EOF is PCA for all practical purposes .
I don’t know the origin of the “EOF” name and it is rather irrelevant . It established itself in geophysics when scalar spatiotemporal fields were analysed (temperatures , pressures , densities) .
The purpose is to find spatial patterns (standing waves) that capture the variability in the data .
So to use your vocabulary we have some unknown function f(x,t) that has been measured . Then a matrix M(xi,ti) is formed where the columns are locations and the lines times . For example 1 column is the time series of SST in Ushuaia . Etc .
You remove then generally the temporal mean from each column and form the covariance matrix .
The EOF method will give you spatial variability modes (e.g EOFs) which explain most of the variance in the data .
Normally a graphical represenattion of an EOF is a contour map that tells you how much of the local (measured) variance is explained by this considered EOF .
At one place you may see 0.9 what means that this particular EOF explains almost all variability at this place .
At another place you see 0.1 what means that this particular EOF explains almost nothing of the variance at this place .
Let us note that this is not what the Figures in the paper show even if they should .
Of course you can reconstruct all original data of the covariance matrix , so f(x,t) , from all EOFs and their Eigenvalues .
However as the purpose of all PCA like methods is to simplify , you will always want to truncate .
I don’t know how many eigenvectors (EOFs) there are in the matrix considered here .
Let’s suppose 50 .
As you see the authors kept only ONE EOF which explains one fourth of the variance .
Clearly you can’t reconstruct the data with this one EOF because there are three fourth of the variability which are NOT explained by this EOF .
As I have already written in previous post , the purpose of PCA/EOF is to identify spatial modes (as they are orthogonal , they are spatially uncorrelated with each other) which explain a big part of the variability .
So no , you don’t get insights in the f(x,t) itself but you get insights in its spatial modes of variability .
This is the easy part .
The hard part begins when you try to connect a contour map of an EOF (I remind that the components of the EOF vector are generally correlations e.g dimensioneless coefficients ) to a physical process .
This is what von Storch called “recognizing the needle by just looking at it” .
If you can’t do that , then any PCA/EOF method is physically useless even if it is mathematically correct .
I hope it helps .

August 24, 2010 2:59 am

TomVonk: Thanks.