NASA Aqua Sea Surface Temperatures Support a Very Warm January, 2010
by Roy W. Spencer, Ph. D.
When I saw the “record” warmth of our UAH global-average lower tropospheric temperature (LT) product (warmest January in the 32-year satellite record), I figured I was in for a flurry of e-mails: “But this is the coldest winter I’ve seen since there were only 3 TV channels! How can it be a record warm January?”
Sorry, folks, we don’t make the climate…we just report it.
But, I will admit I was surprised. So, I decided to look at the AMSR-E sea surface temperatures (SSTs) that Remote Sensing Systems has been producing from NASA’s Aqua satellite since June of 2002. Even though the SST data record is short, and an average for the global ice-free oceans is not the same as global, the two do tend to vary together on monthly or longer time scales.
The following graph shows that January, 2010, was indeed warm in the sea surface temperature data:
But it is difficult to compare the SST product directly with the tropospheric temperature anomalies because (1) they are each relative to different base periods, and (2) tropospheric temperature variations are usually larger than SST variations.
So, I recomputed the UAH LT anomalies relative to the SST period of record (since June, 2002), and plotted the variations in the two against each other in a scatterplot (below). I also connected the successive monthly data points with lines so you can see the time-evolution of the tropospheric and sea surface temperature variations:
As can be seen, January, 2010 (in the upper-right portion of the graph) is quite consistent with the average relationship between these two temperature measures over the last 7+ years.
[NOTE: While the tropospheric temperatures we compute come from the AMSU instrument that also flies on the NASA Aqua satellite, along with the AMSR-E, there is no connection between the calibrations of these two instruments.]


Tom in snow free Florida: You replied, “So are you saying that simply charting a monthly SST anomaly doesn’t tell you anything specific about the direction of global temperature trends?”
No. If the SST anomaly trend is postive, sea surface temperatures are rising over the period used to determine the trend, and vice versa.
And in my earlier answer, I probably provided you with an overly complicated answer when one wasn’t required.
Regards
lgl (11:36:42) : You asked, “Anyway, the problem is it doesn’t drop back enough before the next step. Bacause of an underlying trend?”
No. The warm water that had been below the surface of the Pacific Warm Pool before the El Nino is still on the surface after the El Nino and this is, of course, during the La Nina. In other words, during the La Nina events that followed the 1986/87/88 and 1997/98 El Ninos, the warm water may not be in the NINO3.4 or Cold Tongue Index regions any longer, but it is still on the surface in the western Pacific and eastern Indian Oceans, still releasing heat, still affecting atmospheric circulation.
Bob Tisdale (07:38:43) : “Have you searched the scholarly literature written about this and the RSS MSU TLT anomaly dataset to see if your questions have been addressed?”
I am a practising engineer and left academia some years ago. Although I don’t have ready access to the literature, I can search on the likes of google scholar. My questions have not been answered from that.
I don’t want to rely on google scholar (don’t know how good it is), so I try to get input from others on sites like this. If anybody can, please give a reference to the points I raised above and I’d be very interested to look into it.
Right now, the issue of design of sampling scheme and avoidance of aliasing does not appear to have been addressed. There is a good deal of analysis of statistical sampling error, but this is a question of discrete systems theory.
Tom P (05:38:41) : “Any potential issue here should already have been answered in the scholarly literature. ..Indeed it has.”
Not by that paper Tom.
The paper recognises deficiencies in the spatial and temporal temperature data. But the analysis goes absolutely nowhere near the sampling theorem. Doesn’t even mention it.
There is a long discussion of gridding the globe, and then constructing grid averages or inferring grid data using a 1200 km “horizon”. The analysis is statistical, and consideration is given to statistical sampling error.
The finest time resolution is monthly average. If the starting point is that the data is very likely to be aliased, this paper would be just messing around with ruined data.
Sorry to be so dismissive – come back if there is anything you can pull from the paper which you think is a demonstration that the sampling theorem is satisfied. Even better if you can show that it has been more-than-satisfied by a factor of 10.
Tom: “The theorem states that if signals are sampled at twice the maximum frequency present in a signal, that signal can be perfectly determined.”
That’s fine if you have an absolutely band limited signal (amplitude of zero above a certain frequency), and infinite time to gather your samples. But we have neither of these in practical systems and certainly not when it comes to climate.
So the question is how much we need to exceed the minimum theoretical frequency (time) and wavelenght (space) in order to get answers in a period which meets the requirements of the AGW question. Given that we have 150 years of indequate data, there is every possibility that haven’t even started to gather data at the necessary resolution. (Now there’s a thought.)
Tom: “It does not say anything about the average of that signal”
Agreed! But the point being made was that the central limit theorem cannot be relied upon to dig us out of the hole that aliased data leaves us in. If data is aliased and therefore meaningless, an average value adds no more meaning.
And yet Europe, Russia, the Far East AND North America are experiencing
one of their most severe winters ever!
Kinda makes you wonder “what” any historical records might me measuring???
El Nino is fading fast.
Can one imagine the decline in “Global’ TEMPS that we will be looking at
the rest of this year?
Jordan: You wrote, “If anybody can, please give a reference to the points I raised above and I’d be very interested to look into it.”
I have not seen the topic raised before, here or at any other climate-related blog.
With respect to my earlier comment about emailing Dr Spencer, the climate scientists I have contacted via email (did not include any of those who made headlines recently) have always been very forthcoming in their replies to my questions, most often they provide more insight than was required. I understand your want to maintain a dialogue about your concerns out in the open, but, unfortunately, I don’t believe you’re going to find your answers on this thread. As you can see, it’s only one day old, and the number of comments have dropped drastically. Your discussion is also off topic. But you might get lucky and someone might come along who has researched the it.
Or consider writing a guest post.
Just had a thought. Have you tried Lubos Motl? Here’s a link to his website.
http://motls.blogspot.com/
Regards
sky: The hotspot in the central mid latitudes of the South Pacific has been there since at least November:
http://i47.tinypic.com/2mq9idk.png
While the SST anomalies in that area are unusually high during this El Nino, it is not unusual for the warm spot to occur during an El Nino. Refer to:
http://bobtisdale.blogspot.com/2010/01/south-pacific-hot-spot.html
Also, the cooler waters surrounding it have been at least partially offsetting the hotspot. Here’s a graph of SST anomalies for the South Pacific thru December 2009:
http://i49.tinypic.com/2edcdnt.png
Here’s another post that deals with the hotspot indirectly. It shows up during El Nino events from 1951 to present:
http://bobtisdale.blogspot.com/2010/01/south-pacific-sst-patterns.html
Regards
Kelvin waves demonstrate what happens with upwelling and downwelling. Downwelling is warm, upwelling is cold. The lack of strong near surface Easterly winds keeps the colder waters under the warmer skin of water (from shortwave solar radiation). When the Easterlies kick up, the warm skin is blown away, causing more mixing of colder waters below with the warm turbulent skin.
Mooloo (16:30:33) :
> It has plainly been a cold winter, on average, in the land
> portions of the Northern Hemisphere. To deny that is to deny
> what is plainly in front of your face. And it makes anyone reading
> think “what other facts is this person prepared to wave away”.
Not true in Canada, at least for January. It’s been well above the 1971..2000 normals. Go to the website http://www.climate.weatheroffice.gc.ca/prods_servs/cdn_climate_summary_e.html and select the month of January, 2010 and “Text” output. I downloaded the text output, ran it through “grep” (under linux) and picked off all sites with temperature deltas, and 0 or 1 or 2 missing days (i.e. fewer than 3) during the month. The filter consisted of the command…
grep “^.\{57\} [012]….\.”
Of the 236 sites that matched, only 4 had negative deltas (colder than normal monthly temperatures for January).
Bob Tisdale (18:28:53) :
sky: The hotspot in the central mid latitudes of the South Pacific has been there since at least November
Westward in the NZ offshore plateau there has also been a spectacular and persistent phytoplankton bloom in the same timeframe and latitudinal coordinates ,eg
OCT
http://earthobservatory.nasa.gov/IOTD/view.php?id=40924
jAN
http://www.eosnap.com/?p=13327
This dropped the local surface sst .This would have depleted the eastward regions of nutriemts (one of the main arguments against Fe fertilization ) amd made the waters more optically transparent,the Hyperoligotrophic waters (purple patches) that are observable on seawifs.
There is a good paper Morel et al on this on the easter island anticlyclonic gyre.
Abstract
Optical measurements within both the visible and near ultraviolet (UV) parts of the spectrum (305–750 nm) were recently made in hyperoligotrophic waters in the South Pacific gyre (near Easter Island). The diffuse attenuation coefficients for downward irradiance, Kd(l), and the irradiance reflectances, R(l), as derived from
hyperspectral (downward and upward) irradiance measurements, exhibit very uncommon values that reflect the exceptional clarity of this huge water body. The Kd(l) values observed in the UV domain are even below the absorption coefficients found in current literature for pure water. The R(l) values (beneath the surface) exhibit a maximum as high as 13% around 390 nm. From these apparent optical properties, the absorption and backscattering coefficients can be inferred by inversion and compared to those of (optically) pure seawater. The total absorption coefficient (atot) exhibits a flat minimum (, 0.007 m21) around 410–420 nm, about twice that of pure water. At 310 nm, atot may be as low as 0.045 m21, i.e., half the value generally accepted for pure water. The particulate absorption is low compared to those of yellow substance and water and represents only ,15% of atot
in the 305–420-nm domain. The backscattering coefficient is totally dominated by that of water molecules in the UV domain. Because direct laboratory determinations of pure water absorption in the UV domain are still scarce and contradictory, we determine a tentative upper bound limit for this elusive coefficient as it results from in situ measurements.
From the introduction.
Apart from the inserted quotation mark, the title of this
article is identical to that of an article published some
25 years ago by Smith and Baker (1981). This means that
perhaps the purest natural waters are not yet discovered and
that the subject is still topical. This article is motivated by
recent (2004) observations in the exceptionally clear, blueviolet
waters of the anticyclonic South Pacific gyre, West of
Rapa Nui (Easter Island), and its aims are somewhat similar
to those of the Smith and Baker study. Namely, the
following questions are addressed: (1) What are the optical
properties of these extremely clear natural waters and how
can they be explained? (2) Is it possible from their apparent
optical properties (AOP) to derive some upper bound limits
for the absorption coefficient of pure water? Indeed,
laboratory measurements of pure water absorption, particularly
in the violet and ultraviolet (UV) part of the spectrum
are scarce and contradictory. Somehow paradoxically, it
seems that natural waters, in certain conditions (those of
extreme oligotrophy), may be purer than those prepared in
a laboratory, despite enormous cautions to avoid trace
organic impurities.
Re: Jordan (Feb 5 13:40),
If data is aliased and therefore meaningless
This is the first time I meet this concept, had to look up “aliased”. Nyquist we had been introduced to by George more than a year ago, but alias is new.
From what I can gather it means that spurious signals appear due to not picking the correct for the theorem frequency.
I object to the term “meaningless”. Distorted, yes, complicated, yes, confusing, yes. Meaningless no : as your video shows , if one suspects a Nyquist deviation, one could unscramble the spurious from the signal enough to make decisions.
Now in the context of this post, why would one consider satellite data not to be able to fulfill the Nyquist theorem? The weather does not change every minute nor the terrain correlation every hundred meter or so
UncertaintyRunAmok (16:02:02) :
so in ‘uncertainty’ does the tree make a sound? and is the cat in the box when the lid is on?
Jordan (14:18:25) :
You are confusing spatial and temporal frequencies. Evidently you don’t understand that that the spatial correlations in the paper I referenced are relevant to your concerns about aliasing.
You’re not alone, though. George E. Smith appears to be making a career out of misunderstanding the basics of sampling.
Bob Tisdale (18:11:05)
Generally fair comments and thanks for your suggestions – I’ll think about it.
You may apprecaite how it is necessary to wait for the right time to get attention to an issues like this. There was not much point in trying to spark-up a discussion about aliased signals when AMSU data trend had no particular upward direction for the last decade or so. But January changed that.
I would not have thought this matter is OT on a thread which is concerned with reasons for the January 2010 spike.
Anyway, you are now well aware of the sampling theorem and aliasing with particular regard to these putative climate signals. These points are well founded in the mathematical throry of sampling and discrete signal processing. If anybody wants to create a reconstruction of an analogue signal, this is the field they are working in.
As a couple of posters have mentioned, climatology (as a field) may have made the mistake of treating this as solely a matter of statistics. Possibly completely missed the point.
It would only take a reference to the scholarly literature to give the assurance that the matter has been conclusively analysed and resolved, and that the climate system raises no such issues. With that in mind, all data acquisition systems should be referring to such work in order to satisfy the community that their design avoids aliasing.
I haven’t seen any of that, and neither have you Bob.
It is certinly not even enteratined in Jones’s 150 year golbal average temperature reconstruction. The paper suggested above by TomP does a good job of showing how the spatial coverage has varied over the years, and how sparse the data acquisistion system has been over its whole history. These works make no attempt to show how they avoid aliasing , and that puts them straight into my “aliased junk” folder.
Right now, I suggest others do the same.
Regards
Tom P (01:57:32) : “You are confusing spatial and temporal frequencies. ”
Nope! I have been at pains to keep them separate. And I have also expressed a view that the sparseness of the spatial sampling system is the one which gives me most concern.
But Tom, lets not have that debate. I asked for a reference to the handling of the sampling theorem in the literature. You gave me a paper which treats the issue as an exercise in statistics.
Can you refer to a paper which demonstrates the temporal and spatial bandwidths and addresses the issue of how the sample data systems for the global temperature field will avoid aliasing in space or time?
Jordan: You wrote, “I would not have thought this matter is OT on a thread which is concerned with reasons for the January 2010 spike.”
It isn’t OT. Poor choice of words on my part.
Does the topic deserve a thread of its own? Right now this discussion is lost among 150+ other comments. Leading me to the guest post suggestion again.
anna v (22:30:14) : “I object to the term “meaningless”.”
I was trying to be measured. In fact an aliased signal can be not only meaningless, it can be downright misleading.
In a cyclic system, aliased data will add low frequency signals where none exist. If a researcher is looking for long term climate patterns which (due to their duration and intervening factors) are difficult to trace back to physical processes, how can they also demonstrate that detected cycles in sampled data are not just the artefacts of aliasing?
I think the only sure way is to ensure the data acquisition system is designed to avoid aliasing. And to do that, you need to start off with an analysis of the physical system in order to determine sampling frequency (time) and/or sample “wavelength” (space).
In a less-than-orderly system (such as the climate), alising can suddenly produce an apparent step changes. Does that not have some appeal when we consider what we just observed in January 2010?
Anna: “one could unscramble the spurious from the signal enough to make decisions.”
I don’t think so Anna. The underlying issue of aliasing is a failure to collect sufficient data to be able to reconstruct the signal. Lost data is lost forever. There is too much loss of information between the samples and trying to fit lines (or using models to infer data) between the samples will not be reliable enough to recover the true shape.
“Distorted, yes, complicated, yes, confusing, yes. Meaningless no”
Sorry to disagree again.
Anthony’s helicopter appears to fly with stationary main rotor blades. If somebody knows nothing about helicopter flight and saw that video, they might wonder what those stationary blades are supposed to be doing. Certainly don’t appear to have any part to play in the changes of direction.
Similarly, if all we had was the video of that propeller, we might calculate the average distance of the detatched areofoils from the hub, we might also carry out calculations of the average speed of the aerofoil as it departs from the hub, and the rate of loss of mass as each areofoil disappears. We might wonder what happens to the mass as the detached areofoil disappers and the attached aerofoils gain mass. All completely wasted effort as the real issue is the inadequacy of the observations.
Anna: “Now in the context of this post, why would one consider satellite data not to be able to fulfill the Nyquist theorem? The weather does not change every minute nor the terrain correlation every hundred meter or so”
Yep – that’s the question. I suspect the temperature reconstructions are warped by aliased data. Perhaps in time, but I feel much more so concerned about spatial sampling – especially the oldest most sparse data.
It would be nice to have that formal reported survey and analysis which addresses the issue and puts forward specifications for spatial and temporal sampling.
AS WITH ANY MAJOR CLIMATE RECORD ACHIEVEMENT…THESE PRELIMINARY
RECORDS WILL BE QUALITY CONTROLLED BY NOAA’S NATIONAL CLIMATIC DATA
CENTER OVER THE NEXT SEVERAL WEEKS
So I guess that we just have had least amount of snow in 114 years?
Re: Jordan (Feb 7 03:26),
Well, I also would be interested to see whether the Aqua scanning fulfills the Nytquist criteria, so it might be good to see a thread on this.
This January peak is in line with the peak of 1998, it is an el Nino year. I think the dissonance comes from the ad hoc assignement to air temperatures the role of controlling the climate, instead of their being one of the measures that show the energy flows of the earth system.
I think that your video shows that there exists information, not that there is no information. It is the interpretation of cause that is in doubt.
Seems to me that deterministic chaos is the answer to this Nytquist conundrum.
Jordan (02:28:40) :
As you fail to understand the relationship between sampling and correlation in the paper I referenced, there really is very little more I can offer.
Anna
“This January peak is in line with the peak of 1998, it is an el Nino year.”
Or perhaps all we have is a distortion and exaggeration of some events like ENSO. Just depends how the thermal field varies, and how it interacts with the measurements we happen to have made. If we change the sampling system, do we get a different impression of a particular event?
“Seems to me that deterministic chaos is the answer to this Nytquist conundrum.”
Don’t see why. This should only be a matter of design of the sampling system to make sure our observations and reconstructions are not an artefact of the samples we happen to have gathered.
It is a basic requirement of practical applications, such as digital control. We cannot control something unless we can observe it.
If temperature trends were nothing more thn sampling artefacts, it kinda puts that 2 deg C target into perspective, doesn’t it.
Tom P (04:50:33) : “As you fail to understand the relationship between sampling and correlation in the paper I referenced, there really is very little more I can offer.”
If you’d like to press on with this Tom, then I would ask you to look again at the first video I posted above and the first example of the aliased signal where we observe a step-wise sine wave at a fraction of the frequency of the input signal.
Now, here’s the catch: all you have is the aliased series in the lower half of the oscilloscope display. You have nothing to tell you that the input signal is really 10 hertz.
From the alisased signal, please explain how an assessment of the correlation between neighbouring samples (or any other form of correlation analysis) will inform us:
firstly, that the sampled data is an aliased mis-representation of the input signal, and
secondly, that a sampling rate of 100 hertz is required to provide an accurate reconstruction of the input signal within a reasonable time scale.
(If you are still concerned about spatial versus temporal sampling, just substitute spatial dimensions for temporal dimensions in the video.)
Think about it Tom P. This could be another example of climatology trying to develop novel techniques in respect of other disciplines (in this case, discrete systems theory). What do you think are the chances that other disciplines do not use those techniques for perfectly good reasons?
Re: Jordan (Feb 7 06:28),
Jordan (06:28:05) : | Reply w/ Link
I had said:
“Seems to me that deterministic chaos is the answer to this Nytquist conundrum.”
you said:
Don’t see why. This should only be a matter of design of the sampling system to make sure our observations and reconstructions are not an artefact of the samples we happen to have gathered.
I found the following by googling “chaos theory and Nyquist”
http://vestnik.mstu.edu.ru/v11_3_n32/articles/02_pryg.pdf
Interestingly enough in section 3.1 he tackles the ice ages, and yes, there is chaos and Nyquist.
“The results have great importance for a general chaos theory, since a transformation between the base
theoretical types of the chaos structure as result of a real processes’ evolution has been achieved for the first time.”
The weather system is chaotic and the climate too, after all.
Jordan (11:32:55) :
The spacings of the stations analysed in the Hansen paper are down to just a few kilometres. If there were serious problems with aliasing, there would not be a high level of temporal correlation between anomalies measured at neighbouring stations. The fall off in that correlation can be used to determine a reasonable sampling distance.
Climate scientists are very well aware of issues in sampling temperature data and use standard approaches to ensure that their sampling is adequate. What you consider to be a potential flaw in their way of determining temperature changes was first analysed over twenty years ago. Aliasing is not an unresolved issue.
Anna
Thanks for the reference to the paper. I’m sorry but I do not understand it as I have no particular background in chaos theory. As far as I can tell, it is not addressing the issue of whether the sampled data has been sampled correctly. The analysis appears to rest on and assumption that there is no possibility of aliasing in the sampled data series (i.e that the data acquisition system complies with the sampling theorem). Indeed, if the data had been corrupted by sampling, I don’t think the authors’ analysis would have any meaning.
Tom P.
No demonstration of how a statistical analysis can provide a test of aliasing in the above aliased sinusoid? No statistical analysis of an aliased series which points us to the required sampling rate then?
That comes as no surprise. No new mathematics here folks, move along.
Re: Jordan (Feb 7 14:32),
Of course it would comply to the Nyquist theorem if it wants to prove the connection.
If you want to show that statistical mechanics morphs at another level into thermodynamics, which it does, you use the correct statistical mechanics to show that thermodynamic quantities are arrived at in the limits. From then on you can use the thermodynamic quantities in the knowledge that the information gained is based on solid microdynamics. You do not need to follow the microscopic processes to measure temperatures.
In a similar way using deterministic chaos dynamics you do not need to follow the underlying level that produces the chaotic dynamics. IMO of course.
Now if the question is whether the data gathered obeys the Nyquist condition , we come to the point that it has to obey it if we want to reproduce the original correctly in the detail. The question then becomes: what orginal? If we are talking of molecular statistical ensembles then it would be impossible since the frequencies would be tiny, molecular distances and times. If we are talking of a complete temperature map the distances and times become of the order of meters and hours, (which is what you are inquiring after). If we look at it as a dynamical chaotic system the distances become much larger and so do the times.
That is why I have been saying that dynamical chaos tools have to be developed for climate study ( as in the paper by Tsonis et al).