Spencer: Record January warmth is mostly sea

NASA Aqua Sea Surface Temperatures Support a Very Warm January, 2010

by Roy W. Spencer, Ph. D.

When I saw the “record” warmth of our UAH global-average lower tropospheric temperature (LT) product (warmest January in the 32-year satellite record), I figured I was in for a flurry of e-mails: “But this is the coldest winter I’ve seen since there were only 3 TV channels! How can it be a record warm January?”

Sorry, folks, we don’t make the climate…we just report it.

But, I will admit I was surprised. So, I decided to look at the AMSR-E sea surface temperatures (SSTs) that Remote Sensing Systems has been producing from NASA’s Aqua satellite since June of 2002. Even though the SST data record is short, and an average for the global ice-free oceans is not the same as global, the two do tend to vary together on monthly or longer time scales.

The following graph shows that January, 2010, was indeed warm in the sea surface temperature data:

AMSR-E-SST-thru-Jan-2010

But it is difficult to compare the SST product directly with the tropospheric temperature anomalies because (1) they are each relative to different base periods, and (2) tropospheric temperature variations are usually larger than SST variations.

So, I recomputed the UAH LT anomalies relative to the SST period of record (since June, 2002), and plotted the variations in the two against each other in a scatterplot (below). I also connected the successive monthly data points with lines so you can see the time-evolution of the tropospheric and sea surface temperature variations:

UAH-LT-vs-AMSR-E-SST-thru-Jan-2010

As can be seen, January, 2010 (in the upper-right portion of the graph) is quite consistent with the average relationship between these two temperature measures over the last 7+ years.

[NOTE: While the tropospheric temperatures we compute come from the AMSU instrument that also flies on the NASA Aqua satellite, along with the AMSR-E, there is no connection between the calibrations of these two instruments.]

Advertisements

189 thoughts on “Spencer: Record January warmth is mostly sea

  1. I notice that the BBC have already jumped on this by closing their latest item about the growing skepticism of the British towards AGW with the following statement:
    …and scientists in the USA have announced that this January has been recorded as the warmest ever.

  2. I’ve been following the Arctic ice cap progress this year, and my impression is that although temperatures were frigid and the ice cap has grown as usual, warm ocean water has invaded and melted ice in a number of places.

  3. Very interesting. It will be even more interesting to see if this trend continues throughout 2010. With temps from two different instruments showing record warmth from the sea surface up through about 46,000 ft. , the year is starting out exactly as it would need to to become the warmest year on record. We are also seeing arctic sea ice for January near the low levels we saw in the disasterous year 2007 for sea ice extent…and less sea ice means of course even more warming for the oceans…

  4. Why would Spencer be surprised, his website has been showing Channel 5 in record territory all month? Even with the caveats about drift there’d be something seriously wrong if this wasn’t reflected by the Aqua results.

  5. Is the terminology “sea surface temperature” reading the water temp on the surface, or the air temp above the surface? I’ve heard that the satellite only reads IR down to the lower troposphere and not the water surface.
    Thanks,
    Dan

  6. “Sorry, folks, we don’t make the climate…we just report it.”
    Absolutely no need to apologise. The whole reason for this site is, to my mind, to get accurate and reliable data on which to base our hyptheses, no matter where that might lead us. It would be the height of hypocrisy to justifiably attack NOAA and CRU for their manipulation of the date and then to try and do the same to reliable data when it is in our hands.
    There is no right or wrong, good or bad raw data. There are just right and wrong, good or bad ways of handling it. As long as contributors continue to be absolutely open with their data and how they have analysed it then they shouldn’t feel they have anything to apologise for.

  7. One question by the way. How does this measurement tie in with the previous thread about dropping ocean heat content. My first instinctive reaction as someone who is ignorant of this subject is to think the two conflict with each other. Is this right or am I misinterpereting what the ocean heat content measurement is saying?

  8. “There is no right or wrong, good or bad raw data.”
    rereading that (with an eye to the whole question of station data) I realise that it is not quite true but I hope people get the drift of what I meant.

  9. OT but sort of temperature related.
    Forecasters seem to be missing the mark lately. On the supposed to be sunny days we are still overcast with low clouds and corresponding cooler temperature. Local forecasters are usually pretty good but the past couple of weeks they have really botched forecasts on the clouds and amount of rain.
    Cosmic rays having an effect?

  10. The scatter plot is quite clear. The difference in the two swings along that slop. Would be nice to have several more previous decades to see the over all long term trends. But since we do not there is nothing one can imply with this except that climate is far more complex than we, or anyone else, thinks.

  11. So if the oceans are giving off heat/cooling and the heat is going to to the atmosphere where it in turn will give off heat/cool to space, should we be getting ready for a prolonged cooling period?
    Buy long underwear.

  12. OT, but continuing on the BBC theme.
    http://www.bbc.co.uk/blogs/thereporters/richardblack/2010/02/much_has_been_written.html
    I never thought I’d see this statement get through the BBC firewall (and there are several other comments like it too)
    ….
    The mounting evidence is that climate scientists and their cheerleaders are simply looking for a problem where there isn’t one. When significant warming is now shown to be an Urban Heat Island effect and when data sets have clearly been manipulated in a way that creates more warming, and scientisits hide data and refuse requests for data then we skeptics see a huge red flag! With each revelation, the skeptics shout from the roof tops – see we were right – there really is something fishy here and the IPCC, Western Governments and media are all “jumping to conclusions”.

  13. What about the impact of El Nino? What role does it play? Would it suggest that January, 2010 should be warmer or cooler than the data indicate? I understand that the current El Nino is, while stronger than usual, weaker than the record one in 1997-1998.

  14. R. Gates (10:58:18) :
    …… We are also seeing arctic sea ice for January near the low levels we saw in the disasterous year 2007 for sea ice extent…
    You can also see that currently the ice is just above the 2005 level which ended up being comparable to 2009!!

  15. R. Gates (10:58:18) :
    Very solid argument…. to extrapolate for 2010 from a single data point…
    By the way, 2006 was the year with the lower sea-ice extent but was way up for the summer sea-ice extent.

  16. The words “record” (confused with ‘on record’) and “ever” make great headlines in the media but are utterly fallacious in earth science, especially if the “record” refers to… the last few decades. There is a lot of confusion in the media and the population about the relative time scale of everything related to anthropology, geology and climate and it seems to be worsening.

  17. This is natural climate variability. It is one measurement of a complex system. Nothing that is happening is out of the ordinary or unusual.
    The basic alarmist hypothesis, that a rise in the trace gas CO2 will cause runaway global warming and climate catastrophe, remains as baseless as ever.

  18. R. Gates (10:58:18) : “We are also seeing arctic sea ice for January near the low levels we saw in the disasterous year 2007 for sea ice extent…”
    Out of curiosity, what “disasters” were caused by the 2007 decrease in arctic ice extent?

  19. @Dan in California
    “Is the terminology “sea surface temperature” reading the water temp on the surface, or the air temp above the surface? I’ve heard that the satellite only reads IR down to the lower troposphere and not the water surface. ”
    We must NEVER loose sight of the initial point of all this data gathering which is to confirm the CO2 AGW hypothesis. Warmists will try to prove that the world is warming, and then treat that as confirmation. The land and the sea may indeed grow warm or cold, but the key question still remains – Is this provably because of increased CO2 concentrations?
    Remember to make this point in all external discussions – the warmists arguments will simply get stronger if we have a hot summer otherwise….

  20. The previous post is telling me the ocean heat content is going down. This post tells me near record temperatures are all about SST. Every other comment on this site is about the cold winter being experienced or that the technology measurng these temperatures cannot possibly be accurate. Other posts are teling me that climate scientists have less moral fibre than British politicians and the journalists who report on them yet lapping up some of their quotes What is a poor, ingnorant but worried person to do? Perhaps I should start reading the agony aunt blogs before I go completely insane.

  21. “john pattinson (11:40:11) :
    The previous post is telling me the ocean heat content is going down. This post tells me near record temperatures are all about SST.”
    These two are not contradictory. The ocean pumps energy into the atmosphere via El Nino, El Nino is warm upwelling, thus causing a high SST and warming the atmosphere. After this phenomenon you end up with a lower OHC.

  22. The earth is mostly sea, so it goes without saying that the majority influence of the “global temperature” is always from the sea.

  23. Ray said:
    “Very solid argument…. to extrapolate for 2010 from a single data point…”
    The whole month of January is hardly a single data point. Now, If it had been a single day that was warm, that would be different. Janauary had 31 data points as days. Last I checked the year was made up of 12 months, and now we are more than 1/12 the way the way through the year and temperatures are still high througout the troposphere from sea level up to 46,000 ft. Now that’s a pretty big “single” data point…
    Gary said:
    “Out of curiosity, what “disasters” were caused by the 2007 decrease in arctic ice extent?”
    Disasterous used in a metaphoric sense, meaning when compared to previous sea ice extents. That’s how I meant it, but from another standpoint I suppose the argument could be made that the record low sea ice was bad news: Record low ice means a positive feedback is being established whereby increased solar energy can be absorbed by the ocean causing more warming causing more melting and more warming and more melting, etc. Eventually, if this continutes, even in our lifetimes perhaps, the arctic will be ice free in be summer. Great for shipping goods via the northwest or northeast passage perhaps, but potentially very bad for the ecological balance of the planet we’ve enjoyed since our ancestors first came down out of the trees.

  24. Like Smokey wrote, let’s not forget that the ocean is very deep with lots of currents and although the temperature at the surface is rising (due to insolation?), the bulk of the ocean can certainly be cooling at the same time.

  25. I am so confused. I just finished reading a very long line of comments on the January record high global UAH anomaly. It seems there were several explanations for this:
    Some comments seemed to indicate that this isn’t surprising; that in fact the Earth has been warming for the last 150 years. Is this true? Isn’t this what many climate scientists have been saying?
    Other comments seemed to indicate that the warming of the atmosphere was due to heat transferred to the atmosphere from the oceans, but that this meant the heat levels in the oceans were falling. Now this post claims the oceans seem to be heating even faster than the atmosphere. Which is true? Are the oceans cooling as the heat is transferred to the atmosphere?
    Finally, the UAH anomaly not only broke the record for January, it smashed it. The previous high of 0.59 was set only three years ago, and now this January the anomaly hit 0.72. Can we explain this big spike up?

  26. john pattinson (11:40:11) :
    The whole point is that declaring January 2010 the warmest ever in light of recent events is asking for trouble.
    Most unwise and carrying an unwarranted risk with it.
    For what?

  27. Substantially more investment in research into natural climate variations is needed so we can get off these ridiculous political distortion cycles that track every bump & dip in climate.
    The climate, like many of us, is neither left nor right. The common ground shared by alarmists & nonalarmists is a lack of deep understanding of natural climate variations — there is a common priority upon which to capitalize via agreement on the channeling of serious research funding.
    Ongoing instability is not the answer …not even politically, as it might not line up with election cycles the way most might prefer – so this is a control risk until we understand natural variation better — note to hyperpartisan clowns: sensible folks might recommend investment in true understanding to supplement the goofy spin.

  28. I would also say that this non-intuitive local weather vs Global temperature raises a question.
    Since the oceans have such a large heat storage capacity, wouldn’t using surface station data as a proxy for Global temperature of years prior to numerous open ocean temperature readings were available be unreliable to say the least?

  29. Steve Goddard (11:49:38) :
    So basically, the UHI effect has no (of insignificant) impact on global temperatures?

  30. Ray,
    UHI has almost no effect on satellite temperatures. Cities make up a small percentage of the planet. They even make up a small percentage of California.

  31. R. Gates (12:03:15) :
    [i]Great for shipping goods via the northwest or northeast passage perhaps, but potentially very bad for the ecological balance of the planet we’ve enjoyed since our ancestors first came down out of the trees.[/i]
    The ecological balance you speak of has swung from where I am being under a mountain of ice to warming to the point where the Sahara become lakes and grassland. Much of that change (back and forth) long after man’s ancestors left the trees.

  32. As far as I can tell about the ice the JAXA graph seems to not be adding new data for the past day or so as it’s remaining at the same number which would apparently be impossible for Ice Extent.
    As for UHI not having a significant effect, the Earth’s surface is 3/4th’s water and there is sense that the majority of the UAH record is because of the oceans.

  33. Ray,
    UHI does obviously have an impact on ground based temperatures which are increasingly based in cities. That is one reason why GISS has trended upwards over the last decade, while satellite temperatures have gone down.

  34. R. Gates (12:03:15) :
    Try again…. just not convincing… and flawed.
    January represents only about 8.4 % of the total 2010 data. It is statistically insignificant.

  35. I’m not really surprised by this result. There has been a huge hot area in the southern Pacific, beyond the ongoing El Nino. And the southern Atlantic has been on the hot side as well.
    http://www.osdpd.noaa.gov/data/sst/anomaly/2010/anomnight.2.4.2010.gif
    Even though the El Nino is weakening, there is a 3 to 4 month delay on global temperature. And that hot area in the souther Pacific is not weakening, as far as I can tell. I think that we have to expect February and March to be fairly warm also. The good news is that the ocean should be dumping a lot of heat into space right now. When that cycle completes I’m guessing that we’ll see a nice La Nina start up some time in the second half of this year.

  36. Ray (12:18:49) :
    What he said was that the majority the effect on global climate is caused by the activity of the oceans. UHI skews the perceived temperature trend in land-based measurements, it was never actually claimed to alter global temperatures.

  37. R. Gates:

    We are also seeing arctic sea ice for January near the low levels we saw in the disasterous year 2007 for sea ice extent

    A time-lapse video posted on this site sometime last year shows quite clearly that the 2007 loss of sea ice had more to do with the wind than anything else.
    BTW there’s no ‘e’ in disastrous.

  38. Just seen the BBC poll results. Now is it just me or is someone at the BBC trying to ‘confuse the decline’ with that top graph?
    The top graph shows an 83% to 75% drop in global warming believers – or claims to.
    But actually it doesn’t. Take a look lower down and the numbers that have been used are from the lower poll results which asks a rather different question – is climate change happenning? The text also says climate change. They have conflated warming and change to build the graph.
    The lower graph seems to be the source of the 83 and 75 (except I make 41+32+8 = 81and 26+38+10 = 74 so they are still exagerating!)
    But open the pdf for the real eye-opener. You see a ‘net yes’ (Huh?) and some altogether more interesting numbers – like just 26% of all respondents believing global warming is man made.

  39. Am interested if you can comment on NH vs. SH January temperatures. Were both warmer or did one offset the other? thank you for your work.

  40. Does this mean that the El Nino is causing sea temperatures to go up?
    If it does then I wonder what will happen when it ends since temperatures over the land seem not to be very high and if the sea temperature drops then my imagination tells me we are going to get even colder weather in the near future.

  41. Just as I suspected. Dr Roy, what is up with that warm pool in the south Pacific. That is really unusual…

  42. Ray (12:18:49) :
    No, UHI does not have any significance to global temperatures.
    It does have significant impact to surface temperature readings that are used estimate the surface temperature multiplying the errors over and over.

  43. “Are we to assume from this that the oceans are dumping a lot of energy?”
    That is exactly what I was thinking….

  44. RSS has 0.64 globally for January:
    year mon -70.0/ -20.0/ 20.0/ -70.0/ 60.0/ -70.0/ Cont. 0.0/ -70.0/
    82.5 20.0 82.5 -20.0 82.5 -60.0 USA 82.5 0.0
    ————————————————————-
    2010 1 0.640 0.758 0.760 0.375 1.532 0.285 0.040 0.800 0.472
    — John M Reynolds

  45. Will someone please make the definite statement that when SST goes higher it is either:
    1. due to the release of heat from the oceans and they are cooling
    or
    2. due to the accumulation of heat in the ocean and they are warming
    Which is it?

  46. As I see it, there are three possible causes of “warmer” sea surface temperatures:
    1) The heat was already in the ocean, and the warm water became mixed – via anomalous ocean currents, or sub-sea volcanism, etc
    2) Fewer clouds allowed more insolation than previous periods
    3) Ocean currents slowed, allowing more solar heating in a comparable period of time.
    There’s no reason to suspect that reduced cloud cover in the ocean would not be complemented by reduced cloud cover over the continents.
    There’s no reason to suspect sub-sea volcanism, or an anomalous salinity gradient caused surface warm water to mix. [Except for the sub-sea activity that must have accompanied the earthquake off the coast of Haiti.]
    3) is consistent with the interaction of the NAO with the PDO, slowing North Atlantic water giving rise to the Gulf Stream from the Equatorial currents.
    Opposing views are most welcome

  47. OT: U.N. Climate Chief: Critics Should Rub Their Faces With Asbestos
    http://www.foxnews.com/scitech/2010/02/05/climate-chief-critics-rub-faces-asbestos/

    Rajendra Pachauri, the besieged head of the U.N.’s International Panel on Climate Change, told the Financial Times on Wednesday that he is the victim of a “carefully orchestrated” campaign to block climate change legislation.
    “I would say [there are] nefarious designs behind people trying to attack me with lies, falsehoods,” he told the paper, swatting away allegations that his India-based climate institute, TERI, has benefited from decisions made by the IPCC, which he also chairs.

  48. john pattinson (11:40:11) :
    The previous post is telling me the ocean heat content is going down. This post tells me near record temperatures are all about SST.
    John just ask questions. There are a lot of very bright people on this blog.
    Also you will see different opinions and theories here not a cohesive group dedicated to pushing “The Theory of Climate Change” As skeptics we say we do not know everything but if you have a new theory PROVE IT. That means you will see people throw out new ideas on this blog to be “peer reviewed” and dissected or new papers to be discussed.

  49. Archonix (12:28:19) :
    Scott Covert (12:36:14) :
    I know the impact of UHI on the temperature records and certainly how they use it to make their point that the earth is warming, but my point was why are they then using land based temperature measurements if at the end the oceans are in fact the heat capacitors/storage of the earth.

  50. Tom,
    SSTs are largely influenced by clouds. If the sky is clear the sun warms up the upper few meters.
    Circulation is also important, as La Nina occurs when cold water gets dragged up from depth along the west coast of South America.

  51. Ray,
    UHI obviously has some effect on land temperature measurements
    But 70% of the earth’s surface is water, so SSTs obviously have a considerable effect on overall temperature of the planet.
    So how do you derive a “global temperature” when you are possibly comparing apples with pears?

  52. Need Answers:
    None of the above.
    The Spencer post doesn’t show the ocean heating faster than the atmosphere. What it mostly shows is that (surprise!) the atmosphere and ocean are linked, so if one goes up, then it will either go back down, or the other will come up to meet it (as happened now, with Jan 2010).
    As far as reasons for it… you won’t find those answers here. What you’ll find here is wild speculation by ill informed amateurs, or random comments like “but this is the coldest winter in memory so the satellite data must be lying.”
    And, by the way, this hasn’t been that cold a winter. People just like to say that it is. I guess it makes them feel good.

  53. Will this “spike” in global temperature lead to an “upward bump” in longer term temperature (as the 1998 El Nino was followed by temperatures a couple tenths higher than the previous norm). We’ll see over the next decade.
    To me “global warming” is a separate issue from “anthropogenic global warming”. Evidence of warming is not, per se, evidence of AGW.
    In any case, I want the globe to warm a bit — overall its too cold now.

  54. my prognosis is that 2010 will be warm, possibly the warmest since 1934, but including a subtraction of an el nino correction in line with the stagnation during the last decade.
    sea ice will continue to recover due to ocean currents.
    glaciers will recover in regions world wide due to heavy snowfalls and cold land temperatures. data bases will not be updated.

  55. MattN:
    “what is up with that warm pool in the south Pacific.”
    That’s a dandy, isn’t it. Combine that with the current El Nino, and it’s almost like we are getting two El Nino’s at the same time. And that southern Pacific hot area is twice as large as the US.
    Not to worry. It’s one of those things that comes and goes.

  56. Just to be clear, UHI does affect the global temperature. Very little in satellite measurements, but more so in ground based readings like GISS and Had-Crut.
    My point was that the majority of influence in the calculated “global satellite temperature” is from the ocean.

  57. As of 2002, geophysicists have realized that deep-ocean (bathymetric) volcanism since the mid-19th Century has been progressively heating ocean basins, leading to accelerating evaporation as warmer water rises to shallow continental shelves. Since evaporation is a cooling process, “natural air conditioning”, Earth should experience cyclical temperature fluctuations inducing flooding rains in summer, blizzard snows in winter.
    The 500-year Little Ice Age that ended c. 1880 – ’90 has gone through four such cycles, diminishing from fifty years (1890 – 1939) to forty (1940 – ’79) to thirty (1980 – 2009). On this basis, we project a 20-year impending cool-phase from c. 2010 -2029. Now as a “dead sun” Solar Cycle 24 may presage a 70-year Maunder Minimum (which last occurred in 1645 – 1715), it seems Earth’s climatic thermostat may crash, for twenty years is no time-frame at all.
    Over 1.8-million years, the Pleistocene Era has exhibited well-defined periodic glaciations averaging 102,000 years, interspersed with median 12,250-year remissions. Adjusting for the 1,500-year Younder Dryas “cold shock” that ended c. BC 7300, our current Holocene Interglacial Epoch was due to end c. AD 2000 + (12,250 – 12,300) = AD 1950. Sixty years later, various astronomical and geophysical cycles may reinforce to tip Earth to Ice Time again.
    Since long-term climatic episodes are driven mostly by astronomical, geophysical, and plate tectonic factors, atmospheric convection currents are symptoms of temperature change, not cause. Moreover, as a classic “complex dynamic system” Earth’s atmosphere is subject to Chaos Theory, which forbids linear extrapolations of any kind, while fundamental thermodynamic Conservation Laws render “greenhouse effect” hypotheses equivalent to positing perpetual motion.
    Given that geophysical eras from the Cretaceous/Tertiary (K/T) Boundary 65-million years ago have averaged some 14 – 16+ million years, cyclical Pleistocene glaciations probably have at least 12 – 14 million years to run. As overdue Ice Time envelopes Gaia once again, why should anyone but peculating Climate Cultists and their fellow-travelers be surprised?

  58. Scotland records coldest winter
    The past two months have entered the record books
    Scotland has suffered some of the coldest winter months in almost 100 years, the Met Office has confirmed.
    http://news.bbc.co.uk/2/hi/uk_news/scotland/8492333.stm
    Near record cold and snow in South Dakota
    http://media.www.sdsucollegian.com/media/storage/paper484/news/2010/02/03/News/No.End.Of.Snow.In.Sight-3863555.shtml
    AccuWeather Inc. is on record as saying this could be the coldest winter in 25 years in the United States.
    http://www.philly.com/philly/news/homepage/81082317.html
    Coldest winter for 200 years!
    http://www.halifaxcourier.co.uk/nostalgia/Coldest-winter-for-200-years.5979547.jp
    Jan. breaks record for consecutive days below freezing
    http://www.gainesville.com/article/20100201/ARTICLES/2011005/1002
    Parts of Europe, Asia report record cold, snow this winter
    http://www.spokesman.com/stories/2010/jan/21/parts-of-europe-asia-report-record-cold-snow-this/

  59. Tom in Florida (12:41:14) :
    Will someone please make the definite statement that when SST goes higher it is either:
    1. due to the release of heat from the oceans and they are cooling
    or
    2. due to the accumulation of heat in the ocean and they are warming
    Which is it?

    The first one. I’ve been saying it for a year now. The ocean heat content is dropping, because the amount of solar energy going into it has been falling since 2003, and cloud has been increasing.
    The ocean responds by releasing the heat built up over the long run of high solar cycles, keeping the global temperature up. The cold air mass over the continents isn’t being swept over the ocean because global atmospheric angular momentum is low. This is linked to regular changes in the motions of the planets and the moon.
    Have a read around on my site by clicking on my name.

  60. Boy, this is confusing!
    But if the heat is a result of an El Nino – what’s that got to do with CO2? The heat is released by warming in certain parts of the oceans (and which then subsequently warms the atmosphere) – but what has that got to do with AGW? Is anyone saying CO2 causes El Ninos?

  61. Cold weather causes Russian visitors to Finland to seek visa extensions as their cars refuse to start Temperatures down to -35°C have become almost commonplace in Eastern Finland; icy winds from Siberia look set to continue into February
    http://www.hs.fi/english/article/Cold+weather+causes+Russian+visitors+to+Finland+to+seek+visa+extensions+as+their+cars+refuse+to+start/1135252416377
    DOZENS OF people have died as bitterly cold weather again grips central and eastern Europe. At least 20 people perished in recent days in Poland, where more than 200 have died during a particularly harsh winter.
    http://www.irishtimes.com/newspaper/world/2010/0127/1224263206165.html

  62. R. Gates (12:03:15) :

    “Out of curiosity, what “disasters” were caused by the 2007 decrease in arctic ice extent?”
    Disasterous used in a metaphoric sense, meaning when compared to previous sea ice extents. That’s how I meant it, but from another standpoint I suppose the argument could be made that the record low sea ice was bad news: Record low ice means a positive feedback is being established whereby increased solar energy can be absorbed by the ocean causing more warming causing more melting and more warming and more melting, etc. Eventually, if this continutes, even in our lifetimes perhaps, the arctic will be ice free in be summer. Great for shipping goods via the northwest or northeast passage perhaps, but potentially very bad for the ecological balance of the planet we’ve enjoyed since our ancestors first came down out of the trees.

    Given the rapid recovery of the Arctic ice in winter 2009 and winter 2009 I don’t believe there is a significant positive feedback through decreased albedo. Here in Scotland (57N) the winter sun is so low and weak that it couldn’t warm a piece of toast let alone an ocean. And as for the North West Passage, please. I followed the blogs of the 5 or 6 or so boats that made it through last year, they were all fairly small, and some were extremely fortunate to have have made it at all – e.g. http://www.yachtfiona.com/northwestpassage2009/newsletter1.html (The best effort and achievement was definitely the Arctic Mariners – who only made it half way – but they were in a 17.5ft open boat which they had to row/drag across the ice and sail against the wind for much of their commendable adventure (see http://www.arcticmariner.org/#iceCharts)
    The Arctic ice has had significant periods of retreat in recent history which suggests that the 2007 retreat was nothing to be concerned about. Especially since a lot of ice disappeared down the Fram Strait with the wind, not warming. See http://wattsupwiththat.com/2009/06/20/historic-variation-in-arctic-ice/ and
    http://wattsupwiththat.com/2009/11/04/arctic-warming-goes-with-the-floe/

  63. I worked for ten years at a company that designed sensors for the weather satellites used to measure atmospheric temperatures. What these satellites actually measure directly is the heat-generated radiation coming from the relatively hot (compared to absolute zero!) atmospheric gasses as they look down of the earth from orbit. Pick a temperature profile, going from the ground to the top of the earth’s atmosphere, and from it you can calculate the heat radiation you expect the satellite to see at different radiation frequencies. This what you do when you want to design a satellite instrument to predict more of less what it could be seeing as it looks down at the earth. It is a tedious but straightforward calculation, done using straightforward computer programs Now, once the satellite is in orbit, what you get from it is the heat radiation at different frequencies. Can you take this heat radiation and from calculate, using computers, the temperature profile up through the atmosphere. It turns out this is what applied mathematicians call an “ill-posed” problem, meaning that small errors in your radiation measurements — and no sensor is ever perfect — result in big, random-looking errors in your temperature measurements. The satellite people decades ago came up with a solution to this difficulty, and it involves forcing the computer programs calculating the atmospheric temperatures to take into account what we expect the atmospheric temperatures to be. The computer tries to find the closest match to the radiation data from the satellite while making small and reasonable changes away from the average “expected” atmospheric temperatures at the place and season it is looking at. By now I’m sure you can see what the problem is… if someone in a position of authority changes what the computer’s expected atmospheric temperatures are, the satellite measurements will produce different temperature estimates for the same measured heat radiation. If M&M want to keep the satellite guys honest, the way they made (eventually) the CRU scientists come clean, they should be asking for the raw sensor data coming down from the satellites, year after year, and check to see whether any trends exist in it. As long as the temperatures produced from the satellite data come from a statistical constraint on the original ill-posed problem, I would take those temperature values with a grain of salt…

  64. john pattinson:

    The previous post is telling me the ocean heat content is going down. This post tells me near record temperatures are all about SST

    The oceans are huge and deep, with giant heat transport mechanisms in action. So it’s quite possible to have a large warm surface area, even though the heat content of the oceans is lower. Just like in the atmosphere, and even more so in the coupled ocean/atmosphere system – where we’re experiencing very cold weather in most of the NH, which is balanced out by the Pacific SST (and parts of the Arctic, and perhaps a small part of Australia) being a a bit above average.
    No great mystery there.

  65. Need answers (12:05:49) :
    There are a number of metrics one can look at.
    Sea Surface temperature doesn’t necessarily have much to do with ‘Ocean Heat Content’ as the surface is effected by evaporation which is effected by winds.
    Our understanding of how Oceans give up their heat isn’t very complete. The Argo buoys have only be operational for a few years.
    In all the discussions about ‘global warming’ there has always been a problem with ‘The Missing Heat’.

  66. What does the January 2010 AMSR_E Global sea surface temperature anomaly for 60 degrees S to 90 degrees S look like?
    Mike Ramsey

  67. Gary Hladik (11:36:53) :
    “”Out of curiosity, what “disasters” were caused by the 2007 decrease in arctic ice extent?””
    I think I broke a nail.
    R. Gates (12:03:15)
    “”but potentially very bad for the ecological balance of the planet we’ve enjoyed since our ancestors first came down out of the trees.””
    You do realize that spring can come early, or late.
    Summers can be wet one year, dry the next, extremely hot one year, and cool the next.
    Winters can be long or short, warm or extremely cold.
    One year at a time, several years together, etc.
    Got it R. Gates?
    Our “ecological balance” is not that delicate.
    We’ve been through a few mini ice ages, mini heat strokes, since we came out of the trees.
    Even with the worst most hysterical disaster predictions,
    we have a few centuries to back away from the water.
    This is not Goofy Gores movie.

  68. SJones (13:03:56) :
    “Boy, this is confusing!
    But if the heat is a result of an El Nino – what’s that got to do with CO2? The heat is released by warming in certain parts of the oceans (and which then subsequently warms the atmosphere) – but what has that got to do with AGW? Is anyone saying CO2 causes El Ninos?”

    That is the point.
    CO2 caused Global warming is only one of several theories. It is just that the Mass Media, Politicians and Money grubbers have ignored the other theories because they are not as useful when trying to herd the human cattle in the direction they want (more taxes, more regulations, forced purchase of expensive products)

  69. I still say that spatial aliasing is the chief suspect here.
    An aliased sample signal can look highly plausible, trying to interpret aliased data means you are most likely fooling yourself. See these two short videos I posted on the earlier thread:

    http://www.youtube.com/watch?v=LVwmtwZLG88&feature=related
    The first video makes the very important point that in a practical system, you have to sample at 5 to 10 times the absolute theoretical minimum. And if a practical system is sampled at only a small multiple of the theoretical minimum, we are still likely to get a higly misleading impresstion (referred to as a “modulted signal” in the video).
    But maybe I’m wrong, and it is easy to show me I’m wrong.
    Before we sample a system for a real practical purpose (like controlling it) we carry out a survey to assess its bandwidth. That allows us to design a sampling strategy which avoids aliasing.
    So please refer me to the detailed survey of the temporal and spatial field of the Earth’s surface temperature (or at whatever elevation) which would allow me to conclude that aliasing is not a problem.
    If nobody can do that, we are wasting our time trying to interpret trends in this type of series (not just the AMSU series, but all of them).
    This is not a question of statistical sampling error. It is a question of discrete signal processing.
    REPLY:This is a good point. Sat sampling is a different time domain from surface sampling. FYI this video is even more amazing

    – Anthony

  70. Now you got me. How does this square with the graph of Ocean Heat Content in the previous post, and especially the “Annual Ocean Heat Content (0-700) & Linear Trends”, aka Fig. 2? I assume temperature is a measure of heat content.

  71. Many people are now wondering how much colder this global warming can go. Install a auxiliary fireplace? Buy additional heaters? Personally bought two for this winter and am glad I did! OT: And who is the guy in the back room at Houston Control fiddling with the voltage and calibration knobs? 🙂
    Seriously, good job Dr. Spencer, enjoy your posts. Seems to me you just receive the data sent down from the satellites, you don’t control the satellites, that is by another NASA division, right? People wonder such things.

  72. PETER:
    Peter (12:28:20) :
    “R. Gates:
    We are also seeing arctic sea ice for January near the low levels we saw in the disasterous year 2007 for sea ice extent
    A time-lapse video posted on this site sometime last year shows quite clearly that the 2007 loss of sea ice had more to do with the wind than anything else.
    BTW there’s no ‘e’ in disastrous.”
    Peter, you’re right on both counts. Spelling, yes. Wind, yes. The year 2007 was the first since 1934 that Russia was unable to use the North East Passage to service its arctic ports (Archangel et al). Reason? The wind had blown more ice into the White Sea and adjacent areas than icebreakers could handle.

  73. roger (12:47:42) :
    Can we rely implicitly on satellite instrument recordings? For example AMSR-E was last updated on 1st Feb and has had other problems in the recent past.
    For those Trolls gloating over the AMSR-E trace of 2009/2010 ice rebuild, try this link which compares 2006/2007.
    http://igloo.atmos.uiuc.edu/cgi-bin/test/print.sh?fm=02&fd=05&fy=2007&sm=02&sd=04&sy=2010
    Roger, is it my imagination or does it look like the ice this year is much more concentrated than in 2006/2007? I also randomly checked several other years from the past. I didn’t find any year that showed the ice so concentrated. Do you see the same thing?

  74. What many people don’t seem to consider is that the thermal capacity of water is orders of magnitude greater than that of air. So it’s quite feasible to have very cold temperatures over most of the earth’s land area, together with a relatively small ‘pool’ of slightly warmer sea surface, with little or no change in the amount of energy in the whole system.
    That’s just one of the things that’s wrong with the whole concept of global average temperatures.

  75. Roy W. Spencer.
    “But, I will admit I was surprised.”
    I don’t know which regions you should look at first, but take a look at ‘ocean surface relative humidity’. NH January = low air temp = little water vapour support. The ocean finds it hard to evaporate into a saturated atmosphere. Another thing to check would be the diurnal timing of the data capture.
    Best regards, suricat.

  76. Roy Spencer – “Record January warmth is mostly sea”
    Ole Humlum – Climate4You
    “The storm Grote Mandrenke (Great Drowning of Men) strikes the Netherlands in January 1362. Hurricane-force winds with enormous waves and a considerable sea level rise (a storm surge) due to the combined action of push by the wind and lifting of the sea surface because of low air pressure flooded extensive areas of the Netherlands, killing at least 25,000 inhabitants. This number should of cause be seen in relation to the much smaller population at that time than now. The storm also flooded and eroded large land areas in western Slesvig, Denmark, whereby sixty parishes is said to have disappeared totally. Also southern England was severely hit by the storm, with much damage on buildings and infrastructure.
    “The 1362 storm resulted in severe coastal erosion, contributing to the opening of a pre-existing topographical low in the Netherlands towards the North Sea. This process was already initiated by previous storms, and after a disastrous flood in 14 December 1287 (St. Lucia’s flood) the name Zuiderzee came into general usage for this 120 km long pocket-like extension of the North Sea. The 1287 flood is the fifth largest flood in recorded history, and is believed to have drowned somewhere between 50,000 and 80,000 people.” *
    * http://www.climate4you.com/ClimateAndHistory%201300-1399.htm#1362: Grote Mandrenke and the opening of the Zuiderzee in the Netherlands
    ———-
    In the 14th Century we didn’t have a watch on global temperatures. And it was “cooling” then too.

  77. Hi,
    Just a little nitpick. Sorry to be pedantic Dr Spencer , but when you said in your second paragraph “Sorry, folks, we don’t make the climate…we just report it.” ..didn’t you mean to say “weather”.
    Gaz.

  78. Peter (14:22:55) :

    What many people don’t seem to consider is that the thermal capacity of water is orders of magnitude greater than that of air. So it’s quite feasible to have very cold temperatures over most of the earth’s land area, together with a relatively small ‘pool’ of slightly warmer sea surface, with little or no change in the amount of energy in the whole system.
    That’s just one of the things that’s wrong with the whole concept of global average temperatures.

    I have brought this up several times in the past. Leif asked if I would still hold the view that the overall energy hadn’t changed much if the temps dropped.
    I actually said, truthfully, yes.
    What I didn’t say is I might crow about the temp drop to annoy the alarmists.
    DaveE.

  79. So the ocean has beltched some heat and is therefore is cooler as a result (a least a little anyway). This heat is now in the air, but will be quickly and largely lost to space ( I assume), or at least melt some of that record snow. So the nett result for February is the earth has less heat/is cooler(?). Where will we see this, in the air temperatures? Will we see a fall to negative anomalies this month, that would show no nett heating would it not?

  80. “”” Jordan (13:40:58) :
    I still say that spatial aliasing is the chief suspect here.
    An aliased sample signal can look highly plausible, trying to interpret aliased data means you are most likely fooling yourself. See these two short videos I posted on the earlier thread: “””
    Very cool video Jordan. Of course sampling theory predicts that if you sample at the correct Nyquist rate, that you can (in theory) exactly reconstruct the original continuous function. But that really requires that the original samples be of zero time duration, and also that the correct impulse resonse be used for the replacement of the samples, in the reconstruction. Your use of a 10 Fn sampling simplifies that reconstruction process.
    You should do this video experiment again at around half Nyquist or slightly less to show people that at half Nyquist, the aliassing errors now corrupt the zero frequency signal which of course is the average of the continuous function; so you only have to infringe Nyquist by a factor of two to make the average unrecoverable.
    This happens in the twice daily min/max temperature readings for these Stevenson screen stations and the like. With the 24 hour diurnal temperature cycle, twice a day sampling is only adequate if the daily temperature cycle is a pure sinusoid; but if it has a fast morning rise, and a slow evening fall, indicating at least a 12 hour second harmonic component, then min/max recording fails the Nyquist criterion.
    And of course the spatial sampling is a cruel joke, compared to what is required.
    But it seems that a lot of “climatologists” are really statisticians, and know nothing of the general theory of sampled data systems.
    And unfortunately the central limit theorem, will not buy one a reprieve from a Nyquist violation.

  81. Richard Tyndall (11:14:08) : You asked, “How does this measurement tie in with the previous thread about dropping ocean heat content. My first instinctive reaction as someone who is ignorant of this subject is to think the two conflict with each other. Is this right or am I misinterpereting what the ocean heat content measurement is saying?”
    An El Nino releases heat from the tropical Pacific. The tropical Pacific Ocean Heat Content does show a drop this year, as one would expect:
    http://i49.tinypic.com/2nut183.png
    The drop during this El Nino so far is not as significant as comparably sized El Nino events of 1986/87/88 and 1991/92. It may wind up being more in line with the quick drop in response to the 2002/03 El Nino, which was also about the same magnitude.

  82. Re: Dave in Canada (Feb 5 13:27), Yes Dave, I do, as well. As catastrophe insurance underwriters, we had a crude database of hurricanes going back to about 1900, and for many years we were worried by the idea that with AGW, which we then believed in, hurricane activity would increase because of the greater extent of warm seas in the Atlantic/Gulf areas. What appears to have happened is that hurricane damage was worse in the 1950s when global temperatures were flat to down, and again in 2004/5, but the 1980s and 1990s were very quiet years.
    My guess is that if there is any truth in the GCMs then the temperature gradient between the tropics and the poles flattens as the earth warms, and this leads to fewer intense storms. However there are interesting correlations with the El **** deleted as commercially sensitive….

  83. Fred from Canuckistan (11:20:20) : You asked, “So if the oceans are giving off heat/cooling and the heat is going to to the atmosphere where it in turn will give off heat/cool to space, should we be getting ready for a prolonged cooling period?”
    Nope. An El Nino (the discharge phase) can be followed by a La Nina event (the recharge phase).

  84. Putting aside some valid points about sampling and drift in satellite instrumentation (in which a phrase such as “we qualified and accepted it to make sure four types of things wouldn’t go wrong, then we launched it, and found a fifth” comes up a lot) has anyone calculated the amplification factor between the troposphere and the surface?
    I believe one of the ‘fingerprints’ of AGW is an increasing amplification trend, such was the heated debate about the Santer v Douglass papers

  85. “FYI this video is even more amazing”
    Sure. We don’t expect helicopters with stationary main rotor blades to fly. And they tend to make less noise than that one in your video. These are physically implausible observations. To an uninformed observer, it appears that they do, but fortunately we all know where the problem lies.
    Equally, we know that propellers do not continually shed solid aerofoil blades – as one of the videos I posted seems to suggest . So we are fortunate enough to get straight to the root of the problem in that case.
    Now take a leap away from those cases. We are trying to observe global temperature trends from sampled data systems.
    What do we to conclude when we see a significant step change in global average temperature? Not just the principle of such a step change, but a step change in a *monthly average* value!
    How many petajoules appear to be suddenly “changing hands” in the atmosphere when before there were none? What physical short-term mechanisms do we beleive can produce such a change? Is this feasible within the variability we can normally expect from the atmospheric system?
    If we have surveyed the characteristics of the global temperature field and designed a “sampling strategy” with that in mind, we can take comfort that we are not observing an artefact of aliasing (this is more a reference to spatial alisasing).
    It is quire easy to show that the absolute maximum spacing between temperature samples is about 1000 km for a “weatherless” climate where atmospheric temperature continuously varies beteeen the poles and equator. But the climate has weather patterns at much finer resolution. So the spatial distribution of samples must be much finer than 1000 km to avoid aliasing (=or a “modulated signal”).
    I don’t have enough information (the “spectral analysis”) of the spatial climate system to suggest a minimum spacing between samples to capture the variations in the climate across the global temperature field (at a particular elevation). This requires a detailed survey of the physical system prior to sampling.

  86. George: “And unfortunately the central limit theorem, will not buy one a reprieve from a Nyquist violation.”
    So true.

  87. George: “Of course sampling theory predicts that if you sample at the correct Nyquist rate, that you can (in theory) exactly reconstruct the original continuous function. ”
    I agree, but I think the difference is that the theory has the luxury of infinite time to gather informaton about the signal.
    That’s probably the reason why practical systems need to exceed the absolute theoretical minimum by a factor of 5 or 10. Otherwise we can get that hideously distorted “modulated signal” in the first video,

  88. It would possibly be more enlightening to see the total atmospheric column charted in the same manner (hint, hint, Dr. Spencer). If the troposphere is warmer but the stratosphere is cooler, is the net change null? Hmmm. Cooler gasses occupy less volume?, pressure increases slightly?, nacreous clouds form further south?, and all that jazz.
    BTW, a lot of the instrument descriptions and tech specs (including calibration and validation methods), as well as the calculations leading to the final data product, are readily available (although a little difficult to sort through) from within the AIRS website and links there. I’m sure they are doing the best they can considering the challenges involved.
    And yes, they are not actually measuring “temperature” from over 600 km in orbit, they are measuring “channel brightness” (or peak line intensity in my field) of O2 emissions in far IR and microwave frequencies. Considering everything that could have an impact on those measurements, I’m glad my lab is safely enclosed in a climate-controlled facility on the surface of the planet, where most of the variables can be comfortably controlled.
    And I will not stop saying THIS about CO2 radiative emissions, because some people simply refuse to quit believing that it is a gas species with magical properties – absent collision, the only molecule that can absorb at CO2 emission wavelengths is ANOTHER CO2 molecule.If this were not true, it would not be possible to measure molecular emission lines from a satellite that is orbiting the planet at nearly 700 km. Luckily, the ground I walk on is not composed of CO2. No, CO2 “reradiation”, or “back-radiation”, or whatever they are calling it today, CANNOT heat the surface.

  89. 1997 was a solar minimum followed by a temperature spike in 1998 due to el nino; 2009 was a solar minimum followed by a temperature spike in 2010 due to el nino.
    i’m not a scientist, but…

  90. Jordan (15:51:08) :
    …”It is quire easy to show that the absolute maximum spacing between temperature samples is about 1000 km for a “weatherless” climate where atmospheric temperature continuously varies beteen the poles and equator. But the climate has weather patterns at much finer resolution. So the spatial distribution of samples must be much finer than 1000 km to avoid aliasing (=or a “modulated signal”).
    I don’t have enough information (the “spectral analysis”) of the spatial climate system to suggest a minimum spacing between samples to capture the variations in the climate across the global temperature field (at a particular elevation). This requires a detailed survey of the physical system prior to sampling.

    As our non-linear climate displays deterministic chaos, the scale of sampling to get any sort of accuracy of global temperature would need to be incredibly small. I’ve seen estimates that say that even if a 3-D 1m grid over the entire volume of the atmosphere across the globe was to be used, with a 1 minute sampling rate, the accuracy would still only around 0.1of a degree C.
    Not sure what this means in terms of Nyquist frequency?

  91. ps I should add that the videos I attached to my posts are not *my* videos. I searched YT for them – credits due to their authors

  92. “It turns out this is what applied mathematicians call an “ill-posed” problem, meaning that small errors in your radiation measurements — and no sensor is ever perfect — result in big, random-looking errors in your temperature measurements. The satellite people decades ago came up with a solution to this difficulty, and it involves forcing the computer programs calculating the atmospheric temperatures to take into account what we expect the atmospheric temperatures to be. The computer tries to find the closest match to the radiation data from the satellite while making small and reasonable changes away from the average “expected” atmospheric temperatures at the place and season it is looking at. By now I’m sure you can see what the problem is… if someone in a position of authority changes what the computer’s expected atmospheric temperatures are, the satellite measurements will produce different temperature estimates for the same measured heat radiation.”
    Thank you D. Ch.
    Do I interpret this correctly if I say that when certain climate anomaly happens where troposphere temperature is for one reason or another warmer than surface actually is, this creates disconnect between satellite temperature and surface temperature?
    So if we have this exception in place “making small and reasonable changes away from the average “expected” atmospheric temperatures at the place and season it is looking at” this algorithm might not adapt to situation and end up producing warmer surface temperatures that might not represent the actual surface temperatures?
    So the satellites really measure the troposphere and use some kind of static formula which should give the surface temperature if standard / expected conditions exist?
    I’m no means trying to criticize the satellite measurements; I’m merely trying to understand better the complexity of doing these conversions and situations which might lead into situations where mismatch might be created between satellite data and surface data.
    If so, would I be correct if I interpret this so that satellite data is quite accurate information about temperature of troposphere, but while certain conditions exist, it might not be the actual representation of surface temperatures?

  93. And, by the way, this hasn’t been that cold a winter. People just like to say that it is. I guess it makes them feel good.
    If you want to persuade people, then admitting the obvious is a first step. It has plainly been a cold winter, on average, in the land portions of the Northern Hemisphere. To deny that is to deny what is plainly in front of your face. And it makes anyone reading think “what other facts is this person prepared to wave away”.
    Just as you don’t want the non-alarmists to wave away what you regard as important data, you cannot do it either. Accept that it has been a cold winter on land in the NH, and deal with it, if you want us to accept that it has been a record warm one at sea.
    (BTW is has, from my point of view, been a fairly coolsummer. It would also help your cause if you did not leave half the world out of any arguments.)

  94. Btw: About the data flow: Do universities get the raw satellite data, or has it been pre-processed by NASA?

  95. From my specialty (hot water heating).
    A one inch copper water line can transport about 60,000 BTUH’s with a 20* delta T.
    Compare that with forced air, the duct size needed to transport the same amount of energy with air is about 8″ X 14″.

  96. George E. Smith (15:10:38) :

    But it seems that a lot of “climatologists” are really statisticians, and know nothing of the general theory of sampled data systems.

    I would say that all “climatology” is statistics AND that they are practising statistics without a license!
    Dave.

  97. Jordan,
    I come to these blogs to learn things and you taught me something today – thank you. I agree completely whatever we think we are seeing it is certainly effected by spatial aliasing.
    If Dr Spencer is out there – or anyone else for that matter. Is there a response to this issue?

  98. D. Ch.
    “The satellite people decades ago came up with a solution to this difficulty, and it involves forcing the computer programs calculating the atmospheric temperatures to take into account what we expect the atmospheric temperatures to be. The computer tries to find the closest match to the radiation data from the satellite while making small and reasonable changes away from the average “expected” atmospheric temperatures at the place and season it is looking at. By now I’m sure you can see what the problem is… if someone in a position of authority changes what the computer’s expected atmospheric temperatures are, the satellite measurements will produce different temperature estimates for the same measured heat radiation.”
    —–
    Fascinating. While I am prepared to believe that this January’s UAH lower tropospheric temperatures are the hottest on record, D. Ch.’s post reminded me of how NASA’s Nimbus 7 satellites failed to detect the ozone hole over the Antarctic. The British Antarctic Survey’s head of the Geophysical Unit, Joseph Farman, made Dobson spectrophotometer readings of the Arctic atmosphere from the time of IGY (International Geophysical Year 1957-58) onward, at Halley Bay, and detected a sharp drop in ozone in 1981. His first instinct was to distrust his instrument, so he ordered a new one. Even with his new Dobson specrophotometer, which arrived in 1982, the ozone hole was evident. Still skeptical, he travelled to McMurdo station 1,000 miles away to take another series of readings in 1983. He would not publish his results until he eliminated all the possibilities. He refused to accept his results partly on account of how anomolous they were compared with the ozone readings from NASA’s Nimbus 7 satellites. It turned out that NASA scientists programmed the Nimbus computers to analyze only ozone data that fell between 180 and 650 Dobson units, which atmospheric models had indicated were well outside of realistic ozone variables. Data falling outside the parameters were considered to be due to instrument malfunctions – and put aside (but fortunately not discarded). This led to a non-detection of the ozone hole by scientists monitoring the most sophisticated technology of the day, while Farman and his more empirical approach using older technology was able to detect the anomoly and in the end draw attention to it.
    I can see parallels, but doubt that anything similar has occurred with the UAH satellites. Still, it’s worth thinking about asking to see the raw data, as you recommend to Steve McIntyre.

  99. Aagh – I should, proofread more carefully! Joseph Farman worked only in the Antarctic, not in the Arctic as indicated in one of the sentences of my previous post.

  100. MattN (13:56:37) : You wrote, :About that warm pool in the south Pacific. Might this have somethign to do with it,” and posted this link:
    http://www.iceagenow.com/Undersea_volcanic_eruption_in_Tonga_heating_the_water.htm
    There’s a link at the bottom of the page to the full post with comments here at WUWT:
    http://wattsupwiththat.com/2009/03/19/undersea-volcanic-eruption-in-tonga/
    If you were to run through the thread looking for my name you’d find my 13:14:06 comment on 3/19 with the following map, pointing to the location of Tonga. Steve missed the mark by a thousand miles or so.
    http://s5.tinypic.com/25kohg1.jpg
    The big red blob (scientific term) in the South Pacific is a common response during El Nino events.
    http://bobtisdale.blogspot.com/2010/01/south-pacific-hot-spot.html
    It also occurs at other times.

  101. Dave in Canada (13:27:54) : You wrote, “Anyone else find it curious that hurricanes down, ocean temps up?”
    El Nino events suppress Atlantic hurricanes, so, in effect, it’s the other way around.

  102. SJones (13:03:56) : You wrote, “But if the heat is a result of an El Nino – what’s that got to do with CO2?”
    Not a thing. Climate scientists misrepresent the effects of El Nino events. Refer to the following posts that also ran here at WUWT:
    http://bobtisdale.blogspot.com/2009/01/can-el-nino-events-explain-all-of.html
    And:
    http://bobtisdale.blogspot.com/2009/01/can-el-nino-events-explain-all-of_11.html
    And:
    http://bobtisdale.blogspot.com/2009/06/rss-msu-tlt-time-latitude-plots.html
    You wrote, “The heat is released by warming in certain parts of the oceans (and which then subsequently warms the atmosphere) – but what has that got to do with AGW?”
    Not a thing. There is no evidence that anthropogenic greenhouse gases warm the oceans. Refer to these posts:
    http://bobtisdale.blogspot.com/2009/09/enso-dominates-nodc-ocean-heat-content.html
    And:
    http://bobtisdale.blogspot.com/2009/10/north-atlantic-ocean-heat-content-0-700.html
    And:
    http://bobtisdale.blogspot.com/2009/12/north-pacific-ocean-heat-content-shift.html
    Is anyone saying CO2 causes El Ninos?
    They try on occasion.
    If you have any questions, feel free to ask here or at my website.

  103. “While the tropospheric temperatures we compute come from the AMSU instrument that also flies on the NASA Aqua satellite, along with the AMSR-E, there is no connection between the calibrations of these two instruments.”
    But since they fly together there must be a close correlation with the surface track they are observing. So some level of connection might be expected?

  104. Bob, I don’t get it. Is the whole planet warmer this month than it has been for a few decades? Because the sun has been in a funk.

  105. Tom in Florida (12:41:14) : Will someone please make the definite statement that when SST goes higher it is either:
    1. due to the release of heat from the oceans and they are cooling
    or
    2. due to the accumulation of heat in the ocean and they are warming
    Which is it?
    ************
    It depends on the time frame. If you’re talking about year-to-year variations, the rises are typically associated with El Nino events. During El Nino events, the tropical Pacific releases heat into the atmosphere, and through changes in atmospheric circulation, the El Nino events cause sea surface temperatures to warm outside of the tropical Pacific. The release of heat in the tropical Pacific also reduces the ocean heat content there.
    Then there are multiyear time frames. There are also multiyear aftereffects of strong El Nino events that can cause significant step increases in lower troposphere temperature anomalies of the Northern Hemisphere. Discussed here:
    http://bobtisdale.blogspot.com/2009/06/rss-msu-tlt-time-latitude-plots.html
    There are also multiyear aftereffects to strong El Nino events in a significant part of the global oceans, Discussed here:
    http://bobtisdale.blogspot.com/2009/01/can-el-nino-events-explain-all-of.html
    and here:
    http://bobtisdale.blogspot.com/2009/01/can-el-nino-events-explain-all-of_11.html
    If the El Nino causes a multiyear La Nina event, Ocean Heat Content will rise in the tropical Pacific and in many other ocean basins as well during that multiyear La Nina:
    http://bobtisdale.blogspot.com/2009/09/enso-dominates-nodc-ocean-heat-content.html
    And then there are multidecadal timespans, and the major variable is the Atlantic Multidecadal Oscillation.
    http://en.wikipedia.org/wiki/Atlantic_multidecadal_oscillation
    Some try to say the Pacific Decadal Oscillation (PDO) has a similar effect but the PDO is an aftereffect of the El Nino and La Nina events.

  106. “Is anyone saying CO2 causes El Ninos?”
    RealClimate has all but said those exact words. I beilieve the context was they did not think the Pacific would ever go back to a negative phase.

  107. JP I’m glad you found the post useful. It has been long enough since I worked in this area that I couldn’t confidently say how to get the raw data — just asking for it today may well alarm people given the political issues involved. It may need an official FOI request to pry the raw sensor data loose and also to find out how the computer programs have changed over the years. One other point occurred to me after writing the previous post –that by relying on an expected average set of temperatures for the atmosphere at a given time and place, the satellite measurements in effect assume a steady climate (climate being broadly defined as average weather) and use that assumption to estimate each day’s temperatures as a relatively small deviation, mathematically speaking, from the expected climate. This not only makes the satellite temperatures vulnerable to changes in the computer programs’ assumptions about the climate, it also makes the satellite temperatures a relatively bad way to measure climate change of any sort, since the computer programs are always, until told otherwise, assuming **exactly** no climate change. I guess this means that, from the climate alarmists’ point of view, satellite temperatures could be expected to show less of a temperature rise over the last several decades than the surface data does. From the point of view of the rest of us, it would be nice if CRU had not lost the raw data and if NOAA had not decided over the last 15 years or so to stop taking measurements from weather stations located at high altitudes and above the arctic circle. From an objective point of view, it may still make sense to look for trends in the satellites’ raw heat-radiation data, and find out whether the satellite software has changed over the last thirty years. Vigilantfish’s story about how the satellites missed the developing polar ozone hole is also very relevant, because (although I’ve not worked with ozone detection) the mathematics of measuring ozone from its effect on the radiation leaving the earth’s atmosphere would almost certainly be another type of ill-posed problem — so it too would have to be solved by starting out with some restrictive assumptions about the sort of ozone layers the satellite is looking at.

  108. tallbloke: You wrote, “…and cloud has been increasing.”
    And if memory serves me well, the complaint about the early ISCCP cloud amount data was the noise created by volcanic aerosols. Is there something to combat that dataset now that comes to mind?

  109. It’s a good idea to step back and look at what the numbers actually mean; and in this case, they are departures from the mean over a previous, longer period. (I hate the term “anomaly” because it has connotations of wrongness.) So the January figure tells us only that it’s warmer than what one would expect if there were no change.
    Instead of an actual increase in mean temperature, such a “warming” could simply mean that it’s not cooled as much as the mean. Several factors can explain such a “lack of cooling”; the most likely that springs to mind is increased cloud cover.
    One can’t know for sure without reference to the absolute temperatures instead of departures from the mean in conjunction with the global insolation data for the relevant periods.

  110. Paul Vaughan (12:17:43) : “Substantially more investment in research into natural climate variations is needed so we can get off these ridiculous political distortion cycles that track every bump & dip in climate.”
    Hear hear.
    Chris
    Norfolk, VA, USA

  111. Symon (18:18:17) : You wrote, “Bob, I don’t get it. Is the whole planet warmer this month than it has been for a few decades? Because the sun has been in a funk.”
    First, a clarification: The record lower troposphere temperature (TLT) anomaly is for the month of January. According to the preliminary reading (0.724 deg C), it is not an “all-time” record TLT anomaly. February and March 1998 readings were higher at 0.76 deg C.
    Second, other bloggers are claiming the extended solar minimum is causing the oceans to release heat and warm the atmosphere. Not me. I can very simply point to the facts that the 1997/98 El Nino caused an upward step in the TLT anomalies of much of the Northern Hemisphere. I’ve discussed it here:
    http://bobtisdale.blogspot.com/2009/06/rss-msu-tlt-time-latitude-plots.html
    A similar but smaller upward step occurred as a result of the 1986/87/88 El Nino.

  112. Bob Tisdale (17:49:26) : [Dr. Spencer] .. would be the person with the answers to your questions.
    I don’t wish to make this a question for the AMSU series or any particular researcher.
    I’d prefer to ask questions of the science in the open and to receive open replies. Any potential issue here should already have been answered in the scholarly literature.
    Methods to removing the possibility of artifacts due to aliasing should be addressed in statements of methodology, and perhaps analysis included in SI to relevant papers.
    But we do have an absolute requirement set out for us in the sampling theorem, and this will limit the scope for any suggestions of novel approaches, or sweeping inadequate data into statistical averages and hoping that everything will be OK.
    Jones et al presents a temperature series going back some 150 years. His series has been elevated to the status of the “instrumental record” and claimed to be superior to proxy data. But his analysis seems to be of a statistical nature, and there is no obvious statement about how he has complied with the sampling throrem for the last 150 years. Not just achieving the theoretical minimum sampling condition, but exceeding it by a a factor of around 10 in order to be able to accurately reconstruct climate patterns.
    Not to forget that the shape of those series must be attributable to real physical processes. If fluctuations are swept under the carpet as “random variations”, you should look upon them as equivalent to those backward-spinning wagonwheels in old movies.
    Aliasing must be designed-out of the system as a principal objective of the sampling strategy. If you do not feel satisfied that this has been achieved, the safest thing to do is to ignore the shape of these putative climate signals.

  113. the brightness anomaly for january-
    http://www.remss.com/msu/msu_data_monthly.html
    shows a large red blob over china. including the sea ice off the coast which is set to get worse soon –
    http://www.china.org.cn/environment/2010-02/05/content_19372251.htm
    it is obvious when it reads sea ice as a red blob where previously this did not exist even in the life of the satellite, there is something wrong with the calculations made regarding snow and ice.
    australia temp anomalies are incorrect too, they should show cooling in the top half, but only show warming in the lower.
    the problem of snow/ice should not influence australia, so the question is why?
    i could understand that because of the fact that AMSU are microwave frequencies and snow and ice are rather transparent to them as compared with water, that snow or ice may be miscalculated due different types of snow crystals etc and snow where there was none before, or even sea ice where there normally is none, but why is australia showing no cooling? base period perhaps?

  114. Jordan (02:24:34) :
    “Any potential issue here should already have been answered in the scholarly literature.”
    Indeed it has. First, there are some serious misunderstanding by commentators on this site about the Nyquist theorem. The theorem states that if signals are sampled at twice the maximum frequency present in a signal, that signal can be perfectly determined. It does not say anything about the average of that signal, and that can be very well, but not perfectly, determined from much fewer points: you don’t need to know every point on the ocean floor to determine its mean depth of 12400 ft to the nearest 100 ft.
    Dr Roy Spencer measurements are on a 2.5 x 2.5 deg grid, or at a spacing of about 250 km. Hansen and Lebedeff looked at the variation in temperature anomalies with spatial frequencies back in 1987:
    http://pubs.giss.nasa.gov/docs/1987/1987_Hansen_Lebedeff.pdf
    and concluded that sampling at 1000 km was sufficient to plot the anomaly variation.

  115. “Sorry, folks, we don’t make the climate…we just report it.”
    Not to nitpick but one month of temperature data at 14,000 ft is hardly climate.
    The CO2 hypothesis suggests that the oceans are warmed by the troposphere which traps heat near the surface. So it should be noted that any explanation showing that warming of the troposphere in January is caused by ocean heat does not support this hypothesis, and can be written off as natural variability due to an El Nino effect.
    As for why the ocean warming (SST) translates into higher temperatures at 14,000 ft and not over land is beyond me. In the NH I believe oceans (excluding sea ice) cover not much more than 60% of the surface, and it would seem the higher amounts of snow cover in January would reflect more sunlight back to space, and perhaps to clouds at 14,000 ft?

  116. Since global temperature data only goes back a few years,
    one more little tidbit from Prof. Ole Humlum’s (University of Oslo) website at Climate4You:
    The Spanish Armada (Jul-Aug 1588) and the Winter of 2007 –
    “From an official English political point of view the outcome was a major triumph for the English Navy and for Sir Francis Drake. In reality it was a climate-induced disaster for Spain and King Philip II, who rightfully complained that he had sent his ships to fight the English, not the elements. The immediate political effect was the survival of the Kingdom of England and the gradual transfer of world sea dominance to the British Navy: Rule Britannia, Britannia rules the waves.
    “From a meteorological point of view the strong westerly and north-westerly winds suggest a major storm centre travelling across England, in response to a relatively southerly position of the Polar Jet Stream. Presumably the meteorological situation was much alike that bringing about the wet, windy and cold summer of 2007 in NW Europe.”
    http://www.climate4you.com/ClimateAndHistory%201500-1599.htm

  117. Symon: You wrote, “I’d prefer to ask questions of the science in the open and to receive open replies. Any potential issue here should already have been answered in the scholarly literature.”
    Have you searched the scholarly literature written about this and the RSS MSU TLT anomaly dataset to see if your questions have been addressed?

  118. pft (06:08:04) said:

    The CO2 hypothesis suggests that the oceans are warmed by the troposphere which traps heat near the surface. So it should be noted that any explanation showing that warming of the troposphere in January is caused by ocean heat does not support this hypothesis, and can be written off as natural variability due to an El Nino effect.

    The Sun inputs some 1600 W/m2 of energy into the oceans (although averaging reduces this). How much does can the atmosphere input?
    Secondly, what is the heat capacity of the oceans vs the heat capacity of the atmosphere? Which has the greater thermal inertia?
    Frankly, the atmosphere has little ability to impact the temperature of the oceans, and as someone else said, perhaps on another thread, any temperature excursion of the lower atmosphere is going to be dragged back to the temperature of the sea surface in short order with little impact on ocean temperatures.

  119. R. Gates (12:03:15) :
    The whole month of January is hardly a single data point. Now, If it had been a single day that was warm, that would be different. Janauary had 31 data points as days. Last I checked the year was made up of 12 months, and now we are more than 1/12 the way the way through the year and temperatures are still high througout the troposphere from sea level up to 46,000 ft. Now that’s a pretty big “single” data point.
    I must have missed something. I don’t seem to recall R. Gates posting back in June that there were 30 days making a “big” data point. Of course, the fact that the anomaly was .003 may have been the reason. Now, if we extrapolate from that point we’d already be starting the next ice age.
    Clearly, R. Gates does not want to discuss science. You’d think the other warmers would jump on him, since he’s making warmers in general look bad.

  120. Picking up on mobihci (04:58:38) : point, we do seem to be seeing major discrepancies from satellites, even in simple Photos. As I have posted before Satellites do not appear to be showing non Artic Ice in the NH.
    I also agree that “Sorry, folks, we don’t make the climate…we just report it.” was not a completely truthfull remark, they are not even reporting weather, just an Interpretation of electrical signals.
    I would prefer to rely on Mercury or Alcohol.
    Do we still have Weather Balloon readings being taken?

  121. The sudden rise of the UAH LT anomaly for the month January 2010 was unexpected to me. Let us concentrate on the Nothern Hemisphere, where we ascertained an anomaly of +0.841deg. C.!
    We all had the impression that it was much colder in last January than normally. Is it possible that the satellite didn’t notice the cold layers of air above land? I don’t believe it (unless the reflection by the snow plays a role). So, this means that the rise of temperature of the sea surface must be very high. How is this possible on a hemisphere with 60 procent of sea and 40 procent of land? If the temperature anomaly on land is negative, then the anomaly above sea has to be very high. Example: (- 0.24 * 0,4) + (+1.56 * 0.6) = 0.84 deg. C.
    Is it possible that the temperature anomaly above sea in the Northern Hemisphere was +1,56 deg. C. in January?
    I think scientist have hardly notion of the partition of temperature in and above the ocean. Do they know exactly what the satillite is measuring and at what height? I agree with the statement that there is a relation between SST and tropospheric global temperature, but it seems to me that the temperature above land hardly plays a significant role.
    I have also questions about the high fluctuations of the temperature anomaly ascertained by UAH (or RSS). Certainly, El Nino is a factor in the discussion, but not exclusively. In the years 2002 to 2007, we ascertained high anomalies with a moderate El Nino. Even in the month December 2009, the value of ONI (ERSST v2) was 1.8, not the value 2.5 of December 1997.
    We can hope that this time again, the sudden rise will be followed by a proportional decline. Only, it is possible that the expected decline will not happen in the first months. The average anomaly (UAH) for the month February from 1998 to 2009 is + 0.352 deg. C. (for the month January, the average from 1998 until now is + 0.336 deg. C.). Wait and see…, but an anomaly of such amount as in January 2010 has to be kept in mind.

  122. R. Gates (12:03:15) :
    Don’t worry about ice loss at the poles. The sun never gets past the Bragg angle of 57° and so heating, whether of water or ice, is minimal.

  123. This could also simply mean that winds over the ocean were lower than the mean for January. SST as I understand it are strongly effected by evaporative cooling due to average winds.
    In theory the prevailing winds are driven by temperature differentials between the equatorial ocean and the poles. If that differential temperature narrows, prevailing winds would decline in strength and duration on average. That reduced windiness, would reduce evaporative cooling. Reduced evaporation would lower average humidity and cloudiness.
    It would be interesting to see a tabulation of average cloudiness and winds over the oceans during this same period to see if that might explain warmer than average oceans while the northern hemisphere land masses were cooler than average.
    Larry

  124. Bob Tisdale (18:35:41) : “Tom in Florida (12:41:14) : Will someone please make the definite statement that when SST goes higher it is either:
    1. due to the release of heat from the oceans and they are cooling
    2. due to the accumulation of heat in the ocean and they are warming
    Which is it?
    ************
    It depends on the time frame.”
    So are you saying that simply charting a monthly SST anomaly doesn’t tell you anything specific about the direction of global temperature trends?

  125. R. Gates (12:03:15) :

    Gary said:
    “Out of curiosity, what “disasters” were caused by the 2007 decrease in arctic ice extent?”
    Disasterous used in a metaphoric sense, meaning when compared to previous sea ice extents. That’s how I meant it, but from another standpoint I suppose the argument could be made that the record low sea ice was bad news: Record low ice means a positive feedback is being established whereby increased solar energy can be absorbed by the ocean causing more warming causing more melting and more warming and more melting, etc. Eventually, if this continutes, even in our lifetimes perhaps, the arctic will be ice free in be summer. Great for shipping goods via the northwest or northeast passage perhaps, but potentially very bad for the ecological balance of the planet we’ve enjoyed since our ancestors first came down out of the trees.

    R.Gates,
    I don’t see how anyone can state with any degree of confidence that a diminished, or even missing, arctic ice cap would be good or bad or anything for the ecological balance of the planet.
    For example, what was the status of the northern ice cap when the Vikings were raising wheat and cattle in Greenland? I don’t know the ice extent, of course no one does, at that time. But I think it would be reasonable to conclude the arctic ice extent was much smaller in 1000 AD than today.
    Was there suffering on the part of the human family during the medieval warm period? Of course not. It was a time of great prosperity, with food surpluses and spare time for society as a whole. As people were of a religious bent in those days, we saw the building of the Gothic style cathedrals.
    It was only when things got cold everyone began to suffer. Apparently the last of the Vikings died in Greenland around 1500 AD.
    No doubt when it began to get colder, the Arctic ice cap expanded.

  126. tallbloke (13:02:48) :

    Tom in Florida (12:41:14) :
    Will someone please make the definite statement that when SST goes higher it is either:
    1. due to the release of heat from the oceans and they are cooling
    or
    2. due to the accumulation of heat in the ocean and they are warming
    Which is it?


    The first one. I’ve been saying it for a year now. The ocean heat content is dropping, because the amount of solar energy going into it has been falling since 2003, and cloud has been increasing.
    The ocean responds by releasing the heat built up over the long run of high solar cycles, keeping the global temperature up.

    Why when it’s cooling would it release more heat. Surely some sort of equilibrium would would be established between the ocean and the atmosphere. If the oceans were cooling relative to the atmosphere surely there would be less heat transferred upwards.
    I genuinely don’t understand your point.

  127. There was a huge, very intense warm pool in the Southern Ocean, halfway between New Zealand and Tierra del Fuego, that persisted throughout most of January. Methinks that’s what spiked the monthly anomaly. It has little to do with the faux El Nino (aka Modokai) and even less to do with all the other causes speculated here. Underwater volcanism, anyone?

  128. Tom in snow free Florida: You replied, “So are you saying that simply charting a monthly SST anomaly doesn’t tell you anything specific about the direction of global temperature trends?”
    No. If the SST anomaly trend is postive, sea surface temperatures are rising over the period used to determine the trend, and vice versa.
    And in my earlier answer, I probably provided you with an overly complicated answer when one wasn’t required.
    Regards

  129. lgl (11:36:42) : You asked, “Anyway, the problem is it doesn’t drop back enough before the next step. Bacause of an underlying trend?”
    No. The warm water that had been below the surface of the Pacific Warm Pool before the El Nino is still on the surface after the El Nino and this is, of course, during the La Nina. In other words, during the La Nina events that followed the 1986/87/88 and 1997/98 El Ninos, the warm water may not be in the NINO3.4 or Cold Tongue Index regions any longer, but it is still on the surface in the western Pacific and eastern Indian Oceans, still releasing heat, still affecting atmospheric circulation.

  130. Bob Tisdale (07:38:43) : “Have you searched the scholarly literature written about this and the RSS MSU TLT anomaly dataset to see if your questions have been addressed?”
    I am a practising engineer and left academia some years ago. Although I don’t have ready access to the literature, I can search on the likes of google scholar. My questions have not been answered from that.
    I don’t want to rely on google scholar (don’t know how good it is), so I try to get input from others on sites like this. If anybody can, please give a reference to the points I raised above and I’d be very interested to look into it.
    Right now, the issue of design of sampling scheme and avoidance of aliasing does not appear to have been addressed. There is a good deal of analysis of statistical sampling error, but this is a question of discrete systems theory.
    Tom P (05:38:41) : “Any potential issue here should already have been answered in the scholarly literature. ..Indeed it has.”
    Not by that paper Tom.
    The paper recognises deficiencies in the spatial and temporal temperature data. But the analysis goes absolutely nowhere near the sampling theorem. Doesn’t even mention it.
    There is a long discussion of gridding the globe, and then constructing grid averages or inferring grid data using a 1200 km “horizon”. The analysis is statistical, and consideration is given to statistical sampling error.
    The finest time resolution is monthly average. If the starting point is that the data is very likely to be aliased, this paper would be just messing around with ruined data.
    Sorry to be so dismissive – come back if there is anything you can pull from the paper which you think is a demonstration that the sampling theorem is satisfied. Even better if you can show that it has been more-than-satisfied by a factor of 10.
    Tom: “The theorem states that if signals are sampled at twice the maximum frequency present in a signal, that signal can be perfectly determined.”
    That’s fine if you have an absolutely band limited signal (amplitude of zero above a certain frequency), and infinite time to gather your samples. But we have neither of these in practical systems and certainly not when it comes to climate.
    So the question is how much we need to exceed the minimum theoretical frequency (time) and wavelenght (space) in order to get answers in a period which meets the requirements of the AGW question. Given that we have 150 years of indequate data, there is every possibility that haven’t even started to gather data at the necessary resolution. (Now there’s a thought.)
    Tom: “It does not say anything about the average of that signal”
    Agreed! But the point being made was that the central limit theorem cannot be relied upon to dig us out of the hole that aliased data leaves us in. If data is aliased and therefore meaningless, an average value adds no more meaning.

  131. And yet Europe, Russia, the Far East AND North America are experiencing
    one of their most severe winters ever!
    Kinda makes you wonder “what” any historical records might me measuring???
    El Nino is fading fast.
    Can one imagine the decline in “Global’ TEMPS that we will be looking at
    the rest of this year?

  132. Jordan: You wrote, “If anybody can, please give a reference to the points I raised above and I’d be very interested to look into it.”
    I have not seen the topic raised before, here or at any other climate-related blog.
    With respect to my earlier comment about emailing Dr Spencer, the climate scientists I have contacted via email (did not include any of those who made headlines recently) have always been very forthcoming in their replies to my questions, most often they provide more insight than was required. I understand your want to maintain a dialogue about your concerns out in the open, but, unfortunately, I don’t believe you’re going to find your answers on this thread. As you can see, it’s only one day old, and the number of comments have dropped drastically. Your discussion is also off topic. But you might get lucky and someone might come along who has researched the it.
    Or consider writing a guest post.
    Just had a thought. Have you tried Lubos Motl? Here’s a link to his website.
    http://motls.blogspot.com/
    Regards

  133. sky: The hotspot in the central mid latitudes of the South Pacific has been there since at least November:
    http://i47.tinypic.com/2mq9idk.png
    While the SST anomalies in that area are unusually high during this El Nino, it is not unusual for the warm spot to occur during an El Nino. Refer to:
    http://bobtisdale.blogspot.com/2010/01/south-pacific-hot-spot.html
    Also, the cooler waters surrounding it have been at least partially offsetting the hotspot. Here’s a graph of SST anomalies for the South Pacific thru December 2009:
    http://i49.tinypic.com/2edcdnt.png
    Here’s another post that deals with the hotspot indirectly. It shows up during El Nino events from 1951 to present:
    http://bobtisdale.blogspot.com/2010/01/south-pacific-sst-patterns.html
    Regards

  134. Kelvin waves demonstrate what happens with upwelling and downwelling. Downwelling is warm, upwelling is cold. The lack of strong near surface Easterly winds keeps the colder waters under the warmer skin of water (from shortwave solar radiation). When the Easterlies kick up, the warm skin is blown away, causing more mixing of colder waters below with the warm turbulent skin.

  135. Mooloo (16:30:33) :
    > It has plainly been a cold winter, on average, in the land
    > portions of the Northern Hemisphere. To deny that is to deny
    > what is plainly in front of your face. And it makes anyone reading
    > think “what other facts is this person prepared to wave away”.
    Not true in Canada, at least for January. It’s been well above the 1971..2000 normals. Go to the website http://www.climate.weatheroffice.gc.ca/prods_servs/cdn_climate_summary_e.html and select the month of January, 2010 and “Text” output. I downloaded the text output, ran it through “grep” (under linux) and picked off all sites with temperature deltas, and 0 or 1 or 2 missing days (i.e. fewer than 3) during the month. The filter consisted of the command…
    grep “^.\{57\} [012]….\.”
    Of the 236 sites that matched, only 4 had negative deltas (colder than normal monthly temperatures for January).

  136. Bob Tisdale (18:28:53) :
    sky: The hotspot in the central mid latitudes of the South Pacific has been there since at least November
    Westward in the NZ offshore plateau there has also been a spectacular and persistent phytoplankton bloom in the same timeframe and latitudinal coordinates ,eg
    OCT
    http://earthobservatory.nasa.gov/IOTD/view.php?id=40924
    jAN
    http://www.eosnap.com/?p=13327
    This dropped the local surface sst .This would have depleted the eastward regions of nutriemts (one of the main arguments against Fe fertilization ) amd made the waters more optically transparent,the Hyperoligotrophic waters (purple patches) that are observable on seawifs.
    There is a good paper Morel et al on this on the easter island anticlyclonic gyre.
    Abstract
    Optical measurements within both the visible and near ultraviolet (UV) parts of the spectrum (305–750 nm) were recently made in hyperoligotrophic waters in the South Pacific gyre (near Easter Island). The diffuse attenuation coefficients for downward irradiance, Kd(l), and the irradiance reflectances, R(l), as derived from
    hyperspectral (downward and upward) irradiance measurements, exhibit very uncommon values that reflect the exceptional clarity of this huge water body. The Kd(l) values observed in the UV domain are even below the absorption coefficients found in current literature for pure water. The R(l) values (beneath the surface) exhibit a maximum as high as 13% around 390 nm. From these apparent optical properties, the absorption and backscattering coefficients can be inferred by inversion and compared to those of (optically) pure seawater. The total absorption coefficient (atot) exhibits a flat minimum (, 0.007 m21) around 410–420 nm, about twice that of pure water. At 310 nm, atot may be as low as 0.045 m21, i.e., half the value generally accepted for pure water. The particulate absorption is low compared to those of yellow substance and water and represents only ,15% of atot
    in the 305–420-nm domain. The backscattering coefficient is totally dominated by that of water molecules in the UV domain. Because direct laboratory determinations of pure water absorption in the UV domain are still scarce and contradictory, we determine a tentative upper bound limit for this elusive coefficient as it results from in situ measurements.
    From the introduction.
    Apart from the inserted quotation mark, the title of this
    article is identical to that of an article published some
    25 years ago by Smith and Baker (1981). This means that
    perhaps the purest natural waters are not yet discovered and
    that the subject is still topical. This article is motivated by
    recent (2004) observations in the exceptionally clear, blueviolet
    waters of the anticyclonic South Pacific gyre, West of
    Rapa Nui (Easter Island), and its aims are somewhat similar
    to those of the Smith and Baker study. Namely, the
    following questions are addressed: (1) What are the optical
    properties of these extremely clear natural waters and how
    can they be explained? (2) Is it possible from their apparent
    optical properties (AOP) to derive some upper bound limits
    for the absorption coefficient of pure water? Indeed,
    laboratory measurements of pure water absorption, particularly
    in the violet and ultraviolet (UV) part of the spectrum
    are scarce and contradictory. Somehow paradoxically, it
    seems that natural waters, in certain conditions (those of
    extreme oligotrophy), may be purer than those prepared in
    a laboratory, despite enormous cautions to avoid trace
    organic impurities.

  137. Re: Jordan (Feb 5 13:40),
    If data is aliased and therefore meaningless
    This is the first time I meet this concept, had to look up “aliased”. Nyquist we had been introduced to by George more than a year ago, but alias is new.
    From what I can gather it means that spurious signals appear due to not picking the correct for the theorem frequency.
    I object to the term “meaningless”. Distorted, yes, complicated, yes, confusing, yes. Meaningless no : as your video shows , if one suspects a Nyquist deviation, one could unscramble the spurious from the signal enough to make decisions.
    Now in the context of this post, why would one consider satellite data not to be able to fulfill the Nyquist theorem? The weather does not change every minute nor the terrain correlation every hundred meter or so

  138. UncertaintyRunAmok (16:02:02) :
    so in ‘uncertainty’ does the tree make a sound? and is the cat in the box when the lid is on?

  139. Jordan (14:18:25) :
    You are confusing spatial and temporal frequencies. Evidently you don’t understand that that the spatial correlations in the paper I referenced are relevant to your concerns about aliasing.
    You’re not alone, though. George E. Smith appears to be making a career out of misunderstanding the basics of sampling.

  140. Bob Tisdale (18:11:05)
    Generally fair comments and thanks for your suggestions – I’ll think about it.
    You may apprecaite how it is necessary to wait for the right time to get attention to an issues like this. There was not much point in trying to spark-up a discussion about aliased signals when AMSU data trend had no particular upward direction for the last decade or so. But January changed that.
    I would not have thought this matter is OT on a thread which is concerned with reasons for the January 2010 spike.
    Anyway, you are now well aware of the sampling theorem and aliasing with particular regard to these putative climate signals. These points are well founded in the mathematical throry of sampling and discrete signal processing. If anybody wants to create a reconstruction of an analogue signal, this is the field they are working in.
    As a couple of posters have mentioned, climatology (as a field) may have made the mistake of treating this as solely a matter of statistics. Possibly completely missed the point.
    It would only take a reference to the scholarly literature to give the assurance that the matter has been conclusively analysed and resolved, and that the climate system raises no such issues. With that in mind, all data acquisition systems should be referring to such work in order to satisfy the community that their design avoids aliasing.
    I haven’t seen any of that, and neither have you Bob.
    It is certinly not even enteratined in Jones’s 150 year golbal average temperature reconstruction. The paper suggested above by TomP does a good job of showing how the spatial coverage has varied over the years, and how sparse the data acquisistion system has been over its whole history. These works make no attempt to show how they avoid aliasing , and that puts them straight into my “aliased junk” folder.
    Right now, I suggest others do the same.
    Regards

  141. Tom P (01:57:32) : “You are confusing spatial and temporal frequencies. ”
    Nope! I have been at pains to keep them separate. And I have also expressed a view that the sparseness of the spatial sampling system is the one which gives me most concern.
    But Tom, lets not have that debate. I asked for a reference to the handling of the sampling theorem in the literature. You gave me a paper which treats the issue as an exercise in statistics.
    Can you refer to a paper which demonstrates the temporal and spatial bandwidths and addresses the issue of how the sample data systems for the global temperature field will avoid aliasing in space or time?

  142. Jordan: You wrote, “I would not have thought this matter is OT on a thread which is concerned with reasons for the January 2010 spike.”
    It isn’t OT. Poor choice of words on my part.
    Does the topic deserve a thread of its own? Right now this discussion is lost among 150+ other comments. Leading me to the guest post suggestion again.

  143. anna v (22:30:14) : “I object to the term “meaningless”.”
    I was trying to be measured. In fact an aliased signal can be not only meaningless, it can be downright misleading.
    In a cyclic system, aliased data will add low frequency signals where none exist. If a researcher is looking for long term climate patterns which (due to their duration and intervening factors) are difficult to trace back to physical processes, how can they also demonstrate that detected cycles in sampled data are not just the artefacts of aliasing?
    I think the only sure way is to ensure the data acquisition system is designed to avoid aliasing. And to do that, you need to start off with an analysis of the physical system in order to determine sampling frequency (time) and/or sample “wavelength” (space).
    In a less-than-orderly system (such as the climate), alising can suddenly produce an apparent step changes. Does that not have some appeal when we consider what we just observed in January 2010?
    Anna: “one could unscramble the spurious from the signal enough to make decisions.”
    I don’t think so Anna. The underlying issue of aliasing is a failure to collect sufficient data to be able to reconstruct the signal. Lost data is lost forever. There is too much loss of information between the samples and trying to fit lines (or using models to infer data) between the samples will not be reliable enough to recover the true shape.
    “Distorted, yes, complicated, yes, confusing, yes. Meaningless no”
    Sorry to disagree again.
    Anthony’s helicopter appears to fly with stationary main rotor blades. If somebody knows nothing about helicopter flight and saw that video, they might wonder what those stationary blades are supposed to be doing. Certainly don’t appear to have any part to play in the changes of direction.
    Similarly, if all we had was the video of that propeller, we might calculate the average distance of the detatched areofoils from the hub, we might also carry out calculations of the average speed of the aerofoil as it departs from the hub, and the rate of loss of mass as each areofoil disappears. We might wonder what happens to the mass as the detached areofoil disappers and the attached aerofoils gain mass. All completely wasted effort as the real issue is the inadequacy of the observations.
    Anna: “Now in the context of this post, why would one consider satellite data not to be able to fulfill the Nyquist theorem? The weather does not change every minute nor the terrain correlation every hundred meter or so”
    Yep – that’s the question. I suspect the temperature reconstructions are warped by aliased data. Perhaps in time, but I feel much more so concerned about spatial sampling – especially the oldest most sparse data.
    It would be nice to have that formal reported survey and analysis which addresses the issue and puts forward specifications for spatial and temporal sampling.

  144. AS WITH ANY MAJOR CLIMATE RECORD ACHIEVEMENT…THESE PRELIMINARY
    RECORDS WILL BE QUALITY CONTROLLED BY NOAA’S NATIONAL CLIMATIC DATA
    CENTER OVER THE NEXT SEVERAL WEEKS
    So I guess that we just have had least amount of snow in 114 years?

  145. Re: Jordan (Feb 7 03:26),
    Well, I also would be interested to see whether the Aqua scanning fulfills the Nytquist criteria, so it might be good to see a thread on this.
    This January peak is in line with the peak of 1998, it is an el Nino year. I think the dissonance comes from the ad hoc assignement to air temperatures the role of controlling the climate, instead of their being one of the measures that show the energy flows of the earth system.
    I think that your video shows that there exists information, not that there is no information. It is the interpretation of cause that is in doubt.
    Seems to me that deterministic chaos is the answer to this Nytquist conundrum.

  146. Jordan (02:28:40) :
    As you fail to understand the relationship between sampling and correlation in the paper I referenced, there really is very little more I can offer.

  147. Anna
    “This January peak is in line with the peak of 1998, it is an el Nino year.”
    Or perhaps all we have is a distortion and exaggeration of some events like ENSO. Just depends how the thermal field varies, and how it interacts with the measurements we happen to have made. If we change the sampling system, do we get a different impression of a particular event?
    “Seems to me that deterministic chaos is the answer to this Nytquist conundrum.”
    Don’t see why. This should only be a matter of design of the sampling system to make sure our observations and reconstructions are not an artefact of the samples we happen to have gathered.
    It is a basic requirement of practical applications, such as digital control. We cannot control something unless we can observe it.
    If temperature trends were nothing more thn sampling artefacts, it kinda puts that 2 deg C target into perspective, doesn’t it.

  148. Tom P (04:50:33) : “As you fail to understand the relationship between sampling and correlation in the paper I referenced, there really is very little more I can offer.”
    If you’d like to press on with this Tom, then I would ask you to look again at the first video I posted above and the first example of the aliased signal where we observe a step-wise sine wave at a fraction of the frequency of the input signal.
    Now, here’s the catch: all you have is the aliased series in the lower half of the oscilloscope display. You have nothing to tell you that the input signal is really 10 hertz.
    From the alisased signal, please explain how an assessment of the correlation between neighbouring samples (or any other form of correlation analysis) will inform us:
    firstly, that the sampled data is an aliased mis-representation of the input signal, and
    secondly, that a sampling rate of 100 hertz is required to provide an accurate reconstruction of the input signal within a reasonable time scale.
    (If you are still concerned about spatial versus temporal sampling, just substitute spatial dimensions for temporal dimensions in the video.)
    Think about it Tom P. This could be another example of climatology trying to develop novel techniques in respect of other disciplines (in this case, discrete systems theory). What do you think are the chances that other disciplines do not use those techniques for perfectly good reasons?

  149. Re: Jordan (Feb 7 06:28),
    Jordan (06:28:05) : | Reply w/ Link
    I had said:
    “Seems to me that deterministic chaos is the answer to this Nytquist conundrum.”
    you said:
    Don’t see why. This should only be a matter of design of the sampling system to make sure our observations and reconstructions are not an artefact of the samples we happen to have gathered.

    I found the following by googling “chaos theory and Nyquist”
    http://vestnik.mstu.edu.ru/v11_3_n32/articles/02_pryg.pdf
    Interestingly enough in section 3.1 he tackles the ice ages, and yes, there is chaos and Nyquist.
    “The results have great importance for a general chaos theory, since a transformation between the base
    theoretical types of the chaos structure as result of a real processes’ evolution has been achieved for the first time.”
    The weather system is chaotic and the climate too, after all.

  150. Jordan (11:32:55) :
    The spacings of the stations analysed in the Hansen paper are down to just a few kilometres. If there were serious problems with aliasing, there would not be a high level of temporal correlation between anomalies measured at neighbouring stations. The fall off in that correlation can be used to determine a reasonable sampling distance.
    Climate scientists are very well aware of issues in sampling temperature data and use standard approaches to ensure that their sampling is adequate. What you consider to be a potential flaw in their way of determining temperature changes was first analysed over twenty years ago. Aliasing is not an unresolved issue.

  151. Anna
    Thanks for the reference to the paper. I’m sorry but I do not understand it as I have no particular background in chaos theory. As far as I can tell, it is not addressing the issue of whether the sampled data has been sampled correctly. The analysis appears to rest on and assumption that there is no possibility of aliasing in the sampled data series (i.e that the data acquisition system complies with the sampling theorem). Indeed, if the data had been corrupted by sampling, I don’t think the authors’ analysis would have any meaning.
    Tom P.
    No demonstration of how a statistical analysis can provide a test of aliasing in the above aliased sinusoid? No statistical analysis of an aliased series which points us to the required sampling rate then?
    That comes as no surprise. No new mathematics here folks, move along.

  152. Re: Jordan (Feb 7 14:32),
    Of course it would comply to the Nyquist theorem if it wants to prove the connection.
    If you want to show that statistical mechanics morphs at another level into thermodynamics, which it does, you use the correct statistical mechanics to show that thermodynamic quantities are arrived at in the limits. From then on you can use the thermodynamic quantities in the knowledge that the information gained is based on solid microdynamics. You do not need to follow the microscopic processes to measure temperatures.
    In a similar way using deterministic chaos dynamics you do not need to follow the underlying level that produces the chaotic dynamics. IMO of course.
    Now if the question is whether the data gathered obeys the Nyquist condition , we come to the point that it has to obey it if we want to reproduce the original correctly in the detail. The question then becomes: what orginal? If we are talking of molecular statistical ensembles then it would be impossible since the frequencies would be tiny, molecular distances and times. If we are talking of a complete temperature map the distances and times become of the order of meters and hours, (which is what you are inquiring after). If we look at it as a dynamical chaotic system the distances become much larger and so do the times.
    That is why I have been saying that dynamical chaos tools have to be developed for climate study ( as in the paper by Tsonis et al).

  153. Jordan (14:32:48) :
    “No demonstration of how a statistical analysis can provide a test of aliasing in the above aliased sinusoid? No statistical analysis of an aliased series which points us to the required sampling rate then?”
    Your sinusoid was purely in the time domain – it has limited relevance the spatial sampling of temporal signals. The Hansen analysis is to look at temporal correlations in the time domain as a function of sample spacing. If there was a problem of aliasing you would see the correlations falling off and then increasing as the spacing was increased. In fact the correlations in temperature anomalies fall off gently, and are the decrease is comparable to the scatter in the data out to about 1000 km. This confirms there is not an aliasing problem.

  154. 2009 12 +0.288 +0.329 +0.246 +0.510
    2010 01 +0.724 +0.841 +0.607 +0.757
    So in ONE month the difference exceeds the variability by 0.436 over the +0.288 of December 2009 value. What is the significance of these temperature measurements on a climatic viewpoint? Zip.

  155. I think this data should be looked at very closely. How has it been corrected or manipulated. The lesson of the previous few months is, do not take ANYTHING at face value.
    My gut feeling is, the anomolously high snow coverage of the northern land areas is biasing the satellite data somehow.

  156. kzb (10:20:25) :
    I think this data should be looked at very closely. How has it been corrected or manipulated. The lesson of the previous few months is, do not take ANYTHING at face value.
    My gut feeling is, the anomolously high snow coverage of the northern land areas is biasing the satellite data somehow.

    However the tropics and SH are both high too, which is unlikely to be due to snow cover.
    YR MON GLOBE NH SH TROPICS
    2009 01 +0.304 +0.443 +0.165 -0.036
    2009 02 +0.347 +0.678 +0.016 +0.051
    2009 03 +0.206 +0.310 +0.103 -0.149
    2009 04 +0.090 +0.124 +0.056 -0.014
    2009 05 +0.045 +0.046 +0.044 -0.166
    2009 06 +0.003 +0.031 -0.025 -0.003
    2009 07 +0.411 +0.212 +0.610 +0.427
    2009 08 +0.229 +0.282 +0.177 +0.456
    2009 09 +0.422 +0.549 +0.294 +0.511
    2009 10 +0.286 +0.274 +0.297 +0.326
    2009 11 +0.497 +0.422 +0.572 +0.495
    2009 12 +0.288 +0.329 +0.246 +0.510
    2010 01 +0.724 +0.841 +0.607 +0.757

  157. Tom P: “Your sinusoid was purely in the time domain – it has limited relevance the spatial sampling of temporal signals.”
    OK, but I did suggest the sinusoid could be in space or in time (just change of units if the oscilloscope). That said, perhaps you’re saying you believe there is a method based on correlation when there are multiple dimensions.
    I have my doubts, and here are some reasons…
    There is no way to saparate the components of the samples into those labelled “genuine” and those labelled “folded” (aliased). As George Smith correctly says, the central limit theorem does nothing to help get us out of aliasing (statistics are also unable to separate genuine from aliased). I’d be extremely reluctant to rely on any statistical aggregate without assurance that aliasing is out of the question.
    Hansen and Lebedeff do not even address the issue of aliasing. It is not the subject of their paper – they don’t mention the sampling theorem or aliasing. The only person who links this paper to the sampling thorem appears to be you Tom.
    If Hansen and Lebedeff had addressed the question of aliasing, they ought to have referred to the relevant mathematics and literature, and then explained why a different method was required. They make no attempt to do so.
    The the method you appear to be suggesting is completely silent when it comes to suggesting a required sampling rate (time or space). Or any criterion for determining that a required sampling rate has been achieved. Odd that the method would not even come up with an answer.
    The Hansen and Lebdeff paper is concerned with gridding the surface, carrying out grid averages and, where there is no data, in-filling using data from distant sources. The correlation calculation is a method seeking to justify the use of data from “neighbouring” sites. There is no theoretical basis to suggest whether a given correlation is acceptable or not:
    “The 1200-km limit is the distance at which the average correlation coefficient of temperature variations falls to 0.5 at middle and high
    latitudes and 0.33 at low latitudes. Note that the global coverage defined in this way does not reach 50% until about 1900; the northern hemisphere obtains 50% coverage in about 1880 and the southernh emisphere in about 1940. Although the number of stations doubled in about 1950, this increased
    the area coverage by only about 10%, because the principal deficiency is ocean areas which remain uncovered even with the greater number of stations. For the same reason, the decrease in the number of stations in the early 1960s, (due to the shift from Smithsonian to Weather Bureau records),
    does not decrease the area coverage very much. If the 1200-km limit described above, which is somewhat arbitrary, is reduced to 800 km, the global area coverage by the stations in recent decades is reduced from about 80% to about 65%.”
    I could have some sympathy with this analysis – the authors are trying to make the best of a bad lot (past temperature methods that just happen to have been gathered). But that doens’t make it correct.
    There is a straightforward answer to all of this Tom: a detailed survey of the global tempertature field and a design specification for sampled data acquisition systems.
    We expect no less from the digital control systems in a chocolate factory – this gives you the assurance of consistent quality when you buy a bar of chocolate. Why not the same for climate data?
    If the design tells us that past data is of inferior quality, at least we will know whether or not to “buy” those temperature reconstructions.

  158. Jordan (12:09:41) :
    Hansen’s paper is specifically justifying the grid spacing required to accurately determine the Earth’s temperature anomaly variation. It might not frame the problem in a way you like, but that does not make its analysis any less valid.
    But I’m relieved you think there’s no Nyquist problem for chocolate bars – I certainly don’t support any undersampling in that case.

  159. Bob Tisdale (18:28:53):
    It’s the precisely the great intensity of the now-fading South Pacific warm pool (~4K at its peak development in January) that makes me suspect a genesis quite apart from ENSO. It’s certainly far greater than anything seen in the equatorial zone during the same time period. Do you have your own speculations why this is so?
    anna v (22:40:13):
    You’re quite correct! Aliasing does not make the data series “meaningless,” any more than the individual snapshots in the videos here are devoid of information. It’s just that the information can be highly misleading when strung together in a series. Aliasing can change the entire spectral structure of the underlying analog signal, but only in an analytically prescribed way. With adequately long series the average value and the total variance of a random or chaotic signal are not affected in the general case, as shown by the Parseval Theorem. Only in highly contrived examples–sampling a pure sinusoid at exactly its own frequency, instead of twice that, does it produce quite meaningless results.

  160. sky: “Aliasing can change the entire spectral structure of the underlying analog signal”
    True. And it can introduce a systematic error which means we do not have the comfort that the “error” signal meets the requirements of the central limit theorem.
    An example: when a signal is aliased, any frequency component at the sampling rate (or multiples of the sampling rate) will appear as a standing offset. We always “hit” those components at the same point in its cycle, although we do not know which point. Aliasing therefore introduces a bias into the calculation of mean value, and we can only calculate the maximum size of the bias (if we know the signals spectral density).
    Setting that aside for a moment (but not to forget it), where frequency components are quite close to the sampling rate (or multiples), we get those “modulated waveforms” seen in the above video. These are systematic errors which will eventially tend to zero, but it requires a much longer series for them to do so.
    Does a chaotic system mitigate this in any way? I don’t know – we’d need to have the spectrum of the chaotic system to assess that question.
    I appreciate your point about taking a series of snapshots. However, if those were analogous to each snapspshot of the global temperature field, my concern is that the snapshots are aliased and there are large gaps in the image being used to construct the putative climate signals.
    Tom P: Basically, I think the H&L paper is trying to make too much of inadequate data. You’ve got to have doubts when they consider use of “neighbouring” data on the grounds of correlation coefficient of only 0.3 (although 0.95 would not address my concerns about aliasing). And when they are forced to either infer data from up to 1200 km distance, or accept only 65% coverage of the surface. A paper to pop in the wastebasket, methinks.

  161. Jordan (00:07:52) :
    You don’t understand a paper, so you throw it in the bin. I suppose it’s one way to avoid having to consider your own faulty analysis.

  162. Phil -the single biggest temperature anomaly on your list is +0.841 degrees for the NORTHERN HEMISPHERE (2010-01). This is so much in conflict with the weather we have experienced all over Europe, China and the US for this month that we have got to wonder about it.

  163. Tom P: ” It might not frame the problem in a way you like, but that does not make its analysis any less valid.”
    There is an established methodology for helping to deal with the issues of data sampling and discrete signal processing. It acutally does give us theoretically grounded criteria for determining when data is corrupted by inadequate sampling.
    The H&L paper fails to discuss why it doens’t use those techniques. It doesn’t even acknowledge that they exist or that aliasing could be an issue. And you want to talk about faulty analysis?
    So the paper plunges into a new technique without referring its techniques back to relevant theory and/or literature. Perhaps you could provide the missing references to the mathematical theory which supports your assertions with respect to correlation of the sampled as a mechanism to alert us to aliasing, and to produce criteria for sampling rate. So far you have failed to do so.
    If not, then we could all agree that this would be another example of climatology venturing into the realms of inventing new techniques without interacting with the relevant specialist communities (ref Wegman).
    Besides Tom, I did not want to get into this debate. All I have said is that there appears to be no evidence of the survey and sampling design study which would have determined the necessary sampling strategy for the global temperature field. Such a study would have become the “handbook” for all attempts to produce these temperature reconstructions.
    Sorry Tom, but you are simply defending an unhelpful distraction in a paper which pretty well confirms the issue I rasied. Unless you can deliver a convincing reference to those missing mathematics, I see no further reason to entertain you.

  164. Jordan ((5:07:32):
    It’s only with strictly periodic signals (i.e., line spectra) that aliasing affects the mean value, because the phase of such signals–no matter how complex–is permanently fixed, thus leading to systematic bias With chaotic or stochastic signals, such as encountered in nature, the phase takes on random values in the long run, which results in asymptotically unbiased estimates of the mean.
    There’s no way that the very high January SST anomaly is due to aliasing. The satellite anomlies are much too coherent with the surface record. Although we all can speculate on the spectacular warm pool in the Southern Ocean, what produced it is not clearly known. The fact that it remained quite stationary for more than a month, rather than drifting with the strong winds in the “roaring forties,” argues for a stationary source of heat, unrelated to ENSO induced current changes.

  165. sky: These are good points.
    I’m not very familiar with argumens about chaos. I’m more familiar with signal components we might refer to as “noise” or perhaps “error” in statistics.
    There may well be an issue with sationarity and ergodicity of the climate. That might cause problems when we assess the frequency spectra with a finite data sample. But surely the same problem would apply to statistical analysis over finite intervals?
    It is appealing to say that climate processes are not stationary ergodic processes, and I can see where you are coming from. But surely that means that statistical analysis of the data we have collected over finite sample intervals would have no particular significance?
    Returning to the topic of noise – this can be characterised by a frequency spectrum. Red noise is concentrated at the lower frequencies – each has a finite component – we could sample red noise and get an aliased signal. For example zero mean noise could be sampled to produce a bias (offset) due to the coincidence beteen certain frequency components and the sample rate.
    I understand a stochastic process as being one with a rendom component, but also with a deterministic element. If we don’t have the latter, then I would struggle to see any scope for making predictions. And if we don’t have the latter, would there not be an issue in trying to give meaning to past observations?
    If we were to concluded that climate signals are basically unobservable, there could well be a lack of logic in seeking to explain changes like January 2010.
    “The satellite anomlies are much too coherent with the surface record.”
    I’m not too sure that this can be used to argue that aliasing does not exist an any one or all of the temperature series. Bear in mind how aliasing can produce plausible signals and statisitcal analysis (which is invalidated by aliasing) could well latch onto those appealing similarities.

  166. Is there any chance that the +0.841 degrees anomaly for the northern hemisphere in January 2010 is actually a transcription error. I wonder if it should be MINUS 0.841 degrees?
    Frankly I just can’t believe this is the largest positive anomaly on the list.

Comments are closed.