UAH global temperature anomaly – hitting the slopes

Mathematician Luboš Motl takes on the new UAH data (source here) and some current thinking about slopes in global climate by adding his own perspective and analysis. Be sure to visit his blog and leave some comments for him – Anthony

UAH: June 2009: anomaly near zero

Global mean temperature according to UAH MSU for the first 8.5 years i.e. 102 months of this century. Linear regression gives a cooling trend by a hefty -1.45 °C per century in this interval. So if someone tells you that the trend is "of course" positive as long as we omit the year 1998, you may be very certain that he or she is not telling you the truth.

UAH MSU has officially released their June 2009 data. This time, they’re faster than RSS MSU. The anomaly was +0.001 °C, meaning that the global temperature was essentially equal to the average June temperature since 1979. June 2009 actually belonged to the cooler half of the Junes since 1979.

Global warming is supposed to exist and to be bad. Sometimes, we hear that global warming causes cooling. In this case, global warming causes global averageness. In all three cases, it is bad news. The three main enemies of environmentalism are warm weather, cool weather, and average weather.

It is not a coincidence that these enemies are very similar to the four main enemies of communism. The four main enemies that were spoiling the success of communism were Spring, Summer, Fall, and Winter. 🙂 See Anthony Watts’ blog for additional discussion.

Bonus: trends over different intervals

You may have been intrigued by my comment that the cooling trend during the last 8.5 years is -1.45 °C. What is the result if you choose the last “N” months and perform the linear regression?

You may see that the cooling trends are dominating for most intervals shorter than 110 months; the trend in the last 50 months is around -6 °C per century. Only when the period gets longer than 150 months i.e. 12.5 years (but less than 31 years), the trend becomes uniformly positive, around 1.2 °C per century for the intervals whose length is close to 30 years.

Note that those 12.5 years – where you still get a vanishing trend – is from January 1997 to June 2009. If you consider the UAH mid troposphere data instead (relevant for the part of the atmosphere where the greenhouse warming should be most pronounced, according to both proper atmospheric science and the IPCC report, page 675), all the trends are shifted downwards:

You need to consider time periods longer than 180 months i.e. 15 years (at least from Summer 1994) – but shorter than 31 years – to see a uniformly positive warming trend. And the trend that you can calculate from those 30+ years is just 0.4 °C per century and chances are that this 30+-year trend will actually drop below zero again, in a few years. At any rate, the blue graph makes it clear that in the right context, the longer-term warming trend converges to zero at a very good accuracy.

According to the IPCC, the surface warming trend should be around 3 °C per century which should translate to a 4-5 °C warming per century in the mid troposphere where the greenhouse effect has the strongest muscles. You see that according to the last 30 years of the data, the IPCC overestimates the warming trend by one order of magnitude!

Because the mid troposphere is the dominant locus of the greenhouse “fingerprint”, this is the most appropriate method to check the validity of the IPCC predictions. Their order-of-magnitude error is equivalent to the mistake of a biologist who confuses squirrels and elephants.

To be more specific about a detail, half of the Earth’s surface is between 30°S and 30°N – because, as Sheldon Cooper said in TBBT, sine of 30 degrees is exactly 1/2. But the mid-troposphere warming (8 km above the surface) is faster than the surface at least between 40°S and 40°N, i.e. on the majority of the surface, so it is likely that even when you take the global averages of both quantities, the mid-troposphere should see a faster warming than the surface.

Someone may argue that those 30 years represent too short an interval and the trend will be higher in 100 years. But such a reasoning is a wishful thinking. Moreover, periods longer than 30 years don’t really belong to the present generation. In 30 years, most of the population of the Earth won’t remember the year 2009 – and they shouldn’t be affected by stupid fads of those mostly dumb people from 2009.

0 0 votes
Article Rating
129 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
July 7, 2009 9:30 am

The scientific data and anecdotal stories (“cooler and wetter than previous years”, etc.) reinforce each other. Same pattern in general between the temperate and tropical countries.

noaaprogrammer
July 7, 2009 9:50 am

Is there a formal meteorological/geological definition for the number of years encompassed by the term, “climate?” Roughly how many years does it take to shift from one set of climate averages to another, and how is that related to the duration of the “stable” climate periods? Whenever the Earth does undergo a climate shift, what are some statistics that could be used during the shift to measure/project the magnitude of the shift. For example, the number of temperature, precipitation, hurricane, etc. records being broken when compared to the previous stable period. It would be nice to have some formal definitions in the area of climate change using statistics and the mathematical theory of chaos.

Ivan
July 7, 2009 9:52 am

Dear Anthony,
Lubos is not a mathematician, but physicist, working on string theory.

Stefan
July 7, 2009 10:05 am

noaaprogrammer (09:50:13) : Is there a formal meteorological/geological definition for the number of years encompassed by the term, “climate?”
I’ve often wondered this. Why is 30 years defined as climate? Why not 3 or 300 or 3000 or 30000? I keep wondering as a lot seems to ride on the number chosen. To be fair someone did once post a reply as to why, but I didn’t understand the answer.

philw1776
July 7, 2009 10:09 am

Add me to those experiencing an unusually cold and wet May-early July here in New England. What I gather from the informative post is that to project a warming trend, one must carefully select the particular time period to compute the moving average. I doubt that such cherry picking data selection methodology exhibits statistical significance. Another great guest science post in a fine science blog.
Given that CO2 levels are higher this decade than the decades of the last half of the 20th Century, should we not be seeing readily measurably higher temperatures according to IPCC dogma?

Adam from Kansas
July 7, 2009 10:21 am

IceAgeNow has a story on where they’re predicting possible frosts for Canada’s Avalon peninsula around Newfoundland
http://www.weatheroffice.gc.ca/warnings/report_e.html?nl11
If we lived up there we’d have to take our ferns inside and cover the flowers during a time which is supposed to be the hottest part of the year. Speaking of flowers I noticed a few foolish Sunflowers here blooming more than a month early when it’s supposed to get to 100 degrees in a few days O.o.
I’m almost surprised temperatures aren’t shooting upward in response to SST’s by now, perhaps the PDO, AMO, Serychev, and the sun are all acting as a drag

July 7, 2009 10:22 am

The 0.4C rise is the same as that suggested by Dr Roy Spencer & “The Diatribe Guy” in their blogs.
Three identical results from 3 different methods, better than the IPCC models can do.

Rob
July 7, 2009 10:48 am

philw1776 (10:09:51) :
“Given that CO2 levels are higher this decade than the decades of the last half of the 20th Century, should we not be seeing readily measurably higher temperatures according to IPCC dogma?”
I’ll bet they are measurably higher. Average global temps for a decade around 1995 and 2005 and see what you get. Then try it for a few more decades back.

dennis ward
July 7, 2009 11:00 am

I am surprised temperatures are not much cooler than they are, given that the sun has been remarkably quiet of late. Surely one would have thought that temperatures should be much lower than the 1979 average by now?

Paul revere
July 7, 2009 11:18 am

Who cares what your facts say, the Gorical has spoken and the debate is over. Who can deny the Gorical!!!

Steven Kopits
July 7, 2009 11:33 am

Not to belabor the point, but a squirrel to an elephant is about three orders of magnitude. A horse to an elephant would be about right.

Adam from Kansas
July 7, 2009 11:59 am

A meteorologist on Accuweather is saying the quiet sun could be contributing to the relative coolness of the Northeast, when taken in the light there have been no Tambora-sized eruptions as of recent
http://www.accuweather.com/regional-news-story.asp?region=eastusnews
I wonder if it’s true for the Northwest as well, they’ve gotton some relatively cool days, also it seems Summer has been awol for a number of days in the Canadian city of Edmonton which Intellicast is forecasting low 60 and below for several days for them.

Jos
July 7, 2009 12:13 pm

# Stefan.
Cited from H.H. Lamb, ‘Climate, history and the modern world’, 1995, page 11:
” … A step in the direction of standardization was taken at the 1935 conference of the International Meteorological Organization (forerunner of the present World Meteorological Organization) when use of the observations of the years 1901-1930 for all climatic purposes was recommended as the so-called ‘climatic normal period’. Choice of the world ‘normal’ turned out to be unfortunate, but it has persisten in climatological practice. It spreads the impression that nature recognizes such a norm and the conditions should continually return to the regime of the chosen period. Wenow know that 1901-1931 was a highly abnormal period, though it was surpassed by the following thirty years 1931-1960, which were in due course substituted as the ‘new normal period’. Globally these were probably the warmest, and in many regions the moistest, regions periods of such length for centuries past. …”
See more here.
http://books.google.nl/books?id=0Nucx3udvnoC&dq=climate+definition+thirty+years&source=gbs_navlinks_s
With a special thanks to William Kininmonth – former head of Australia’s National Climate Centre, who explained this to me earlier this year.

rbateman
July 7, 2009 12:14 pm

“Global warming is supposed to exist and to be bad. Sometimes, we hear that global warming causes cooling. In this case, global warming causes global averageness. In all three cases, it is bad news. The three main enemies of environmentalism are warm weather, cool weather, and average weather.”
Weather is bad for you. Weather is part of the environment. The environment is bad for you. Weather is part of the Climate. The Climate is bad for you.

steven mosher
July 7, 2009 12:15 pm

I could think of two reasons to say that “climate” is a 30 year period. One would be that with a sample of 30 years of data you could say the sample was “large” in statistical terms, but really this isn’t a physically motivated rationale. The other reason could be the existence of underlying physical cycles that had length of 15 years.

Jos
July 7, 2009 12:25 pm

# Stefan, Noaaprogrammer
There are several scientists active in climate research that argue that there is no unique period that can be used to define “climate”. Reason is that the notion that climate appears to be non-stationary, i.e. that after subtracting a mean the remaining variability still contain non-random variations (for example long term persistence), and these remaining variations THUS are not random.
It appears that such variations occur on all sorts of timescales, although there are some preferences within the climate system for certain periods, like ENSO, typically about 3-5 years, or PDO, 30-60 years or Oeschger-Dansgaard variations (1470 years) or the ice ages.
Nevertheless, the question is valid, and you can defend that there is no unique climate-timescale. I guess it all depends on the timescales and response times of various sub-processes that play a role in climate.
Still, for practical purposes – like a weather forecast – it is nice to be able to say that ‘for the time of the year it is warmer or colder than usual’ as we are so used to for example the ever changing seasons.

James H
July 7, 2009 12:25 pm

Summer isn’t AWOL in the Phoenix, AZ area. We’ve been a bit below normal lately in the mid 100’s, but the forecast for Saturday is 116. That’s somewhere between the normal and the record for that day. This is the time of year where we expect more 115-118F days. Where are these frost warnings again? Is there gainful employment available? 🙂

rbateman
July 7, 2009 12:33 pm

dennis ward (11:00:12) :
Time spent in minimum conditions plus latency plus prevailing patterns (noise) determines who get what when and how much. Ask not for whom the cold comes, it comes for you. Your only advantage in this is that, unlike previous civilizations, you have records of what has been. And that’s about all the warning you are ever likely to get, seeing that the present crystal ball of AGW is no better at predicting the future than those who read tea leaves or entrails, or those who forecast endless prosperity in the CDO swap market.
It all seems so simple until the bottom falls out.
Choose your poison. Long or short, boiling or freezing. If you guess at it, your chances are 50-50. If you happen to be observant and pick out the right indicators, you may do better. If you take somebody else’s word for it, you are still guessing.

Gary Crough
July 7, 2009 12:34 pm

noaaprogrammer (09:50:13) : Is there a formal meteorological/geological definition for the number of years encompassed by the term, “climate?
I think 30 years is the minimum used by researchers in the field. I also thought the years 1951-1980 were the baseline used in the RSS and UAH temperature graphs provided by this site? Can someone confirm or correct this assumption?
For example the 1st point plotted, Jan 1979 is ~ -.15 C (on the UAH chart). I thought that indicated it was .15 C below the 1951 – 1980 average global temperature? If not what is it indicating?

Jos
July 7, 2009 12:44 pm

#Gary Crough
According to the UAH website:
http://vortex.nsstc.uah.edu/data/msu/t2lt/readme.06Jul2009
“Note that the base period for the mean annual cycle for
t2lt is now 1979-1998, or 20 years instead of the previous
1982-1991 ten years.”
So it is the anomaly compared to the 20-year mean for 1979-1998.

Bob Kutz
July 7, 2009 12:47 pm

I am pretty sure those graphs representing the trend over various periods looks a lot like the graph of trends in a random data set;
It flails about wildly on a small sample set, switching signs less and less frequently as sample size increases, then somewhere between 50 and 200 events, it picks a sign and diminishes to zero over about a thousand events such that x=n and y = RND(+1 or -1).
Try it out in excel if you want, it’s a simple statistical experiment. I am reasonably confident there’s some interpretation to be made, depending on the number of y intercepts and the sample size, but I haven’t looked much deeper than that.
Sorry if that is too pedestrian for the fully engaged scientific mind, but I found it useful.
Hmmm. . . . Statistics 201 to the rescue!

oMan
July 7, 2009 12:50 pm

Maybe the 30 year period chosen for climate is based on the underlying political-cultural cycle. Which would be based on the average term of ambitious young charlatans who could rise by peddling nonsense because their elders and betters were retiring. Roughly 30 years?

Bob Kutz
July 7, 2009 1:00 pm

What’s truely remarkable about that is when you introduce a trend of 1 per 100 (i.e. 1 ‘degree’ per century or .01 per ‘n’) there’s two intercepts at less than 10 events, and after about 35 events, there is no doubt that it’s not trending toward zero, or even remaining constantly in the vicinity. If there’s a trend, it shows up for real over any large data set. I imagine there’s a probability distribution somewhere here as well.
There’s something in this line of thinking that could prove disasterous for the AGW crowd if they want to claim 1 or 2 or even 5 degrees of warming per century. (Although the proprietor of this website and the regulars here have provided more than a few disasters for the warmists already).

TJA
July 7, 2009 1:12 pm

One day, when we fully understand what drives climate, we will have a good number to define a period associated with climate. Until then, we may as well be arguing about angels and pinhead. Eleven years is good for now, because it encompasses one complete sunspot cycle, on average. At least it has some basis in physical reality and balances the effects of a sunspot cylcle. Maybe 33 yrs will turn out to be right.

JT
July 7, 2009 1:13 pm

“Is there a formal meteorological/geological definition for the number of years encompassed by the term, “climate?” Roughly how many years does it take to shift from one set of climate averages to another, and how is that related to the duration of the “stable” climate periods? ”
Benoit Mandelbrot, who is a mathematician, and who knows something about chaotic systems, considered that question and there is a summary of his conclusions at Climate Audit. To over-simplify: its Weather, all the way down.
http://www.climateaudit.org/?p=396

TJA
July 7, 2009 1:14 pm

At least multiples of 11 have the advantage that they defeat cherry picking from within a solar cycle.

Juls
July 7, 2009 1:20 pm

With the 10 years moving average used in most publications, the trend is still positive, and we can unfortunately expect 10 years more before the AGW hypothesis is definitely discarded.

bluegrue
July 7, 2009 1:53 pm

Here’s a plot of linear 8.5 year “trends” of the UAH data, where the slope is plotted versus the time of the end of the 102month period, so e.g. the data at 1995 represents the data from mid-1986 to the beginning of 1995.
http://i27.tinypic.com/20k1y87.png
In mid-1987 the “trend” was -2.2°C/century and in mid-1995 the “trend” was down to -1.8°C/century. In that respect the current dip in 8.5 year linear “trend” is not really exceptional. Also note, that the 8.5 year “trend” was above zero most of the time and above +2°C/century for extended periods of times, whereas it only dipped down to comparable negative slopes for short periods of time.

botosenior
July 7, 2009 2:02 pm

and another great prediction from METOFFICE:
Precipitation during summer will be below average in eastern Europe!
If you take a look, almost today in most of this regions more rain was fallen, than during an average meteo-sommer (06-09).
Now, we saw this time again, that all of this medium (long) range forecasts from metoffice have been very wrong.
There is nothing else to say!

July 7, 2009 2:12 pm

Lubos: “global warming causes global averageness” is a classic. My vote for the quote of the week.
Great post. Thanks.

July 7, 2009 2:16 pm

dennis ward (11:00:12) :
I am surprised temperatures are not much cooler than they are, given that the sun has been remarkably quiet of late. Surely one would have thought that temperatures should be much lower than the 1979 average by now?

There is still a lot of residual heat in the oceans left over from the run of big amplitude solar cycles. It ain’t going to raise temperatures, but it does leave some lag in the system and will mean the fall of temperatures will be slow for a while. If solar cycle 24 doesn’t get it’s act together in the next 2 years, then you’ll see how much the sun affects climate more clearly.
If the realclimatescientists understood how much bigger the ocean is in terms of thermal capacity than the atmosphere is, you’d probably already be aware of this.

George E. Smith
July 7, 2009 2:27 pm

“”” Stefan (10:05:32) :
noaaprogrammer (09:50:13) : Is there a formal meteorological/geological definition for the number of years encompassed by the term, “climate?”
I’ve often wondered this. Why is 30 years defined as climate? Why not 3 or 300 or 3000 or 30000? I keep wondering as a lot seems to ride on the number chosen. To be fair someone did once post a reply as to why, but I didn’t understand the answer. “””
Well there are two aspects of “climate”. The most obvious to people is what causes it to be warm and humid, and rainy in the Amazon, but dry and hot in the interior of Australia; ie local climate; which is as much a function of geography as it is of atmospheric/oceanic Physics.
The other aspect is the global long term equivalent to weather. Evidently climate is formally defined as the long term average of weather; which brings in your long term query.
Actually climate is no such thing, as the long term average of weather; it is much more accurate to say it is the long term integral of weather. Because any change in the status quo by whatever means, must clearly start from the status quo. Weather changes do not operate on the long term average of anything. If you want to superimpose a hurricane onto an isothermal eath that has a temperature of +15 deg c all over, you can’t expect to end up where we end up after a real hurricane; because you didn’t start at the place where the hurricane started.
The earth on the other hand takes weather as it happens by the nanosecond, and religiously integrates it for all future time to get to someplace else. So what should we say about a science that can’t even properly define what it is about.
The other thing is that climatologists don’t seem to want to talk about real world variables like temperature. they have to make up fictitious ones like “anomaly” which is something that isn’t what it was supposed to be; and what it was suposed to be; that miraculous zero anomaly that we are nearly celebrating today, depends on what period of time when we couldn’t measure the real variables either, we want to take the average of, to set as a baseline for where things should be according to climate concensus.
If the data were any good, what difference does it make where you set the baseline. How about setting the baseline at zero deg C, or even zero Kelvins if you like; something that is recognizable as real science.
And then there are the “Forcings”, another mythical creation of concensus science; not to be confused with any real physical variables of main stream science.
Add to that, a lavish dose of statistical mathematical prestidigitation, so you can create information out of nonsense.
But don’t ever expect to see one of these anomaly graphs plotted on the same scales as the actual real world physical temperatures that you might actually measure on any northern summer day. Try plotting that first UAH graph above on a scale from -90 deg C to +60 deg C, to get a real world view of how significant climate change really is.
George

David
July 7, 2009 2:30 pm

Unbelievable denialism here. Those of you who are actually still considering the issue, rather than wedded to one side of it than the other, please consider the following paragraph from the Hadley Climate Research Unit for a moment before resuming this counter-constructive online banter:
The time series shows the combined global land and marine surface temperature record from 1850 to 2008. The year 2008 was tenth warmest on record, exceeded by 1998, 2005, 2003, 2002, 2004, 2006, 2001, 2007 and 1997. This time series is being compiled jointly by the Climatic Research Unit and the UK Met. Office Hadley Centre. The record is being continually up-dated and improved (see Brohan et al., 2006). This paper includes a new and more thorough assessment of errors, recognizing that these differ on annual and decadal timescales. Increased concentrations of greenhouse gases in the atmosphere due to human activities are most likely the underlying cause of warming in the 20th century.
The 1990s were the warmest complete decade in the series. The warmest year of the entire series has been 1998, with a temperature of 0.546°C above the 1961-90 mean. Thirteen of the fourteen warmest years in the series have now occurred in the past fourteen years (1995-2008). The only year in the last fourteen not among the warmest fourteen is 1996 (replaced in the warm list by 1990). The period 2001-2008 (0.43°C above 1961-90 mean) is 0.19°C warmer than the 1991-2000 decade (0.24°C above 1961-90 mean).
Analyses of over 400 proxy climate series (from trees, corals, ice cores and historical records) show that the 1990s is the warmest decade of the millennium and the 20th century the warmest century. The warmest year of the millennium was likely 1998, and the coldest was probably (but with much greater uncertainty) 1601.

We should all remember having read that, when we consider how we will justify our inaction to our grandchildren.

Tom in Florida
July 7, 2009 2:40 pm

George E. Smith (14:27:40) :
“Try plotting that first UAH graph above on a scale from -90 deg C to +60 deg C, to get a real world view of how significant climate change really is.”
Exactly. Also try drawing a veritcal line for 380 PPM on an 8×11 piece of paper using the bottom edge of the paper as 0 and the top edge as 1,000,000.

a jones
July 7, 2009 2:47 pm

Well if I were you I would read that very carefully yourself and note both the non sequiturs and the inherent contradictions. Not to mention the mays, the might bees, and the unsupported assertions.
Kindest Regards

Ron de Haan
July 7, 2009 2:52 pm

David (14:30:01) :
“Unbelievable denialism here”.
From your side and especially from the side of the Hadley Climate Research Unit you refer to.
Hadley is into the AGW/Climate Change Scam up to their neck.
Your problem is that you still believe this Old English Institution can be trusted.
Unfortunately you can’t.
I know this comes as a real shocker and I had to walk the same road you have entered today.
Look for earlier postings about Hadley at WUWT, icecap.us and other honest and objective blogs. Most of them can be found via WUWT.
It’s your children’s future they intend to steal by closing down our economies and taxing the hell out of the hard working people, all based on an absolute hoax.
We can adapt to the climate as we have done for hundreds of thousand of years.
Adapting to the biggest hoax in history will be a lot more difficult.
So please go the whole nine yards and learn what we have learned.

Jordan
July 7, 2009 3:02 pm

steven mosher: “One would be that with a sample of 30 years of data you could say the sample was “large” in statistical terms”
Only if 30 years’ AMSU data is a statistically representative sample. But it is not – it does not give us the randomness/diversity required to capture all the information required to summarise climate behaviour. If this data is used to produce measures of significance, such measures would need to be expanded to account for the shortcomings of the inferior sample.
Juls: “.. we can unfortunately expect 10 years more before the AGW hypothesis is definitely discarded.”
Right now, the AMSU LT series has a large spike in 1998. As this sits toward the right hand end of the series, it contributes to the apparent positive trend.
If the new data over coming decade or so does not challenge the 1998 peak, the spike will transition to the middle, and then onto the left-hand end of the growing series. Without large new positive anomalies, 1998 will act as a pivot, and the apparent trend would reduce to zero and eventually turn negative. If this happens, it will show that 30 years is arbitrary and will be a practical demonstration of statistical insignificance.

Boudu
July 7, 2009 3:03 pm

“Sometimes, we hear that global warming causes cooling. In this case, global warming causes global averageness. In all three cases, it is bad news.”
This has to be Quote of the week !

Bob Kutz
July 7, 2009 3:07 pm

David (14:30:01) :
Here’s the underpinning of the information you just regurgitated;
http://www.ncdc.noaa.gov/img/climate/research/ushcn/ts.ushcn_anom25_diffs_urb-raw_pg.gif
Can you see what that is? Do you understand what it means? The raw data show’s no (or only very slight) warming. This is all a facade.
So go ahead, give us that old argumentum ad misericordiam sham argument; Do it for the Children! In lieu of any real evidence to the contrary. We don’t know, so just to be safe, we’d better stop using fuel! Talk about a way to screw up our kid’s lives!
The powers that be have chosen to ignore the facts, have subverted the science, and would appreciate it greatly if we would now surrender our freedom.
You go first.

Russ R.
July 7, 2009 3:16 pm

David:
“counter-constructive online banter” ????????
I guess we need “banter police” to determine the relative merit to online banter, to stop the evil counter-constructive variety.
I am sure we could trust the banter police to only give us the information we need, and not expose us to the ideas, that might make us mistrust the banter police. I bet Hadley wished they had set up the right kind of information regulation, so they wouldn’t look like they had an agenda, other than providing the public with non-biased data.

Gerry
July 7, 2009 3:17 pm

When Dr. Roy Spencer released the June UAH results, I was intrigued by his statement, “The decadal temperature trend for the period December 1978 through June 2009 remains at +0.13 deg. C per decade.” This was nicely explained by Motls’ regression analysis. The positive decadel trend results from a little bump on Motls’ regression curve (second figure) between 113 months (9.4 years) and 123 months (10.2 years).
Also, as Motls observed “You see that according to the last 30 years of the data, the IPCC overestimates the warming trend by one order of magnitude!”
I would add to Motls’ comments that there is a clear warming trend over the last 310 years. Why? Because the year 1699 was near the end of the Maunder Minimum, in the frozen depths of the Little Ice Age!

July 7, 2009 3:21 pm

David
Would you like to confirm your understanding of the number of accurate weather stations that were used at the start of the series in 1850, and also how the marine surface record back to then was compiled? Also you might like to find out how often the reporting stations have changed in number and location.
Once you know that background we can all have a sensible discussion on the data you have referenced.
Tonyb

Ron de Haan
July 7, 2009 3:22 pm

dennis ward (11:00:12) :
I am surprised temperatures are not much cooler than they are, given that the sun has been remarkably quiet of late. Surely one would have thought that temperatures should be much lower than the 1979 average by now?
Be patient Dennis,
You are not the only warmist who is “surprised”!
We live on a resilient planet that takes it’s time to warm up and cool down.
It’s because all that water you know.
I personally am not disappointed at all.
We are cooling fast and nobody can stop it.
Looking forward to the coming winter which will start about 2 months earlier.
From Seablogger comment by Steve Sadlov:
“Sunday, a dry cold front passed through. I thought, maybe, just maybe … this is our harbinger of climatic autumn, in this region. The earliest ever. Of course, there is no way to know this now. Only retrospectively, perhaps 2 months from now, will we know for sure. Interestingly, and, hauntingly, a northerly flow has settled in. The marine layer is once again mixed out. Cold air from aloft has worked its way to the surface. We’ve been flirting with the upper forties the past two mornings. I will neither rule in nor out the possibility that climatic autumn has arrived here. If true, it is indeed a year without a summer”.
WATER VAPOR IMAGERY IS INDC A DEEPENING TROUGH OFF THE PACIFIC NW COAST. THIS TROUGH WILL MOVE TO THE COAST DURING THE WEEK…BRINGING NEAR TO SLIGHTLY BELOW NORMAL TEMPS. HAVE KEPT TEMPS UP A BIT IN THE NORTH BAY VALLEYS AS THE NORTHERLY FLOW SHOULD INCR…LEADING TO SOME DOWNSLOPE WARMING. THIS TROUGH WILL REMAIN ALONG THE COAST THROUGH THE WEEKEND. THE LARGE RIDGE IN THE SOUTHERN PLAINS WILL BUILD A LITTLE TO THE NORTHWEST…HOWEVER UNLIKE EARLIER MODEL RUNS IT WILL NOT PUSH INTO THE DISTRICT. THEREFORE…ANY WARMING OVER THE WEEKEND WILL BE MODEST. ANOTHER TROUGH DEEPENS OFF THE PACIFIC NW COAST EARLY NEXT WEEK FOR SOME ADDITIONAL SLIGHT COOLING.
=========================================
If our “typical summer pattern” is not established soon, it will never be established, and “the fall pattern” will be the innate default value.
http://www.intelliweather.net/imagery/intelliweather/templine_nat_640x480_img.htm
and
http://www.iceagenow.com/Record_low_temperatures_in_46_states_during_June.htm
No Dennis, I am not disappointed.

July 7, 2009 3:36 pm

There is much discussion here as to what constitutes a period long enough to be termed ‘climate,’ thirty years being considered the norm.
Some time ago I did the calculation for an average persons 70 year lifetime which is probably a more meaningful figure.Apologies to those who have seen it before here, but it is relevant in this thread.
“Being at a loose end, I set my dedicated team of climate researchers here in the UK on the task of graphing Hadley CET temperatures back to 1660, so we could demonstrate to the misinformed the realities of indisputable and catastrophic climate change, and get our large research budgets increased.
Unfortunately the ‘adjustments and smoothing interpolator’ was away on holiday and the ‘trend line coordinator’ was absent a wedding, so I must apologise that the data shown below from 1660 is ‘unadjusted’ and looks nowhere near as pretty and nicely ordered as we have become used to.
http://cadenzapress.co.uk/download/beck_mencken_hadley.jpg
One of our staff is a former actuary and thought she would amuse herself by working systematically through the records back to 1660, to see for herself the alarming warming trend over the centuries-obviously she had seen the Gore film and was wearing the T Shirt
“Catastrophic Climate Change-stop it now! Ask me How!”
Living near the coast she thought about the cycle of the tides, and whilst realising that the climate cycle was different- in as much it is however long we want it to be, and starts from whatever point necessary to maximise our funding- thought it would be fun to use this idea of a regular cycle.
Consequently she based her calculations on a three score year and ten life span as she worked out the average annual mean temperature enjoyed by ‘British Everyman’ through each year of each decade. This assumed he was born at the start of a decade and died the last year of the decade seventy years later. Of course we urged her to call this mythical person ‘everywoman’ but as a woman was likely to live longer, as an actuary she thought this would only complicate matters, so 70 years it is. These are her calculations;
Someone born in Britain in 1660 and living to 70- Average annual temp 8.87c
Someone born in 1670 and living to 70 Average annual temp 8.98
1680 9.01
1690 9.05
1700 9.19
1710 9.21
1720 9.17
1730 9.14
1740 9.04
1750 9.03
1760 9.08
1770 9.10
1780 9.07
1790 9.12
1800 9.15
1810 9.13
1820 9.14
1830 9.12
1840 9.10
1850 9.14 (Start of the famously reliable Hadley global temperatures)
1860 9.17
1870 9.21
1880 9.30 Official end of the Little Ice Age
1890 9.39
1900 9.40
1910 9.46
1920 9.497
1930 9.60
1940 9.70 (projected to 2009)
1950 9.76 Extrapolating current trends (our favourite phrase)
1960 9.79 Using advanced modelling techniques to create a robust scenario.
The actuary has a poetic turn of mind and decided to call the people born in the period from 1660 to 1880 as ‘LIA Everyman’ in as much the person lived part or all of their lives during the Little ice age. She called those born from 1890 to the present day as ‘UHI Everyman’ She assures me that no adjustments have been made to correct UHI Everyman’s unfair reputation to exaggerate his (or her) temperatures.
It was at this point that the Accountants -who were in auditing our accounts to ensure we were spending our grants wisely- became really interested. They’re at a bit of a loose end as they are the group who audit the annual EU accounts-they’ve refused to endorse them for 12 years in a row now- and say it’s so easy to spot the fraud that it’s not a full time job anymore!
Consequently they hope to get some work with the IPCC as they see them as a rapidly growing enterprise as fond of throwing meaningless and unsubstantiated-some might unkindly say fraudulent –numbers around, as the EU are.
After examination of the data the accountants reluctantly agreed that the temperatures were remarkably consistent, and the increase of a fraction of a degree in mean average temperatures during Everyman’s lifetime over a period of 350 years, was so well within natural variability it was difficult to make any useful analogy (other than it was the sort of increase in average warmth that would pass by completely unnoticed if we weren’t looking very very hard for it).
The fractional temperature difference was unlikely to have any effect on Everyman’s choice of clothes, or the day they might attempt to have their first swim of the year in the sea. (Wearing approved buoyancy aids of course)
The Accountants were particularly intrigued by the fact that the very slight rise in overall temperatures was almost entirely due to the absence of cold winters depressing overall temperatures, rather than hotter summers. At this point the actuary mentioned that warmer winters were good, as statistically, fewer people died.
Someone mused that the modern temperatures seemed rather too close for comfort to those experienced during the LIA, and another murmured as to what the temperature variance would show if we did this exercise for the MWP, or the Roman warm period. I quickly pointed out that it was just a Little Ice age and not the real thing, and that Dr Mann had told us all that the MWP was an outdated concept, and as I had never heard of the Romans they couldn’t exist, and neither could their allegedly warm period.
Another Accountant mentioned that if UHI was stripped out, the already tiny increase in temperature since the Little Ice Age would disappear. I reminded them who was paying their bills and to stop that sort of Contrarian talk immediately.
Of course I fired the actuary when she confessed that the almost indistinguishable blue line along the bottom of her original graph represented total man made co2 since 1750. Obviously she was some sort of closet right wing tool of Big Oil out to cause trouble.
I’m undecided whether to turn this report over to our adjustments and smoothing interpolator for remedial work or merely to lose it. Or burn it.
TonyB

July 7, 2009 3:37 pm

Anthony,
too many zeros in “The anomaly was +0.001…” ? The original article shows only +0.01…
Bob

bluegrue
July 7, 2009 3:54 pm

@ Bob Kutz (15:07:03) :

Can you see what that is? Do you understand what it means?

Yes, pretty clearly. About 0.4°F or about 0.2°C of the warming of 0.7°C of the contigeous US48 since about 1970 are due to the necessary correction for time of observation bias (TOBS) and changes in station location (SHAPS). Your point being?

Adam from Kansas
July 7, 2009 4:30 pm

You must be pulling my leg Ron de Haan, I do not doubt what that honest person says about what cooling is in store weather-wise, but please tell me it won’t mean an early freeze here in Wichita despite some sunflowers at my transition school blooming, the reason is we have a tomato and cucumber plant growing in two upside-down planters and I hope my parents and others can enjoy a bumper crop of them before the first freeze kills them, we’re just about to get a big wave of little tomatoes and cucumbers in addition to the 5 already seen, so it’d be nice if the first freeze waited until…………oh should I say the average date somewhere in October so we have time to actually pick a bunch.

Michael Hauber
July 7, 2009 4:32 pm

‘Is there a formal meteorological/geological definition for the number of years encompassed by the term, “climate?” ‘
Or would a more relevant question be ‘How many years would it take for Co2 warming to become greater than the normal amount of variation in the climate due to other factors?’
Hansen in 1981 said about 20.
http://pubs.giss.nasa.gov/abstracts/1981/Hansen_etal.html
So nearly 30 years ago Hansen predicted:
a) There will be a warming trend in the next few decades sometime in the next couple of decades due to Co2.
b) Natural variation could cancel out this Co2 warming for periods of up to 20 years.

Joel Shore
July 7, 2009 5:28 pm

Lubos is wrong when he says that what he calls the “mid-tropospheric data” (and is technically known as T2) should show a higher trend than the T2LT data that one usually talks about. The problem with T2 is that, while the weighting function is centered in the mid-troposphere, it has a considerable tail going into the stratosphere. Since the stratospheric cooling trend is considerably larger in magnitude than the tropospheric warming trend, this causes a large contamination of the trend in T2 from the stratospheric cooling.
Spencer and Christy created the synthetic T2LT channel to try to eliminate this contamination. There is still some argument about how successfully their T2LT channel does this and whether there is a better way to do this (see http://www.ncdc.noaa.gov/oa/climate/research/nature02524-UW-MSU.pdf and http://ams.allenpress.com/perlserv/?request=get-abstract&doi=10.1175%2FJTECH1840.1 ), but I don’t think anybody claims that T2 itself gives a good measure of tropospheric temperature trends because of the issue of considerable stratospheric contamination.

TJA
July 7, 2009 5:28 pm

“Unbelievable denialism here.” – David
David, have you ever heard of the “Drunkard’s Walk”? If a system has memory, and I am assuming that you are not maintaining that the planet has no memory of its preceding temperature. then if it takes an excursion in one direction or another, the time periods around that excursion before and after will also be clustered around the outlying number.
In other words, even if every word of the Hadley Centers description of past temperature fluctuations is accurate, it proves exactly nothing. This is not denial, it is recognition of the inexorable logic of mathematics. If you can’t understand this argument, don’t be surprised if nobody takes you seriously.
“Lubos is not a mathematician, but physicist, working on string theory.” – Ivan
Ivan,
I am curious what your point is?
Speaking of “deniers”, has here is an interview with Dr. Richard Lindzen, MIT climatologist, on Howie Carr’s radio show in Boston where he says that he is no longer a skeptic, but in fact is now a full throated denier, considering that the evidence against AGW is now overwhelming.
http://audio.wrko.com/m/audio/24111309/richard-lindzen-global-warming-denier.htm?q=lindzen

George E. Smith
July 7, 2009 6:00 pm

“”” David (14:30:01) :
Unbelievable denialism here. Those of you who are actually still considering the issue, rather than wedded to one side of it than the other, please consider the following paragraph from the Hadley Climate Research Unit for a moment before resuming this counter-constructive online banter: “”””
Should take a dose of your own advice David.
For a start, there isn’t any reliable global temperature data dating back before the deployment of the Argo Oceanic Bouys, that established in 2001, that oceanic water temperatures, and air temperatures are not correlated (why on earth would anyone expect them to be). That means the believable climate data era begins about 1980, around the same time that polar orbit satellites were launched.
Secondly, the Data reported by the Hadley Climate Research Unit; comprises only the data gathered by the Hadley Climate Research Unit; much as the GISStemp reports only the data gathered by the GISStemp network of sensors.
Neither one of those data gathering networks complies with the Nyquist Sampling Theorem on Sampled Data Systems; either in spatial sampling or in temporal sampling, and in both cases the violation is by factors large enough to corrupt even the zero frequency value of the recovered function, which is the average being sought. So garbage in; garbage out !
Thirdly, it is widely acknowledged that since the IGY in 1957/58, and more recently in the post cooling frenzy of the mid 1970s, there was a very brief period of warming lasting til about the mid 1990s save for that anomalous 1998 El Nino; that was so short lived; but ever since and certainly since 2000, the trend has gone cooler, and decidedly so in the late 2000s, so the data shows we have definitely gone theorugh a climate temperature peak; which is now behind us.
Mathematicians have a habit of placing all of the higher values of any function in the vicinity of a peak; and conversely, by convention, we place the lower values ina region around the minimum; it sort of works out more fairly that way.
So you are merely stating the obvious in pointing out that 2008 was one of the warmest recent years; somewhat akin to saying that the higher elevations on earth can generally be found up in the mountains; we like it that way.
But the causal CO2 of course has not ceased its inexorable climb; nor even paused.
But we have gone from cold scare to cold scare in a mere 30 years; passing through a hot scare on the way.
Don’t you think you are just over dramatising things a bit much ?

TJA
July 7, 2009 6:21 pm

By the way, in that radio interview from last week, Howie points out that Dr Lindzen is wearing a down vest, in July. I know, it is weather. The lake here is almost 10 degrees below normal, and the longest days are over, so I wonder when it is going to freeze this year? The past year the bays froze very early, around Thanksgiving, the same time the ground froze,and the cold was incredible this past winter, I will be curious what happens this year.

Stuart Nachman
July 7, 2009 6:31 pm

The 30 year period is consistent with the Akasofu Wave Theory with oscillations of approximately 30 years since about 1880.

July 7, 2009 6:36 pm

If, as some scientific theories assume, the world is billions of years old, a 30-year baseline is preposterous. Why is ‘normal’ not calculated on the basis of all known accurate records? The longest time span I’ve heard for such records is about 150 years. So if climate norms are based on the longest, best set of data, what does the trend show?

July 7, 2009 6:39 pm

Bob K. (15:37:19) : said:
“Anthony,
too many zeros in “The anomaly was +0.001…” ? The original article shows only +0.01…”
Bob, these two numbers are from different files on the UAH website. They are both global lower troposphere temperature anomalies for June 2009.
I asked:
“On the UAH website the file
http://www.nsstc.uah.edu/public/msu/t2lt/tltglhmam_5.2
shows the June global temperature anomaly at 0.001, but the file http://www.nsstc.uah.edu/public/msu/t2lt/uahncdc
shows the June global temperature anomaly at 0.01.
Could you please explain why these files show different values?”
John Christy, Director, Earth System Science Center, UAH responded:
“The uahncdc.XX files are produced from gridded maps which interpolate farther into the polar regions (to 86.25) where as tXXglhmam_5.X are produced from programs that deal only in zonal anomalies (higher signal to noise) and go only to 83.75. Also, the grid-based datasets have some screening for high elevation which are then interpolated, but the zonals do not have this done (i.e. high elevations included). Also too, uahncdc.XX files are rounded to 0.01 and the zonals are rounded to 0.001. The differences between the two are well below any significance values and can be used interchangeably.”

starzmom
July 7, 2009 6:45 pm

Just to note, the National Weather Service, for all their faults, uses a 30 year rolling average to calculate average temperatures at any given location. We are still in the 1971-2000 30 year period. In 2 years we will jump to 1981-2010. Average temps around the country change when the rolling average is changed. This makes it hard to compare temperature differntials, needless to say.

July 7, 2009 6:52 pm

a few great temperature links:
review of all 4 main temps
http://www.climate4you.com/GlobalTemperatures.htm#Recent%20global%20satellite%20temperature
the measurement of global temperatures – National Climate Data Center
http://www.appinsys.com/GlobalWarming/GW_Part2_GlobalTempMeasure.htm
WoodForTrees -flat from 1997. go play.
http://www.woodfortrees.org/plot/rss/from:1997/plot/rss/from:1997/trend
Historical Climatology Network – Check out where you live. but these use suspect surface data
http://cdiac.ornl.gov/epubs/ndp/ushcn/usa_monthly.html#map

July 7, 2009 6:52 pm
Manfred
July 7, 2009 7:02 pm

(14:30:01) :
Though satellite data does not fully support your list from the Hadley Center, who still refuses to make their algorithms public, a temperature increase has taken place since the little ice age.
The millenium record stuff, however, is a wild guess and unsupported by science andin pure contradiction to various historical reports, especially higher historic sea-levels and smaller glaciers.
For an introduction to this topic, I would recommend you to read this http://www.climateaudit.org/?p=2322 and this http://www.climateaudit.org/?p=4866 .
After that, how can you justify towards your grandchildren, that you helped in wasting their resources, prosperity on the basis of bad science ?

Law of Nature
July 7, 2009 7:39 pm

Re: Gerry (15:17:25) :
Dear Gerry, David and of course Anthony 🙂
While some posts in this thread seems to refute Davids statements about recent temperature measurements quite well, it should also be mentioned that his statement about the recent proxies is also not correct!
Steve McIntyre (&Ross MK.) and the Jeffs did a lot of work showing that the used math for some of the famous proxy studies is flawed while C. Loehle holds a peer reviewed publication showing the possibility of a warmer global middle age period than nowadays with up-to-date proxies.
All the best,
LoN

Ron de Haan
July 7, 2009 7:44 pm
Bruce Richardson
July 7, 2009 7:45 pm

I wanted to compare several linear trend lines to a centered moving average based on the UAH data thru June 2009. The chart is available as a pdf file using this SendYourFiles link:
http://syfsr.com/?e=2C0FBDC0-8A7F-4071-AC6A-486C2DA2E273
It has:
UAH data
37-month Triple Centered Moving Average (Hybrid)
Linear regression from May 1998 until December 2007
Linear regression from January 2002 until June 2009 (the chart has a typo)
Linear regression from June 1999 until June 2009
Linear regression from January 2000 until June 2009
For full disclosure, the centered average is a 37-month triple centered average i.e. a centered average of a centered average of a centered average. I refer to this as a “hybrid” for want of another term. Normally a centered average stops as soon as the leading or trailing data stops. This one uses the data points that are available and keeps going to both ends. The last point on the right is an 18-month trailing average because there is no leading data. The first point on the left is an 18-month leading average because there is no trailing data.
The nice thing about linear regressions is that we can get whatever we want. We can “prove” that it’s warming, cooling, or staying the same. A centered moving average isn’t as versatile. 🙂
Bruce Richardson
Houston, TX

Bruce Richardson
July 7, 2009 7:58 pm

Regarding the statement from Hadley Center. According to their data, it has been cooling since around 2004. But even with that cooling, we are still at the warm end of a long-term warming trend. It isn’t a surprise that it is warmer now then it was when it was cooler. The last time I checked their site, they did not mention the approximately 5-year cooling trend that their own data is showing. Do they really consider that to be irrelevant?
Bruce Richardson
Houston, TX

Ron de Haan
July 7, 2009 7:58 pm

Adam from Kansas (16:30:36) :
You must be pulling my leg Ron de Haan, I do not doubt what that honest person says about what cooling is in store weather-wise, but please tell me it won’t mean an early freeze here in Wichita despite some sunflowers at my transition school blooming, the reason is we have a tomato and cucumber plant growing in two upside-down planters and I hope my parents and others can enjoy a bumper crop of them before the first freeze kills them, we’re just about to get a big wave of little tomatoes and cucumbers in addition to the 5 already seen, so it’d be nice if the first freeze waited until…………oh should I say the average date somewhere in October so we have time to actually pick a bunch.
Adam,
I really hope you will have bumper crops and sufficient time to harvest them.
I am not pulling your leg, these are serious observations by serious people.
If you are interested to read all the postings: http://www.seablogger.com/?p=15659
I wish you and your family all the best.

Don Shaw
July 7, 2009 11:50 pm

David (14:30:01) :
“Unbelievable denialism here. Those of you who are actually still considering the issue, rather than wedded to one side of it than the other, please consider the following paragraph from the Hadley Climate Research Unit for a moment before resuming this counter-constructive online banter:
The time series shows the combined global land and marine surface temperature record from 1850 to 2008. The year 2008 was tenth warmest on record, …..”
David, I find it interesting that the time period selected is 1850 to 2008. Do you think that going back to 1850, when it was mighty cold, might distort the story? This selection tells me that they are intentionally distorting the picture with this selection? Surprisingly you missed one of the messages of this blogg; namely how the arbitrary nature of how one picks the timeframe to sell their story. Why 1850-2008? For those who don’t know better, it fools them into believing global warming is a crisis.
If you want to live in a climate of the 1850’s I suggest you move North possibly to iceland, I don’t want colder temperatures.
Finally I find it insulting to suggest that deniers are selfish and not thinking of their grandchildren. Quite the opposite, I am concerned that the current cap and trade fad will ruin the economy for our grandchildren and cause a significant deterioration of their lifestyle.

Doug
July 8, 2009 12:36 am

For what it is worth, Los Angeles has had below average temperatures since May 22, 2009, a streak of 47 days and counting.
http://www.wrh.noaa.gov/lox/scripts/getprodplus.php?wfo=lox&prod=laxpnslox
It’s been quite cool for June in So Cal, nice for those of us without AC but I hear the farmers in Canada seem to be losing summer crops due to the cold.
http://www.cwb.ca/public/en/newsroom/releases/2009/061109.jsp
Doug

rogerkni
July 8, 2009 12:47 am

David:
Here’s a link to a 52-page paper, “The Recovery from the Little Ice Age”:
http://people.iarc.uaf.edu/~sakasofu/pdf/recovery_little_ice_age.pdf
Here is its Abstract:
“Two natural components of the presently progressing climate change are identified.
The first one is an almost linear global temperature increase of about 0.5°C/100 years (~1°F/100 years), which seems to have started at least one hundred years before 1946 when manmade CO2 in the atmosphere began to increase rapidly. This value of 0.5°C/100 years may be compared with what the International Panel on Climate Change (IPCC) scientists consider to be the manmade greenhouse effect of 0.6°C/100 years. This 100-year long linear warming trend is likely to be a natural change. One possible cause of this linear increase may be Earth’s continuing recovery from the Little Ice Age (1400-1800). This trend (0.5°C/100 years) should be subtracted from the temperature data during the last 100 years when estimating the manmade contribution to the present global warming trend. As a result, there is a possibility that only a small fraction of the present warming trend is attributable to the greenhouse effect resulting from human activities. Note that both glaciers in many places in the world and sea ice in the Arctic Ocean that had developed during the Little Ice Age began to recede after 1800 and are still receding; their recession is thus not a recent phenomenon.
The second one is the multi-decadal oscillation, which is superposed on the linear change. One of them is the “multi-decadal oscillation,” which is a natural change. This particular change has a positive rate of change of about 0.15°C/10 years from about 1975, and is thought to be a sure sign of the greenhouse effect by the IPCC. But, this positive trend stopped after 2000 and now has a negative slope. As a result, the global warming trend stopped in about 2000-2001.
Therefore, it appears that the two natural changes have a greater effect on temperature changes than the greenhouse effects of CO2. These facts are contrary to the IPCC Report (2007, p.10), which states that “most” of the present warming is due “very likely” to be the manmade greenhouse effect. They predict that the warming trend continues after 2000. Contrary to their prediction, the warming halted after 2000.
There is an urgent need to correctly identify natural changes and remove them from the present global warming/cooling trend, in order to accurately identify the contribution of the manmade greenhouse effect. Only then can the contribution of CO2 be studied quantitatively.”

Hank Hancock
July 8, 2009 1:14 am

Luboš Motls’ math underscores a well understood concept in wave theory. How and when you measure a wave is of critical importance. If you measure from any point in the cycle to the same point in the next cycle, the positive component of the cycle will be equal to the negative resulting in a net value of zero.
If, however, you measure only a partial cycle, say 3/4 of the cycle, you can arrive at a net value that is greater or less than zero, depending on what point on the wave form you begin your measurement. If your measurement begins at 1/4 wavelength past the peak of the waveform – at the point of zero crossing on the negative slope – the overall value derived will be positive by a value = 25% of the amplitude.
When an AGW alarmists says 30 years of satellite measurements is sufficient to prove global warming it is a weak argument. One must ask how do you know you aren’t just measuring a net positive component of a cycle that is greater than 30 years?
it seems to me that to pick 30 years as a measuring stick based solely on the availability of satellite data is so arbitrary in the bigger picture of climate cycles that it yields worthless conclusions if you don’t know at what point of a cycle the measurement begins and nailed down a complete Fourier analysis of the phases, amplitude, and the constructive and destructive relationships of other cyclical components of climate. To my understanding, climatology isn’t there yet.
The AGW argument strongly rests on a specific length of measurement with a specific beginning point. Start changing the measurement properties and the case for AGW turns cold. That is why the warmists cry “cherry picking” any time you slide the measuring stick – you’re moving it out of that sweet “hot spot.”

Chris Schoneveld
July 8, 2009 2:03 am

Lubos wrote: “Global mean temperature according to UAH MSU for the first 8.5 years i.e. 102 months of this century. Linear regression gives a cooling trend by a hefty -1.45 °C per century in this interval. So if someone tells you that the trend is “of course” positive as long as we omit the year 1998, you may be very certain that he or she is not telling you the truth.”
Come on Lubos, we are skeptics alright, but we shouldn’t resort to the deceptive tactics of the AGW crowd. To cherry-pick 8.5 years to show a negative trend is also spurious. Add another 1.5 years and the trend is positive again. http://www.woodfortrees.org/plot/uah/from:1999/plot/uah/from:1999/trend
Even a starting date on the other side of 1998, for instance 1996 so including 1998, you get a positive trend: http://www.woodfortrees.org/plot/uah/from:1996/plot/uah/from:1996/trend

Frank Lansner
July 8, 2009 2:19 am

Lubos, this work is really usefull.
I can only recommend everyone to read and understand this.
The major warming should happen in the mid troposphere.
There has been no such warming for 15 years!!!

And the warming back from 1979 is only 0,4K/century.
Applause, and I understand why Anthony uses this!

bluegrue
July 8, 2009 4:22 am

rogerkni (00:47:56)
Here’s an overlay of Akasofu’s plot with the GISTEMP data:
http://i44.tinypic.com/33pf41z.png
As you are citing Akasofu, could you please explain to me how his smoothing algorithm (he is smoothing the gray 5-year mean line of GISTEMP, not the annual data) resulted in a major deviation from the data in the period 1905 to 1935 and shifts in the maximum of the peaks at 1900 and 1945? Or how he interpolated the beginning and the end, where the GISTEMP 5-year mean has no data at all? Can you name a reasonable smoothing algorithm that will reproduce Akasofu’s smooth? Free hint: it’s not a running mean.
Oh, and please note, how his plot relies on the above distortions to give the impression of oscillations superimposed on a linear trend.

Stefan
July 8, 2009 4:27 am

Hank Hancock (01:14:47) : The AGW argument strongly rests on a specific length of measurement with a specific beginning point. Start changing the measurement properties and the case for AGW turns cold. That is why the warmists cry “cherry picking” any time you slide the measuring stick – you’re moving it out of that sweet “hot spot.”
Can anyone contradict this point, or is it simply true? Going from one question of definitions or starting points, to another, the strong precautionary principle is often invoked to claim that it is a moral duty to “do no harm” to the environment, and specifically it puts the onus on the polluter to prove that the substance is actually safe. Funny then, that when people try to demonstrate that CO2 is fairly safe (even necessary for plants), they are accused of denialism.
Environmentalists who follow the strong PP create the role of denialist, because environmentalists don’t hold themselves to a high standard of proof–remember, they don’t need to prove the treat is real, only suggest there is a threat–so someone else has the burden of proving the treat is unlikely.
The question is, where do you set the threshold between a threat we must play safe about, and a threat that is too weak to be actionable? It seems we are back to basic questions–does 30 years even make sense for climate?–do we have any idea how to distinguish a weak threat from a severe threat? (particularly with the environment).
For example, a killer plague is always a threat. Containment might be the only way to handle it. And yet we allow rapid global air travel. PP says we shut down business and holiday air travel. No?

Chris Wright
July 8, 2009 4:29 am

@ David (14:30:01) :
“Unbelievable denialism here.”
I think a better word would be ‘scepticism’. I’m a sceptic and I’m proud of it. It means that I don’t accept something to be true simply because it comes from authority, and that I try to ask awkward questions. If a theory is sound then it should be able to answer awkward questions, and therefore honest scientists should have no fear of scepticism. In fact scepticism is – or should be – the very foundation of science. Galileo was one of the great sceptics, and there are many, many more such as Einstein and Wegener. Oh, yes, and Buzz Aldrin!
.
If some people believe that most of the warming is caused by CO2 and that we’re doomed unless we slash our emissions, I can respect that belief – provided, of course, that their data and arguments are sound and honest. Unfortunately, much of the ‘science’ that underpins the strong AGW theory is so bad that it verges on willful deception.
.
You present a statement from Hadley as if it were some profound revelation that will have all the sceptics instantly recanting their beliefs. Maybe you’re a newcomer to WUWT (and Climate Audit) but if not you will be aware of the enormous amount of discussion that goes on about the claims emanating from Hadley and all the other pro-AGW organisations. This is precisely what the debate is about, so to give that quote as if it might be news to us is a little odd. It’s also relying on authority, which can be very dangerous.
.
The Hadley statement effectively refers to the Hockey Stick, which claims that the 20th century warming is ‘unprecedented’. You must be aware that McIntyre and McKitrick published a devastating criticism of the original Mann paper (MBH98). Among other things they were able to demonstrate that Mann’s algorithm would generate hockey sticks even if you fed in red noise. Others have reproduced this claim and, as far as I’m aware, no one has been able to disprove it. If the claim stands then MBH98 is dead in the water and, worse, is close to scientific fraud.
.
Apart from bad science, MBH98 (and all the other copies of the Hockey Stick) is contradicted by a huge amount of ice core and proxy data and, perhaps just as important, by history itself. MBH98 is a particularly bad example, but sadly much of climate science has been corrupted by money and politics, and the behaviour of some prominent climate scientists reflects this. For example, they often fail to make public their data and methods, as required by the rules of virtually all scientific organisations, and people have been forced to use freedom of information laws to try to obtain this information. This isn’t how science is supposed to operate in the 21st century. Maybe these scientists are simply lazy or incompetent. But at the very least it gives grounds for suspicion that they have something to hide.
.
“We should all remember having read that, when we consider how we will justify our inaction to our grandchildren.”
The Hadley statement is so bogus that I doubt if it’s of any significance to anyone. But you were making a more profound statement. If we really are doomed by our emissions then, yes, we should take action. But that action will be unimaginably expensive and will actually make our grandchildren significantly poorer, and will be the cause of increased poverty in other parts of the world. If dangerous climate change does eventually happen (it certainly hasn’t yet) then we will have robbed them of the resources that could have helped them protect themselves against the climate change.
.
But if climate change is primarily natural, as the evidence increasingly indicates, then the warming will come to an end and we will enter an era of global cooling. We will then see politicians around the world throwing away trillions of dollars trying to make the world cooler when the world is actually becoming – umm, cooler. I hope you agree that this, possibly hypothetical situation, would be about as barking mad as it’s possible to imagine.
.
Very often it’s not floods or droughts or starvation that kill enormous numbers of people. It’s poverty that kills. The best thing we can do for our grandchildren is to drive up prosperity while protecting the environment in a sensible way. Through foreign aid and investment, prosperity in developed countries will – or certainly should – translate to increased prosperity in less developed regions such as Africa. The tragedy is that this obsession with CO2 may lead to a distinct fall in global prosperity. Yes, climate change may already be killing hundreds of thousands. But it’s not the climate itself that is causing the damage. It’s the almost demented belief in man-made and destructive climate change that is causing the damage. This belief has caused the US to switch a third of its grain production to bio fuels. It caused a big spike in food prices which probably led to death from starvation. I regard this as obscene, almost as if filling up a gas tank is more important than filling stomachs. But this is an almost inevitable result of the AGW delusion, and things will get much worse if it continues unchecked. Of course, AGW is built on delusions within delusions. For example, there is good reason to believe that over the next few decades, a switch to biofuels will actually increase CO2 emissions, as a major cause of emissions is land use change.
.
Probably like many others here, I feel we have been very fortunate to have lived during a warming period. History repeatedly shows that mankind prospers when the world gets warmer. It’s when the world gets colder that people starve and civilisations fail. With this in mind, and after trying to understand the arguments on both sides, I have come to the conclusion that, probably due to the effects of dominant negative feedback mechanisms, carbon dioxide has a negligible effect on the global temperature, both in the 20th century and in the entire history of the earth. With this in mind, the present policies of politicians around the world can only be described as barking mad. Hopefully we are at the high tide of strong AGW, and that good sense will eventually prevail. But there is one promising sign apart from science itself: opinion polls in the UK and US show that a healthy majority of people believe that the warming is natural, and there is also a strong trend of increasing scepticism.
.
With this in mind, I would say that the denialism here – or rather, scepticism – is totally believable. Because, among people who live in the real world, we are in the majority.
.
If you want to engage in reasoned debate here then I’m sure you’d be most welcome. But quoting Hadley like that wasn’t a good start, as you were simply relying on authority. For sceptics that’s a bit like showing a red rag to a bull. But if you quoted actual scientific research it would be another story. According to New Scientist there is ‘overwhelming’ proof of strong AGW, but strangely they resist the temptation of telling us what it actually is. If you know what this ‘overwhelming’ proof is I’m sure we’d like to hear!
Chris

Stefan
July 8, 2009 4:40 am

Sorry for all the typos… “treat” should be “threat”.

Dan
July 8, 2009 5:38 am

“If you eliminate 1998, the trend (in temperatures) is clearly upward.”
If you eliminate 1927, Babe Ruth would have hit 654 homers, rather than 714. Don’t even get me going on the subject of A-Rod.

Bob Kutz
July 8, 2009 5:58 am

bluegrue (15:54:10)
Here’s my point; they adjust the temps up by .6 degrees and then claim there’s a .6 degree increase in GST. Doesn’t that strike you as a bit obtuse?
Further; they’ve basically denied any UHI effect, allowing -.1 degree, and then offset that with a +.7 degree adjustment. Tell me, what do these ‘necessary’ adjustments represent?
It is fairly intuitive that there is a UHI effect. I was watching the weather just the other day. All of Iowa is in the low 70’s and upper 60’s. Des Moines is reporting 78 degrees. Not unusual, and makes a bit of sense, if you live in Iowa.
So, where is it that the thermometer reads 79, and you adjust that up to 80, to be more ‘accurate’?
Let me know.

Bob Kutz
July 8, 2009 6:17 am

Bluegrue;
Further; TOBS and SHAPS should tally to near zero in the aggregate. Period, end of story.
Unless one assumes they are adding an unaccounted UHI back into SHAPS, thereby maintaining a UHI effect they were not adjusting out in the first place. (That would be a spurious argument at best.)
Here’s a thought; if their sites are so great (despite literal documented evidence to the contrary), why are there ‘necessary’ adjustments? To think that there is anything other than deliberate fraud at this point is to ignore massive amounts of evidence to the contrary, and assume incompetence at a level unprecedented since Easter Island.
SInce there funding doesn’t come from ‘big oil’, I guess their integrity is beyond reproach?
And the ‘we really don’t know everything we need to yet’ crowd is the side that’s in denial?
Beam me up Scotty, there’s no intelligent life down here.

July 8, 2009 6:30 am

Someone in another thread called WoodForTrees a great “cherry orchard”. I took that as a compliment because I like cherry trees 🙂 But too many cherries give you indigestion…
http://www.woodfortrees.org/plot/uah/last:120/plot/uah/last:120/trend/plot/uah/last:102/trend/plot/uah/last:60/trend
Which is more representative – maybe none of them?
See also: http://www.woodfortrees.org/notes#trends
But the graph of the change of trend over time is an interesting way to look at it. To make it really interesting you’d have the end year plotted on the Z axis – the curve to end year 1998, for example is going to look quite different…

bluegrue
July 8, 2009 6:37 am

Hank Hancock (01:14:47) :

The AGW argument strongly rests on a specific length of measurement with a specific beginning point.

You have it the other way around, you need to cherrypick short periods to get negative trends. Here’s monthly GISTEMP data running mean over 1, 10, 20 and 30 years. Could you please point out, where exactly there is a cherrypick in the length of the smoothing period? Take any 15 to 50-year average and you’ll get more or less the same result. Of course the longer averages will smooth out detail. No, it’s not the satellite record, but ground-based and satellite data agree reasonably well.
The 30 year average smooths out most of the weather but still allows one to observe changes in climate. It was adopted about 75 years ago!

By agreement adopted by the former International Metereological Organization at its meeting in Warsaw in 1935 recent 30-year averages of weather observations are defined as climatic ‘normals’. The original standard period so adopted was 1901-30; […]

“Climate” by H. H. Lamb, Methuen (August 1977), ISBN 0064738817, Footnote 1 on page 684. A simple look at the MetOffice site or the IPCC glossary would have revealed to anyone interested, that the 30 year period goes back to a definition by the WMO. If find it curious that most readers seem to prefer to speculate rather than investigate. So the “30 year weather average = climate” was around quite some time before Hansen.
For completeness sake: If you continue to read the footnote you will note, that the author is not happy with this definition as the normal period is updated every 10 years and he further contends, that 10 to 15 year averages are more useful to predict the next year or two.

J. Bob
July 8, 2009 7:59 am

Hank Hancock – A while back I looked at different methods for long term temp analysis, using other then “statistical” methods. One of which Leif uses on solar analysis, Fourier convolution. In addition, some standard signal conditioning methods to see what would shake out. Using the East English data from 1659 to 2008, it was “detrended” it with a linear line.
http://www.imagenerd.com/uploads/t_est_21-Gnm7m.gif
Next three methods were used to remove the short term data, using 40 year “filters” (fc=0.025 cycles/year) Fourier convolution (FFT), Chevushev 4-pole and moving average. The result was:
http://www.imagenerd.com/uploads/t_est_24-vco3s.gif
Since the 4-pole Cheb. introduces a “phase” delay, of 180 deg. or ~20 years (for a 40 year period wave), I “backed” it off 20 years for the resultant graph:
http://www.imagenerd.com/uploads/t_est_25-avCpP.gif
So the FFT and “phase shifted” long term data look quite similar, especially at the end, using two different methods. Putting the “trend” back in resulted in:
http://www.imagenerd.com/uploads/t_est_26-kT1s8.gif
(temp er should read temp) and comparing it to the composite put out by climnate4you, resulted in:
http://www.imagenerd.com/uploads/t_est_27-qvBaC.gif
So based on this ONE data series, it looks very likely that a downward tend may be in the making. In addition, looking a the error between the actual data and the trend, the resultant error, over the range, seems well bounded.

Mark Wagner
July 8, 2009 8:13 am

year 2008 was tenth warmest on record, exceeded by 1998, 2005, 2003, 2002, 2004, 2006, 2001, 2007 and 1997.
I also point out that “tenth warmest” out of ten can also be called “coldest.”
That the warm temperatures are clustered around the peak is no surprise. That’s why it’s called a “peak.”
It’s usually preceded by a run up of temperatures before the peak, as occurred from roughly 1978 to 1998.
It’s usually suceeded by a steady decrease of temperatures after the peak, as has occurred from 1998 until now.
These, too, would be defining characteristics of a “peak.”
The problem with its “peakiness” is that CO2 has not peaked; it continues to rise. Thus absolving CO2 as a driver of temperatures – at least for me.
Meanwhile, solar activity seems to be related to the “peakiness”; cycles 19, 21,22,23 were among the most active ever recorded. That these occurred during the run up to the peak makes the peak as no surprise – at least for me.
Nor will cooling from a less active sun be a surprise to me. Although some AGW true believers will find their faith shattered…

July 8, 2009 8:45 am

J.Bob: Similar Fourier filtering (to harmonic 4 = roughly 40 years) on HadCrut data:
http://www.woodfortrees.org/plot/hadcrut3vgl/plot/hadcrut3vgl/detrend:0.7/fourier/low-pass:4/inverse-fourier/detrend:-0.7
Note I’ve done the same trick of detrending and then, er, dedetrending to avoid Fourier edge effects.

bluegrue
July 8, 2009 9:24 am

Bob Kutz (05:58:46)
Are you even aware that you compare the numbers in degree Fahrenheit to others in degree Celsius? Ever heard of conversion factors? Are you further aware that TOBS and SHAP are applied to US48 only? As US48 is just 2% of the Earth surface the global impact of these corrections is minimal. Furthermore your assertion that “TOBS and SHAPS should tally to near zero in the aggregate” is simply that, an assertion, and a baseless one at that.

J. Bob
July 8, 2009 12:20 pm

Woodfortrees (Paul Clark) – What I did was just “chop” off freq. above 0.025 cycle/yr., close to the “cutoff” freq. as shown in the graph ref. below:
http://www.imagenerd.com/uploads/temp_est_12-GOpNo.gif
Note the spectral chart should read cycles/year instead of Hz., so I wasn’t looking at any particular harmonic.
What was interesting is how the recursive Cheb. Filter correlated to the FFT filter, after the phase adjustment. This was one of the cross checks we did when running analysis studies. The phase delay of the Cheb. filter does not help in real time process control, but it does help verify the FFT filter, if it is used in real time data processing. Just one of the reasons why the FFT is incorporated into signal processing IC’s.
Checked you site previously, now I see how you put in other types of filters, thanks for the example
The people at RC had a real hard time with Fourier analysis and filtering. Even called the procedures outline by Blackman and Tukey (co-developer of the FFT) “bungled”.

Don B
July 8, 2009 12:54 pm

David, speaking of unbelievable denialism…
In 2005 two Russian solar physicists bet James Annan $10,000 that the earth’s temperature would be cooler, not warmer, ten years from then. The Russians were probably thinking of their grandchildren. 🙂
But the ten year length of the bet, short by climate standards per above comments, indicates that both sides of the bet were confident. Nearly halfway through the bet, the Russians are looking good.

Bob Kutz
July 8, 2009 2:52 pm

bluegrue (09:24:29) :
Yes, I am aware that I have mixed C and F, not in any formulative fashion though. Simply in separate comments regarding temp. and I don’t usually use C when talking about real time, daily temps. That’s for science logs.
As to the TOBS and SHAPS adjustments, I don’t believe that it’s true that they only apply to US data, as there is little information regarding the adjustments made outside the US available. Common belief is that they closely reflect the USHN adjustments. Nevermind the issues with what happened to the representative sampling in the demise of the USSR.
Further, if you simply recite the surface stations, or adjust for the time of day, then that ought to be essentially a ‘random noise’ adjustment. If it’s not, you are manipulating the data. Or are you thinking that whatever time, earlier or later, the temp adjustment ought to be positive, and whatever the relo, the temp adjustment likewise ought to be positive?
Statistically, if it’s not near zero, it’s manipulation. The graph is fairly transparent in that regard. Considering that the UHI adjusment is in the realm of .1C, it’s laughable that anyone would be willing to support a positive adjustment for over .5 for the combined TOBS and SHAPS. There’s just no damn way taking a temp 1 hour later or earlier is going to have the same effect as having the station in the middle of a parking lot, or even in the park in the middle of what has become a metro area. At least not over a longer period of time. Further, SHAPS should net to Zero, unless you are what? Always moving the station to a cooler location, but not because of UHI? What kind of bs is that? There’s no way to make that claim, short of admiting you are allowing for a UHI effect to be adjusted out, that you weren’t adjusting for to begin with. If you don’t understand what that means, then that’s your problem. But I am having serious doubts about your understanding of any of this.
Further, in you other post, you note that in periods of 30 to 50 years, there’s only a warming trend. Take a look, and note too that the longer period used, the less that trend is. It’s approaching zero, the same as it would for a random sample. Think about that. Meanwhile, the shorter term trend is negative, indicating that while it may have been warming, it is now cooling, and in the long term, these minor changes are just ‘noise’. If there were really a trend, it wouldn’t be very arguable at this point.

jorgekafkazar
July 8, 2009 4:21 pm

Steven Kopits (11:33:23) : “Not to belabor the point, but a squirrel to an elephant is about three orders of magnitude. A horse to an elephant would be about right.”
Well, with all this robustness in the climate models, I’d say there must be an elephant.

Pamela Gray
July 8, 2009 4:29 pm

Reminds me of the room filled with horse manure. A boy was discovered digging through the manure sure that there must be a pony in it somewhere. So too the AGW’ers. With all this cold weather, there must be an AGWing theory in the manure somewhere.

Ivan
July 8, 2009 5:04 pm

TJA,
“I don’t know what your point is”
Simple factual correction. I didn’t mean to suggest “he is not climate scientist or statistician”, but simply to correct the information Anthony offered. I think Lubos is one of the best and mathematically most capable commentators of climate change I am aware of. If you know something about string theory, you should also know those guys have to be damn good mathematicians in order to do that science. 🙂

Editor
July 8, 2009 5:09 pm

Don Shaw (23:50:44) :
David (14:30:01) : “The time series shows the combined global land and marine surface temperature record from 1850 to 2008. ”
David, I find it interesting that the time period selected is 1850 to 2008. Do you think that going back to 1850, when it was mighty cold, might distort the story? This selection tells me that they are intentionally distorting the picture with this selection?

As seen here:
http://chiefio.wordpress.com/2009/03/02/picking-cherries-in-sweden/
1850 is almost exactly the bottom of the LIA lows (look at the black average line). It is a most perfectly harvested cherry…
Notice that the 1720 era temperatures are just about exactly the same as now. We’ve had a 300 year cycle down into a LIA low and back out again, nothing more. Everything else is stuff and nonsense dancing in the error bands of fictional precision.

Editor
July 8, 2009 5:29 pm

bluegrue (06:37:55) : Here’s monthly GISTEMP data running mean
GIStemp? You still use GIStemp? That is Soooo last millennium…
A partial list of problems with GIStemp can be found here:
http://chiefio.wordpress.com/gistemp/
over 1, 10, 20 and 30 years. Could you please point out, where exactly there is a cherrypick in the length of the smoothing period?
OK, how about the fact that there is a roughly 30 (ish) year PDO warm phase in that period. All your intervals are from inside one half of longer cycle that has now flipped from what it was in the ’70s. That is inside of a longer rise from 1850 (the starting point of GIStemp’s cherry pick) at the very bottom of the Little Ice Age inside of a roughly 300 year cycle. Oh, and just for grins, there is a 1500 year cycle of Bond Events that we are also in nearing the end of a warm cycle. (Though it is worryingly possible that we are now in the first stages of Bond Event Zero…
http://chiefio.wordpress.com/2009/04/06/bond-event-zero/
though I dearly hope not.)
Oh, and GIStemp is more of a “fiction creation program” than a temperature series. One example? They need to create lots of data over space and over time where there are none, and they rewrite the history in strange and wonderous ways. (OK, that’s two…)
http://chiefio.wordpress.com/2009/02/24/so_many_thermometers_so_little_time/
So as soon as you say “GIStemp” you are saying “Fictional pasteurized processed data food product”. No thank you. I like my data real, wholesome, and minimally processed.

Editor
July 8, 2009 5:47 pm

bluegrue (09:24:29) : As US48 is just 2% of the Earth surface the global impact of these corrections is minimal.
But the US is a much larger percentage of the THERMOMETERS… so your point about the surface area is a rather silly one…
http://chiefio.wordpress.com/2009/02/24/so_many_thermometers_so_little_time/
Take a look at the graphic in the top of that link. Notice that the USA dominates the thermometers x time product that is the “temperature record”. What happens (or happened) to the USA stations is very important.
Since GIStemp happily takes the stations it has and fictionalizes their readings over ever larger areas of time and space, well, lets just say that a little data fudge here and there can flavor a very large space… About 1500 km in the later “anomaly” stages of the code.

David
July 8, 2009 6:51 pm

After reading much of the above, I guess there are two kinds of people: those who can believe in a worldwide decades-long conspiracy of postdocs, journalists, data collectors, and policymakers, and those who can’t. Oh, and I guess the ocean and the sun would have to be in on it, too, since they don’t seem to be cooperating.
As a published peer-reviewed scientist, I suppose I’ve been hoodwinked by my colleagues.
As for the fellow that asserted that the years Hadley names as the warmest are the product of regression, here are the raw 20 warmest years since 1850 in that series, with no adjustments:
1998
2005
2003
2002
2004
2006
2001
2007
1997
2008
1999
1995
2000
1990
1991
1988
1987
1983
1994
1996
Must be all of those crummy thermometers.
REPLY: Interesting thing about Hadley, they have steadfastly refused to share any techniques, formulae, or code that is used to come up with their temperature dataset. Science without transparency or replication opportunity is not science. Further, it only takes a couple of people and computer code to produce the dataset they offer, so your “decades-long conspiracy of postdocs, journalists, data collectors, and policymakers” really doesn’t apply when it is in the hands of of couple of people.
I challenge you. Ask Hadley to share the techniques, formulae, and code as many of us have. FOI requests have been made and ignored. If you are truly a man of science, it should be no problem whatsoever for you to obtain what nobody else has been able to.
Let us know when you have it. Or let us know when you are rebuked. I’ll be happy to provide the contact information for Dr. Philip Jones there if you wish and I’ll even grant you a full on guest post for you to share what you’ve learned. – Anthony Watts

Reply to  David
July 8, 2009 7:05 pm

David:
Confirmation bias and groupthink are all that is required. No conspiracy necessary.
BTW, let us know when Hadley releases the raw data (not the gridded data) and methods that created your list. Until then that list is nothing more than opinion.

Editor
July 8, 2009 8:36 pm

jorgekafkazar (16:21:20) :
Steven Kopits (11:33:23) : “Not to belabor the point, but a squirrel to an elephant is about three orders of magnitude. A horse to an elephant would be about right.”
Well, with all this robustness in the climate models, I’d say there must be an elephant.

Harvesting a silly nit: I think it depends on which you are comparing:
Volume (or proxy of mass), Surface area, or length …

Steve Keohane
July 8, 2009 8:37 pm

Pamela Gray (16:29:42) I heard the story re: the boy shoveling the stall, as a light-hearted example of optimism. Seems hard to imagine such vitriolic folk as being optimists…

bluegrue
July 9, 2009 1:45 am

Bob Kutz (05:58:46)

Here’s my point; they adjust the temps up by .6 degrees and then claim there’s a .6 degree increase in GST.

Bob Kutz (14:52:22)

Yes, I am aware that I have mixed C and F, not in any formulative fashion though. Simply in separate comments regarding temp.

Yeah.

As to the TOBS and SHAPS adjustments, I don’t believe that it’s true that they only apply to US data, as there is little information regarding the adjustments made outside the US available.

Hansen et al. 2001 document the changes with regard to their 1999 algorithm and say otherwise.

or adjust for the time of day, then that ought to be essentially a ‘random noise’ adjustment.

Wrong, you are seriously uninformed: TR Karl et al., J. Appl. Met. 25(2), pp. 145–160, Feb. 1986. The time of observation has changed in the US in a systematic, not a random way. Read the paper, note its date.

Further, in you other post, you note that in periods of 30 to 50 years, there’s only a warming trend. Take a look, and note too that the longer period used, the less that trend is. It’s approaching zero, the same as it would for a random sample.

Take a look yourself here. Taking 10, 30 and 50 year averages, cut down to that period that is covered by all smooths, apply linear regression. Care to repeat the bit about approaching zero?

bluegrue
July 9, 2009 1:45 am

E.M. Smith

So as soon as you say “GIStemp” you are saying “Fictional pasteurized processed data food product”. No thank you. I like my data real, wholesome, and minimally processed.

Using Hadcrut does not change the point I made in any way. Have a look at this graph of Hadcrut data, giving linear trends for the period 1905 to 1985 after 10, 30 and 50 year smoothing. Care to explain, why the trends are almost identical?

But the US is a much larger percentage of the THERMOMETERS… so your point about the surface area is a rather silly one…

Words fail me. The thermometer readings are used to determine a temperature for each point of the Earth’s surface, then you integrate over the surface. If you have two streches of land of equal size, one with 10000 thermometers and one with 10 thermometers, the former having an average temperature of 20°C, the other of 22°C, then the average temperature over both is 21°C and not 20.002°C, as you imply.

July 9, 2009 2:05 am

David
Glad you are still around. Perhaps you can answer the question I posed a few days ago then we can have a sensible discussion on the weather stations and the data that emanates from them.
“Would you like to confirm your understanding of the number of accurate weather stations that were used at the start of the series in 1850, and also how the marine surface record back to then was compiled? Also you might like to find out how often the reporting stations have changed in number and location. Once you know that background we can all have a sensible discussion on the data you have referenced.”
Thanks. Look forward to your reply
Tonyb

Chris Wright
July 9, 2009 4:37 am

David (18:51:46) :
You don’t believe in a worldwide decades-long conspiracy. Neither do I. It’s far more subtle than that. Group think is great if you’re fighting a war, building the first atomic bomb or putting a man on the moon. But it’s a terrible way of trying to get to the truth. It becomes all too easy to ignore or rationalise away facts that contradict what the group thinks.
And of course there are many vested interests. Many jobs depend on AGW. Because of the scare, US spending on climate science has mushroomed from hundreds of millions to over a billion dollars, and thatw as under Bush. If I were a climate scientist I would have to be stupid to question the orthodoxy – assuming, of course, that I didn’t care about the integrity of science.
No, I don’t think it’s a conscious conspiracy. But it may be an unconscious one.
I’m not quite sure what listing the temperatures proves. Nobody is seriously saying there has been no global warming, although the amount is almost certainly exaggerated. You can see the trend simply by looking at the graph of HADCRUT3. But it also shows a very consistent cooling trend over almost the last ten years. The satellite record, which is almost certainly far more reliable than the ground record, is now almost precisely on its 30 year average. According to AGW, shouldn’t the satellite record actually be higher?
With this in mind, it’s not surprising people are becoming more sceptical. But the real problem is the lack of credible proof of AGW. Without that proof, and bearing in mind that the recent warming is well within natural variability, then AGW looks increasingly like a failed theory. But if you do know what that ‘overwhelming’ proof of AGW is, then please let us know!
Chris

Hank Hancock
July 9, 2009 5:12 am

bluegrue (06:37:55) – “You have it the other way around, you need to cherrypick short periods to get negative trends. Here’s monthly GISTEMP data running mean over 1, 10, 20 and 30 years. Could you please point out, where exactly there is a cherrypick in the length of the smoothing period?”
I’m sorry but I don’t have it the other way around. I’m not talking about the mean samples (smoothing period) but rather the relationship of the measurement period to the lower frequency cycle. Looking at your running mean, I see a quarter cycle in its positive phase with 1880 appearing to be a transition from a negative to positive phase which the graph provided by E.M.Smith (17:09:30) collaborates:
http://chiefio.wordpress.com/2009/03/02/picking-cherries-in-sweden/
There’s a zero crossing at roughly 1979, assuming 2002 to be the peak of the positive phase. What is most noteworthy is the trend from 1920 to 1940 is almost identical to that of 1979 to 2000. It is hard to correlate the CO2 trend to that little glitch that occurs from 1940 to 1979 and the one that occurs at 2002 to the present.
The fifth image generated by J. Bob (07:59:45):
http://www.imagenerd.com/uploads/t_est_27-qvBaC.gif
provides an excellent “drill down” to a transition ca. 2002 from the past positive phase to the present negative phase in the cycle.
I don’t see a “cherry pick” per se. I see a measurement period that, by way of its arbitrary length, optimizes on the positive phase of a longer climate cycle. The false premise of AGW is that this arbitrary measuring period is somehow of sufficient length to account for dominant climate cycles far longer than it’s statistical scope.
“By agreement adopted by the former International Metereological Organization at its meeting in Warsaw in 1935 recent 30-year averages of weather observations are defined as climatic ‘normals’. The original standard period so adopted was 1901-30; […]”
If the AGW measurement period ties to the length of the satellite record, it is arbitrary. If it ties to the ground instrument record, it is arbitrary. If it ties to the number 30 pulled out of the air by well meaning scientists 75 years ago, it is arbitrary. Being arbitrary, it can only speak to the portion of the cycle it covers with no meaningful application beyond retrospection.

rtgr
July 9, 2009 5:36 am

@bluegrue.. But then again you dont really know the temperature if you have just 10 thermometers! Of the 1700 thermometers used In GISS. how much are in Afrika?? how much are in the Sarah desert (none) even the 1200 km smoothing doenst help there. + most of them are poorly sited and all of them are corrected constantly
btw accoording to sat data (UAH) 2008 was the 17th warmest in 30 years…
1998 0.51 1
2005 0.34 2
2002 0.31 3
2007 0.28 4
2003 0.27 5
2006 0.26 6
2001 0.2 7
2004 0.19 8
1991 0.12 9
1987 0.11 10
1988 0.11 11
1995 0.11 12
1980 0.09 13
1997 0.08 14
1990 0.07 15
1981 0.05 16
2008 0.05 17
1983 0.04 18
1999 0.04 19
2000 0.03 20
1996 0.02 21
1994 -0.01 22
1979 -0.07 23
1989 -0.11 24
1982 -0.15 25
1986 -0.15 26
1993 -0.15 27
1992 -0.19 28
1985 -0.21 29
1984 -0.26 30

Steve Keohane
July 9, 2009 6:49 am

bluegrue (06:37:55)One the one hand we’re told temperature is very important, on the other, the monitoring system peaked in 1985, and is presently decimated: http://i44.tinypic.com/23vjjug.jpg
Hansen would have us believe that he can just average this and that with less real data and arrive at a good approximation of reality: http://i27.tinypic.com/14b6tqo.jpg
With a coincidental 2° temperature shift, up of course, it would appear not.

J. Bob
July 9, 2009 7:39 am

bluegrue (06:37:55) – Why not look at some of the longer term temps to get a better perspective, and look at other sources. Look at the Stockholm data STOCKH-GML 1755-2005 at
http://www.rimfrost.no/
where they have a lot of longer term temp series from other parts of the world. However looking at the two longest, East England and the Stockholm data, you will see that these cycles have happened before.

Editor
July 9, 2009 10:14 am

Steve Keohane (06:49:53) :
bluegrue (06:37:55)One the one hand we’re told temperature is very important, on the other, the monitoring system peaked in 1985, and is presently decimated: http://i44.tinypic.com/23vjjug.jpg

That is a fascinating “animation”! Any explanation for the huge drop out of stations since 1985, even in places like the USA? The soviet collapse drop out I understand, but China, Africa, and US? If it is just arbitrary deletion of stations from use in computation then it is a severe “cherry pick”. If it is actual retirement of stations, then I’m left to wonder why, with AGW such a “world ending problem”, nobody is willing to look at the thermometers…
Hansen would have us believe that he can just average this and that with less real data and arrive at a good approximation of reality: http://i27.tinypic.com/14b6tqo.jpg
With a coincidental 2° temperature shift, up of course, it would appear not.

Another wonderful graph. You can see the collapse of the Soviet Union in the hugh spike of temp / drop of stations. Nothing like deleting most of Siberia to warm your averages…
And per Hansen’s magic hand waving data spread: I’ve read the code, I’ve written up my comments on much (though not all) of it. It makes no sense at all. The most obvious is that IF I have a change of equipment (or TOBS) in, say, 2005 then that is used to rewrite all the history of the station back to the beginning of time. Now just what does a change in 2005 have to do with 1890? If we had a change that lowered the record by 1 F in, oh, 1900, and then another change that raised it by 1F in 2000, then that change in 2000 would be used TO LOWER all the data from 1900 to 2000 (!) since only changes in the last few years are used / “corrected”. So you would get a “double dip” increase in slope of the temperature curve, not a correction.
Another? He happily fills in data for places that have none up to 1500 km away (in the “anomaly” stages STEP4_5. It’s “only” 1000 km in the actual temperature stages… and yes, these are applied sequentially, so a thermometer can get “filled in” 1000 km away then the anomaly can get spread another 1500 km. So if it’s only additive it would be 2500 km for those two steps (though he does this trick in more than two places…)
So take just a moment and ask yourself if Phoenix is a good proxy thermometer for San Diego… I’ll wait… Then ask if San Francisco is a good proxy for Reno? Or Lodi? Or Marysville? (Now that Marysville is gone, it’s missing data going forward will be “filled in” from somewhere “nearby” up to 1000 km away…) The whole process stinks of error creation.
Then we are supposed to get excited about changes in the 1/10 th and 1/100 th degree C positions? When they don’t even have any accuracy in them and are totally a product of False Precision? Even if the data WERE real, which it isn’t, everything to the right of the decimal point is a FICTION. That it is a fiction based on a computed fabrication of synthetic station records makes it a Farce of a Fiction of a Fabrication …
http://chiefio.wordpress.com/2009/03/05/mr-mcguire-would-not-approve/
So I’m left to ask: “Will the real temperature record please stand up?”
Take the raw data, or take individual long life stations and look at that data. Do not, under any circumstances, use GIStemp. (And since Hadley works closely with Hansen, matches his series fairly well, and will not publish their methods: I’m left to assume they use some of the Hansen Magic Sauce and ought not to be trusted either.)
And for the apologists that say “but it matches the satellites” I would point out that rewriting the data prior to 1980 is a great way to fudge the past and increase the slope while staying in sync with the newer less fudge prone series… The method itself sets off my data forensics red flags…
So we have fairly reliable recent data that says the world is not changing much, and that is to the cooler side. We have recent news flow from all over the planet saying that the cooler phase is in many cases colder than has been seen in decades (or in the recorded weather history). We have a few very long time series reliable thermometers (such as the Swedish record) that show a drop into 1850 and a rise back to the prior normal temperature in a simple long term cycle.
The conclusion I’m left with is that the world is absolutely normal. We are fairly ignorant of it’s history. It is most probable that we are headed back into a cold phase for several decades. AGW is bunk, and Hansen with GIStemp is either horridly poorly done, or sinister. Malice or stupidity are the only two choices… pick one. Just don’t expect to use the GIStemp product for anything other than scaring the children…

Pamela Gray
July 9, 2009 10:49 am

Mr. Smith, you should go to Washington.

Bob Kutz
July 9, 2009 11:21 am

bluegrue (01:45:18) :
You clearly don’t understand the concepts in the graphs in this article, and the graph at woods shows that. It’s not the number of samples per year, it’s the length of data you use; the longer the trail, the lower the trend line, until at some point, it will reach zero.
http://woodfortrees.org/plot/gistemp/from:1880/mean:12/plot/gistemp/from:1880/mean:120/from:1880/to:2008/trend/plot/gistemp/from:1880/mean:600/from:1930/to:2008/trend/plot/gistemp/from:1880/mean:360/from:1980/to:2008/trend
This shows (as does the graphs in the article to which this thread refers) that over time, the trend is toward zero. (Notice; that’s toward, not ‘to zero’)
Now, if you take the .6 out of the data (That’s -.1 +.7) There is no trend at all, and, as Anthony has observed many times, there is no way you can justify that adjustment.
Here’s a fine reference for this ‘uncontroversial’ adjustment; http://atmoz.org/blog/2008/03/24/i-guess-i-dont-understand-the-time-of-observation-correction/
Let’s just find one thing agreeable, you and me; the science is far from fully understood, and even farther from ‘settled’. Only a simpleton like Gore would make such an unscientfic assupmtion.
Can you at least concede that?

Bob Kutz
July 9, 2009 11:36 am

and yes, I realize I mispoke and didn’t correct myself at the end of para 1

bluegrue
July 9, 2009 1:11 pm

Hank Hancock (05:12:31)
You are simply eyeballing the data, not a reliable way for analysis.
It is hard to correlate the CO2 trend to that little glitch that occurs from 1940 to 1979 and the one that occurs at 2002 to the present.1940-1979: Aerosols, solar variability. You’ll find it in the IPCC reports. The last 8 years is not long enough to determine a statistically significant trend.The fifth image generated by J. Bob (07:59:45):
http://www.imagenerd.com/uploads/t_est_27-qvBaC.gif
provides an excellent “drill down” to a transition ca. 2002 from the past positive phase to the present negative phase in the cycle.How about a bit of testing, whether or not this downturn, that you base your argument on, is an artifact or not. Filtering can introduce them, especially at the beginning and the end of the series. I’ve done the FFT low pass filtering for Hadcrut3 data using the full data set and the data set truncated at the end of 1996, consider it time travel into the past.
http://i28.tinypic.com/s3efj5.png
The smoothed truncated data set indicates a “downturn” in 1992. None of that has happened. J. Bob, if you read this, could you please repeat your smoothing with truncated data sets (anytime in the 90s, so we still have the recent upturn but avoid 1998) and post the results, too? I’m pretty sure you’ll see the same artifact.

Being arbitrary, it can only speak to the portion of the cycle it covers with no meaningful application beyond retrospection.

Nonsense, but if you prefer to read only part of my post I can’t help you.

bluegrue
July 9, 2009 1:11 pm

E.M.Smith (17:09:30) :
You are fond of Uppsala, have a look at this
account, how the temperature was reconstructed for the first half of the 18th century. The sharp temperature change at the beginning is most likely a smoothing artifact.

bluegrue
July 9, 2009 1:13 pm

J. Bob (07:39:14) :
Please read my post to Hank Hancock, I’d like you to check something.
As for cycles, all you are showing are variations in a single location, not global level. Secondly, would you please quantify your periods, rather than say “Hey, just look”?

bluegrue
July 9, 2009 1:14 pm

Steve Keohane (06:49:53) :
Please read here:
http://scienceblogs.com/deltoid/2004/04/mckitrick.php

bluegrue
July 9, 2009 2:00 pm

Now, if you take the .6 out of the data (That’s -.1 +.7) There is no trend at all, and,

Necessary correction of known systematic errors, US48 only, Celsius / Fahrenheit. As I told you before. This gets boring.

as Anthony has observed many times, there is no way you can justify that adjustment.

I’d like to hear that from the source.
@ Anthony: Do you think that TOBS and SHAP are necessary corrections to the US48 temperature data or not?

bluegrue
July 9, 2009 2:10 pm

The previous post was also on
Bob Kutz (11:21:58)

You clearly don’t understand the concepts in the graphs in this article,

The discussion had drifted. Here’s Motl’s second plot extended to the full Hadcrut3 data set.
http://i25.tinypic.com/wji05z.png
The pink curve is the same analysis for a simplistic model: temperature anomaly is zero until 1970, it rises linearly to +0.65°C today, add noise. Not too bad for such a simplistic model.
And per popular demand ( woodfortrees (Paul Clark) ), here is Motl’s first plot for 1998 as an end date, I consider this one to be nonsense:
http://i29.tinypic.com/wuglz7.png
Absolutely ridiculous warming rates.
P.S.: I’ll call this a day, CU.

July 9, 2009 2:25 pm

Bluegrue
I reread your reference to Upsalla-incidentally none other than Arrhenius lived there-somewhat ironic.
The grand master of interpolation is James Hansen-who was a brilliant scientist in his day but made a lot of assumptions based on a small number of poorly distributed weather stations, many with chunks of data missing.
http://pubs.giss.nasa.gov/docs/1987/1987_Hansen_Lebedeff.pdf
This paper managed to make it to the infamous Congress hearing in 1988 and was a fundation stone for the first IPCC assessment. Count the number of ‘uncertainties’
CET and Zurich Fluntern are both very old records, the latter two closely mirroring each other.
Tonyb

Hank Hancock
July 9, 2009 2:28 pm

J. Bob (07:59:45):
Excellent work I might add. I was intrigued to see how the FFT compared to the Chev 4. Chev 4 appears more reactive to the higher frequency components. Curiously, it pulls a slight negative phase shift around 1800, perhaps in response to the large extent from 1770 through 1790. I’m curious if you used a four point transform in the FFT?
http://www.imagenerd.com/uploads/t_est_25-avCpP.gif
The moving mean plot was a real eye opener! At 1825 and more notably in the mid 1840’s it’s at a totally opposite phase from reality. It is showing a marked warming trend leading into the 1840’s – a time when unseasonably cold and rainy weather was bringing on the Ireland potato blight and the Great Hunger. What it most interesting is it’s running a 40 year spread. It’s kind of close to the 30 year spread bluegrue alludes to being just right to see the global warming. Now I get it!

Bob Kutz
July 9, 2009 2:40 pm

bluegrue (14:00:14)
As long as you’re going to check with Anthony, let’s back up a step, and ask him to direct his attention not just to SHAPS and TOBS, but rather at the validity of the sum of these adjustments, as related by the graph which started this discussion;
http://www.ncdc.noaa.gov/img/climate/research/ushcn/ts.ushcn_anom25_diffs_urb-raw_pg.gif
which YOU claimed were attributable mostly to SHAPS and TOBS, were incontrovertible, entirely justified and necessary.
I indicated the notion that anyone would adjust the most recent readings of objective instrumentation up by .6 C and then claim there’s been .6C warming in the last 100 years and the science is settled is rather a strange way to conduct sciencere. That’s like starting with a hypothesis, and then adding fudge factors to the data until you get what you want.
The alternative, of course, would be to take the objective measurements, and after realizing there’s been a century of no change in the data, start to look at those measurements and see if there’s some change that’s been camoflaged by sighting issues or time of observation issues. We’ve clearly gotten the cart before the horse on this.
Sincerely,
Bob Kutz

July 9, 2009 2:47 pm

Bluegrue
Sorry for the spelling mistakes-it ‘escaped’ before its time!
tonyb

J. Bob
July 9, 2009 5:06 pm

Bluegrue – OK, How about ending the sequence in 1997. Actually, looking at the phase corrected Chev. filter will give you a pretty good idea what happens if you truncate the run. Might take a day of two to look at the truncated data. Am also looking at the Stockholm-Uppsala data from 1772-2005 to see what shakes out.

J. Bob
July 9, 2009 7:34 pm

Couple of points:
1 – I realize that this is only one data set, I doubt if “global temps” existed back then, but it’s the best we have. However one could easily compare it to the “global temps” computed from the mid 1800’s.
2 – Hank, your comment on the famine in Ireland, correlates to research I have been doing on our various families. The result was how many population migrations were caused by the weather/climate? There seems to be a pattern from looking at the various family backgrounds. These are just pieces of info, without any research, but a pattern might emerge.
Unfortunately we have very little Chronicles to go on, but a history of Pomerania, Germany, etc. give glimpses.
~500 B.C. Goth and other Scandinavian tribes head south into Pomerania (northern Poland now). These included the Goths, Vandals etc. Cause of the migration, seems to famine (poor growing weather).
~100 A.D. Roman occupy England, making wine from local grapes. (warmer weather?)
~400 A.D. What is called by the Goths “the great migration” move out of Pomerania, south, into the Roman empire. About that time the Rhine freezes over, allowing the tribes to cross into Roman land. About that time the Huns burst out Eurasia.
~900-1000 A.D. Vikings settle in Iceland and Greenland. Settlement in North America.
~ 1350 – Last Viking settlement in Greenland, beginning of long cold period.
Anyway there seems to be some correlation to weather and mass migrations of the past, and seem to follow a 800-1000 period. That whole area would be worth a few papers.

George E. Smith
July 9, 2009 7:36 pm

“”” bluegrue (01:45:49) :
E.M. Smith
Words fail me. The thermometer readings are used to determine a temperature for each point of the Earth’s surface, then you integrate over the surface. If you have two streches of land of equal size, one with 10000 thermometers and one with 10 thermometers, the former having an average temperature of 20°C, the other of 22°C, then the average temperature over both is 21°C and not 20.002°C, as you imply. “””
Well you are forgetting that there are approximately zero thermometers over 70 percent of the earth’s surface. It is my understanding that around 1850 there were precisely 12 thermometers in the Arctic (north of +60 Lat), and that number increased over the years to around 86 or so, and then in recent years decreased to avout 72 or so; my guess being that cionicided with the collapse of the Soviet Union.
It is laughable to suggest that this hodgepodge of sampling methodology, in any way conforms to the basic laws of sampled data theory. Having 10,000 thermometers on one area and ten on another, doesn’t improve the situation over having ten on each.
Nyquist does not reqire that the sampling intervals be equal; but the maximum sample spacing (anywhere) sets the band limit of the recoverable signal; not the minimum spacing. Ergo, GISStemp, and HADcrut are both just a set of numbers bearing no relationship to the temperature of planet earth.
All of the wonderful manipulations of statistical mathematics can be appled to any arbitrary set of numbers; those numbers don’t actually have to have any linkage to each other; they are simply a set; yet they have an average or mean, a median, a standard deviation; any high falutin extraction you want to make; you just can’t extract any information from that set; there isn’t any to be had.
It can be argued that the maximum information density is carried by white Gaussian noise; because no matter how long a sample sequence you want to study, at no point can you predict the value of the next sample; you can’t even predict if it will be higher or lower than the latest value. So in that sense the signal is 100% information, having zero redundancy.
Conversely, GISStemp, or HADcrut have close to zero information content; they violate Nyquist both temporally, and in spades in spatial sampling.
You only need a violation by a factor of two, to make the average unrecoverable (without aliassing noise).
And the Central Limit theorem cannot buy you a reprieve from a Nyquist violation.

J. Bob
July 9, 2009 7:45 pm

Hank – got side tracked on above post. The Chev. filter was 2 cascaded 2-pole recursive filters, with a cut-off freq. of 0.025 cycles/year and 5% ripple. I used cascaded filters to avoid stability problems. The FFT and Chev were completely different computational methods. I feel this gives a better cross check. I used the coefficients from “The signal Processing Handbook”
at
http://www.dspguide.com/ch20/2.htm

July 10, 2009 12:53 am

J Bob
As you say History tells us a great deal-there are many papers on all the events you mention, many posted by my self and others in this forum over the months.
The trouble is that these events are called ‘anecdotal’. A complete Roman army destroyed when their enemies crossed the frozen Rhine is anecdotal! Yet cherry picking highly dubious bristlecone pine proxy is the epitome of science! The Romans were great ones for recording weather so we can reconstruct the climate of much of the Known world from around 600AD to the fall of the Byzantine empire in 1453AD.
The Vikings of course are dismissed as being a very localised anomaly.
George E Smith
Did you read my link to the Hansen paper? This describes the number of thermometers he used from 1880 to construct his global temperature. The numbers worldwide in 1850 were about 20 (that could be considered reliable).
There is no doubt the climate irrationalists are scared of history. The story hinges on limited climate variation within natural variability during constant co2 levels which only in recent years goes beyond natural limits due to increased co2 concentrations.
Tonyb

J. Bob
July 10, 2009 7:28 am

Hank Hancock (14:28:57) :
“I’m curious if you used a four point transform in the FFT?”
Hank, I’m not sure what you ment, but what I did use were the 1st 13 freq.
in the freq. domain. The freq. increments are given by 1/(# sample [512] times the time increment [1 year] ). So freq. from 0.002 to 0.025 cycles per year were used to re-construct the filtered input in time. Does that help?
Tonyb – The historical-weather climate observation was only that, and would be worth many threads, which I would love to get into. However “given the time alloted to us”, [Gandolf, Lord of the Rings], I guess I’d better stick to what I started out, namely apply signal conditioning methods to get better insight into long term weather/climate patterns.

July 10, 2009 8:25 am

J Bob
‘Lord of the Rings’ ‘An inconvenient truth’ “IPCC Assessments 1-4” -six of my favourite science fantasy novels.
Incidentally I meant the Romans from 600BC (not AD) so that gives us 2000 years of ‘anecdotal’ records.
Tonyb

David Ball
July 12, 2009 1:29 pm

Just curious if anyone has heard from the David (I am a peer-reviewed scientist) that was saying he was flabbergasted by the denialism here, then couldn’t respond to several queries about the source of his data. Any word on how he is making out with his mission to access the techniques , formula, and code from Hadley? Perhaps he found out that his colleagues were actually lying to him.
REPLY: I sent him a personal email, he’s in a state of denial and refuses to look. Turns out he’s the curator of a prominent museum of science, and you’d think he’d have a better handle on dealing with the public aka “us” – Anthony

Bart Nielsen
July 12, 2009 6:18 pm

David:
We should all remember having read that, when we consider how we will justify our inaction to our grandchildren.
If you are a properly-brought-up-environmentally-conscious-global-citizen it’s a reasonable statistical inference that you will likely end up as grandfather to a child who has four grandparents who have no other grandchild, or perhaps you will have no grandchildren at all. It’s quite comical to hear the zero population enthusiasts try to justify their aims by appeals to future generations. Those who believe that the best thing we can do for our children is to not have them really don’t have any stake in the future.

kuhnkat
July 13, 2009 8:26 am

Rob,
“I’ll bet they are measurably higher. Average global temps for a decade around 1995 and 2005 and see what you get. Then try it for a few more decades back.”
Except since 2005 CO2 has continued to rise and temps have fallen precipitously. Doesn’t appear that CO2 has much of anything to do with weather OR climate!!
HAHAHAHAHAHAHAHA

Ron de Haan
August 31, 2009 7:18 am