No statistically significant warming since 1995: a quick mathematical proof

Physicist Luboš Motl of The Reference Frame demonstrates how easy it is to show that there is: No statistically significant warming since 1995

First, since it wasn’t in his original post, here is the UAH data plotted:

UAH_LT_1979_thru_Nov_09

By: Luboš Motl

Because there has been some confusion – and maybe deliberate confusion – among some (alarmist) commenters about the non-existence of a statistically significant warming trend since 1995, i.e. in the last fifteen years, let me dedicate a full article to this issue.

I will use the UAH temperatures whose final 2009 figures are de facto known by now (with a sufficient accuracy) because UAH publishes the daily temperatures, too:

Mathematica can calculate the confidence intervals for the slope (warming trend) by concise commands. But I will calculate the standard error of the slope manually.

x = Table[i, {i, 1995, 2009}]

y = {0.11, 0.02, 0.05, 0.51, 0.04, 0.04, 0.2, 0.31, 0.28, 0.19, 0.34, 0.26, 0.28, 0.05, 0.26};

data = Transpose[{x, y}]

(* *)

n = 15

xAV = Total[x]/n

yAV = Total[y]/n

xmav = x - xAV;

ymav = y - yAV;

lmf = LinearModelFit[data, xvar, xvar];

Normal[lmf]

(* *)

(* http://stattrek.com/AP-Statistics-4/Estimate-Slope.aspx?Tutorial=AP *)

;slopeError = Sqrt[Total[ymav^2]/(n - 2)]/Sqrt[Total[xmav^2]]

The UAH 1995-2009 slope was calculated to be 0.95 °C per century. And the standard deviation of this figure, calculated via the standard formula on this page, is 0.88 °C/century. So this suggests that the positivity of the slope is just a 1-sigma result – a noise. Can we be more rigorous about it? You bet.

Mathematica actually has compact functions that can tell you the confidence intervals for the slope:

lmf = LinearModelFit[data, xvar, xvar, ConfidenceLevel -> .95];

lmf["ParameterConfidenceIntervals"]

The 99% confidence interval is (-1.59, +3.49) in °C/century. Similarly, the 95% confidence interval for the slope is (-0.87, 2.8) in °C/century. On the other hand, the 90% confidence interval is (-0.54, 2.44) in °C/century. All these intervals contain both negative and positive numbers. No conclusion about the slope can be made on either 99%, 95%, and not even 90% confidence level.

Only the 72% confidence interval for the slope touches zero. It means that the probability that the underlying slope is negative equals 1/2 of the rest, i.e. a substantial 14%.

We can only say that it is “somewhat more likely than not” that the underlying trend in 1995-2009 was a warming trend rather than a cooling trend. Saying that the warming since 1995 was “very likely” is already way too ambitious a goal that the data don’t support.

0 0 votes
Article Rating
300 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
December 26, 2009 1:01 pm

What’s nice about this post – both the data & methods are posted for everyone to see & verify – wouldn’t it be nice if the CRU & GISS would do the same – like real scientists doing real research would do so that it can be verified (& if decisions are made on the basis of the analysis, those decisions can be made with confidence)

C Shannon
December 26, 2009 1:03 pm

This *should* be filed under things everyone knows. Sadly thanks to climate hysterics very few do know of it because it undermines the “right” policy conclusions.
Whats worse than that though is that an article such as this presenting undeniable facts in a concise and easy to follow manner isn’t likely to make a dent in the problem.
Still the effort is greatly appreciated all the same. The truth will eventually prevail.

Disputin
December 26, 2009 1:10 pm

I am far from expert in statistics, but is this really valid? By including the obvious outlier of 98 (El Nino) the SD is increased so widening the confidence intervals. While I should agree that in the long run the 98 jump is just a part of the variability, over this restricted timescale it is a major anomaly.
But then what do I know?

EdB
December 26, 2009 1:17 pm

Says nothing about what humans might have caused. Totally meaningless imo.
I am betting on galactic cosmic rays, thank you very much.

tallbloke
December 26, 2009 1:18 pm

Heh, love it. Nice one Luboš.

Icarus
December 26, 2009 1:33 pm

The long-term warming trend is around 0.13C per decade according to the entire UAH record. What you should be calculating is whether there is any statistically significant deviation from that warming trend – otherwise you’re just grasping at straws.
http://www.woodfortrees.org/plot/uah/from:1979/to:2009/plot/uah/from:1979/to:2009/trend

Richard M
December 26, 2009 1:46 pm

Seems like I heard it from the AGWers that 15 years without warming would falsify the hypothesis. Of course, they would claim that the greater probability of warming is enough.

Steve Goddard
December 26, 2009 1:48 pm

Whatever the direction, the magnitude is much lower than IPCC forecasts.
http://www.bbc.co.uk/sn/hottopics/climatechange/climate_challenge/aboutgame_2.shtml
If The Nobel Prize winners were correct, temperatures should have risen over half a degree since 1995.

December 26, 2009 1:49 pm

Why start at 1995 instead of 1979?
Why not use monthly data?

Richard
December 26, 2009 1:50 pm

Melly Kalikimaka and a hearty thank you to Anthony and all his merry elvises who do such a great job at this crossroads of knowledge. Thanks for bringing us such nice lumps of carbon rich reading. Wishing you a great and prosperous New Year. Keep up the good work.

Schrodinger's Cat
December 26, 2009 1:50 pm

Good

Henry chance
December 26, 2009 1:59 pm

You are so mean to use facts to go up against emotional arguments. The polar bear extinction alone is over the top for data.

December 26, 2009 2:04 pm

Ah yes, let’s predict the future based on past data.
Firstly you have give us some reasons to believe that your data is meaningful. Where does your satellite data comes from? What sensor does the satellite use? Have satellite readings been calibrated against earth based instruments? Does the satellite read surface temp? Tropospheric temp? Stratospheric temp? Top of the atmosphere temp? Temp at noon? Temp at midnight? Or just temp at any old time? How do you know ? How do the satellite readings change as the satellite sensors age? How many satellites are in your data? One? More? If more than one, how closely do the readings from the various satellites agree with each other?
Then we want to know the average and standard deviation of your data. How much noise (standard deviation is a measure of noise) is in your data? Is the noise level higher than any trend you might be seeing?
Looking at your code, it seems your are asking Mathematica to do a least squares straight line fit to your data. If we believe a straight line is a good approximation , then we can take to slope of the best fit straight line as a trend.
Then you have to explain why you picked 1979 as a start point. Suppose I pick 1998 as a start point? I can get any answer I like if I can pick my start year. If I pick 1998, then I can say temperature has been declining since then. You have to explain your choice of start year.

Eve
December 26, 2009 2:08 pm

My heating fuel usage shows that there has been cooling since 1997, which is the oldest data I have. I will show fuel usage in Litres per year. I do not heat the house when it is warm therefore each year after 1997 must have been cooler. The furnace has a scheduled maintenance each Nov, the same two people live in the same house and the thermostat settings have not changed.
1997-2767.20 Litres
1998-3057.50 Litres
1999-4009.30 Litres
2000-3874.70 Litres
2001-3586.70 Litres
2002-3752.20 Litres
2003-3634.50 Litres
2004-4072.50 Litres
2005-3293.50 Litres
2006-4276.70 Litres
2007-3700 Litres
2008-4476.20 Litres

niphredilflower
December 26, 2009 2:10 pm

How long a period are we blamed for effecting the climate? I heard recently that CO2 is only blamed for the last 20 years… and the 1998 peak was due to an El Nino… If this is the case then surely we are only being blamed for a tenth or so of warming, of which is showing signs of returning.
If we return to temperatures of pre-blame… does that prove the build up of heat is gone? Or can they still claim that we have reduced cooling that would have occurred to a greater degree? – What happens if the heat in the climate system is not measured or has reversed in the next few years?

Bill H
December 26, 2009 2:11 pm

Get rid of the Hadly, CRU, MET, NASA, GISS, NOISE and whala….. Static normal cycle……
Whoda thunk it…?

niphredilflower
December 26, 2009 2:11 pm

– addition: tenth of a degree

Adam from Kansas
December 26, 2009 2:13 pm

They will probably say that this decade is the warmest ever by a significant amount, that is if you cut off the data somewhere after the 1930’s.
Also, Piers Corbyn is talking about a dangerous system developing that could be even worse than the one still rolling in the United States on Dec. 28-30.
http://www.iceagenow.com/Notably%C2%AD_dangerous_warning_for_28-30_Dec.htm
Despite Okla. City getting slammed hard, Wichita got off easy on that system, Weather Underground already has two days in a row with snow chances before New Year’s in their forecast column, we can’t dodge the worst parts of these snow storms forever.

Bill H
December 26, 2009 2:15 pm

And a scientist who shares his methods and equations…
Haven’t seen that from any of those so called scientists who scream the earth is melting…..

KevinB
December 26, 2009 2:16 pm

Hey man, don’t you know that the issue is settled? Al Gore told me so, so it must be true. The fact that he owns a piece of the largest carbon trading firm, and is actively pushing for cap and trade is not a conflict of interest, because he’s thinking of the children (and the polar bears).

PJB
December 26, 2009 2:17 pm

I am constantly dismayed by the “use” of statistical significance and relationship correlations in the media and by the public.
Even a 90% significance is weak by statistical standards. (95% is just enough to be reasonable.)
When a coefficient of correlation (linear regression) is less than 0.9, there is barely more than a “trend” established. When they start to use log plots, almost anything starts to look like a straight line relationship.
Forget about causation. Correlation means that factors demonstrate similar effects. Not that one causes the other.
Forget about cropping and truncating the graphs or picking a range of data that only looks at a short (for the event) time-frame.
When you start to throw in “adjusted” data that mysteriously suits an agenda…
Samuel Clemens had it right. Lies, damn lies and statistics….

Graeme W
December 26, 2009 2:18 pm

Which is why people who want to spin the numbers are now saying that the last decade is the warmest on record, rather than temperatures are getting hotter.
You can spin almost anything using statistics. It’s nice to see them being used properly for a change, though the title of the article is spin because it implies there’s no warming trend. The closing statement of the article isn’t that definite. It merely says that we can’t say that a warming trend is “very likely”.

Kevin
December 26, 2009 2:20 pm

I believe the issue is clouded by earlier analysis that proved there _was_ statistially significant warming for a time interval ending in 2000. Menzie Chin, an economist who publishes on the site Econbrowser replicated that math. He’s something of a Democratic partisan, but surely has command of the math.
Like all of these experiments, the results depend on what data you choose to work with.

kadaka
December 26, 2009 2:21 pm

And what do things look like without the 1998 spike?
How much of the heating of the Pacific Ocean is due to underwater volcanic activity? After all Al Gore informed us it is millions of degrees just a few kilometers down so volcanic activity must be a potentially significant source of heat. The global warming theories are concerned with how CO2, with the positive feedback mechanisms, traps solar energy. Shouldn’t heating from below the surface then be discounted somehow when figuring warming based on atmosphere-based reasons? Thus if El Nino is tied to such from-the-Earth heating, shouldn’t 1998, involving a strong El Nino event, be discounted when calculating the trends?

Rob Vermeulen
December 26, 2009 2:22 pm

The trend is non-significant only because the poster used the average yearly anomaly. Taking every month into account, the trend becomes statistically more significant.

pwl
December 26, 2009 2:23 pm

It’s interesting to note that during 1998 the Sun was in an angry mood (high sun spots) as this video shows: http://pathstoknowledge.net/2009/12/20/the-constant-and-never-changing-sun-cant-influence-climate-on-earth
Given that the sun spot maximum occurred near the peak temperature in 1998 one wonders what influence it had (and has). Seems to be potentially correlated.
Would anyone care to expand upon this possible correlation? Luboš Motl? Anthony? Steve?

DirkH
December 26, 2009 2:28 pm

“Graeme W (14:18:37) :
Which is why people who want to spin the numbers are now saying that the last decade is the warmest on record, rather than temperatures are getting hotter.
[…]”
They can say that because the past (according to GISTEMP) is getting colder all the time. Proof here:
http://wattsupwiththat.com/2008/11/14/the-evolution-of-the-giss-temperature-product/

Scarlet Pumpernickel
December 26, 2009 2:31 pm

Is all that snow going to push the chart down, runaway cooling magnified with all that white ice in the northern hemisphere!

December 26, 2009 2:43 pm

But – but – Tamino just a few weeks back posted charts where his linear trend line technique definetly indicated ‘warming’ in the recent past, it seemed to leave ‘no doubt’ in the minds of those who posted that there was positively *no warming*.
He even received *rave* reviews on his site for doing so (albeit from an ever-so select group of sycophants known for their slavish praise) –
Was he just “hiding the decline” in his presentation?
.
.
.

Ano
December 26, 2009 2:44 pm

This kind of thing was inevitable from the moment the EPA was allowed, back in 1993, to get away with using a 90% confidence interval to claim a barely statistically significant risk of lung cancer in non-smokers from environmental tobacco smoke (based on a cherry-picked selection of meta-studies, no less).
When there was no outcry from the statistical or scientific, but instead vast attention from the media, the abuse and manipulation of statistics went into overdrive. Disraeli’s admonition that “There are three kinds of lies: lies, damned lies and statistics”, is now cited as a slightly amusing saying, rather than the caution it was intended as.
I don’t mean to change the subject of this thread to second-hand smoke: just to indicate that this was the first example I can recall of widespread statistical abuse which attracted massive media attention, grants for its purveyors and a government response disproportionate to the problem. The thin edge of the wedge, so-to-speak, and a template followed closely by the AGWers.

December 26, 2009 2:45 pm

1st paragraph should have ended thusly: “that there positively was *warming*.”
.
.
My bad.
.

December 26, 2009 3:07 pm

Lubos,
I did a post that includes your review of “Red Hot Lies”: Prostitution Services.

December 26, 2009 3:12 pm

I’d like to know what happens when you factor in the siginificant autocorrelation in the residuals. That would make the sigmas much wider and the statistical significance smaller.

Editor
December 26, 2009 3:14 pm

Bret (13:49:22) :
> Why start at 1995 instead of 1979?
I suspect the short answer is that UAH temperature were going up during that warm PDO phase.
Hmm, tweaking the first few lines from a Robert Woods graph, I get no warming between 1979 and 1995, see http://www.woodfortrees.org/plot/uah/mean:4/plot/uah/from:1979/to:1995/trend/plot/uah/from:1995/trend/plot/uah/from:1997.5/trend/plot/uah/from:1992/to:1999/trend
Unfortunately, WFT doesn’t have a standard deviation function, but you can take its list of data pointd and put them into a spreadsheet. Time for me to make dinner.
Unfortunately, the temperature data stream is so noisy that just looking at temperature doesn’t provide much guidance.

rbateman
December 26, 2009 3:16 pm

0.5C might seem like a lot of change, but convert that to degrees K and what does that now represent in percentage of total?

u.k.(us)
December 26, 2009 3:18 pm

pwl (14:23:55) :
It’s interesting to note that during 1998 the Sun was in an angry mood (high sun spots) as this video shows: http://pathstoknowledge.net/2009/12/20/the-constant-and-never-changing-sun-cant-influence-climate-on-earth
Given that the sun spot maximum occurred near the peak temperature in 1998 one wonders what influence it had (and has). Seems to be potentially correlated.
Would anyone care to expand upon this possible correlation? Luboš Motl? Anthony? Steve?
=====================================
to me, it seems the correlation is clear (probably because i dont know any better).
my question is: whats causing, the suns changes?

Tom in Florida
December 26, 2009 3:20 pm

Eve (14:08:45) : “My heating fuel usage shows that there has been cooling since 1997, which is the oldest data I have. I will show fuel usage in Litres per year. I do not heat the house when it is warm therefore each year after 1997 must have been cooler. The furnace has a scheduled maintenance each Nov, the same two people live in the same house and the thermostat settings have not changed.
1997-2767.20 Litres
1998-3057.50 Litres
1999-4009.30 Litres
2000-3874.70 Litres
2001-3586.70 Litres
2002-3752.20 Litres
2003-3634.50 Litres
2004-4072.50 Litres
2005-3293.50 Litres
2006-4276.70 Litres
2007-3700 Litres

As your house has aged have the windows been checked for air leakage?
How about the doors? What about insulation? What about burner efficiency declining with age? Perhaps the thermstat needs adjusting?
You see one can fail to take into account ALL possible reasons why something happens and come to potentially false conclusions.
Same with the simplistic approach that all warming is CO2 induced.

December 26, 2009 3:20 pm

Although 15 years sounds like an attarctive interval this is cherry picking. 1999 tp 2009 will give you statistical warming at 90% and 1979 gives a warming I think at 95% confidence using UAH I seem to remember froma pot by LUCIA. I confess I prefer Lucia’s 2001 which has the merit of being “when IPCC made its forecast” if we are looking at recent trends. I can’t see the rationale for 1995.

Simon Platt
December 26, 2009 3:22 pm

Hey,
Excluding the El-Nino year the best estimate for the mean rate of temperature increase using Motl’s approach is 1.5 degrees per century, with a 95% confidence interval between 0.2 degrees per century and 2.9 degrees per century. But the linear model is not good, as reflected in the large uncertainty in the slope of the best fit, so probably none of these numbers could be trusted.
So it seems to me that these data tell us nothing – or at least nothing more than the fact that they can’t be modelled by a constant rate of increase. I suppose that’s almost what Motl is saying, although not quite.

Paul
December 26, 2009 3:39 pm

I believe that this posted analysis is flawed. You cannot apply this type of simple linear trend analysis to serially correlated data, since the precision of the parameter estimates is strongly a function of the autocorrelation in temperature data.

Tilo Reber
December 26, 2009 3:47 pm

Lubos should post his results at Tamino’s place, since Tamino is so much in love with statistical sophistry.

December 26, 2009 3:54 pm

Ric Werme (15:14:24) : “I get no warming between 1979 and 1995”
That’s because there is none, at least for the satellite based series.

photon without a Higgs
December 26, 2009 3:55 pm

…both the data & methods are posted for everyone to see & verify – wouldn’t it be nice if the CRU & GISS…
A wistleblower would be nice too.

December 26, 2009 4:17 pm

A great site to refer to anytime statistically significant trends is mentioned is
Cherry-Pickers Guide to Global Temperature Trends.
“..how global temperature trends look across different datasets and choice of starting dates..”
http://rogerpielkejr.blogspot.com/2009/10/cherry-pickers-guide-to-global.html

ShrNfr
December 26, 2009 4:26 pm

I’ll pick AMO for some warming through 2005 and then for cooling through around 2040. At least in the northern hemisphere.

ShrNfr
December 26, 2009 4:33 pm

Predicted lows for Boston on tuesday are of the order of 8 degrees F. Hey Hansen, want to come up my way? Its only a couple blocks from the Charles River and we could have a nice invigorating dip.

crosspatch
December 26, 2009 4:33 pm

The long-term warming trend is around 0.13C per decade according to the entire UAH record.

Yes, and plot the same since 2002 and you see a rather dramatic cooling trend since that time. If you go all the way back to 0AD we are in a cooling trend. Picking a year when an apparently cycle of natural warming started doesn’t really mean much. Check back in another 30 years when we have a full Pacific cycle in the records.

December 26, 2009 4:35 pm

I have more of an issue with the whole approach.
If at the end of 2009 there is less heat in the climate system (oceans, atmosphere, soil, etc) than at the end of 2008, then the earth has cooled in that year.
It’s not “noise”. It has cooled and there IS a reason. Trying to fit chaotic systems to linear trends or least squares regression or […] is something I don’t really understand.
Across 1,000,000 years the earth has cooled. In 20,000 years the earth has warmed. In 1,000 years possibly cooled (let’s not debate MBH 98 etc). In 150 years warmed. In 10 years cooled/warmed depending on dataset.
Some processes take place over millions of years – mountains being formed and continents moving. Some processes take place over 20,000/100,000 years – changes in the earth’s orbital eccentricity, precession etc. Some processes take place over 1000’s of years – ocean currents and salinity changes. Some take place over years, months and days.
Trying to form a pattern and make something of that pattern? Interesting intellectual exercise but why? And does it mean anything in physical terms? Finding patterns and trying to fit to physical processes makes sense only to help our scientific understanding.
So if the earth cools in a year it has cooled. It’s not a statistical blip. The question is why. “Natural variation?” – why? What is the process?
Check out: http://scienceofdoom.com/2009/12/19/is-climate-more-than-weather-is-weather-just-noise/

Frank K.
December 26, 2009 4:35 pm

One question that I haven’t heard a good answer for is this: Why do people like to attach ** linear ** trends to processes (like the behavior of the climate) that are very ** non-linear **?

docattheautopsy
December 26, 2009 4:41 pm

The reason I’m more convinced by the historical data derived from ice core samples in Greenland and Antarctica is the reach you have with the data. Having a “warming trend” that’s built on data from 1979-1998 is certainly alarming. Taking such a small sample from a geological time scale and drawing worldwide impact of industrial pollution from a mere 100 years of data while discarding data carefully gathered on the past 500,000 years is absurd.
What’s worse is taking computer models of a system that’s not completely understood and applying it to the system as a measure of accurate prediction. I liken it to a man believing Elvis will return in 2012 because four separate psychics agree that Elvis will return in 2012.
Climatology relies on observation, not experimentation, and while we know more about climate, nobody can say with scientific certainty that manmade CO2 emissions are the only cause of warming. Experimentation relies on controlling all variables in the system, and that’s something we can’t reliably replicate.
This statistical analysis is nice, but again, it only works over a small time range. The discussion on confidence intervals is certainly valid, and the conclusion is expected, but we should beware such small data sets in climatological data to prove our points.
Also, I’d like to see 1998 compared to a 1979-2009 range, an 1800-2009 range, and an 1800-2009 range to see if the abnormally warm year can be statistically excluded, as there were certainly warm and cold years in the past 200 years.

DirkH
December 26, 2009 4:42 pm

O/T Hansen has a book to sell.
http://www.latimes.com/entertainment/news/la-ca-james-hansen27-2009dec27,0,5460299.story
“”Storms of My Grandchildren: The Truth About the Coming Climate Catastrophe and Our Last Chance to Save Humanity” by James Hansen
Did he miss the christmas gift market or is the critic late? it says 27th Dec.

royfomr
December 26, 2009 4:43 pm

Think it’s about time that the precautionary principle was resurrected.
I and many others, in the uk, not only survived the searing summer of 76 but rather enjoyed it!
No doubt that,at that time, many were worried about Global Cooling. Forget the feeble, history re-writers, the next ice age was the biggest threat to mankind ever. Those whom say that this was not what science said at the time are as guilty of revisionist history as the current Iranian leader and his drinking buddies.
Climate confounds the senses, climate cabals confuse the science and politicians carry on doing what they’ve always done.
Adding to the paradox that gave us hot summers during the seventies, I or is that we, got cold, icy winters.
We/I/us survived! Open fireplaces upon which we combusted anything combustible gave us heat and light. And at prices we, the financially disadvantaged, could afford.
Not now. When the electricity and the gas stop, thousands will die. Not of drowning, heat prostration or computer predictions but of cold-induced metabolic disfunctioning!
To any politicos who may be out there, get a grip guys, a reducing population will diminish the tax returns, cut down deadly pollutants, provide more Lebensraum for Tuvaluans and Scuba- dressed Maldivians but is that how you want to be remembered?
Coming back to the precautionary principle, if the Scientific Consensualists turn out to have been wrong or frauds and we’re taking precisely the wrong measures then how are you going to explain that to your grandkids?

DocMartyn
December 26, 2009 4:45 pm

I have yet to informed why an increase in CO2 to cause a zero order increase in global temperature. The idea that one can just fit a straight line without having an underlying hypothesis is nonsense. You do not fit to the line which gives you the best R2 value, if that were the case everything would be fitted to polynomial’s, no you fit to a modeled relationship between X and Y. CO2 increases are not linear, therefore the line shape of temperature should not be linear, instead plot temperature against [CO2] or log[CO2] or at least some function of [CO2]; anything else is bollocks. The whole point of the excercise to to find any relationship between [CO2] and Temp, please, please, do not take your eye of the ball.

Basil
Editor
December 26, 2009 4:47 pm

Paul (15:39:43) :
I believe that this posted analysis is flawed. You cannot apply this type of simple linear trend analysis to serially correlated data, since the precision of the parameter estimates is strongly a function of the autocorrelation in temperature data.

In this case, not strongly enough to change the outcome. Using a robust method for computing standard errors, the confidence interval still includes 0: -0.55 to 2.45 C per century. The standard deviation drops from 0.0084 to 0.0069, raising the t-ratio from 1.13 to 1.37, but still far from indicating a trend significantly different than zero.

DirkH
December 26, 2009 4:51 pm

Ok the critic was late. Now that’s dumb. Promoting the book after it’s intended market window. Maybe there were too many leftovers in the shops.

Curiousgeorge
December 26, 2009 4:52 pm

Anthony, as a statistician you should know better. With enough data points, anything can be “statistically significant”. Besides that, it’s a subjective comment anyway.

December 26, 2009 4:52 pm

I have a MS in stats. That doesn’t make me an expert or anything, but I do know a little about stat methodology. The analysis above is very simplistic for the following reasons:
1. It fails to consider the measurement error. Each data point is accepted as “true” and without error. That’s a giant leap of faith and unsupported by the evidence.
2. The analysis includes a calculation of parametric error based on the Normal (Gaussian) distribution. The data are not normal, not independent, and are not the product of infinite replications.
3. The analysis fails to consider probabilistic (Bayesian) error.
Since all those varieties of error exist and are not accounted for, the precision and accuracy are much, much less than assumed. Hence the real confidence intervals are much, much wider.
As Dr. Briggs [http://wmbriggs.com] likes to say, people are way too certain of themselves!

December 26, 2009 4:53 pm

In response to Icarus’s challenge to compare 1979-2009 and 1995-2009, I’ve plotted it.
1995-2009 has a lower slope. Rate of increase in temparature since 1979 is slowing.
http://www.woodfortrees.org/plot/uah/from:1979/to:2009/plot/uah/from:1979/to:2009/trend/plot/uah/from:1995/to:2009/trend

December 26, 2009 4:56 pm

Many Christmas thanks to Anthony Watts and Luboš Motl, Steve McIntyre and all other defiant free thinking global warming skeptics and global warming debunkers. You all can proudly tell your grandchildren that you made a difference. A very big difference
You have exposed a lie that’s more dangerous than the Lysenko lie because this one is global
Plus seasons greetings and eternal gratitude to the CRU leaker cum whistle blower

photon without a Higgs
December 26, 2009 4:57 pm

Even if trying to follow the math gives you a headache then just watch the weather on tv. There’s been longer winters the last 3 years around the world. These were never predicted by the global warming scientists.
For example: this past week from Dec 19–25 there were 846 snowfall records in the lower 48 of the U.S. And winter is just getting going.
Much ado about nothing from them scientists!

DRE
December 26, 2009 4:59 pm

Clear description of analysis — CHECK
Analysis code provided — CHECK
Data used adequately identified — CHECK
Clearly not the work of a Professional Climate Scientist so it can be safely ignored.

December 26, 2009 4:59 pm

Why not just graft this on to the GISS figures starting in 1979. The only problems is Hansen will keep adjusting the prior figures down to fabricate an upward trend.

December 26, 2009 5:00 pm


Paul (15:39:43) :
I believe that this posted analysis is flawed. You cannot apply this type of simple linear trend analysis to serially correlated data, since the precision of the parameter estimates is strongly a function of the autocorrelation in temperature data.

I believe Tamino did this exact ‘trick‘ a few weeks ago (and received *praise* for it) –
– Does this/would this caveat apply to him as well?
i.e., his posted analysis was flawed: … cannot apply this type of simple linear trend analysis to serially correlated data, since the precision of the parameter estimates is strongly a function of the autocorrelation in temperature data.
Comment?
.
.
.

December 26, 2009 5:04 pm

niphredilflower: You wote, “How long a period are we blamed for effecting the climate? I heard recently that CO2 is only blamed for the last 20 years…”
It really depends on the study and who’s trying to prove what. I’ve also recently run across studies citing anthropogenic influence since the mid-1800s. And I can recall one that attempted to show man’s influence for 1000s of years.

J.Hansford
December 26, 2009 5:16 pm

Icarus (13:33:14) :
The long-term warming trend is around 0.13C per decade according to the entire UAH record. What you should be calculating is whether there is any statistically significant deviation from that warming trend – otherwise you’re just grasping at straws.
——————————————————-
No Icarus, 15 years in which temperature has not responded to increasing levels of CO2 is contrary to the Hypothesis of AGW…… The 0.13c per decade for the 30 year UAH satellite record is well within range of Natural temperature variation…… AGW hypothesis is flawed.

photon without a Higgs
December 26, 2009 5:20 pm

johna1800 (16:53:04) :
In response to Icarus’s challenge to compare 1979…
—————————————-
Talk to Icarus about the Medieval Warm Period. He wants longer data sets. 1000 year data set should do the trick.

grumpy old man
December 26, 2009 5:21 pm

Rob Vermeulen (14:22:28) :The trend is non-significant only because the poster used the average yearly anomaly. Taking every month into account, the trend becomes statistically more significant.
Very right. Our goal here should be to understand, not try to advocate for one position or another.

Manfred
December 26, 2009 5:23 pm

David Starr (14:04:33) :
“Have satellite readings been calibrated against earth based instruments?”
that is an interesting question. are satellite data in any way contaminated by a calibration to ground based data ?
does poor ground based data adjustment, station cherry picking, possible fabrication, warming biasing in data set splicing or lack of UHI adjustments also contaminate satellite readings ?

December 26, 2009 5:23 pm

kadaka (14:21:52) : You asked, “And what do things look like without the 1998 spike?”
You can’t really remove it, though many have tried. There are multiyear aftereffects of the 1997/98 El Nino.
http://bobtisdale.blogspot.com/2009/11/more-detail-on-multiyear-aftereffects_26.html
and:
http://bobtisdale.blogspot.com/2009/12/more-detail-on-multiyear-aftereffects.html
You asked, “How much of the heating of the Pacific Ocean is due to underwater volcanic activity?”
Little. Refer to Emile-Geay and Madec (2009).
http://www.ocean-sci.net/5/203/2009/os-5-203-2009.pdf
They write, “Of course, the deep ocean is subjected to another heatsource: the geothermal flux due to lithospheric cooling. Yet the latter is usually neglected in oceanographic studies, primarily because it amounts to less than 2% of surface heatfluxes (Huang, 1999) – a total power of 0.03 PW and a meanflux of ∼88 mW m−2 (Stein and Stein, 1992), while surfacefluxes are on around 30 to 250Wm−2, larger by three orders of magnitudes.”

December 26, 2009 5:27 pm

Following on from DocMartyn..
Looking for statistical significance between CO2 and temperature according to say a log function (or whatever it exactly is) would have a point. But even to the modelling community with a huge confidence in their models (vast overconfidence I believe) it isn’t a simple relationship because of all the feedbacks in the climate system.
I agree with DocMartyn’s main point – you need a hypothesis to test against. Just plotting a graph and fitting a linear trend is a pointless exercise.
I believe this crazy concept has come about because of the idea that “natural variation” is “noise”. Therefore, we are trying to see the “signal” without the “noise”.
In this case – flawed beyond belief – it makes sense to try to find the real signal.
But where’s the body of evidence and testable theories that build the case that year to year changes in the earth’s temperature (and more importantly, heat) are “noise”?
I’m not a climate scientist so maybe I missed it.

Bernd Felsche
December 26, 2009 5:37 pm

Rule of thumb when spending your own money: Keep the money in your pocket unless there’s a 99% confidence in the desired outcome by spending the money.
Rule of thumb when spending somebody else’s money: 50:50 sounds like a good idea.

Michael Jankowski
December 26, 2009 5:47 pm

“Tilo Reber (15:47:08) :
Lubos should post his results at Tamino’s place, since Tamino is so much in love with statistical sophistry.”
——————————————————
Good one.
Tamino would either:
(a) not allow it to be posted
(b) allow it to be posted, come up with some reason to blindly dispute the results (e.g., suggest the period was cherry-picked), and keep Lubos or anyone from defending it all the while getting his chorus of flute blowers

Basil
Editor
December 26, 2009 5:50 pm

grumpy old man (17:21:10) :
Rob Vermeulen (14:22:28) :The trend is non-significant only because the poster used the average yearly anomaly. Taking every month into account, the trend becomes statistically more significant.
Very right. Our goal here should be to understand, not try to advocate for one position or another.

Luboš was probably just trying to be as simple and straight-forward as possible. In any case, the argument that taking monthly data into account would make the trend become “statistically more significant” is without merit. It is still not significantly different than zero. Using monthly data, the trend is 0.93 (in C per century), with a 95% confidence interval of -0.22 to 2.08.

Gillian Lord
December 26, 2009 5:51 pm

Eve (14:08:45)
Maybe you are just getting older. I notice that I need more heating in my old age.

photon without a Higgs
December 26, 2009 5:53 pm

the 846 snowfall records referred to before:
http://mapcenter.hamweather.com/records/7day/us.html?c=snow

Tenuc
December 26, 2009 5:57 pm

Here’s some examples of trends from Roger Pielke Jr’s blog:-
“Knappenberger provides some robust conclusions as well as some examples of creative cherrypicking:”
• For the past 8 years (96 months), no global warming is indicated by any of the five datasets.
• For the past 5 years (60 months), there is a statistically significant global cooling in all datasets. • There has been no (statistically significant) warming for the past 13 years. [Using the satellite records of the lower atmosphere].
• The globe as been cooling rapidly for the past 8 years. [Using the CRU and satellite records]
• Global warming did not ‘stop’ 10 years ago, in fact, it was pretty close to model projections. [Using the GISS and NCDC records beginning in 1998 and 1999]
• Global warming is proceeding faster than expected. [Using the GISS record staring in 1991 or 1992—the cool years just after the volcanic eruption of Mt. Pinatubo]
• For the past 15 years, global warming has been occurring at a rate that is below the average climate model expected warming
We know that temperature is the result of the large number of processes that drive our climate system. We also know that climate exhibits all the attributes of deterministic chaos.
So the real question is why do climate scientist still try to find trends in non-linear data, when these produce answers which are meaningless?

Baa Humbug
December 26, 2009 6:19 pm

Why are posters referring to 1998? Wasn’t 1998 just another year in the variability of climate? The fact that it influences the “confidence interval” is immaterial. 1998 is what it is and can’t be changed.
Regards start years, the first question to answer is what shape slope do you want? Going back over time, anyone can produce any shape, “slope” graph they wish.
All this minutae detail of the mathematica may be a fun exercise in itself, but is irrelevant to climate, as shown in the above paragraph.
It became irrelevant once the infamous hockey stick was shown to be a lie. There is nothing unusual about the last 5-10-50-100 years.
By all means debate the maths, but don’t make any assumptions about the climate.

photon without a Higgs
December 26, 2009 6:22 pm

856 snowfall records, in The Lower 48, Dec 19–27
HAMweather:
http://mapcenter.hamweather.com/records/7day/us.html?c=snow
I wanted to put this in ‘Tips’ but it’s not available now
p.s. map is for previous 7 days and will change soon, so record total will change

photon without a Higgs
December 26, 2009 6:23 pm

photon without a Higgs (18:22:18)
opps, 854, not 856

Paul Vaughan
December 26, 2009 6:27 pm

Mike D. (16:52:48) is the only contributor who passes an in-page search for “assum” (assume, assumed, assumption, assumptions, etc.)
The assumption of randomness is absolutely ridiculous.
However, alarmists rely heavily on such unrealistic assumptions, so Luboš Motl certainly has good reason for firing such a shot.

Richard M
December 26, 2009 6:29 pm

Icarus, using 15 years is what was put forth by the AGW crowd. Lubos is using their definition. Why would he use anything but the last 15 years?
Rob, the reason we use years is because that is the only unit reasonable from a climate perspective. A year represents the smallest time period where inter-year variances cancel out. Even years are probably too fine (a better unit might be PDO full cycles).

Sydney Sceptic
December 26, 2009 6:32 pm

Hmm.. you know that if you draw a trend line from the start of that graph to the end of the graph, you get .1 degree C per decade warming?
Quick, better break out the sunblock, we’re seeing a deviation from the ‘normal’ 1 degree per century or .1 degree per decade of.. oh gawsh.. 0.0 degrees?
AGW. Where is it?

DirkH
December 26, 2009 6:42 pm

“photon without a Higgs (18:22:18) :
856 snowfall records, in The Lower 48, Dec 19–27”
Compare Hansen’s weather forecast from 1988:

kdkd
December 26, 2009 6:44 pm

This analysis is flawed. Any estimate of short term trends that relys on a single date as the start point is worthless. For this kind of thing to be valid, you’d want to bootstrap mutiple start points, and different time peridos.
However, a recent paper from Energy and Environment (“TREND ANALYSIS OF SATELLITE GLOBAL TEMPERATURE DATA” Loehle, C.) recently showed that the effect of 1998 is quite strong in terms of producing a short term cooling trend, but as time moves on from 1998 this trend gets smaller, showing it’s merely an outlier.
The paper also shows that a meaningful timeframe for detecting statistically significant trends is in the region of 15 years plus. Of course, the authors use much more sophisticated statistical analysis than Motl’s, and being from Energy and Environment, the scientific conclusions are over-extended but there is some useful stuff there. For Mot’ls methodology, I’d be unhappy with anything less than 20-25 years.
[snip]

Basil
Editor
December 26, 2009 6:44 pm

Richard M (18:29:19) :
Icarus, using 15 years is what was put forth by the AGW crowd. Lubos is using their definition. Why would he use anything but the last 15 years?
Rob, the reason we use years is because that is the only unit reasonable from a climate perspective. A year represents the smallest time period where inter-year variances cancel out.

There is some poor reasoning here. The comment might have merit, if the interannual variances were random, but they are not. There are systematic seasonal variations throughout the year, and those are not without some interest to the study of climate. Even if the interest is in yearly changes, we can get to those using monthly data (seasonal differencing, or moving averages, for example). No need to throw out monthly data. Truth is, canceling out the interannual variance makes climate seem a lot less variable than it really is.

Tom P
December 26, 2009 7:00 pm

Luboš Motl,
The monthly data from UAH is readily available and can be used to provide a much better estimate of the confidence limits – we now have 180 monthly points rather than 15 averaged annual points to look at the variability of the temperature measurements.
Following precisely your method, the trend now drops to 0.94C/century from your figure of 0.95C, not a significant difference. However, with the additional information from twelve times the data, the standard error of this slope drops to 0.31C, substantially less than your figure of 0.88C.
Hence the warming slope is more than three standard deviations above zero, which means we can say with 99.9% confidence, not your 86%, that there has been warming on the basis of the UAH data.
I have followed your algorithm exactly here, though as has been pointed out, for the correlated temperatures in a time series the confidence level will be even higher than 99.9%.
I would have thought you’d have been well aware that averaging measurements throws away important information about their variability. For some reason you must have forgotten this important mathematical property of measurement science when performing the calculation in your article.

Icarus
December 26, 2009 7:04 pm

Richard M (18:29:19) :
Icarus, using 15 years is what was put forth by the AGW crowd. Lubos is using their definition. Why would he use anything but the last 15 years?

Actually 30 years is what is ‘put forth’ by climate scientists, and with good reason – if average interannual variation is around 0.2C, and the observed warming trend is around 0.2C per decade, then obviously you need substantially more than 10 years to distinguish trend from stochastic variation. 15 years would be a bare minimum on that basis but it’s better to work with 30.

Kevin Kilty
December 26, 2009 7:12 pm

Disputin (13:10:16) :
I am far from expert in statistics, but is this really valid? By including the obvious outlier of 98 (El Nino) the SD is increased so widening the confidence intervals. While I should agree that in the long run the 98 jump is just a part of the variability, over this restricted timescale it is a major anomaly.
But then what do I know?

1998 is within a standard deviation, so why consider it an outlier?
Paul (15:39:43) :
I believe that this posted analysis is flawed. You cannot apply this type of simple linear trend analysis to serially correlated data, since the precision of the parameter estimates is strongly a function of the autocorrelation in temperature data.

I don’t know that this is meaningful in this particular, short series, but autocorrelation reduces degrees of freedom. Applies to the suggested monthly time series even more so?
scienceofdoom (17:27:49) :
Following on from DocMartyn..
I agree with DocMartyn’s main point – you need a hypothesis to test against. Just plotting a graph and fitting a linear trend is a pointless exercise.

Isn’t the implicit null hypothesis a zero trend in this case? You will find even less significance testing against an assumed positive trend.

December 26, 2009 7:15 pm

kdkd: The paper by Loehle costs $18.
You said “The paper also shows that a meaningful timeframe for detecting statistically significant trends is in the region of 15 years plus.”
Can you elaborate on that?
I suspect, from everything else that I have read, that the analysis will simply show that year to year variation is high and tells you nothing about “longer term trends”.
We may lack the data to prove or disprove the hypothesis that any particular period in question can be subject to exactly the same analysis.
My hypothesis: “The GMST variation over 30 years periods is statistically insignificant in the context of 200 year temperature trends and should be ignored.”
And my later hypothesis: “The GMST variation over 1000 year periods is simply noise in the context of longer term 50,000 year trends.”
I would be very interested to know a little more about what exactly Loehle demonstrates.

December 26, 2009 7:20 pm

Richard M wrote: “…the reason we use years is because that is the only unit reasonable from a climate perspective. A year represents the smallest time period where inter-year variances cancel out
Aren’t the inter-year variances already removed from temperature anomaly data?

December 26, 2009 7:46 pm

pwl (14:23:55)
Sunspot activity may have an influence on earth but it is not due to the radiation getting more direct radiation from the sun. The sun’s variance is only .1% our of which we would receive only a quarter portion of that. So directly the sun’s influence would not be able to exert any real variation on the earth. However that does not mean there is not a myriad of other mechanisms that may occur because of the sun in an indirect manner. Just as a Volcano is not the cause of cooling but the resultant aerosols do contribute something. This is not directly an effect because of the eruption but because the aerosols blanket the earth and reflect radiation.
However this is simply pure conjecture of the same kind as CO2, correlation does not mean causation. Lest we forget.

Steve J
December 26, 2009 7:49 pm

Funny,
I thought the temp increase preceeds the co2 increase by 800 years.
Based upon what we know (Thanks Anthony) about the dubious quality of the data.
Why would any rational being attempt to develop any trends based upon such a small data slice?
2,500 years should be the min. or maybe 10,000 years.
The entire argument evaporates under those conditions.

December 26, 2009 7:52 pm

Bret (19:20:24) :
Aren’t the inter-year variances already removed from temperature anomaly data

Lol welcome to the wonderful world of temperature anomalies. let me do you one better. What is the likelihood that the data has been transformed by an algorithm in a way that renders its actual use irrelevant?
As for your question, if you are compressing a year down to single data point yes, however the point is that a year is a simple method of creating a start and end point for that data to be correlated. In a grander scheme of things based on data that I have seen anything short of a couple of centuries is probably too short a time period to get an actuary gauge on temperature perhaps as much as half a millennium…

December 26, 2009 8:03 pm

I just realized by my last post someone might then ask well then what is the point of this article if 15 years is too short a time period to know what is going on with temperature… I for one think that is exactly one of the points of this article… Plus it just shows how confidence levels really don’t vary that much…
Lets put temperature into perspective and get rid of this micro chasm we call Celsius. Remember that 14 degrees Celsius is actually 14 degrees plus 273 degrees above absolute 0 so if someone was to come along and mention that the Earth has warmed approx. .5 Degrees placing you from 287 Degrees to 287.5 Would you really be that worried?
That means the earth increased in temperature by a whooping .17% degrees. Would you not place that as within a normal cycle of variance statistically. I mean the Earths temperature has not even varied as much as a quarter of a percent.
Anyway… Statistically speaking we have not managed to effect much change on the Earth with a 150 years of spewing CO2 into the atmosphere and if the Earth does not hit the tipping point, and I am fairly willing to wager pretty heavy on it not, before CO2 cannot help warm us anymore ( CO2 is a gas that absorbs radiation lets face it ) that the natural cycles of the planet hold far more sway then the paltry excuse for a greenhouse gas that CO2 is.
sorry for the rant, just tired of explaining to people over and over again and the blank look on their face as they go, oh I see…

kdkd
December 26, 2009 8:05 pm

scienceofdoom: I found the paper here http://icecap.us/images/uploads/05-loehleNEW.pdf (as my university has quite rightly stopped holding E&E since it’s move from a social science journal to a political rag). The interesting bit is Figure 2, which shows the confidence interval plotted along with the trend for a couple of sattelite data sets. The conclusion that the paper supports evidence against anthropogenic global warming is extremely weak, but the regression techniques are interesting.

December 26, 2009 8:07 pm

Tom in Florida (15:20:10) (re: Eve (14:08:45))
What about burner efficiency declining with age?
Perhaps the thermstat needs adjusting?

Those issue were preemptively dealt with –
“Asked and answered.”
.
.

kdkd
December 26, 2009 8:12 pm

scienceofdoom: (again). If we’re talking about anthropogenic warming, the correct time frame to look at trends is in the 200 year period, although subsets of that time are interesting (e.g. in the early 20th century, co2 levels accounted for about 25% of observed warming whereas in the late 20th century/early 21st century, co2 accounts for ~ 80% of observed warming – now correlation is not causation, but it is rather suggestive given the cohesive body of scientific theory associated with this information).
The other error I’m seeing in the comments here is the “800 year lag” between temperature and co2. This is correct when we’re coming out of an ice age, because there is a lot of carbon locked up in the ice that takes a while to come out, and eventually the biogeochemically stored carbon will reach equilibrium for the current temperature. Perhaps we will see a similar phenomenon with the melting of the arctic permafrosts. Anyway this 800 year lag is conceptually quite distinct to the current warming which appears to be caused by the burning of fossil fuels.

cohenite
December 26, 2009 8:25 pm

Lubos has picked 1995 and most critics are referring to 1998 being an outlier; that’s ironic because if 1998 is an outlier than by including it Lubos is actually tilting the scales towards a warming trend. But 1998 IS NOT an outlier; it is a legitimate phase change point verified by prominent and well documented climatic events;
http://arxiv.org/PS_cache/arxiv/pdf/0907/0907.1650v3.pdf

Michael Jankowski
December 26, 2009 8:27 pm

kdkd (20:12:57) :
So which temperature data set shall we use for the 200 year period?
And since CO2 has been increasing and is supposed to account for “80% of the observed warming in the late 20th century/early 21st century,” then I assume it should be easiest to spot even in a subset of, say, the most recent 15 years?

INGSOC
December 26, 2009 8:33 pm

Someone may already have said this, but it is “somewhat more likely than not” that none of this matters to the purveyors of doom. Recent events such as the CRU scandal and Copenhagen make it clear that science and data are irrelevant to the eco crusaders. In fact, it is plain that the environment doesn’t matter to them either. It should be clear to everyone by now.
Nice work though.

yonason
December 26, 2009 8:38 pm

From the looks of what I found over at GreenieWatch, the stooges at CRU make Moe, Larry and Curly look like rocket scientists.
http://69.84.25.250/blogger/post/ClimateGate-Data-Series-Part-5-A-break-down-of-large-data-file-for-manipulating-global-temperature-trends-from-2006-2009.aspx
Anyway, if the data’s all wrong to begin with, what’s the point of arguing over what it means, at least until we can get a correct data set, that is?

Tom in Florida
December 26, 2009 8:41 pm

Jim (20:07:59) : “Those issue were preemptively dealt with -”
I assume you are referring to the statement: “The furnace has a scheduled maintenance each Nov, the same two people live in the same house and the thermostat settings have not changed.”
The annual maintenance would only prove the furnace was running within certain parameters not necessarily at the exact same efficiency. Likewise with the thermometer. The point I was making was that a mix of many causes could be responsible for increased fuel usage, not just cooler weather. So the science wouldn’t be settled in that case just as it isn’t settled with climate change.

Dave F
December 26, 2009 8:44 pm

I read through this a couple of times, and I found myself wondering why you chose 1995. I think that it might be because that is when the oscillation makes its home, so to speak, above the line representing the mean, but I am not sure. Why 95?
Also, enjoyed the piece on Motl’s blog about the Wunderkind. If anyone hasn’t been, check it out.
http://motls.blogspot.com/

Mike G
December 26, 2009 8:47 pm

Since the current warming has taken us no farther than the peak of the last warming, 60 to 70 years ago, looking at the actual temperature data, rather than the value added data, I don’t see what all the fuss is about.

jorgekafkazar
December 26, 2009 8:49 pm

Curiousgeorge (16:52:20) : “Anthony, as a statistician you should know better. With enough data points, anything can be “statistically significant”.”
not really

December 26, 2009 8:59 pm

kdkd – thanks for the paper!
One extract from the introduction:
“Linear trends are the simplest way to assess climate change, and are
used in the IPCC reports and most of the trend studies cited in the introduction, among many others. Linear trends also have the advantage that confidence intervals are well defined, which aids in interpretation. Calculating such linear trends overcomes issues due to subjective interpretation of noisy data and the arbitrariness of various methods of smoothing the data, especially at the end points”
I’m sure there are lots of people smarter than me that know much more about climate science and statistics who think this approach is valid, but this paper tells me the same as all the other places I have seen these kind of analyses.
“It’s customary”
Maybe a different question to ask would be “Is climate chaotic?”
Or, “If climate is chaotic can we learn anything (useful) from linear trends and confidence intervals?”
Thanks again for the link to the paper.

December 26, 2009 9:07 pm

cohenite (20:25:57) :
Lubos has picked 1995 and most critics are referring to 1998 being an outlier; that’s ironic because if 1998 is an outlier than by including it Lubos is actually tilting the scales towards a warming trend.
No, since 1998 occurs before the halfway point it tilts away from a warming trend. Whether to include/exclude 1998 comes down to cherry picking.

Manfred
December 26, 2009 9:08 pm

Richard M (18:29:19) :
“…Even years are probably too fine (a better unit might be PDO full cycles).”
Icarus (19:04:39) :
“Actually 30 years is what is ‘put forth’ by climate scientists, and with good reason…”
I strongly agree with Richards point, that everything below a full 60-70 years ocean cycle is misleading.
On the other side, if realclimate “scientists” chose to put forth 30 years, i.e. exactly the warming half of a full natural cycle, their “good reasons” appear to be “good” in an odd and now familiar way.
The solution to this dilemma and the loss of trust in CRU, NOAA and GISS temperature records should be to correct UAH values for ocean current cycles just as they are corrected sometime for El Nino/La Ninas and volcanoes.
An ocean cycle correction should remove most if not all of the warming since 1979, and with respect to the warming due to land use changes, the question arises, if global cooling did not already start in the 1930s or 1940s.

Dave F
December 26, 2009 9:16 pm

Tenuc (17:57:39) :
That has always puzzled me a bit too. Why not just eliminate even carving out a mean for the data? Why not just throw it all against the wall and see what sticks? If it sticks, it is done! Seriously, though, why not just analyze the entire temperature set, every single point? Then you don’t have to deal with the logistical headaches of transporting a temperature to the next closest station. The way I understand radiative forcing, and anyone feel free to correct me if I am wrong, the temperatures should be Higher highs and higher lows, so the entire dataset should exhibit an upward trend, right? So, I say make pasta. Throw it against the wall and see what sticks.

kdkd
December 26, 2009 9:32 pm

scienceofdoom:
All non-linear regressions do is use an appropriate mathematical function to transform a non-straight line into a straight line. In the case of the paper I posted, I think (from memory) they used a sinusoidal function to straighten the line up. Other than that the techniques are the same. And yes, the climate system is chaotic, but this just means that there’s a lot of variability that we can’t properly account for among the significant amount of variability that we can account for.
Michael Jankowski:
[ So which temperature data set shall we use for the 200 year period? ]
Well to be honest the best approach is to use as many sources as possible into an aggregate data set. Relying on a single data source is generally to be avoided.
[ And since CO2 has been increasing and is supposed to account for “80% of the observed warming in the late 20th century/early 21st century,” then I assume it should be easiest to spot even in a subset of, say, the most recent 15 years? ]
No. This is because although the magnitude of the decadal trend is about the same as the year on year variability, so the long term trend doesn’t really become apparent until the long term trend exceeds the short term variability by quite a lot. So the technique I used to ascribe the proportion of co2 associated with the temperature anomaly reading only works well over time periods of at least 30-50 years.

Bulldust
December 26, 2009 9:35 pm

As a few others have mentioned, you do yourself no favours by taking yearly data when monthly are available. When the same analysis is done for the larger (monthly) data set sd´s fall and t-values (significance) increase.
Having said that, I tend to take an engineering apprach to graphs like this before I apply stats. Step back, look at the graph, and ask yourself, what does the data appear to tell you?
IMHO I see data with longish fluctuations of 2-3 years with enough variation to from year to year that forcing a linear trend on the data would be folly. It tells you nothing of real significance.
As humans we are very adept at pattern recognition as we devote a major portion of the brain to it. In that sense everybody is equally capable of looking at the first graph and “seeing” patterns if there are any.
To my mind there is very little to see there other than the regular oscillations and one or two stand outs. Firstly El Nino of 1998 stands out because it is a larger oscillation than the rest, and secondly the period from 2001 to 2007 looks a little odd because of the lack of variation compared to the previous couple of decades.
However, the last two paragraphs are based on focusing on the 13-month moving average. Shortening the term of the average to 5 months or so would introduce more amplitude in the oscillations and change the impressions somewhat.
I believe Aesop was attributed to saying something along the lines of:
“We can easily present things as we wish them to be.”
Statistics weren´t even around in his day… smart man that.

Dave F
December 26, 2009 9:37 pm

kdkd (20:12:57) :
(e.g. in the early 20th century, co2 levels accounted for about 25% of observed warming whereas in the late 20th century/early 21st century, co2 accounts for ~ 80% of observed warming – now correlation is not causation, but it is rather suggestive given the cohesive body of scientific theory associated with this information).
How much does this scientific body of literature say the temperature will go up if there is an additional 34ppm of CO2 added to the atmosphere, under the same climatic conditions as this year, in the next year?

Bulldust
December 26, 2009 9:51 pm

Hi Leif … good to see you around 🙂

Dave F
December 26, 2009 9:55 pm

Rob Vermeulen (14:22:28) :
The trend is non-significant only because the poster used the average yearly anomaly. Taking every month into account, the trend becomes statistically more significant.
That really seems a sleight of hand, though. If the yearly anomaly is not a statistic composed of the monthly anomalies, what is it? So aren’t the monthly anomalies contained in the yearly anomalies? Hell, we could break it down to daily anomalies if you would like, but aren’t you really just adding data points for the sake of adding data points? The more I think about the anomaly as a means of measurement, the more I dislike it.

photon without a Higgs
December 26, 2009 9:56 pm

kdkd (20:12:57) :
in the late 20th century/early 21st century, co2 accounts for ~ 80% of observed warming
Without reference to studies please don’t make statements like this.
We’ve all been in the circles these type of statements make before.
So references please?

December 26, 2009 9:58 pm

The sun’s variance is only .1% our of which we would receive only a quarter portion of that.
Isn’t that at the surface of the Earth? What about UV variation at the top of the atmosphere which I believe is way more than the TSI variation. And the UV is mostly absorbed at the top of the atmosphere.
Last I heard the effect of UV in the models is parameterized. i.e. based on assumptions rather than explicit physics.

photon without a Higgs
December 26, 2009 10:00 pm

kdkd (20:12:57) :
the current warming which appears to be caused by the burning of fossil fuels.
Supply all the references for this statement. Otherwise please don’t play this game here.
‘Appears’ is a word that doesn’t count in science.
But it does count in sophistry and propaganda.

photon without a Higgs
December 26, 2009 10:01 pm

kdkd (20:12:57) :
the current warming which appears to be caused by the burning of fossil fuels.
——————————————–
Stamp prices have gone up at the same time as temperatures have gone up. So it appears stamp prices control temperature.

cohenite
December 26, 2009 10:06 pm

Leif: 2 points; as is indicated in the linked paper, 199[7] is NOT a cherry pick because it is supported by prominent climate events and a likely PDO phase shift; other break papers by such authors as Tsonis and Swanson;
https://pantherfile.uwm.edu/kswanson/www/publications/2008GL037022_all.pdf
argue for a break around this time, although S&T select 2002 as the regime shift date; other phase change or step papers are;
http://www.arl.noaa.gov/documents/JournalPDFs/Seidel&Lanzante.JGR2004.pdf
http://www-eaps.mit.edu/faculty/lindzen/203_2001GL014074.pdf
Secondly, in respect of 1998 adding to a warming trend, I guess I was thinking of a running mean.

samspade10
December 26, 2009 10:12 pm

David Starr, instead of taking the time to type out all those rhetorical questions, you could have just looked at the website for yourself:
http://discover.itsc.uah.edu/amsutemps/execute.csh?amsutemps
If I were less cordial, I could have referred you here

December 26, 2009 10:15 pm

his is because although the magnitude of the decadal trend is about the same as the year on year variability, so the long term trend doesn’t really become apparent until the long term trend exceeds the short term variability by quite a lot.
Well then. Since the daily variation is on average in the 10 deg. C (or more) range…..
Or take the mid latitude Summer vs Winter variation of 20 deg C (or more)….

crosspatch
December 26, 2009 11:06 pm

We can’t know how much “observed warming” there is because the record is so bad. The statement of 80% of “observed warming” might turn out to be very little at all.
How much warming has there been since the 1930’s? What about temperatures of the MWP? What is responsible for the cooling since then? Why are todays temperatures cooler than the MWP in spite of more CO2? Why did temperature rise from 1830 to 1930 outpace later rise when CO2 was not rising nearly as fast?
Any linkage of climate warming and CO2 is pretty much a guess and nothing more.

Ron Broberg
December 26, 2009 11:08 pm

@ Jeff L (13:01:52) :
GISTEMP:
http://data.giss.nasa.gov/gistemp/sources/
CRU:
http://www.metoffice.gov.uk/climatechange/science/monitoring/subsets.html
It means that the probability that the underlying slope is negative equals 1/2 of the rest, i.e. a substantial 14%.
So those claiming “global cooling” are betting on the 14%.
Good to know.

Galen Haugh
December 26, 2009 11:25 pm

I appreciate several things on the above comments–First, Eve’s list of home fuel consumption by year and their indication of an increasing trend from 1997 to 2008. Of course there can be a number of things that might cause this but most of the objections are minor compared to the actual heat balance indicated. (I mean really, how many of you have seen the windows of your house fall apart or the seal on the doors fall off within 10 years, or the insulation get eaten up by termites, or so on and so forth. Certainly these should be considered but in the houses I’ve lived in, they are very minor aspects.)
So I entered the data into Excel and did a simple plot and eye-balled it for trend line (no need going any further than that). It looked like some of the temperature graphs for the same time period–imagine that!
Where I do take great issue with these “climate scientists” was mentioned in a post above with this sentence:
“Then you don’t have to deal with the logistical headaches of transporting a temperature to the next closest station.”
My expertise happens to be in 3-dimensional statistics, something called kriging, an invention of mining. In that science, the position of a data point has meaning as well as the value of the data point. Never, EVER could I ever justify moving a data point from one place to another regardless of any adjustment factors–what’s required is getting that data point from sampling. Simply adjusting a data point so you can put it someplace else is absurd beyond imagination. It’s beyond a logistical headache–it’s simply a NO-NO!
Of course, where I’ve used three-dimensional statistics before is in reserve estimations and models of mineral deposits–particularly gold and silver. If part of the model has unacceptable estimation variances (based on what’s considered proven, probably, and possible), one simply drills and gets the data. And because nature is unpredictable, there is no justification under the sun for simply inventing data points (which is what moving a data point from one place to another actually is– FALSIFICATION OF DATA)
Admittedly, my example of obtaining data can’t be applied to climate science since temperatures are obviously time related. However, temperatures are also position related, so again, shifting data from one location to another simply boggles my mind. I am beginning to believe that “climate scientist” is another word for “charlatan” or “witch doctor” or “lazy programmer” or “political activist” and a dozen other unflattering terms. The deeper I dig into what these “climate scientists” do, the more disillusioned I become. Much of their methodology wouldn’t be acceptable where I’ve worked in the past, and certainly wouldn’t warrant investing significant capital based on their results.
I believe there is a very good reason they don’t want to show anybody what they do–we’d be having a pitch fork party with lots of tar and feathers. And until they come clean, show us all their data and methodology, I’m afraid we can’t make any significant conclusions about earth’s climate. Garbage in, garbage out.
I think our efforts should concentrate on pressing FOIA requests to successful conclusion, both in the US and in the UK. In the meantime, we can quell all arguments AGWers might make with the pointed fact that they are hiding the science.
Show us the data, discussions and methodologies. All of it!

Bulldust
December 26, 2009 11:44 pm

Rob Vermeulen (14:22:28) :
That really seems a sleight of hand, though. If the yearly anomaly is not a statistic composed of the monthly anomalies, what is it? So aren’t the monthly anomalies contained in the yearly anomalies? Hell, we could break it down to daily anomalies if you would like, but aren’t you really just adding data points for the sake of adding data points? The more I think about the anomaly as a means of measurement, the more I dislike it.
———————————
Yes and no. The yearly averages are certainly comprised of the monthly averages but you lose 11 pieces of information by using only the annual data.
For example, if I gave you the annual averages only and asked you to back out the data for the monthly temperature anomolies you would look at me as if I had two heads. The information simply isn´t there anymore.
For exactly this reason any statistic, like a linear trend, derived from annual averages over a time period is less significant than a trend calculated from monthly data over the same period because the annual trend took less pieces of information into account.
Statisticians will talk of degrees of freedom and the like, but I hope the above explains it well enough.
Another way of thinking about it would be like averaging colours across pixels in a computer image. Eventually, if enough pixels are averaged, the picture is no longer recognisable. Eventually you would have a single brown (or whatever colour) pixel… no way you could back out the image from that single average pixel.
Statistics is a black art really… we try to separate out the noise to find a signal. The problem is that it is very easy to “perceive” things that are not there. Perhaps that is the downside to the human pattern recognition ability – we find it very easy to see patterns in random clusters. Think of spotting familiar shapes in clouds, for instance. The cloud certainly did not deliberately align to make that specific shape… you merely imagined it to be there.
How one looks at data depends on what one is looking for. It is always a balanncing act between the fine detail (and increased noise) and the more general holistic picture. It´s a question of whether we are interested in the wood or the trees.
Time to go ski on some of that global warming >.>
… delete if replicated

photon without a Higgs
December 26, 2009 11:54 pm

Dave F (20:44:25) :
Why 95?
I’m wondering if it’s because Richard Lindzen has spoken about global warming ending in 1995?

photon without a Higgs
December 27, 2009 12:02 am

crosspatch (23:06:29) :
The statement of 80% of “observed warming”
UHI is manmade so that is part of manmade global warming. So most of the commenters here believe in ‘manmade global warming’ since they know UHI is happening. 😉
What % of warming is UHI? And if UHI is taken out has there even been any warming in the last ~30 years??

Philip T. Downman
December 27, 2009 12:20 am

How about trying “Benford’s law” to temperature data? Should it reveal manipulation?
Who me? Far to lazy.

kdkd
December 27, 2009 1:10 am

photon without a Higgs
[ Stamp prices have gone up at the same time as temperatures have gone up. So it appears stamp prices control temperature. ]
This is the kind of solipsistic argument that gives climate change sceptics a bad name. The methodology used is a good bit more sophisticated than your comment acknowledges.
crosspatch:
[ Any linkage of climate warming and CO2 is pretty much a guess and nothing more. ]
No, this is incorrect. We predict that co2 will cause warming from scientific information derived form both classical mechanics and quantum theory. We can then do a range of lab experiments, and planetary observations to see whether the theory fits the data. The correspondence is reasonably good. There’s a bunch of things in the rest of your post that are plain incorrect as well (plenty of warming since the 1930s etc).
M.Simon:
The standard deviation of the annual average is quite high, your comments about 10ºC daily range is irrelevant to this – the standard deviation is not the same thing, and its important to measure it across a consistent time scale (for the purposes of the present example that would be the standard deviation of the annual mean temperature).

Geoff Sherrington
December 27, 2009 1:21 am

The 1998 hot year features prominently in the top graph. It is not a global feature as there are many stations where there is no hot 1998 – all 3 Australian Antarctic stations (Mawson, Casey, Davis)plus Macquarie Island, for example. I have been doing a simple anlysis by subtracting each month in 1997 from each month in 1998. This result for Meekatharra Australia shows a typical result.
http://i260.photobucket.com/albums/ii14/sherro_2008/MeekaJ.jpg?t=1261905136
The 4-lobed difference graph is a feature of almost all I have studied. Is it a valid conclusion that global station data, in some stage of the gridding/interpolation procedure, are converted to quarterly inputs?
I’d be delighted to see readers post other examples of this simple 1998-1997 routine from all over the world. In particular, it does not seem to indicate a month or place where the warming started for the 1998 hot year. I’d do the differencing myself more often, but it usually takes local knowledge to choose data sets that have not been excessively “adjusted”.

December 27, 2009 1:24 am

Dear readers,
the reason for going back to 1995 is to see how long intervals can end up showing no statistically significant warming, assuming the annual white noise null hypothesis.
Of course, if one goes before 1995, the warming will become statistically significant with these choices. But 15 years is a pretty impressively long timescale over which the global warming can be seen to be “statistically non-existent” – which tells us something about the non-urgency of the problem, even if the problem exists at all.
By the way, UAH datasets show a cooling trend not only from 1998 to 2009, as all of us have heard many times, but also in intervals 2001-2009, 2002-2009, 2003-2009, 2004-2009, 2005-2009, 2006-2009, and 2007-2009. That’s, in fact, most intervals among the 14 intervals 1995-2009 and 2008-2009. See
http://motls.blogspot.com/2009/12/no-statistically-significant-warming.html
But clearly, for longer periods than 15 years, one is gradually raising the confidence – the warming trend becomes statistically significant. It surely is statistically significant for the last 30 years.
Also, if you replace the annual data by the monthly data, you may restore the statistical confidence, even for the last 15 years. It means that the null hypothesis “the monthly data since 1995 are white noise with the appropriate parameters” may be robustly falsified. However, this hypothesis is invalid a priori – because the monthly data are continuous and their detailed behavior is more similar to red noise than white noise (the color never becomes white – the autocorrelation never disappears).
However, if you postulate that the monthly data are a random walk – red noise – you will be unable to falsify this null hypothesis by the observed data, too. There’s just too much noise in them. In fact, you will be surprised why the observed accumulated temperature change in 15 or 30 years is so tiny.
One must be careful about the null hypotheses. If a null hypothesis is falsified, it doesn’t yet prove a man-made or another, otherwise “unnatural” signal. It is usually because of the naivite of the null hypothesis. See more comments about these and other issues at
http://motls.blogspot.com/2009/12/no-statistically-significant-warming.html
Best wishes
LM

Geoff Sherrington
December 27, 2009 1:25 am

Re kdkd (01:10:30) : Stamp prices.
The price of a stamp saying “Organic” or “Green” has gone up, but I guess you mean postage stamps. I’m a keen collector with a modestly good Australia 1913-2009 mint unhinged (plus I make a CD that cross-references and illustrates about 4,000 stamps in that period). Fell free sherro1 at optusnet dot com dot au End of commercial.

Geoff Sherrington
December 27, 2009 1:33 am

Galen Haugh (23:25:31) :
Good on you. You might have been successful with a job application with our mining company. Alas, I’ve been saying the same as you for some years now. I just get called a denialist. But you can’t deny the presence or absence of an ore deposit outlined by drilling.
Ditto with you on the falsity of a change of collar coordinates. You can’t shift a hole and call it the same as before.

Tom P
December 27, 2009 1:40 am

Luboš Motl,
I see you have updated your original article with a much longer consideration of the monthly temperatures I raised in an attempt to try to salvage your conclusions:
http://motls.blogspot.com/2009/12/no-statistically-significant-warming.html
After much verbiage you now say:
“I am convinced that such an improved model could match the autocorrelation and the distribution of increments at all timescales and that the null hypothesis that the underlying trend is zero would statistically survive, too.”
without offering any such model. This is proof by assertion.
But you might want to ask Anthony to update your headpost here so that readers can enjoy your full updated analysis at this site.
You claim in the title that you have produced “a quick mathematical proof.” Your original post was indeed quick. What you have now produced deserves none of those three words.

NickB.
December 27, 2009 1:51 am

So I’ve been reading through this thread with a bit of a rubberneck, passerby of an accident sort of fascination and I keep coming back to a couple of points raised regarding the utility of anomolies and the ridiculousness of some of the techniques used in the creation of the surface temperature models (my aplogies to the posters I reference, working from a mobile makes things difficult sometimes). What I think is getting lost in all these conversations is this:
What is the problem we’re trying to solve… and what is the best way to do it?
It seems to me that we spend a whole lotta time and effort debating issues that are at best 2-3 steps removed from that core question and where there is lack of direction and clarity, confusion reigns.
What, exactly, are we trying to measure and represent by these reconstructions? Is it surface temperature, is it near surface temperature, is it atmospheric temperature, or is it total heat content (which, correct me if I’m wrong here would have to factor in humidity alongside temperature for atmospheric measurements… wouldn’t it)?
Maybe that’s a stupid question, let the flogging commence 😀

Mapou
December 27, 2009 2:02 am

kdkd:

No, this is incorrect. We predict that co2 will cause warming from scientific information derived form both classical mechanics and quantum theory. We can then do a range of lab experiments, and planetary observations to see whether the theory fits the data. The correspondence is reasonably good.

When was the last time you conducted a controlled experiment with a planet the size of the earth with a similar atmosphere, geography and geology? And, while you’re at it, explain why it was warmer in the middle ages.

CodeTech
December 27, 2009 2:03 am

Who is kdkd? And who is kdkd’s “we” ?
Either way, appeal to authority, baffle with BS (bad science), and dismissing a deliberately ludicrous demonstration that “correlation is not causation”… yeah, you’re not doing yourself any favors there, kdkd.
From what I can see here, you’ve got exactly nothing.

Roger Knights
December 27, 2009 2:17 am

Rather than comparing the fuel use at one home over time, a better check on temperature records kept by the gov’t. would be “degree-day” records kept by fuel companies. (And local weather bureaus?) These records could be a fruitful source of insight and add oomph to a critique of the official stats.

Roger Knights
December 27, 2009 2:34 am

If 2010 & 2011 are as cool as 2008, any uptrend will be so attenuated as to lose its rhetorical force, and cooler heads will get more of a hearing.

Veronica
December 27, 2009 3:01 am

DirkH
Thanks for posting the Hansen vid. Makes me feel so much better already – happy new year!

stephen richards
December 27, 2009 3:07 am

I like Lubos’s analysis, not because it shows anything significant and not because it shows something significant. You see, I believe what it shows is a simple piece of analysis as it should be done. Metadata, method, data and results followed by his conclusions. Unlike the Tam et al he makes no attempt to hide what he is doing.
A quick opinion on trends. The linear trend trick really has no value in climate analysis. Why? because climate as Arhenius knew, has several ‘failure mechanisms. If you linear trend those change points you eliminate the very thing for which you should be searching. In climate analysis, we need to know about the points of inflexions/change like the cooling and warming points in the 20th cent. These inflexion points in climate can be at intervals from 30 to 100,000 to 500,000 years.

December 27, 2009 3:32 am

Icarus (13:33:14) :
The long-term warming trend is around 0.13C per decade according to the entire UAH record. What you should be calculating is whether there is any statistically significant deviation from that warming trend – otherwise you’re just grasping at straws.

I’m afraid I have to agree with this. We should be testing the hypothesis that the trend has changed since 1995 and it clearly hasn’t – not in any statistically significant sense at least.
One interesting point, though, is the fact that the years immediately before 1995 were affected by the 1991 Pinatubo eruption. If data were adjusted to account for the Pinatubo affect (up to 0.5 deg) the non-significant trend could stretch back to the early 1990s.

Icarus
December 27, 2009 3:55 am

Steve J (19:49:16) :
Funny,
I thought the temp increase preceeds the co2 increase by 800 years.
Based upon what we know (Thanks Anthony) about the dubious quality of the data.
Why would any rational being attempt to develop any trends based upon such a small data slice?
2,500 years should be the min. or maybe 10,000 years.
The entire argument evaporates under those conditions.

Think about how quickly the world’s climate responded to the Pinatubo eruption and I’m sorry to say that your argument evaporates. We see palaeoclimatic changes on scales of hundreds and thousands of years because the causes change at that pace, not the effects – changes in insolation due to periodic changes in the Earth’s orbit play out over many thousands of years. If the forcings change much faster, then the effects play out much faster too, and this is what we are seeing with anthropogenic forcings from CO2, methane, black carbon on snow and so on.

NickB.
December 27, 2009 4:04 am

OK, so I think I just answered my own question – h/t to photon without a Higgs in the thread here (http://wattsupwiththat.com/2009/12/26/satellite-measurements-prove-our-quiet-sun-is-cooling-the-upper-thermosphere/) with the link to this http://www.youtube.com/watch?v=Ykgg9m-7FK4
So, disregarding the rest of the stuff in the video on Miskolczi, what we’re trying to get to is the average *surface* temperature of the earth so we can plug it into the energy balance equations
So let me refine my question… is averaging of daily T-Min/T-Max of near surface air temps even a remotely good way of going about this?
Seems to me, with all the uncontrollable variables involved both real (instrument, UHI, other siting issues, etc) and Mann-made (ba-duh-dum – i.e. statistical) in land surface temperature records… couldn’t there be a better way to go about this?

December 27, 2009 4:23 am

An excellent post in the same vane using GISS data is at http://tamino.wordpress.com/2009/12/15/how-long/#more-2124
Of course if you are a glacier it the cumulative trend of warm temperatures that will melt you all too quickly.
http://glacierchange.wordpress.com/2009/12/19/helm-glacier-melting-away/

December 27, 2009 4:53 am

So, disregarding the rest of the stuff in the video on Miskolczi, what we’re trying to get to is the average *surface* temperature of the earth so we can plug it into the energy balance equations
Since radiation goes as T^4, average temperature does you no good in an energy balance.

Tenuc
December 27, 2009 5:11 am

Dave F (21:16:46) :
[Tenuc (17:57:39) …]
“That has always puzzled me a bit too.
Why not just eliminate even carving out a mean for the data? Why not just throw it all against the wall and see what sticks? If it sticks, it is done! Seriously, though, why not just analyze the entire temperature set, every single point? Then you don’t have to deal with the logistical headaches of transporting a temperature to the next closest station. The way I understand radiative forcing, and anyone feel free to correct me if I am wrong, the temperatures should be Higher highs and higher lows, so the entire dataset should exhibit an upward trend, right? So, I say make pasta. Throw it against the wall and see what sticks.”
You are correct when you imply that averaging and homogenising the data is a fruitless exercise. In fact, because climate is driven by deterministic chaos (not randomness as is often stated), by doing this you are losing information content. The very noise which people try to remove so that a long-term climate signal can be seen, is actually the product of the whole intricate system in action.
Each mechanism effects how the other’s respond and total system energy is constantly varying at any chosen moment in time. It is impossible to tease out the tiny bit of information about how much CO2 affects the overall system as it is a micro-scale process easily swamped by macro-scale processes, for example the tilt in Earth axis of spin, changes to total albedo.
However, if you just take the individual bits of temperature data and “throw it against the wall and see what sticks”, you will not get any insight. The data granularity is too low by several orders of magnitude for anything useful to be ascertained. The only honest answer that science can give when asked what the future will bring is ‘Climate will continue to change, but we don’t know at the moment the direction or magnitude of these quasi-cyclical changes’.

Basil
Editor
December 27, 2009 5:31 am

Tom P (19:00:56) :
Luboš Motl,
The monthly data from UAH is readily available and can be used to provide a much better estimate of the confidence limits – we now have 180 monthly points rather than 15 averaged annual points to look at the variability of the temperature measurements.
Following precisely your method, the trend now drops to 0.94C/century from your figure of 0.95C, not a significant difference. However, with the additional information from twelve times the data, the standard error of this slope drops to 0.31C, substantially less than your figure of 0.88C.
Hence the warming slope is more than three standard deviations above zero, which means we can say with 99.9% confidence, not your 86%, that there has been warming on the basis of the UAH data.
I have followed your algorithm exactly here, though as has been pointed out, for the correlated temperatures in a time series the confidence level will be even higher than 99.9%.

This is incorrect. The confidence interval is lower. The effect of autocorrelation is to make the standard deviation lower than it should be, and if corrected for serial correlation, the standard deviation will be higher, and the confidence level lower. In this case, using the monthly data, and no corrrection for serial correlation, the regression for the trend is:
coefficient std. error t-ratio p-value
———————————————————
const -0.0241097 0.0745042 -0.3236 0.7466
time 0.000776549 0.000258984 2.998 0.0031 ***
which indeed looks very significant. But corrected for serial correlation, it is
coefficient std. error t-ratio p-value
———————————————————
const -0.0241097 0.146453 -0.1646 0.8694
time 0.000776549 0.000484935 1.601 0.1111
The standard error is now doubled, and the trend is no longer significant.

Peter of Sydney
December 27, 2009 5:43 am

It is ludicrous to expect any prediction to be 100% valid using any length of historical data, be it 10, 100, 1000, 1000, …. years. It would be like predicting the future trend of a stock purely based on past data. It can’t be done. If it could we’d all be trillionares by now.

Richard M
December 27, 2009 5:56 am

For those of you who think monthly anomoly data is a valid unit … well, why not use daily data? Why not use hours, minutes, nanoseconds?
Clearly I could show any trend I wanted by making the units small enough and I could get great statistical verification.
And, for those who seem to have their heads buried in the sand, it was climate scientists responding to the 10 years of cooling that was getting lots of play in the press that stated unequivocally that it would have to be 15 years to be meaningful. Don’t complain to me, complain to those scientists. I already stated I thought a single unit should be a full PDO cycle.
But, now we have reason to ask these climate scientists if they have changed their position on AGW by their very own reasoning.

Jordan
December 27, 2009 6:03 am

kdkd (01:10:30) :
“We predict that co2 will cause warming from scientific information derived form both classical mechanics and quantum theory.”
True. But the relevance of the AGW hypothesis has more to do with predictions of catastrophe. And that requires amplification of the basic physical responses.
There is no physical basis or evidence for amplification whatsoever.

Tom P
December 27, 2009 6:10 am

mspelto (04:23:25) :
Thanks for pointing this out. Tamino’s post is indeed very good:
http://tamino.wordpress.com/2009/12/15/how-long/#more-2124
It shows very clearly the serious statistical shortcomings of Luboš Motl’s post and comes to a very different conclusion regarding the significance of recent temperature trends.

Richard M
December 27, 2009 6:15 am

Keep in mind. The reason the climate scientists state 15 years is because they didn’t believe the AGW signal could be hidden for that long. It has nothing to do with longer term warming or cooling. It is precisely the AGW signal itself that is important here.
While I’m not sure that was the point of this article, the analysis does make the following question valid … “where is the AGW signal?”.

Manfred
December 27, 2009 6:37 am

mspelto (04:23:25) :
tamino makes 2 choices, one is taking a 30 year period matching the warming half of a PDO cycle and the other is chosing GISS instead of satellite data.
while taking only the warming half cycle is about as misleading as you can get, GISS is only the second worst data set after Petersen/Karls NOAA data.
Among various issues, this alone is more than enough reason not to rely on GISS:
“…In the ROW (rest of the world), there are almost the same number of negative (urban) adjustments as positive adjustments…”
http://climateaudit.org/2008/03/01/positive-and-negative-urban-adjustments/

December 27, 2009 6:54 am

mspelto
Here is the original document concerning helm glacier
http://www.sfu.ca/~jkoch/gpc_2009.pdf
Much of its retreat was during the very warm first half of the 20th Century (especially around the 1930’s). It then advanced at various times.
To compare today to the LIA is meaningless. It was much colder then. If the glacier had been observed in the 1730’s or during the MWP it would have been very small.
Tonyb

View from the Solent
December 27, 2009 7:07 am

kdkd (01:10:30) :
“We predict that co2 will cause warming from scientific information derived form both classical mechanics and quantum theory.”
Please explain the justification from quantum theory. Show your workings.
(btw, you forgot to include string theory, and/or M-theory plus 11-dimensional analysis.)

DocMartyn
December 27, 2009 7:46 am

“Kevin Kilty (19:12:54) :
I agree with DocMartyn’s main point – you need a hypothesis to test against. Just plotting a graph and fitting a linear trend is a pointless exercise.
Isn’t the implicit null hypothesis a zero trend in this case? You will find even less significance testing against an assumed positive trend.”
Not at all, the implicit Null Hypothesis is that DeltaTemp=Function[CO2]. The data MUST be plotted with reference to [CO2], which we do know to be rising, in a non-linear manner.

JonesII
December 27, 2009 7:53 am

Satellites were adjusted according to cherry picked warmer surface stations. We do not really know now what the actual data was.

J. Bob
December 27, 2009 8:18 am

_Jim (14:43:02 – 12/26/09), here is something to compare Tamino’s “famous” E. English early data post of last year.
Here is a plot of the English 1659-2008 data, showing the down turn toward over the last few years.
http://www.imagenerd.com/uploads/t_est_28-bGGxs.gif
This used a 40 year Fourier Convoluiton filter. The figure below shows the same 1659 data filtered with three different filters. These include 40 year Fourier, MOV and a “filtfilt” 2 pole recursive Chev (runs forward in time, then reveres backward in time, defined better in the MATLAB signal Cond. toolbox). The “filtfilt” and MOV have end point problems the Fourier avoids.
http://www.imagenerd.com/uploads/ave1-raw-smoothed-5rhsb.gif
The next figure compares the Fourier and Empirical Mode Decomposition (EMD) filters with the CRU 1855-2003 data set. The EMD has the ability to use non-stationary and non-linear data sets.
http://www.imagenerd.com/uploads/cru-fig-6-NMyC0.gif
Note the comparison between the two, especially at the end points.
The last figure overlays the with a Climate4you global temp http://www.climate4you.com/
composite. This compares the standard data sources and Hadcet, E. England & Ave14. Ave14 is a composite of the 14 longest running temperature records, starting before 1800. The include the E. England, Uppsalla, DeBilt, Berlin, Paris, Geneva, etc., as found on the rimfrost http://www.rimfrost.no/ site.
http://www.imagenerd.com/uploads/climate4u-lt-temps-Ljbug.gif
Seems all have a tendency to bend down over the past 8-9 years.

Don
December 27, 2009 8:26 am

If you take the blackbody radiation spectrum of the earth and overlay on it the absorption spectrum of CO2 you can see that if CO2 absorbs all the radiation in the bands of its absorption spectrum it will only be a very small part of the total radiation from the earth. This is the reason that the modelers need some “pushing” or amplification mechanism to achieve dramatic predictions such as Hansen’s “tipping”. Since for the last 10 years the CO2 has gone up but the temperature has remained about constant, perhaps the effect of CO2 on greenhouse warming has saturated and increasing concentrations will have no effect. Comments?

Nemesis
December 27, 2009 8:42 am

Put it this way: If the climate did not change – wouldn’t that be the time to worry.?

December 27, 2009 10:24 am

cohenite (22:06:54) :
99[7] is NOT a cherry pick because it is supported by prominent climate events
It was 1995, not 1997. And it is still a cherry pick as the trend itself [or ‘climate events’] is used to select the starting point.

December 27, 2009 10:25 am

I did the same thing about a month and a half ago where I looked at four datasets (UAH,RSS,HADCRUT AND GISS) and got similar results.
http://noconsensus.wordpress.com/2009/11/12/no-warming-for-fifteen-years/

edward
December 27, 2009 10:46 am

Jeff L (the first post)
Have you ever been to the GISS website?
The data and methods are all published right on their website. Your comment is more an indictment of your making a claim on faith rather than checking the facts before making an accusation.
Thanks
Edward

peter_dtm
December 27, 2009 11:41 am

Galen Haugh (23:25:31) : (and others) seem to be to have the correct response to all the AGW :
Where is your original source data ?
What is your methodology ?
Where is your source programming (fully documented) ?
– what; you won’t release it ? then GO AWAY and don’t come back until you can provide the basic science behind your claims.
We should spend a considerable amount of time educating our peers & government officials in what this all means & why it is so important.
How dare they claim to be doing science when they fail to meet the most basic premise behind the Scientific Method ?
Another question – based on the stats :
the daily average temperature can NOT be understood by supplying Min/Max values as an illustration of the underlying tendency of the temperature over 24 hours. This is because the ‘signal’ is definitely not sinusoidal; and even if it were; Nyquest requires AT LEAST two samples ; giving only two samples runs the major risk of aliasing the data.
I doubt you could get a decent reconstruction of the ‘signal’ even with four samples; since the actual ‘signal’ is a very skew sine (skew in time) modulated by all sorts of chaotic influences.
Consider a cloudy temperate summers day; the sky clears about 13h00; and gives a peak temp before clouding over at 14h00. The sky clears with sunset; causing a protracted depression of the temperature overnight the overnight temperature spends most of the night hovering around the Min. Clearly the temperature will be lower for a considerable greater period than it is higher than the simple average created by just taking the Max/Min values
– example running (10 minute sample rate) average temperature over the last 24hours at my house (UHI) 4.4C;
Max 6.4 Min 2.9
but the Min/Max average = 4.6 (rounding down to 1 sig place) – and that’s on an overcast winter’s day/night…
error is 0.2C (4.5%)
So the so called daily ‘average’ ((Min + max)/2) gives a false measure. And that false measure is then used to generate the monthly average (sum of daily average divided by days in the month).
Am I missing something ? Or is this GIGO ?
And that’s before we get into the whole adjustment and falsification of ‘missing’ data – sorry I believe that’s called ‘reconstruction’ – but if there was no data; how can you make it up – err reconstruct it ?

Dave F
December 27, 2009 11:45 am

Leif Svalgaard (10:24:18) :
I think Motl’s explanation was acceptable. If I read it correctly, he is saying that he was trying to see how far back one had to go to find a statistically significant trend.
If you missed it, it was here:
Luboš Motl (01:24:45) :
Sounded like Motl just went as far back as he could in an attempt to see how long it would take to find statistically significant warming.

Alexej Buergin
December 27, 2009 11:59 am

” Eve (14:08:45) :
My heating fuel usage shows that there has been cooling since 1997, which is the oldest data I have. I will show fuel usage in Litres per year. I do not heat the house when it is warm therefore each year after 1997 must have been cooler. The furnace has a scheduled maintenance each Nov, the same two people live in the same house and the thermostat settings have not changed.
1997-2767.20 Litres
1998-3057.50 Litres
1999-4009.30 Litres
2000-3874.70 Litres
2001-3586.70 Litres
2002-3752.20 Litres
2003-3634.50 Litres
2004-4072.50 Litres
2005-3293.50 Litres
2006-4276.70 Litres
2007-3700 Litres
2008-4476.20 Litres”
The Swiss Homeowners association publishes “Heating-Degree-Days” since 1981. I used the numbers for December/January (in Zurich) and got a very slight positive linear trend (+1HDD/year for 62 days). That would indicate it got a little bit colder.
According to the official numbers Zurich-winters are 1°C warmer today than in 1980. These are homogenized data, of course.

December 27, 2009 12:35 pm

Dave F (11:45:32) :
I think Motl’s explanation was acceptable. If I read it correctly, he is saying that he was trying to see how far back one had to go to find a statistically significant trend.
If that were the case, one could quantify that [Motl didn’t] in tis way.
Going x years back produces a trend with significance S, then vary x from 1 through N [where N is perhaps 30 as is normally assumed to define ‘climate’] and plot S as a function of x to show that x = 15 was a ‘good’ choice.

kadaka
December 27, 2009 12:40 pm

@ Bob Tisdale (17:23:49) :
Thanks for the replies.
What’s worse than not knowing the answers?
Not searching for them.

Pamela Gray
December 27, 2009 12:44 pm

I like the idea of “going back far enough to find a significant trend” either positive or negative. This is a great way of finding significant temperature trends to then discover if oceanic trends correlate. My hypothesis is that significant temperature trends (of course one would need to define what makes the temperature trend significant) are nearly always (correlation co-efficient?) correlated with significant oceanic trends. I believe this has been done.

Editor
December 27, 2009 2:03 pm

To: kdkd
quoting crosspatch:
[ Any linkage of climate warming and CO2 is pretty much a guess and nothing more. ]
“No, this is incorrect. We predict that co2 will cause warming from scientific information derived form both classical mechanics and quantum theory. We can then do a range of lab experiments, and planetary observations to see whether the theory fits the data. The correspondence is reasonably good. There’s a bunch of things in the rest of your post that are plain incorrect as well (plenty of warming since the 1930s etc).”
No, this is incorrect
“You” are predicting (man-caused) Global Warming based on:
1) assuming no technological or engineering advances for 400 years in energy production or conversion or efficiency. (No fusion, no nuclear power production changes, no fast-flux or breeder reactors, no pebble bed or recycled waste used as fuel, no change in Uranium, thorium, or plutonium sources, no slow breeder development, no change in nuclear waste disposal methods (that is, continuing our present problem (er, solution) of using (er, ignoring or exaggerating) the issue to stop further nuclear power production, no additional hydro or pumped storage application, ….)
2) Assuming all CO2 production is from man-released causes, then accelerating that assumed increase and extrapolting that accelerated increase for 400 years into the future.
3) Ignoring increased plant life and productivity from the increased CO2 in the atmosphere, but then predicting increased famine because (somehow ????) more plants will die from the assumed global warming caused assumed heat increase and assumed lack of water while also ignoring the increased growing season from this assumed global warming “threat”
4) Assuming the assumed increase in CO2 concentration will cause an increased atmospheric heat retention, and then multipying that assumed effect by 10 (sometimes 15) times arbitrarily – because otherwise there is is no effect of CO2 on atmospheric heat retention.
5) THEN assuming that EVERYTHING ELSE in our current climate remains exactly the same by allowing NO feedbacks: Assuming that the increase in temperature has no other effect on water vapor, clouds, net flows of air in the upper, middle or lower atmosphere, the PDO, the AMO, polar winds, net reflectivity, heat conditions or humidity, air flow from the jet streams, etc. (This is because the current crude finite element GCM’s you are exclusively relying on for 400 year-long predictions can’t model day or night, winters and summers, jet streams, and any region smaller than a single good sized US state or average European country.
6) Then you take these simplified extrapolations and merely multiply the effects by factors of hundreds and thousands: If temperatures increase (by 1.5 degrees) then the polar ice caps and all of Greenland will melt (even though average temperatures are colder than -30 degrees!) somehow the ice will melt, or that a 1.5 degree increasein temperatures will kill 30,000 spciecies (even though an actual 0.5 degree temeprature increase in temperatures has killed exactly …. NONE. Somehow, a 1.0 degree change will kill 29,999 more species across the world.)
You predict droughts and floods and hurricanes – with no evidence and no rational justification of ANY of your numbers.
That sir, is what you are predicting. Somehow.
You predict the climate 100, 200, and 400 years into the future, but fail at the first 15 years by multitudes.
And with the deliberate aid of the propagandist (er, press) you are only allowing one side to speak.

kdkd
December 27, 2009 2:12 pm

Solent:
The theory of chemical bonds is based on quantum theory. It’s rather well understood, and the basis of much of modern civilisation’s infrastructure. Increased co2 levels failing to warm the atmosphere (once short term variability has been accounted for) would be a major blow against this theory. However, there is no evidence for this.
RACookPE1978:
Classic “sceptic” technique of taking a bunch of information, taking bits out of context then extrapolating into areas totally irelevant to the original premise, and a gish gallop of statements with scientifically dubious provenance.

Editor
December 27, 2009 2:14 pm

But Pamela, that is part of the problem:
These “linear” predictions are dead wrong. All of them. From any point back in time.
You have got to begin (as noted in part above) with a gut-level understanding of this world’s climate as a cycle: a series of summed sine and cosine waves of various amplitudes ADDING to each other sometimes and SUBTRACTING from each other at other times.
A volcano throws a single wave (a trough actually) into the mix of from 1/2 year to 1-1/4 years long.
An El Nino throws a single spike into the mix: 3/4 year to 1-1/2 years long.
The AMO/PDO winds add two separate 30 year cycles on top of three sunspot cycles of about 11 years each to a 65 year solar oscillation (the 1870’s peak, dip in 1905-1910, the 1935-40 peak, the dip in the early 70’s and back up to today’s peak.
All of these short term cycles are on top of an independent (?) 800-900 hundred year Optimum Period Cycle. (Roman Warming Period-Medieval Warming Period-Modern Warming Period)
Every one of these pushes the temp’s up or down by a little bit.
Drawing any single line from any two points in the overall curve is simply dead wrong. Today’s temps have leveled off from a local 60 year peak

Editor
December 27, 2009 2:15 pm

RACookPE1978:
“Classic “sceptic” technique of taking a bunch of information, taking bits out of context then extrapolating into areas totally irelevant to the original premise, and a gish gallop of statements with scientifically dubious provenance.”
Show me that any of those statements are wrong.

kdkd
December 27, 2009 2:15 pm

RACookPE1978:
You’ve committed the classic error of assuming that all the feedback mechanisms will go in the direction that you expect them to (i.e. error of wishful thinking). Based on no evidence we’d have to expect that feedback mechanisms will be neutral through random chance. However, the available evidence and underlying theory suggests that there are quite a few positive feedback mechanisms starting to operate.

Richard M
December 27, 2009 2:20 pm

kdkd (14:12:15), “Classic “sceptic” technique of taking a bunch of information, taking bits out of context then extrapolating into areas totally irelevant to the original premise, and a gish gallop of statements with scientifically dubious provenance.”
Yet not a single fact to dispute any of the statements. In other words, a classic “warmist” argument.

Dave F
December 27, 2009 2:24 pm

kdkd (14:12:15) :
How much will the temperature go up if there is an additional 34ppm of CO2 added to the atmosphere, under the same climatic conditions as this year, in the next year?

Carlos GRANT
December 27, 2009 2:38 pm

Since July of this year we have “El Niño” again. Here in Buenos Aires it is raining every two or three days and our sommer is very fresh. The same happened in 1998.
That is perhaps the reason why now the global temperatures have increased during the last five months. When “El Niño” is over (April or May) we will able to see the global temperatures without its effect.

Basil
Editor
December 27, 2009 2:50 pm

Richard M (05:56:25) :
For those of you who think monthly anomoly data is a valid unit … well, why not use daily data? Why not use hours, minutes, nanoseconds?
Clearly I could show any trend I wanted by making the units small enough and I could get great statistical verification.

Richard, we’re on the same side here (I think, e.g. skeptical of CO2 induced AGW) so I don’t want to see this turn argumentative. But there is no magic to any given unit of time for empirical analysis. You attempted to justify the use of annual vs. monthly, and I found the reasoning unpersuasive. Now you come back with “well, why not use daily data?”
Indeed, why not? There may well be some — many — research questions where daily data is just the ticket. Several immediately come to mind. It is not hard to imagine them.
It is not about “making the units small enough” to “get great statistical verification.” It is about using units that are appropriate for the task. For measuring temperature trends, I prefer monthly over annual because I think that it is depicts the full range of natural climate variation better than annual data. And contrary to what several have been saying (some who thought that monthly data would invalidate Luboš point, and I showed that it didn’t), it isn’t guaranteed that using monthly data will result in a better statistical result. Yes, monthly data increases the number of observations, and that’s a good thing that will improve the goodness of fit. But it also introduces a lot more volatility, and that makes it harder to find a statistically significant trend. If the monthly volatility is high enough, this could have the opposite effect of what some expect.
But the main point of my post is to emphasize that there is no magic unit of analysis for time series analysis. I think in most cases, monthly is preferable. At least for measuring temperature changes. But as I said, I can think of some cases where using daily data would be preferable. In other cases, it would be appropriate to use just portions of a year (say winter months, or a “heating season”, typically November through May, in utility usage analysis). Do not put all your eggs in the “annual data is best, and anything else must have a nefarious purpose” bag.

tfp formerly bill
December 27, 2009 2:57 pm

Here’s an interesting plot from UAH – the globe averaged monthly stratopheric temperatures:
All stratospheric results
http://img51.imageshack.us/img51/1592/uahstratospherictempsal.png
Just the global
http://img39.imageshack.us/img39/6902/uahstratospherictempsgl.png
Now, if the global data is not an artifact of the system, then there seems to be a dump of heat in 1983 and 1992.
pre 1983 temperature is basically flat
between 1983 and 1992 temperature is basically flat but 0.5 deg less than pre 1983.
after 1992 temperature is basically flat but 0.5 deg less than between 1983 and 1992.
What is this heat dump?
Answers on a postcard.

Editor
December 27, 2009 3:00 pm

@Luboš Motl (01:24:45) :
Luboš,
To me, one of the really fascinating statistical oddities in the UAH data is that the entire “warming trend” occurs between 1995 and 2000.
There’s no warming trend before January 1995 and no warming trend after June 2000…
UAH Lower Trop

Steve Oregon
December 27, 2009 3:01 pm

How does this guy get this bad?
Ross Gelbspan’s video on climate change and the fossil-fuel-funded disinformation campaign
http://link.brightcove.com/services/player/bcpid51061328001?bctid=52599643001

Editor
December 27, 2009 3:21 pm

kdkd
“Based on no evidence we’d have to expect that feedback mechanisms will be neutral through random chance. However, the available evidence and underlying theory suggests that there are quite a few positive feedback mechanisms starting to operate.”
Er, uhm, ….. No.
See, the only way Hansen and his ilk can get their global warming models to work is by back-adjusting their outputs from 1970 through 1998 by ADDING in a fudge factor by artifically assigning a variable “soot level index” to the air (globally approximately the (local) European and United States-led cleanups) and then arbitrarily extending this by assuming that India/China/Brazil/Mexico/South Africa will change their soot index values (somehow) in the future.
And by multiplying the (assumed) effect of approximating CO2’s effect on the atmosphere by multiplying it by a factor of ten. (To account for CO2 increasing water vapor’s effect on greenhouse gas reflectivity – though there is no evidence that this will actually happen. And though this effect is not in proportion to any physical amounts of CO2, water vapor, or clouds actually present. And though this (assumed) extra amount of water vapor will have no other effect on the world: like increasing clouds or upper atmosphere absorption/reflection of heat/light/cosmic waves/etc.
You see, ALL of the assumed feedbacks are assumptions on your part (by Hansen and his cronies) to make the final effect what they want.
Your statement on CGM positive and negative calculation feedbacks is completely contradicted by what your people are actually doing with their calculations.
—…—
Further, there are NO observations of ANY type to suggest that ANY (assumed) positive AGW feedbacks of ANY types have been observed. And there have been several direct observations that negative feedbacks HAVE been observed (particularly in cloud reflectivity), and many calculations that show that negative AGW feedbacks SHOULD be used in any climate study.
Again, that part of your statement is completely false.

December 27, 2009 3:34 pm

TonyB note that Helm Glacier has lost at least 30% of its volume in the last 25 years, that is impressive. As you point out the glacier was advancing in the late 1960’s, so this is not just a continuation of a long term retreat, but response to recent warming. Also note that its mass balance history looks just like all of the other glaciers reported to the World Glacier Monitoring Service. We began reporting these before this hoopla began too.
http://glacierchange.wordpress.com/2009/12/19/helm-glacier-melting-away/

Neil Crafter
December 27, 2009 3:52 pm

RACookPE1978 (14:03:14)
Beautifully put questions. The proponents of the armageddon that AGW is supposed to present to the world have never to my mind answered these questions. Why is warming (if it is occurring) such a bad thing? As for “kdkd”, his/her insulting tone saying things like “we predict” and “classic sceptic argument” is to be expected but I note no actual answers provided to RACook1978’s questions.

Duncan
December 27, 2009 4:20 pm

Running 13 month average somehow stops or 7 months ago.
Might want to check the end-point handling on that, and/or change the caption.

michael
December 27, 2009 4:32 pm

If you take out the 1998 spike that is go to woodsfortrees web site and first plot the data from 1978 to 1997 you a graph that has a least squares linear fit through it with a slope of .0339 degrees / decade if again you then take out the spike and plot the data from 2000 to 2009 you get a graph with a least squares trend line through it with a slope of .0389 degrees / decade obviously the spike in the graph by the event centered on 1998 really changes the slope of the least squares fit line through the data. It would seem to me that if you treat that spike as a spurious event, then there is no difference is slope or no real change from 1978 to 2009 even though co2 has been increasing the whole time.

May
December 27, 2009 4:50 pm

Er… I’m not a statistician (or a theoretical physicist) but I found this post quite compelling:
http://www.realclimate.org/index.php/archives/2009/10/a-warming-pause/
As it turns out, climatologists use better data and explain why 🙂

December 27, 2009 5:01 pm

Your understanding of what a confidence interval tells us is flawed. Don’t worry though. Most people, even those who run a lot of very advanced models using very advanced software do not understand the concept.

Only the 72% confidence interval for the slope touches zero. It means that the probability that the underlying slope is negative equals 1/2 of the rest, i.e. a substantial 14%.

That is not a valid interpretation of a confidence interval. The “true” slope of the line is just a number. As such, it is either negative or not. There is no “probability that the underlying slope is negative”.
You choose the appropriate confidence level before you run the regressions.
Confidence intervals with historical, rather than randomly selected, observations are hard to interpret.
Normally, having X% confidence means that for X% of randomly selected samples, the confidence interval calculated using this method would include the true slope.
The X% confidence is NOT the probability that the interval includes the “true” slope.
As before, the “true” slope is just a number (even though you do not know it). Therefore, it is either in the interval you calculated, or it is not.
Everything above should be interpreted within the confines of classical statistics. Bayesian stuff is slightly more interesting.

December 27, 2009 5:15 pm

Well I’m no scientist but people manipulating statistics to further their own agenda is not news to me.I’ve wondered for many years about global warming,all the dire predictions and the alarmists comming out of the woodwork.Not surprised some manipulation of data has turned up there.

Dave F
December 27, 2009 5:25 pm

Basil (14:50:15) :
Aren’t you, at some point, just embedding the problems with averaging temperature with a simplistic calculation in the data?
(Tmin+Tmax)/2 is not the average temperature. It assumes that all temperatures exist for the same amount of time over the day. If Tmin and Tmax occur for the same amount of time, and all the points in between also do, then that would be the case, but that is almost never the case. If the temperature is 40F for seven hours, 60 for four hours, and 35F for the last thirteen, do you say the average temperature of the day was 50F? I would say it is ~41F=(40*7)+(60*4)+(35*13)/24. Is there a problem with this?
Now, I realize that there are only two samples taken, but I can’t imagine it would be a great problem to collect a sample every hour. It would require more storage space, but it would be far more accurate.

Basil
Editor
December 27, 2009 6:03 pm

Duncan (16:20:36) :
Running 13 month average somehow stops or 7 months ago.
Might want to check the end-point handling on that, and/or change the caption.

It is a centered moving average. It has to stop 7 months before the end (leaving 6 months blank at the end, both ends, actually).

Editor
December 27, 2009 6:03 pm

mspelto (15:34:20) :
TonyB note that Helm Glacier has lost at least 30% of its volume in the last 25 years, that is impressive. As you point out the glacier was advancing in the late 1960’s, so this is not just a continuation of a long term retreat, but response to recent warming. Also note that its mass balance history looks just like all of the other glaciers reported to the World Glacier Monitoring Service. We began reporting these before this hoopla began too.
—…—…
OK. Let us assume that glacier retreat is a “symptom” of rising global temperatures.
1) What is the actual, measured increase in (both global AND local) temperatures between (any) two dates where an AGW “expert” is claiming that “glaciers are retreating”?
2) Demonstrate conclusively – and exclusively! – that “The specific measured temeprature increase would result in that much glacier ice melting over that period of time.”
This has never been done: For example, if you want to claim that glacier retreat proves that global warming is real” then you must show that “a half of one degree temperature increase (at the measured face of the glacier) will result in 4 million tons of glacier ice melting from the face and bottom of the glacier in 20 years.
Further, you (the AGW-proponent) must show that glacier melt worldwide STOPPED in 2000 (because temperatures clearly have not risen since the year 2000 and since you claim that ALL glacier melting/retreat is due to global warming (temperature increase); therefore there can be NO glacier retreat anywhere since the year 2000.
if glaciers worldwide have continued to retreat since the year 2000 (or 1995 actually), then glacier advance and retreat must not be solely due to global warming – at the time of the retreat/advance at least.
Now, obviously, over a long period of time and many degrees of change in temperature, glacier retreat IS directly to overall global temperatures.
But the entire AGW premise is based on a 25 year change in temperatures (of less than 1/2 of one degree) that is claimed to be caused by man’s release of carbon dioxide.
If glacierretreat/advance are NOT related to short term 30 year cycles) global warming of less than 1/2 degree, then glacier retreat cannot be used as a “proof” of man-caused global warming.
Global warming of greater than 30 years (the long-term warming that IS naturally caused and that CANNOT be controlled or affected by man) does occur but CANNOT be caused by mankind.

Basil
Editor
December 27, 2009 6:06 pm

Dave F (17:25:06) :
Well, you are raising quite a different set of questions now. Relevant and interesting, but really more of a tangent to the original discussion.

December 27, 2009 7:04 pm

Glaciers are the easiest things in the world to cherry pick, because there are about 160,000 of them world wide. Just pick the ones that are receding, then start arm-waving and pontificating about “climate change.” It’s an easy way to scare the kids.
In general, most glaciers are receding, as they have been since the LIA. But not all. And it has nothing to do with CO2, which is the central pillar of the climate catastrophe crowd.
The alarmists can’t even get their dates straight: click. Being several centuries off casts doubt on their little remaining credibility.
Some glaciers are advancing. If CO2 was the cause of glacier retreat, all glaciers would react to it.
The basic fact of the matter is that the raw global temperature record has been so manipulated, scrubbed, fabricated, tweaked, invented/filled in and massaged, that it can not be relied on to show anything conclusively.
The only honest course of action now is to start over with well sited, reliable, regularly calibrated instruments, maintained around the globe. But honesty is very low on the government’s list of priorities. And of course ‘UN honesty’ is an oxymoron.

kdkd
December 27, 2009 7:16 pm

RACookPE1978:
Here’s your classic skeptic technique of avoiding the stuff you don’t like, and making disproportionate claims about the stuff that you do like. Personally I trust the IPCC’s work as it’s based on a wide ranging process of consensus that receives scientific then political review – it’s only flaws as far as I can see are that it tends to being more conservative than might be prudent because of the need for scientific and political consensus. I certainly don’t buy the conspiracy theories popular in these pages.
Anyway the IPCC covers a range of forcing mechanisms, but doesn’t look at feedback mechanisms that may arise from the increased forcing caused by co2.
Neil Crafter: Your tone is the insulting one here I’m afraid. I’m merely trying to use measured scientific language – modelling the kind of behaviour that I’d like to see from the “sceptic” community. Mostly I see the sceptic community as an interesting study in the group psychology of [snip].

NickB.
December 27, 2009 7:28 pm

Michael,
You might as well link to Al Gore’s new book as a reference – and if you think GISS and CRU are better measures than the satellite data, just think about it this way…
Imagine only being able to see a somewhat random subset of 1/100th maybe even 1/1000 of the pixels on your computer monitor, and then recreating/modeling the rest of your screen based on those few pixels. That is what the instrument surface temperature records are essentially doing. The satellite records are mopre akin to taking a digital snapshot.
The reason RC doesn’t reference any of the satellite data in this article is that their instrument based reconstructions still show warming where the satellite data from RSS and UAH do not. The combination of found warming and use of only their own reconstruction (Gavin is more or less in charge of GISS) as the de facto standard really are quite convenient, or quite suspicious… it’s all in how you look at ‘it I guess

December 27, 2009 7:32 pm

kdkd,
You can ‘trust’ the IPCC all you want. But it’s like trusting a snake to not bite you. Look at this chart: click
Literally dozens of peer reviewed studies show that CO2 persistence is very short; ten years or less. But the 100% political appointees who run the IPCC don’t like CO2’s short persistence time, because it completely debunks their CO2=AGW conjecture.
To avoid that problem, the UN arbitrarily set the CO2 residence time at 100 years. And you say you trust the IPCC over all those peer reviewed studies. Why would you?

Editor
December 27, 2009 7:34 pm

May (16:50:02) :
Er… I’m not a statistician (or a theoretical physicist) but I found this post quite compelling:
http://www.realclimate.org/index.php/archives/2009/10/a-warming-pause/
As it turns out, climatologists use better data and explain why 🙂

If I gave the statistical climatologists at RC this series of numbers, they would find a warming trend…
1907 -0.475528
1907.08 -0.482963
1907.17 -0.489074
1907.25 -0.493844
1907.33 -0.497261
1907.42 -0.499315
1907.5 -0.5
1907.58 -0.499315
1907.67 -0.497261
1907.75 -0.493844
1907.83 -0.489074
1907.92 -0.482963
1908 -0.475528
1908.08 -0.46679
1908.17 -0.456773
1908.25 -0.445503
1908.33 -0.433013
1908.42 -0.419335
1908.5 -0.404508
1908.58 -0.388573
1908.67 -0.371572
1908.75 -0.353553
1908.83 -0.334565
1908.92 -0.31466
1909 -0.293893
1909.08 -0.27232
1909.17 -0.25
1909.25 -0.226995
1909.33 -0.203368
1909.42 -0.179184
1909.5 -0.154509
1909.58 -0.12941
1909.67 -0.103956
1909.75 -0.0782172
1909.83 -0.0522642
1909.92 -0.026168
1910 -3.58979e-09
1910.08 0.026168
1910.17 0.0522642
1910.25 0.0782172
1910.33 0.103956
1910.42 0.12941
1910.5 0.154508
1910.58 0.179184
1910.67 0.203368
1910.75 0.226995
1910.83 0.25
1910.92 0.27232
1911 0.293893
1911.08 0.31466
1911.17 0.334565
1911.25 0.353553
1911.33 0.371572
1911.42 0.388573
1911.5 0.404508
1911.58 0.419335
1911.67 0.433013
1911.75 0.445503
1911.83 0.456773
1911.92 0.46679
1912 0.475528
1912.08 0.482963
1912.17 0.489074
1912.25 0.493844
1912.33 0.497261
1912.42 0.499315
1912.5 0.5
1912.58 0.499315
1912.67 0.497261
1912.75 0.493844
1912.83 0.489074
1912.92 0.482963
1913 0.475528
1913.08 0.46679
1913.17 0.456773
1913.25 0.445503
1913.33 0.433013
1913.42 0.419335
1913.5 0.404509
1913.58 0.388573
1913.67 0.371572
1913.75 0.353553
1913.83 0.334565
1913.92 0.31466
1914 0.293893
1914.08 0.27232
1914.17 0.25
1914.25 0.226995
1914.33 0.203368
1914.42 0.179184
1914.5 0.154509
1914.58 0.12941
1914.67 0.103956
1914.75 0.0782172
1914.83 0.0522642
1914.92 0.026168
1915 5.38469e-09
1915.08 -0.026168
1915.17 -0.0522642
1915.25 -0.0782172
1915.33 -0.103956
1915.42 -0.12941
1915.5 -0.154508
1915.58 -0.179184
1915.67 -0.203368
1915.75 -0.226995
1915.83 -0.25
1915.92 -0.27232
1916 -0.293893
1916.08 -0.31466
1916.17 -0.334565
1916.25 -0.353553
1916.33 -0.371572
1916.42 -0.388573
1916.5 -0.404508
1916.58 -0.419335
1916.67 -0.433013
1916.75 -0.445503
1916.83 -0.456773
1916.92 -0.46679
1917 -0.475528
1917.08 -0.482963
1917.17 -0.489074
1917.25 -0.493844
1917.33 -0.497261
1917.42 -0.499315
1917.5 -0.5
1917.58 -0.499315
1917.67 -0.497261
1917.75 -0.493844
1917.83 -0.489074
1917.92 -0.482963
1918 -0.475528
1918.08 -0.46679
1918.17 -0.456773
1918.25 -0.445503
1918.33 -0.433013
1918.42 -0.419335
1918.5 -0.404509
1918.58 -0.388573
1918.67 -0.371572
1918.75 -0.353553
1918.83 -0.334565
1918.92 -0.31466
1919 -0.293893
1919.08 -0.27232
1919.17 -0.25
1919.25 -0.226995
1919.33 -0.203368
1919.42 -0.179184
1919.5 -0.154509
1919.58 -0.12941
1919.67 -0.103956
1919.75 -0.0782172
1919.83 -0.0522642
1919.92 -0.026168
1920 -7.17959e-09
1920.08 0.026168
1920.17 0.0522642
1920.25 0.0782172
1920.33 0.103956
1920.42 0.12941
1920.5 0.154508
1920.58 0.179184
1920.67 0.203368
1920.75 0.226995
1920.83 0.25
1920.92 0.27232
1921 0.293893
1921.08 0.31466
1921.17 0.334565
1921.25 0.353553
1921.33 0.371572
1921.42 0.388573
1921.5 0.404508
1921.58 0.419335
1921.67 0.433013
1921.75 0.445503
1921.83 0.456773
1921.92 0.46679
1922 0.475528
1922.08 0.482963
1922.17 0.489074
1922.25 0.493844
1922.33 0.497261
1922.42 0.499315
1922.5 0.5
1922.58 0.499315
1922.67 0.497261
1922.75 0.493844
1922.83 0.489074
1922.92 0.482963
#Data ends
#Number of samples: 192
#Mean: -0.00247671
WFT
Unfortunately, the number series is a harmonic function – a Sin wave. It has no trend.
Here’s the Earth’s climate over the last 2,000 years…
Moberg & UAH
If I gave a statistician the raw numbers to the graph above, they would say that there is no warming nor any cooling… Because the linear regression is flat.
The Earth is always warming or cooling. It warmed from 1980-1942… Cooled from 1943-1976… Warmed from 1977-2003… It’s been cooling since 2003.
The episodes of warming and cooling occur along multiple cycles of varying frequency and amplitude…
UAH Spectral Decomp
All of the warming during the satellite record occurred in one 63-month period…
UAH Jan 1995
Temperatures were flat before and after that 63-month period…
UAH Dec 1978
UAH Apr 2000

Editor
December 27, 2009 7:37 pm

Correction to: David Middleton (19:34:32)
“1980-1942” should be “1908-1942″…

The Earth is always warming or cooling. It warmed from 1908-1942… Cooled from 1943-1976… Warmed from 1977-2003… It’s been cooling since 2003.

Editor
December 27, 2009 8:02 pm

kdkd:
“Personally I trust the IPCC’s work as it’s based on a wide ranging process of consensus that receives scientific then political review – it’s only flaws as far as I can see are that it tends to being more conservative than might be prudent because of the need for scientific and political consensus. I certainly don’t buy the conspiracy theories popular in these pages.”
—…—
No.
You are dead wrong.
The IPCC is based on self-serving politicians from dictator-ruled second and third-world countries who are working (successfully) to take money from the US and Western Europe to spend it on their countries “projects”.
The “science review” has proved to be dead wrong. the “science”has never been shown “right”, and all parts of it have been individually proved to be (if not corrupt or dead wrong) to be exaggerated or controversial, or legitimately disputed. There is no single piece of the supposed AGW theory can stand -collectively or in total – against debate. None. No part of it can be defended absolutely as “true data.”
And now we find the AGW “scientific community” cannot even provide the raw data they claim they are using. Their computer programs have no inspection or quality controls. Their data storage has been proved to be paper (thrown away), tapes, PC printouts – with the source code destroyed, backup never made on obsolete systems and backups themselves destroyed. The source code for basic programs is laughable – if not a sad reflection of the amateurish methods and sloppy programming in obsolete languages for programs that are not commented or even verifiable.
The “political review” has been shown to be done by one person re-writing the “summary for Politicians” to DELIBERATELY create the current programs of economic destruction and taxation wanted by the UN’s politicians.
To do this, they are spending 79 billion of US taxpayer money to fund those “52 scientists” who re-wrote (deliberately ignoring and abusing those 33,000 other scientists and climate specialis who disagree with their UN-funded and US-funded theories. As the leaked programs prove, 26 of these 52 “scientists” conspired (despite your denial!) to actually corrupt their data and pollute the UN’s process – all for the POWER, MONEY, and INFLUENCE they can get by supporting their AGW propaganda.
“Conservative” predictions? Even Hansen’s ten year predictions are dead wrong! What’s “conservative” about re-writing political papers in isolation to exaggerate effects 400 years in the future? What’s conservative about not getting even ONE prediction correct within a factor of 2 of the real world?
What? Do you think Al Gore and the UN actually “earned’ their Nobel Peace Prize for their roles in studying the supposed “science” behind AGW theories?
Follow the political power that comes from controlling AGW policies.
Follow the economic power that comes from controlling AGW policies
Follow the power that comes from influencing the world’s energy supplies.
Follow the money that funds corruption:
If you claim that “Big Oil” is behind any skeptic’s position, or you claim that “Big Oil” has corrupted the science of global warming, then you are only are proving that money HAS corrupted the lairs and propagandists who are promoting AGW for their own political and economic benefit.
I am abusing your grade-school inspired “trust” in the UN.
But a UN commission that spend 2 years spending money on a multi-billion dollar international conference in Copenhagen for 45,000 accredited delegates, but only reserved room for 15,000 to get inside is too stupid to plan their way out of a snowdrift when a shovel is handed them.
The UN is corrupt, stupid, and is run to benefit dictators and governments who take their country’s money and spend it on their own mansions and their bank accounts. (Which, by the way, is what Gore and the UN-appointed head of the IPCC does too.)

savethesharks
December 27, 2009 8:03 pm

kdkd: “Here’s your classic skeptic technique of avoiding the stuff you don’t like, and making disproportionate claims about the stuff that you do like.”
[Textbook Projection, folks].
Are you, kdkd, not describing your OWN religion with this statement??
And I’ll take the term “skeptic” as a compliment…even if you don’t mean it as one.
Why??? Because it lays out the basic BASIC skeptical mode of science….
….as opposed to the dogma of the AGW faith.
Chris
Norfolk, VA, USA

savethesharks
December 27, 2009 8:07 pm

kdkd: “Personally I trust the IPCC’s work as it’s based on a wide ranging process of consensus that receives scientific then political review.”
You *TRUST* the IPCC’s work?
Sounds a bit more like faith.
And then….”scientific then POLITICAL review.”?
You cannot be serious here?
“Trust”…..”political” review???
On which planet do you live??
Chris
Norfolk, VA, USA

Editor
December 27, 2009 8:07 pm

kdkd:
“Here’s your classic skeptic technique of avoiding the stuff you don’t like, and making disproportionate claims about the stuff that you do like.”
—…—
Back up your claim: What part of the “stuff that I don’t like” am I avoiding?
Be specific. I want to know what I am avoiding (in your eyes) so I can address it.
[Since I know I have deliberately NOT avoided “anything”, you must know something I don’t know I don’t know I am avoiding.]

Evan Jones
Editor
December 27, 2009 8:12 pm

kdkd:
The IPCC will not “show its work” when requested. That alone puts it out of court.
The one most important thing we can take away from climategate is that all methods and data must be provided on request. What is shocking is that this lesson should have been necessary in the first place.

savethesharks
December 27, 2009 8:23 pm

RACookPE1978 (20:07:29) writing to kdkd: “Back up your claim: What part of the “stuff that I don’t like” am I avoiding?
Be specific. I want to know what I am avoiding (in your eyes) so I can address it.”

[Chuckle] That is what happens when you take on an exacting engineer, kdkd.
I could have warned you….but you would have never listened anyway.
Slice slice slice.
Chris
Norfolk, VA, USA

kdkd
December 27, 2009 8:40 pm

RACookPE1978:
Well we’re at a fundamental ideological impasse. The IPCC are a diverse international group, and there are multiple lines of evidence from various sources that show the overwhelming likeliest source of the post-industrial global warming is greenhouse gas emissions. The only skeptic arguments I’ve seen fiddle around the edges, and ignore a variety of well understood scientific theory to provide a rather incoherent argument often backed by rather strange conspiracy theories.
The fundamental problem with your argument is that you’re overstating the importance of clouds, and making unjustified long term extrapolations, while conveniently discounting the soot feedback, and ignoring the decrease of the arctic albedo which is a clear long term positive feedback.
As far as the models are concerned they have predicted arctic amplification of warming, as long as 30 years ago. This is not predicted by models where greenhouse gasses are not the forcing drivers.
Anyway, I’m gone now. My last comment is that the title of this post is incorrect. This is not a mathematical proof, its an incomplete argument derived from statistical reasoning. If the statistial argument was completed, it would not reach the same conclusion. Like I said, I find the sceptics interesting from the point of view of the group psychology of denial, but little else.

savethesharks
December 27, 2009 9:03 pm

kdkd: “The IPCC are a diverse international group, and there are multiple lines of evidence from various sources that show the overwhelming likeliest source of the post-industrial global warming is greenhouse gas emissions.”
Show forth the “multiple lines of evidence.”
Where are they? Show some direct observations that aren’t simply climate model extrapolations.
You won’t because you CAN’T.
And that is the reason you are fleeing this argument….
Chris
Norfolk, VA, USA

savethesharks
December 27, 2009 9:12 pm

“As far as the models are concerned they have predicted arctic amplification of warming, as long as 30 years ago. This is not predicted by models where greenhouse gasses are not the forcing drivers.”
HUH?? What code language are you speaking here? Speak clearly, please.
“Anyway, I’m gone now.”
If you can’t stand the heat…..
Like I said, I find the sceptics interesting from the point of view of the group psychology of denial, but little else.
And we find you less than interesting, but from the point of view of the group psychology of GROUPTHINK and cognitive dissonance, but less than little else.
You are fleeing because you afraid of what you might have to recant if you learned the truth….
Cognitive dissonance at its best. The question is….will you ever man up and recognize your error.
The truth hurts.
But it also sets you free…
Chris
Norfolk, VA, USA

kdkd
December 27, 2009 9:33 pm

savethesharks:
The climate models have shown for a long time that early greenhouse gas forced global warming will result in amplified warming in the arctic. It’s very powerful evidence that we have a serious problem, and the repeated challenges by sceptics have been unable to undermine this evidence.
Anyway, I’m out of here because I’m off on holiday and don’t have further time to waste on this. An impartial view of the data, and the challenges to the science show the skeptic position severely wanting. But there’s quite a lot of this, and it’s not trivial to explain. Sitting in the sceptic echo chamber is an amusing brief diversion, but the scientific credibility of the arguments in this thread are generally pretty poor, with the most common three points being strange conspiracy theory, overstatement of weak conclusions to try to magnify the sceptic case in a way the evidence doesn’t warrant, and finally irellevant or scientifically wrong statements that have no basis in fact.

Editor
December 27, 2009 10:07 pm

kdkd:
“Anyway, I’m out of here because I’m off on holiday and don’t have further time to waste on this. An impartial view of the data, and the challenges to the science show the skeptic position severely wanting. But there’s quite a lot of this, and it’s not trivial to explain. Sitting in the sceptic echo chamber is an amusing brief diversion, but the scientific credibility of the arguments in this thread are generally pretty poor, with the most common three points being strange conspiracy theory, overstatement of weak conclusions to try to magnify the sceptic case in a way the evidence doesn’t warrant, and finally irellevant or scientifically wrong statements that have no basis in fact.”
—…—
A “waste” of time?
When your AGW propaganda is going to destroy the world’s economy for ….. nothing? When tens of millions have already been killed by false enviro theories, you consider energy and food and power and transporation for the world’s deperate poor a “waste of time” ??? I pity your lack of morals.
Again: Give me specific proof of any of your generic (general) statements. I’m working 80+ hour weeks trying to improve real power plant efficiencies to deliver real power to realpeople so real people don’t freeze in the dark and starve. I’ve been receiving environmental and efficiency and scientific awards awards since 1972.
There is no part of the physics, chemistry, chemical bonding, gas theories, heat exchange theory, nuclear and chemical bonding, cosmic and cloud interactions (yes, I’ve seen clouds produced by nuclear radiation – measured it too), radiative and conductive heat transfer, finite element analysis, computer programming, computer simulations, winds, steam, political, economic or statistical analysis I have not formally studied and applied and used. Don’t use your deliberately vague words – I know you are throwing press release propaganda because your basic words you repeating are false.
They are lies, in other words. If you believe them based on your faith in the UN and the world’s greedy socialist/liberals/environmental theists, and you repeat them, then you are a dupe.
If you are a scientist, then you can’t believe you own words and you are a simple propagandist.
I really don’t care about your holiday – only about your biased “faith” in a fruitless exaggeration of extrapolations from politicians and “scientists” who have produced no evidence in the past – nor can they produce any evidence now – that supports any of your conclusions.
EVERY statement you have made is no more true than a generic press release. NONE can be traced back to facts.

savethesharks
December 27, 2009 10:10 pm

kdkd: “The climate models have shown for a long time that early greenhouse gas forced global warming will result in amplified warming in the arctic.”
Ahhh…the climate models.
Show the actual evidence, not the models, kdkd. Can you?
Cite it please, before you “flee” on your holiday.
“An impartial view of the data, and the challenges to the science show the skeptic position severely wanting.”
What impartial view?? Yours??? Hardly impartial, bro.
“Sitting in the sceptic echo chamber is an amusing brief diversion, but the scientific credibility of the arguments in this thread are generally pretty poor, with the most common three points being strange conspiracy theory, overstatement of weak conclusions to try to magnify the sceptic case in a way the evidence doesn’t warrant, and finally irellevant or scientifically wrong statements that have no basis in fact.”
Uh huh….THIS folks….is sophistry at its best. Attempting to weave a clever argument….devoid of substance. A true smokescreen.
What is your agenda, kdkd???
Is AGW your personal religion, and is that why you feel you need to defend it?
Anyways….the burden of proof is on YOU. You espouse a fantastic theory, then you have to prove it.
Thanks for the entertainment…..
Enjoy your holiday.
[Or I thought you were GONE….a couple of posts ago because it was a waste of your time….]
Chris
Norfolk, VA, USA

Editor
December 27, 2009 10:17 pm

kdkd:
savethesharks:
“The climate models have shown for a long time that early greenhouse gas forced global warming will result in amplified warming in the arctic. It’s very powerful evidence that we have a serious problem, and the repeated challenges by sceptics have been unable to undermine this evidence.”
no. Again, no.
The only reason the arctic is expected to show AGW theories is because it is so cold (on average) that the water vapor is very low (low relative humidity, in other words.
With low water vapor, the CO2 concentration is higher, and thus the theoretical effect of 0.003 some-odd percent of CO2 in the air is expected to be higher than in the mid-latitudes, where water vapor is even higher a percent of total greenhouse gases.
but the ONLY parts of the Arctic that indicate rising temperatures are those parts of the former USSR which we now find (through the Soviet Union’s formal complaints to the IPCC and GISS and Hadcrut offices, were made by selectively eliminating MOST of the USSR’s weather stations to deliberately create hot spots in the Arctic. By the way, it was your “trusted” AGW experts and scientists who were corrupting the data from Russia.
If you claim melted ice across the Arctic is a “positive feedback” for AGW, then I challenge you to show me why your vaunted (theoretical hand-waving about ice melting by politicians who failed divinity school) yielded HIGHER ice regions in 2008 and 2009 than 2007. If low ice is to create a positive feedback, somebody forgot to tell nature. Because the ice coverage in April and May 2009 set all-time highs for coverage. (When using the same instruments form the same platform – do you want me to go into how previous ice calculations weren’t based on same data elements?

Evan Jones
Editor
December 27, 2009 10:19 pm

The problem with measuring sea level is that many land masses are subsiding and many are uplifting. There is also sedimentation and erosion. None of this is sea level “rise” or “fall”, per se.
This makes things very hard to judge. Esp, when we are talking in terms of a couple of mm per year! (I wouldn’t be surprised if the margin of error alone were a lot larger than that.)

savethesharks
December 27, 2009 10:31 pm

RACookPE1978 (22:07:27)
WELL said, mate. Way to sock it to ’em as I completely agree with your assessment.
And thanks for those 80 plus hours per week. Much appreciated.
Where….just WHERE would we be without our engineers??
In the f****** stone age, that’s where.
Cheers.
Chris
Norfolk, VA, USA

Bulldust
December 27, 2009 10:35 pm

No point engaging with kdkd… as proudly linked from his “site”:
http://blogs.crikey.com.au/rooted/2009/04/08/climate-change-cage-match-a-fight-to-the-death/
Preferred combat appears to be referring to oppposing views as delusional and psychotic. Of course, at a site like Crikey this is acceptable discourse. Don´t feed the troll.

kdkd
December 27, 2009 11:30 pm

RACookPE1978
No. In the absence of water vapour we’d expect less warming as it’s a powerful (equilibrium) greenhouse gas. And the satellite observations also confirm the arctic amplification as well. You’re too swayed by the conspiracy theories.
Bulldust: No surprise, I don’t respect the sceptic argument. If the current post is a good example of overextended conclusions, and poor use of scientific and statistical techniques. I understand here that if I lose patience, I get deleted. Over there there’s more vigorous (and entertaining) conversation.
Now really gone.
[Reply: kdkd has never had a post deleted from WUWT. ~dbs, mod.]

Norm in Calgary
December 28, 2009 12:03 am

We can only say that it is “somewhat more likely than not” that the underlying trend in 1995-2009 was a warming trend rather than a cooling trend.
Unfortunately, the AGW’ers also know this to be true, but that hasn’t stopped them has it?

savethesharks
December 28, 2009 12:14 am

Bulldust (22:35:56) :
Right. Agreed.
kdkd: “Now really gone.”
Good.
Ironically, you walked out in a huffyhuff about three posts ago saying you would not be coming back….yet you keep coming back.
Thanks for finally vocalizing your lack of respect for the “skeptics argument”.
Because of that lack of respect, you have really disqualified yourself in the ability to debate man to man here.
Go back “over there” (I am assuming you mean the Peoples Republic of RC) where you probably will feel more at home.
Chris
Norfolk, VA, USA

Bulldust
December 28, 2009 3:15 am

I tried having civilised discourse at Crikey´s Pure Poison site when they discussed the “obvious” warming trend in the above graph a few weeks ago. Needless to say all my rational arguments were gunned down by people with a similar slant and approach to dkdk. I gave up trying to have any meaningful debate there. Every now and then I toss a good story their way when it is obvious that AGW has completely missed the mark.
Interestingly I was making the same gut-feel arguments based upon my statistics background over there as here. It is pretty obvious from the more recent temperature data that neither a positive not negative trend is significant for the last 10 years or so. The Air Vent anaylsis linked above vindicates that gut-feel perfectly.
But people see what they want to see…

Bulldust
December 28, 2009 4:32 am

I almost feel I should apologise to kdkd now… it appears that when he assesses people as being delusional or psychotic, based upon their blog postings, that he is qualified to make such statements by virtue of his background:
http://www.uow.edu.au/~kd21/
He was a research neuropsychologist ya know.

December 28, 2009 6:02 am

Ric and others,
I’ve been meaning to add confidence intervals to WFT for a while, but it’s rather tricky to do this in bare-metal code – for one, IIRC it seems I need to calculate an inverse t-function. I have the mathematical description of the process, but anyone who can point me at any code (in any low-level language) for doing this simply given a basic time-series array, I’d be most grateful…
At least adding the SD of the data and R^2 of the trend to the textual data output should be fairly easy – I’m aiming to do an update over the holiday, so watch this space…
Merry Christmas and Happy New Year!
Paul

Leone
December 28, 2009 6:12 am

If Pinatubo is reduced from UAH with correction values +0.5, +0.4, +0.3, +0.2, +0.1 beginnig from year 1992, the trend 1990-2009 is +0.04.
So there actually is no warming during past 20 years, which is expected according to existed solar activity.
Currently NH land areas are cooling remarkably fast: If the trend of past 4 years still continues about 8 years, NH land is back to 70’s temperatures. It is the oceans which are still keeping global temperatures at relatively high level, but after few years it is expected that also oceans begin to cool faster.

Roger Knights
December 28, 2009 9:53 am

Bulldust wrote:
It is pretty obvious from the more recent temperature data that neither a positive not negative trend is significant for the last 10 years or so.

Make that nine years and you’ll deprive the warmists of the comeback that you’ve included a low year (2000) that lowers your average level. Looking at the bar chart (the 2nd one), it’s obvious that temperatures have plateuaed for the past nine years. That may not yet be statistically significant, but it’s surely not what the warmists would have predicted or bet on nine years ago. That should lower our confidence in their current predictions.
This modest claim (instead of saying “it’s cooling”) stands up better in the long run, and thus makes more of an impression on bystanders.
=========

KDKD wrote:
“Sitting in the sceptic echo chamber is an amusing brief diversion, but the scientific credibility of the arguments in this thread are generally pretty poor,”

How can he tell what this place is like from a visit to one thread? I’ve noticed several other drive-by warmists make similar sweeping and condescending dismissals after only a brief (single-thread) apparent acquaintance with the site. Further, their judgments that our side is , etc. appears to be based on seeing claims made here that contradict what they’ve read elsewhere, or that they believe have been refuted elsewhere — without trying to get deeper into the guts of the matter., or being self-critical oftheir own baggage.

Roger Knights
December 28, 2009 9:55 am

Oops — insert “with the exception of the down-year in 2008” after “it’s obvious that temperatures have plateuaed for the past nine years.”

Roger Knights
December 28, 2009 9:56 am

2nd oops — insert “wrong” after “their judgments that our side is “

tfp formerly bill
December 28, 2009 10:54 am

RACookPE1978 (22:07:27) :
When your AGW propaganda is going to destroy the world’s economy for ….. nothing? When tens of millions have already been killed by false enviro theories, you consider energy and food and power and transporation for the world’s deperate poor a “waste of time” ??? I pity your lack of morals.

You have evidence for yor comments that you would care to share?
destroy the world’s economy ???
tens of millions have already been killed by false enviro theories???
Again: Give me specific proof of any of your generic (general) statements. I’m working 80+ hour weeks trying to improve real power plant efficiencies to deliver real power to realpeople so real people don’t freeze in the dark and starve.
I would be interested in this (seriously!) as I thought the only significant improvement used in power generation was CCGTs.
I assume you have proof you can offer for your “there is no (insignificant) AGW statements?
I … care about your biased “faith” in … “scientists” who have produced no evidence in the past – nor can they produce any evidence now – that supports any of your conclusions.
So you say the scientists are all part of a massive conspiracy to do (something?). Again I challenge you to produce you evidence.
Remember you may be asked to produce this in court should one of these scientists you have defamed go in that direction.
savethesharks (22:10:46) :
…Ahhh…the climate models. …
What is your agenda, kdkd???
Is AGW your personal religion, and is that why you feel you need to defend it?
Anyways….the burden of proof is on YOU. You espouse a fantastic theory, then you have to prove it.

STS The same can be said of your stance as a AGW denier (you certainly are not a skeptic).
If I say that the temperature is increasing by 10C per decadeyou have denied yourselves (and me) the tools to prove that the statement is wrong/right:
Ice cap -its cyclical
Glaciers – it’s cyclical and cherry picked
GISS CRU HADCRUT GHCN etc temps – they are invalid
Tree rings – They are invalid, cherry picked, and conlcusion drawn up by fraudsters (not my words!!)
Ice-off days – natural cycles
The weather – well it’s just weather not climate.
Sea levels – invalid
Sea temperatures – invalid
etc.
Ah! you say – there is the satellite record since 1997. However you conveniently forget that the temperatures are derived from mathematicaql models. So that is gone to skepticism too.
Now prove to me that there is no 10C or 0.1C per decade warming.
By your view of all climate scientist who suggest that AGW is possible i.e. they are liars with an agenda and all part of some world domination conspiracy. The skeptical scientists are, in your view whiter than white.
RACookPE1978 (22:17:51) :
With low water vapor, the CO2 concentration is higher, and thus the theoretical effect of 0.003 some-odd percent of CO2 in the air is expected to be higher than in the mid-latitudes, where water vapor is even higher a percent of total greenhouse gases.
CO2 concentration of dry ir is constant over the globe:
http://img175.imageshack.us/img175/9698/manyco219992001.jpg
but the ONLY parts of the Arctic that indicate rising temperatures are those parts of the former USSR which we now find (through the Soviet Union’s formal complaints to the IPCC and GISS and Hadcrut offices,
I think you may find that the complaints had nothing to do with the Russian Met office, byt were from a right wing “think tank”
I challenge you to show me why your vaunted (theoretical hand-waving about ice melting by politicians who failed divinity school) yielded HIGHER ice regions in 2008 and 2009 than 2007.
I beleive that no one beleives 2007 to be a “normal” year It was an outlier on a steady downward trend
If low ice is to create a positive feedback, somebody forgot to tell nature. Because the ice coverage in April and May 2009 set all-time highs for coverage. (When using the same instruments form the same platform – do you want me to go into how previous ice calculations weren’t based on same data elements?
Oh dear – cherry picking or what!!!!!!!!!
From JAXA site the AMSRE record
http://www.ijis.iarc.uaf.edu/en/home/seaice_extent.htm
http://img137.imageshack.us/img137/6582/amsreseaiceextent.png
If cherry picking is now valid then I would chose 13th November and 12th December and 23rd December as dates to prove that Ice coverage is less than any other year (including 2007) in the AMSRE record!!!!!!!
Chosing anything but min ice extent NH and max ice extent SH is very dubious. Arctic ice maximum is limited by being unable to freeze land and Antarctic ice mininimum is limited by not being able to unfreeze land.

Joel Shore
December 28, 2009 11:09 am

Smokey says:

Literally dozens of peer reviewed studies show that CO2 persistence is very short; ten years or less. But the 100% political appointees who run the IPCC don’t like CO2’s short persistence time, because it completely debunks their CO2=AGW conjecture.
To avoid that problem, the UN arbitrarily set the CO2 residence time at 100 years. And you say you trust the IPCC over all those peer reviewed studies. Why would you?

Complete and pernicious nonsense. That chart that you show compares apples-to-oranges. While the exchange between the atmosphere and the upper ocean is relatively fast, that is not what controls the decay of most of an amount of CO2 released into the atmosphere (from a source outside the biosphere / atmosphere / upper oceans). Instead, the major rate-limiting step is the transfer of that excess CO2 into the deeper ocean. As a result of this, decay of an excess amount of CO2 is highly non-exponential, which means that it can’t be properly modeled as one single decay time (and, in fact, the carbon cycle models that the IPCC considers don’t contain only a single decay time, despite what Lawrence Solomon says). 100 years is a reasonable middle-of-the-road estimate if one is forced to give a single time. But the fact is that such a number will underestimate the decay in excess CO2 at short times and considerably overestimates the decay at long times. See here for a discussion of the actual science http://www.realclimate.org/index.php/archives/2005/03/how-long-will-global-warming-last/ or read Archer’s book “The Long Thaw” or any introductory climate book, such as L.D. Danny Harvey’s “Global Warming: The Hard Science” for details.

John Sims
December 28, 2009 11:13 am

It’s about time youse guys following this blog who have significant statistical and climate science skills came up with a “correct” or “best” way to analyse the available global temperature record. Hey, then you could build a consensus (on the way that the science *should* be done). Btw, you can have several “best” ways, as long as the differences between them are clearly explained and justified.
I have to count myself out – on the stats, I would have been up to it forty or so years ago, but now…. As to climate science, I’m still using seaweed (I’m a believer in the notion that while climate and weather are different, the former has some implications for the latter – and probably vice-versa given the nature of the climate system – a complex non-deterministic system if ever there was one).

Editor
December 28, 2009 12:53 pm

tfp formerly bill (10:54:18) :
“Again: Give me specific proof of any of your generic (general) statements. I’m working 80+ hour weeks trying to improve real power plant efficiencies to deliver real power to real people so real people don’t freeze in the dark and starve.
I would be interested in this (seriously!) as I thought the only significant improvement used in power generation was CCGTs.”
—…—…—
Combined cycle plants are good: clean coal and coal gasification with secondary heat recovery is even better because you recover more of the heat form tars and residue than you can in conventional coal burners: but that would involve burning “evil” coal and so the enviro’s prohibited building the plant we designed a few years ago.
Rewiring a generator can raise its output 1.5 – 2% On a 1200 Meg nuclear or fossil plant, that a lot of new “free” energy that is available: more than an entire wind farm, for example. And the output is available 92 (service factor) percent of the time, vice a wind farm’s average out of 21% to 23% availability. Changing transformers can get 1/2 of 1 percent efficiency improvements. Too often though, the “evil” trace PCBs prevent transformer replacement – again, we could improve efficiency, but enviro extremism prevents it.
Changing blade design can add 3-4 percent gains. (It’s more expensive, but the utilities don’t get money for improving efficiency (in fact they have to PAY to improve efficiency and pay a LOT for the new enviro reviews and paperwork), but only the enviro’s get green money for their projects and for delaying and preventing new energy projects.
Changing pipes in a cooling system, changing and cleaning oil and steam heat exchangers, changing gland seals or steam reducers and steam traps? Fixing insulation can help. (But that’s illegal too in many areas because of disposal costs. Again, enviro rules prevent real improvements.)
Replacing burners and heat treatment ovens in a aluminum forge plant. But treating that Aluminum (or steel) in China or India (at the UN IPCC’s leader’s steel mills and refineries, for example) requires them to use older, less effective burners and very, very dirty life-threatening emissions. Guess the enviro’s don’t mind killing Chinese coal miners – at a average rate of 6 per day. 365 days a year.
Changing fuel and gas mixes can be very effective. Fixing simple leaks? – But again, that costs money that is wasted on enviro research and “green projects” in universities. Did I mention that the enviro’s are in it for THEIR money?
Solar? Only useful output in spring and fall of 6 hours per day. $-1/2 per day in winter – IF the sun is shining. If you have clouds/fog/snow/rain/drizzle? You get nothing from your millions of wasted money but clouds/fog/snow/rain/drizzle for the farmers. (Which isn’t bad of course, but they can’t farm because you took their flat land for the solar collectors, and few things can grow under the shade of a solar collector.
It’s better in the summer (North hemisphere point of view): some solar collectors can actually get almost 7 hours of productive power a day. The rest of the time you need a regular plant. Again, nothing if there is clouds, rain, fog.
So you end up with a muddy parking lot full of dead plants nder your solar collectors. Or you can pave the farmland with rocks and asphalt. I’ve seen that many times as well. At least under a wind farm the cows can live in the sunshine-fed grass, and they don’t mind the noise and dead birds..

Yes, we are told that CO2 levels are constant worldwide – at least factoring in the seasonal up and down change in the hemispheres as plants grow in the spring and summer, then die off and release CO2 in the fall and winter. (But I though we were told by the IPCC that CO2 has a hundred year half life, so how come it really changes significantly every six months? Funny, the half life experiments “I” ran never behaved that way when “I” ran them….)
But in the Artic, lower water vapor plus a constant CO2 measn the relative perrcents of CO2 in the total water vaopr-dominated gas mix rise. Therefore, the mid-troposphere atmosphere in the Arctic “should be” higher earlier than it will be elsewhere in the planet according to AGW theory since the relative concentration of CO2 is higher there.
But it isn’t. And ground level temperatures have been steady since they began measuring them in the mid-70’s.

tfp
December 28, 2009 3:02 pm

As to the CO2 breathing in an annual cycle.
http://img175.imageshack.us/img175/9698/manyco219992001.jpg
1. the maximum breath is taken in the north. The further south you go the less the dip. Only the south pole station shows a southern dip 6 months anti-phase to the NH
2. It is not the dissolutuion in the ocean as this would be greater in winter not summer as is shown in my plot
3. Sary Taukum is as far from sea as is possible and shows a similar dip although the fall is slower and the rise faster than Pt Barrow.
4. It is not plant growth as the autumn die back in winter would be slow and unlikely to complete before the freeze.
Is it plant respiration O2 to CO2 at night CO2 to Growth during daylight? In polar region long nights=high CO2
Is it summer rain absorbing CO2 (snow will not absorb as much(??))
What is obvious is that the amount of CO2 absorbed during June to August is, as near as can be eyeballed, returned in August to January.
There is an overall effect on atmospheric CO2 of Zero over the 8 months from June to January. I.e. there is no effect on the half life in terms of ppm.
All the dip shows that there is some mechanism that could remove 12 years of CO2 increase in a matter of 3 months (if you could stop it being released again!!!!
————
I will ignore your jaundiced view of environmental protection, a rather essential part of life, which is a sad reflection on your supposed care for humanity.

December 28, 2009 4:35 pm

Smokey says the glaciers can be cherry picked. This is true, but there are three problems with this as applied to the key data sets of the World Glacier Monitoring Service.
1) Most of the long term glaciers in the mass balance data set were selected during the IGY in the late 1950’s. Not cherry picked after climate change.
http://www.geo.unizh.ch/wgms/mbb/mbb10/sum07.html
2) The terminus retreat records from places such as the North Cascades, Switzerland, Austria and Norway were begun long before we could have thought to cherry pick them. In the North Cascades all 47 glaciers we examine were retreating in 2009, but when we began in 1984 only 32 were
http://www.nichols.edu/departments/Glacier/Bill.htm
http://glacierchange.wordpress.com/2009/11/17/hinman-glacier-disappers/
In Switzerland of course 2003 was the first year with no glacier advancing in the century, these were not cherry picked.
http://glaciology.ethz.ch/messnetz/lengthvariation.html
3) Glacier termini respond to longer term climate change, temperature is not all that matters and depending on their size their is a lag. However the smaller glaciers such as in the Alps or North Cascades follow the temperature trends very nicely, indicating they do reflect temperature note the swiss record linked before. This is why they are used as a climate indicator, they do not need an urban heat island adjustment either.

December 28, 2009 5:11 pm

mspelto:

“In Switzerland of course 2003 was the first year with no glacier advancing in the century, these were not cherry picked.”

Despite the denial, here is a good lesson on how the alarmist crowd cherry picks glaciers. Note that mspelto carefully selected 2003 — implying that all glacier advances have stopped [“in the century” – which is only over the past 9 years].
Why try to make a natural event into an alarming-sounding crisis? Looking at mspelto’s Swiss link, the cyclic rise and fall of glaciers as a function of warming from the LIA and changes in precipitation at higher altitudes is evident.
And picking 2003 as an indicative year is surely cherry-picking. Why? Because 2008, for example, had five advancing glaciers: click. Maybe glaciers are beginning to advance again, no?
You can go to any year in that link and count the advancing glaciers [note that many glaciers are not monitored]. Also keep in mind that Switzerland is a very small country. Most of the planet’s glaciers are on the Antarctic continent, which has been cooling for the past fifty years.
When the warmists come up with empirical evidence connecting atmospheric CO2 with glacier retreat, I will sit up straight and pay attention. Until then, pictures of retreating glaciers are propaganda, just like pictures of polar bears.

Sharpshooter
December 28, 2009 6:01 pm

We’re setting records for cold but the Temp Anomaly is up 0.5C?
Am I missing something?

Jason S
December 28, 2009 6:04 pm

AMS November of 1980 Says:
“Evidence has been presented and discussed to show a cooling trend over the Northern Hemisphere since around 1940, amounting to over 0.5°C, due primarily to cooling at mid- and high latitudes – Bulletin of the American Meteorological Society – November 1980”
So North America has now come back .5C. Hey, were right back where we started.

Editor
December 28, 2009 6:25 pm

To address Rachel Carson “legacy” of death:
See:
Rachel Carson’s Ecological Genocide By: Lisa Makson
FrontPageMagazine.com | Thursday, July 31, 2003
A pandemic is slaughtering millions, mostly children and pregnant women — one child every 15 seconds; 3 million people annually; and over 100 million people since 1972 –but there are no protestors clogging the streets or media stories about this tragedy. These deaths can be laid at the doorstep of author Rachel’s Carson. Her1962 bestselling book Silent Spring detailed the alleged “dangers” of the pesticide DDT, which had practically eliminated malaria. Within ten years, the environmentalist movement had convinced the powers that be to outlaw DDT. Denied the use of this cheap, safe and effective pesticide, millions of people — mostly poor Africans — have died due to the environmentalist dogma propounded by Carson’s book. Her coterie of admirers at the U.N. and environmental groups such as Greenpeace, the Sierra Club, the World Wildlife Fund and the Environmental Defense Fund have managed to bring malaria and typhus back to sub-Saharan Africa with a vengeance.”

Carson = 100 million dead worldwide. Her facts were wrong by the way. The EPA ruling against DDT (in the US) was a political decision, by a politician who subsequently provided for, was promoted by, and was funded by the environmental groups he ruled in favor of. Against the prevaling science and the scientific data – then as now, the enviro groupthink does not permit facts to get in the way of their agenda.
Mao = 63 Chinese killed
Stalin = 23 million killed
Leaves the rest of the world’s evil socialists/communists rulers far behind.

Editor
December 28, 2009 6:32 pm

To show that glaciers are retreating due to global warming – you must show that the increase in temperature of 1/2 of one degree since 1970 caused the retreat: That is, how many million tons of ice are melted in a glacier by a 1/2 degree rise in temperature, if the average temperature of the ice at that elevation is xxx degrees?
It is the AGW crowd who is using localized glacier retreat to “prove” there is global warming everywhere … But they have not ever established that the only temperature change that has actually happened is enough to cause the effect we are supposedly seeing.
At no location anywhere have temp’s gone higher than 1/2 of one degree. The glacier in Europe that are retreating are exposing building, icemen, trees, and grassy araes that -by definition! – have been exposed before to the air.
So why should we believe that “man-caused” global warming did not happen before at even higher temperatures?

December 28, 2009 6:47 pm

Joel Shore (11:09:13):

“100 years is a reasonable middle-of-the-road estimate if one is forced to give a single time.”

Wrong again, but what’s new about that?
Here’s Prof Freeman Dyson, referring to the Keeling Mauna Loa chart and explaining exactly why a century-long CO2 residence time assumption is wrong:

The only plausible explanation of the annual wiggle and its variation with latitude is that it is due to the seasonal growth and decay of annual vegetation, especially deciduous forests, in temperate latitudes north and south. The asymmetry of the wiggle between north and south is caused by the fact that the Northern Hemisphere has most of the land area and most of the deciduous forests. The wiggle is giving us a direct measurement of the quantity of carbon that is absorbed from the atmosphere each summer north and south by growing vegetation, and returned each winter to the atmosphere by dying and decaying vegetation.
…vegetation in the Northern Hemisphere summer absorbs about 4 percent of the total carbon dioxide in the high-latitude atmosphere each year. The total absorption must be larger than the net growth, because the vegetation continues to respire during the summer, and the net growth is equal to total absorption minus respiration. The tropical forests at low latitudes are also absorbing and respiring a large quantity of carbon dioxide, which does not vary much with the season and does not contribute much to the annual wiggle.
When we put together the evidence from the wiggles and the distribution of vegetation over the earth, it turns out that about 8 percent of the carbon dioxide in the atmosphere is absorbed by vegetation and returned to the atmosphere every year. This means that the average lifetime of a molecule of carbon dioxide in the atmosphere, before it is captured by vegetation and afterward released, is about twelve years. [source]

As always, the question comes down to who we should believe. In this case, is it Prof Freeman Dyson? Or Joel Shore?
There are other papers showing that a relatively short residence time for CO2 means a low climate sensitivity number, which in turn means there is nothing alarming about an increase in atmospheric CO2. Its climatic effects are so insignificant that any changes in CO2 can be safely disregarded — as planet Earth is clearly showing us.
Finally, enough of the endless realclimate promotions. [Link to a book through Amazon if it’s supposedly that great.] I understand that realclimate needs all the help it can get. The problem is that the people running realclimate are the same crooked clique revealed by the CRU emails, and like their petty tyrant pals at Wikipedia, they heavily censor scientifically skeptical comments debunking CO2=AGW — and they do it on government [taxpayer-paid] time. How does RC’s constant censorship of opposing views fit in with the 1st Amendment? Or is the Constitution just too old timey for these rent-seeking scientists on the grant gravy train?
Why do alarmist sites censor skeptical comments? Because in a free marketplace of ideas, like WUWT, their repeatedly falsified conjectures – like their preposterous, evidence-challenged belief that a very minor trace gas controls the entire Earth’s climate – are routinely destroyed in a free debate. Joel Shore couldn’t ever get a comment posted at realclimate, if the tables were turned and he was a skeptic. Here, he is free to post. [But Joel isn’t getting any converts, so the marketplace of ideas is obviously favoring the skeptics’ reasoning; people regularly comment that when they looked closely into the AGW claims, they’ve ended up accepting the scientific skeptics’ arguments.]
So enough of the unmerited free promotion of the odious realclimate crew. Until they consistently allow opposing views – and put WUWT on their blogroll, like Anthony does for them – they lack credibility.

December 28, 2009 8:11 pm

Joel Shore (11:09:13) :
As a result of this, decay of an excess amount of CO2 is highly non-exponential, which means that it can’t be properly modeled as one single decay time (and, in fact, the carbon cycle models that the IPCC considers don’t contain only a single decay time, despite what Lawrence Solomon says). 100 years is a reasonable middle-of-the-road estimate if one is forced to give a single time. But the fact is that such a number will underestimate the decay in excess CO2 at short times and considerably overestimates the decay at long times.
—————————–
Since every carbon atom on the planet has most likely been in the atmosphere at some point, why don’t you propose a decay time in the billions of years Joel ??
What counts for alleged AGW, is the decay time above 280 ppm – you know, that legendary number when the climate was perfect. Young politicians reveling in their paradise of swimming in the sea in summer, auburn and red autumn/fall leaves dropping during the scheduled week, and snowball fights in winter.
So what’s that number Joel, besides single digit years ??
Any decay time below that idyllic number is a veritable “Russian nested doll” set of strawmen as it relates to purported AGW, and the people who imply otherwise are disingenuous snips.
The math here is no different from the pharmacokinetics of human therapeutics that have multi-phasic clearance. The administered dose is based primarily on the short half-life, to get the target therapeutic levels. The residual longer half-lives of lower drug concentrations, while noted and quantified, play a small part in the choice of dose (in life or death decisions sometimes).

Tom P
December 29, 2009 6:09 am

Smokey (18:47:26) :
“As always, the question comes down to who we should believe. In this case, is it Prof Freeman Dyson?”
Actually, as Dyson contradicts himself in a clarification of this article, the question is which of his statements should we take to be correct.
In the original article your bolded text is critical. Dyson writes:
“This means that the average lifetime of a molecule of carbon dioxide in the atmosphere, before it is captured by vegetation and afterward released, is about twelve years.”
He writes in response to Professor May who questions Dyson on this point:
“My residence time is the time that an average carbon dioxide molecule stays in the atmosphere before being absorbed by a plant. He is talking about residence with replacement. His residence time [of 100 years] is the average time that a carbon dioxide molecule and its replacements stay in the atmosphere when, as usually happens, a molecule that is absorbed is replaced by another molecule emitted from another plant.”
See: http://www.nybooks.com/articles/21882
But Dyson’s residence time as he originally stated was the “average lifetime of a molecule of carbon dioxide in the atmosphere, before it is captured by vegetation and afterward released” not just the capture time. He has contradicted himself.
In any event, the residence time Dyson is talking about is not the residence time of CO2 at present, but a hypothetical time “if we replaced all plants by carbon-eaters which do not reemit the carbon dioxide that they absorb.” Such plants do not yet exist.
The currently applicable 100-year residence time that “measures the rate at which the total amount would diminish if we stopped burning fossil fuels” is not disputed by Dyson.
You have completely misunderstood Dyson’s argument, if you read it at all in the first place.

December 29, 2009 6:41 am

Tom P,
Your conclusions are questionable: “Dyson is talking about is not the residence time of CO2 at present, but a hypothetical time ‘if we replaced all plants by carbon-eaters which do not reemit the carbon dioxide that they absorb.’ Such plants do not yet exist.” A 2,000 year old redwood tree or bristlecone pine are carbon eaters that for all practical purposes do not re-emit the CO2 they absorb, and carbon rich organisms that fall to the ocean floor are sequestered. And there are numerous similar examples.
Making a major issue out of Lord May’s quibble while glossing over Prof Dyson’s reply avoids what Dyson clearly stated:

Since we are discussing the effect of carbon-eating plants, my use of the short residence time without replacement is correct, and his use of the long residence time with replacement in that situation is wrong.

Not only is Lord May wrong, but he failed to explain how the UN/IPCC arrived at its politically motivated assumption that CO2 remains in the atmosphere for a century on average. The South Pacific above ground atomic bomb tests in the 1950’s showed conclusively using carbon isotopes that CO2 is re-absorbed by the oceans/biosphere in a relatively short time. That fast absorption rate answers the central question in the AGW debate. AGW alarmism fails without a century long CO2 persistence time, so the IPCC has no choice but to claim a false century long CO2 residence time.

Tom P
December 29, 2009 8:40 am

Smokey (06:41:38) :
Dyson clearly states in his article that new, genetically engineered plants would be required for additional sequestration:
“one quarter of the world’s forests were replanted with carbon-eating varieties of the same species…”
The current residence times are the result of the all the existing sequestration that takes place – your “numerous examples” are irrelevant to an improvement here. As Dyson himself wrote:
“Natural plant communities fail by a large factor to sequestrate as much carbon as they absorb.”
You later say:
“Not only is Lord May wrong…”
Dyson does not dispute May’s figure, only the relevance of it in his context of a hypothetical genetically engineered solution. There is indeed no contradiction between Dyson’s figure of about 12 years for the biospheric uptake of CO2 and a much longer average lifetime for the effect of the introduction of a CO2 molecule from the burning of fossil fuels.

DirkH
December 29, 2009 9:20 am

“Tom P (08:40:12) :
[…]There is indeed no contradiction between Dyson’s figure of about 12 years for the biospheric uptake of CO2 and a much longer average lifetime for the effect of the introduction of a CO2 molecule from the burning of fossil fuels.”
I looked into this David Archer paper here
http://geosci.uchicago.edu/~archer/reprints/archer.2009.ann_rev_tail.pdf
wondering how they come to the conclusion of that near-eternal residence time of man made emissions in the atmosphere. Their trick seems to be this (I’m using “trick” here in a colloquoial way, like in “a smart way to do it”, you know, not like in… aeh… “tricking you”):
They assume a yearly uptake rate of 2 Pg/year by oceans and 2 Pg/year by terrestrial plants.
Now these are NET numbers. The gross carbon exchange between oceans and atmosphere is, like, way bigger. The NASA should know shouldn’t it:
http://nasascience.nasa.gov/earth-science/oceanography/ocean-earth-system/ocean-carbon-cycle
And given that even the realclimate goons talk about the stored heat in the ocean we can deduce the ocean gets slightly more warmer, expands, rises a little and releases more CO2, shouldn’t that be the case? And from there Archer’s trickery (“trick” as in, you know, see above) starts taking a nosedive in my opinion, but what do i know, i’m not a physicist, this guy is and he can explain it better:
http://cdsweb.cern.ch/record/1181073/
Somewhere in the presentation he shows a differential of the Keeling curve and correlates it with something else that is not the CO2 emissions of humanity. And finds a way better match.

Tom P
December 29, 2009 10:30 am

DirkH (09:20:33) :
I’m afraid I can’t follow your explanation.
“…but what do i know, i’m not a physicist, this guy is and he can explain it better…”
Where Kirby talk about CO2 lifetimes? You might like the video, but it’s irrelevant to this discussion.

Joel Shore
December 29, 2009 10:51 am

RACookPE1978 says:

Carson = 100 million dead worldwide. Her facts were wrong by the way. The EPA ruling against DDT (in the US) was a political decision, by a politician who subsequently provided for, was promoted by, and was funded by the environmental groups he ruled in favor of. Against the prevaling science and the scientific data – then as now, the enviro groupthink does not permit facts to get in the way of their agenda.
Mao = 63 Chinese killed
Stalin = 23 million killed

I am surprised that you have so uncritically accepted such lies and nonsense. Perhaps because they re-enforce what you want to believe? The actual facts are these:
(1) DDT was never banned worldwide. The U.S. EPA banned it in the U.S. where malaria was eradicated already.
(2) The campaign against malaria failed for a variety of reasons. One of the major reasons was the development of resistance by mosquitoes to DDT and other insecticides. For example, the explosion of malaria deaths in India in the 1970s occurred at a time when DDT use was high and continuing to increase. Such resistance occurs to the greatest extent when DDT is used extensively outdoors as in agriculture, and in fact, this is precisely one of the things that Rachel Carson warned about in “Silent Spring”. If her warnings had been better heeded, perhaps many lives could have been saved.
(3) There is general agreement that the use of DDT for indoor spraying in areas where there is not resistance is sometimes advisable, although there is disagreement on exactly when and where the benefits (and, specifically, the benefits relative to other options) outweigh the risks.
(4) The fact that outdoor spraying of DDT had very detrimental effects on some wildlife is not disputed even by the organization Malaria Foundation International, which campaigned (successfully) against a “ban” (actually phase-out) of DDT in the International Treaty on Persistent Organic Pollutants (POP’s) negotiated in 2000. They say, “It cannot be seriously disputed that DDT has devastated some wildlife populations, such as birds of prey.” ( http://www.malaria.org/DDT_open.html ) They also do not dispute that DDT has health risks to humans, saying “There is no doubt that there are health risks associated with DDT use”, although they argue that the benefits of its use against malaria where it is effective outweigh those risks. They also argue that the restriction on DDT in the POP’s treaty to use only against disease “is is arguably better than the status quo going into the negotiations over two years ago. For the first time, there is now an insecticide which is restricted to vector control only, meaning that the selection of resistant mosquitoes will be slower than before.” ( http://www.malaria.org/DDTpage.html )
(5) The whole DDT saga serves as a telling example of how there is a strong anti-environmental movement that is willing to lie and rewrite history in the service of their ideology and that there are plenty of people who will uncritically accept their lies. A good dissection of the lies is here: http://info-pollution.com/ddtban.htm and http://scienceblogs.com/deltoid/ddt/

Editor
December 29, 2009 11:17 am

CO2 residence time…
Essenhigh (2009): 12CO2 ~5 years, 14CO2 ~16 years.
Segalstad (1992): ~5 years.
Murray (1992): ~5 years.
Stumm & Morgan (1970): ~5 years.
IPCC (Houghton et al., 1990): 50 to 200 years.
Source: Correct Timing is Everything – Also for CO2 in the Air by Tom Segalstad

To calculate the RT of the bulk atmospheric CO2 molecule 12CO2, Essenhigh (2009) uses the IPCC data of 1990 with a total mass of carbon of 750 gigatons in the atmospheric CO2 and a natural input/output exchange rate of 150 gigatons of carbon per year (Houghton et al., 1990). The characteristic decay time (denoted by the Greek letter tau) is simply the former value divided by the latter value: 750 / 150 = 5 years. This is a similar value to the ~5 years found from 13C/12C carbon isotope mass balance calculations of measured atmospheric CO2 13C/12C carbon isotope data by Segalstad (1992); the ~5 years obtained from CO2 solubility data by Murray (1992); and the ~5 years derived from CO2 chemical kinetic data by Stumm & Morgan (1970).

As is so often the case, the IPCC’s preferred number sticks out like a sore thumb and must be correct because the IPCC said so.

Joel Shore
December 29, 2009 1:27 pm

David Middleton:
There seems to be a propensity on this site to repeat nonsense long after it has been corrected. As both I and Tom P have explained, the number that you quote from IPCC is NOT a residence time. It is a rough estimate of the decay time for the CO2 concentration given an excess amount of CO2 released into the atmosphere, which is eventually determined not by exchange between the atmosphere and the upper ocean but by slower processes like exchange with the deep ocean. (And, it is really only a rough estimate because the decay is actually not exponential. Hence, a single decay time will overestimate the CO2 concentration at short times and will underestimate the CO2 concentration at long times, which is why Solomon et al. )

Editor
December 29, 2009 3:00 pm

Shore (13:27:03) :
I believe that the IPCC number is a residence time. http://www.ecd.bnl.gov/steve/CDIAC_94/CDIAC.html>CDIAC also says that CO2 has a residence time measured in decades to centuries…

A major difference between CO2 and sulfate aerosols is the residence time of the materials in the atmosphere: decades to centuries for CO2, days to weeks for sulfates. Another key parameter is the sulfur content in the fuel. The greater the sulfur content, the greater the climatic effect.

The atmosphere contains ~388 ppmv CO2 (~820 Gt C).
Total annual carbon—>atmosphere sources:

Terrestrial vegetation 60 Gt C
Soils & detritus 60 Gt C
Surface ocean 90 Gt C
Anthropogenic 8 Gt C
Total Sources 218 Gt C/yr

820/218 = 3.76
If CO2 had a “decades to centuries”residence time in the atmosphere, CO2 levels would be rising a heck of a lot faster than they have been over the last 150 years.
The Earth started to warm up out of the Little Ice Age 260 years before CO2 levels started to rise…
Moberg & CO2
The really funny thing is that before 1960, CO2 levels were rising at a faster rate than anthropogenic emissions were rising…
Emissions vs Atmos. Conc.
The warming started before the CO2 started to rise.
The CO2 started to rise before anthropogenic emissions started to rise.
That’s because the atmospheric CO2 has been rising mostly because the Earth has been slowly warming since the 1600’s.
Atmospheric CO2 would be somewhere between 330 and 370 ppmv (or higher) right now if man had never discovered fire.
The plant stomatal data clearly show that CO2 levels have routinely risen to >360 ppmv in response to warming episodes over the last 10,000 year.

P Wilson
December 29, 2009 4:40 pm

There seems some confusion of c02 residence time: 100 years refers to atmospheric half life, derived by mathematical rates, and don’t necessarily apply to the nature of climate as a constant, since the c02 moelecules constantly exchange – not to be confused with aerial residence, which is diurnal-annual-decadal-centennial. However, the Anthropogenic c02 airborne percentage hasn’t changed since 1850.

P Wilson
December 29, 2009 4:44 pm

Joel Shore (13:27:03)
“There seems to be a propensity on this site to repeat nonsense long after it has been corrected. ”
Alas not always an easy task

P Wilson
December 29, 2009 4:54 pm

philincalifornia (20:11:00) :
the IPCC makes the simple error of assuming 100 years is an absolute, and fail to distinguish between atmospheric half life and aerial residence, hence the problem of the “missing sink”, though that was invalidated by Segalstaad in 1992 and 1998. Essenhigh (2009) gave us a clearer equation in explaining that c02 is part of a non-static exchange where 1/5th of c02 is exchanged between various pools depending on its temperature dependence and relatively quick equilibrium.
what are we talking about co2 for anyway? It has F*** all to do with the climate.

P Wilson
December 29, 2009 5:11 pm

Tom P (06:09:54) :
Quite honestly, no-one knows how long a co2 molecule stays in the air. It could be a day or less to 7 years. The rest is just speculation. Not sure where this talk of decay comes from – surely we mean rate of increase? There are 38,000GT of c02 in the oceans, afterall, although oceans regulate aerial c02 content

Joel Shore
December 29, 2009 5:47 pm

David Middleton says:

I believe that the IPCC number is a residence time. http://www.ecd.bnl.gov/steve/CDIAC_94/CDIAC.html>CDIAC also says that CO2 has a residence time measured in decades to centuries…

The terms are sometimes used a little carelessly. Of the major gases that this is computed for, I think it is only for CO2 that this distinction is important because most of the other gases are removed from the atmosphere by chemical reactions (or deposition). CO2 is fairly unique in having large exchanges between the atmosphere and other reservoirs.

If CO2 had a “decades to centuries”residence time in the atmosphere, CO2 levels would be rising a heck of a lot faster than they have been over the last 150 years.

And, a perturbation of CO2 really had a halflife of only a few years, CO2 levels would be a heck of a lot lower. That is why you have to understand the full nature of the carbon cycle, which is not characterized by a single decay time.

The really funny thing is that before 1960, CO2 levels were rising at a faster rate than anthropogenic emissions were rising…
Emissions vs Atmos. Conc.

Land use factors need to be included in cumulative emissions too. Besides which, that result is very sensitive to exactly where you start the cumulative emissions curve. Since good modern measurements have been available, the rise in the atmospheric levels as a fraction of emissions has been quite constant with around 40-50% of the excess CO2 remaining in the atmosphere.

That’s because the atmospheric CO2 has been rising mostly because the Earth has been slowly warming since the 1600’s.
Atmospheric CO2 would be somewhere between 330 and 370 ppmv (or higher) right now if man had never discovered fire.
The plant stomatal data clearly show that CO2 levels have routinely risen to >360 ppmv in response to warming episodes over the last 10,000 year.

Completely ludicrous. You selectively believe data that tells you what you want to believe and ignore the wealth of data and theory that contradict it.

December 29, 2009 6:20 pm

P Wilson (16:54:28) :
philincalifornia (20:11:00) :
the IPCC makes the simple error of assuming 100 years is an absolute, etc. etc.
Yes I have noticed that this disgusting semantic argument that has resulted in this strange set of warmist comments. This is nothing to do with science, but rather is related to the mechanism used to pervert the scientific literature in order to make fake headlines that trick people of voting age into believing their s#!t.
The argument (one of them) is that a carbon atom from burned oil or coal that is sequestered into a xylan or cellulose molecule in a big fat log and sits on, or under the ground for ten to a hundred years is an anthropogenic carbon atom, so that represents the [insert believable-to-dupe phrase] lifetime of an anthropogenic carbon atom in the atmosphere (as nasty carbon dioxide, of course).
So why, when I note that the coal and oil carbon atoms were already in the atmosphere once, and ask why they don’t quote a lifetime/residence time/half-life of tens of millions of years, or even billions of years, I get no answer.
Here’s the answer – 100 years is the maximum number that the sheeple will believe. 1000 years has been tried and accepted in the “peer”-reviewed literature, but nobody in the real world is that fricking stupid (well they are actually, but is it a voting majority ??).
I’ve already proposed the Shore Uncertainty Principle, which is that if some of the things that Joel argues for were true (e.g. 100- 1000 year “residence time” of CO2, in the way scientists think of this, in the atmosphere), he would not be here to post about it.
See also David Middleton (15:00:57) :

December 29, 2009 7:08 pm

I nominate this statement for Winner of the Psychological Projection Comment of the Year Decade:

You selectively believe data that tells you what you want to believe and ignore the wealth of data and theory that contradict it.

The endless hairsplitting arguments by the alarmist crowd over what the residence time and persistence of CO2 means is simply an attempt to wiggle out of the fact that with short CO2 persistence times, the entire CO2=CAGW conjecture implodes.
But there isn’t much wiggle room. The above ground atomic bomb tests in the South Pacific in the 1950’s created new carbon isotopes that were easily identifiable. This allowed tracking of CO2, from the detonation until the CO2 was re-absorbed by the ocean and biosphere. The CO2 persistence was found to be on the order of ≈10 years.
A century-scale CO2 persistence would allow for a high climate sensitivity number, meaning that a fast rise in CO2 would result in a fast rise in global temperature. In fact, that is the IPCC’s computer modeled conclusion. Unfortunately for the IPCC’s politicians, the planet disagrees.
But a short, decade-scale CO2 persistence requires a much lower climate sensitivity number. That means that the effect of CO2 on the climate can be disregarded for all practical purposes; its tiny to non-existent effect is inconsequential, and can safely be ignored.
The IPCC’s political appointees have no choice in the matter: they must claim a century-scale CO2 persistence. With a short CO2 residence time, they have no reasonable argument why they should not be disbanded. So against all the empirical evidence, the IPCC continues to falsely claim that the average CO2 molecule persists in the atmosphere for a century or more. Conclusion: sucks to be them.

December 29, 2009 7:36 pm


Smokey (19:08:31) :

A century-scale CO2 persistence would allow for a high climate sensitivity number,

WHEN we see seasonal variations (in the hemisphere) of measureable CO2 percentage changes (due to uptake by plant biota)? And the lifetime is ‘quoted’ as a century?
Horse hockey! Not all of it CERTAINLY!?
.
.

P Wilson
December 29, 2009 8:58 pm

Joel Shore (17:47:28)
“The terms are sometimes used a little carelessly”
that ranks as the understatement of the year. Incompetence is the correct word. However, given that no-one is measuring co2 exchanges, or rates of exchange, which is well nigh impossible, it is surprising, given that oceans have been very warm over the 20th-21st century that there isn’t well over 400ppm of c02. It proves that nature, specifically oceans are very adept at maintaining aerial c02 at around 0.04%. The carbon cycle – as a complete cycle – takes several hundred years at a time
Phil in California: I’m baffled how the concept of decay came into the c02 ratio. Anyway, even taking Anthro c02 as a given, what is often not understood is that if 50% of it stays in the air (from a day to several years) it is less that 1.5% of total c02 – as that ratio of anthro c02 to natural c02 is measurably the same since 1850, then a years prior c02 would dissolve against the ration of oceanic natural emission -yet nature always does the balancing trick. Atmospheric scientists don’t know what happens to co2 or why with any certainty. To say it is removed from the atmosphere is a careless comment – it fluctuates diurnally and seasonally at the very least – particularly during winter when its slightly elevated as nature puts billions of tons into the atmosphere through decay.
Quite what any of this has to do with climate cooling and warming (largely regional and irregular), remains a stubborn mystery.

P Wilson
December 29, 2009 9:10 pm

kdkd (21:33:36) :
savethesharks:
“The climate models have shown for a long time that early greenhouse gas forced global warming will result in amplified warming in the arctic. It’s very powerful evidence that we have a serious problem, and the repeated challenges by sceptics have been unable to undermine this evidence.”
So c02 during the 30’s must have been well into the 500ppm mark – a russian icebreaker was found floating in free waters 300miles from the North Pole – scientists then were worried about global warming. However, since its only been measured since 1979, nothing of certainly prior to this can be vouchsafed for – even during the holocene optimum or the Medieval Warm period – from which we can infer that it was very probably warmer in the Arctic than the period since 1979 to the present, given the nature of greenland ice cores as proxies -1875 marked the coldest/lowest period of the entire holocene so it is an inappropriate starting point in any case

Tom P
December 30, 2009 2:15 am

Smokey (19:08:31) :
“The above ground atomic bomb tests in the South Pacific in the 1950’s created new carbon isotopes that were easily identifiable. This allowed tracking of CO2, from the detonation until the CO2 was re-absorbed by the ocean and biosphere. The CO2 persistence was found to be on the order of ≈10 years.”
… and measurements of the increase in the same isotopes in the ocean have been used to determine the time taken for adsorption and sequestration in the marine environment of many decades and more.
All the available data has been used to validate the analysis that CO2 has a range of sequestration times ranging from a decade or so to many centuries and more depending on all the available processes. Your stubborn insistence on concentrating on just one value is like the sunbather’s who thinks because his sunburn improves after a day or so he can safely ignore any long-term skin damage.

P Wilson
December 30, 2009 4:24 am

Tom P
“All the available data has been used to validate the analysis that CO2 has a range of sequestration times ranging from a decade or so to many centuries and more depending on all the available processes”
more correctly, it’s circulation, like money circulation. The chances are that if you spend a $1 in your local community, same coin might just come back to you as change somewhere else – but that doesn’t mean it was in the bank accumulating all the time in absence – it could have gone via the hotel, the cinema, the department store, the bank and so forth until it ended in your pocket again, which could have been days or years later. The greatest reservoir is the bank of course – though in co2 terms, oceans contain up to 40,000GT, though oceans are capable of absorbing any amount of co2 – and this process depends on ocean temperatures.
so its uncertain where the long term “skin damage” comes from. If its the radiative effects of c02 then it is so much the factor 3 sunblock in the sahara, and an increase in 2% of this factor 3 won’t do anything to stop sunburn. More of the same sunblock doesn’t increase its effect – you’d need a higher factor of lotion. (Actually, climatologists know more co2 doesn’t mean more heat, so contrive fake equations to make it look like it does)

Tom P
December 30, 2009 5:08 am

P Wilson (04:24:03) :
“…oceans are capable of absorbing any amount of co2 – and this process depends on ocean temperatures.”
The ability of the oceans to absorb carbon dioxide is far from infinite – and decreases with temperature.
Your post is certainly on a par with the Luboš Motl’s head article in terms of its scientific worth. Maybe you and him could work up something together here to brighten these dark months?

P Wilson
December 30, 2009 6:55 am

Thats what I said: Oceans can absorb any amount of c02, but depends on temperature. Des[pite very warm oceans over the 20th-21st the surprise is that there isn’t even more c02 in the atmosphere

December 30, 2009 8:39 am

Shore (17:47:28) :

The terms are sometimes used a little carelessly. Of the major gases that this is computed for, I think it is only for CO2 that this distinction is important because most of the other gases are removed from the atmosphere by chemical reactions (or deposition). CO2 is fairly unique in having large exchanges between the atmosphere and other reservoirs.

I agree. The terms are used carelessly. The RT of CO2 has almost nothing to do with its chemical properties. The RT is not really like a half-life; it’s dictated by the ratio of sourcing-to-atmosphere-rate to the sinking-from-atmosphere-rate. It can fairly easily be estimated by dividing the Gt C in the atmosphere by the total annual Gt C emissions to the atmosphere. If the source rate and sink rate were equal, there would be no CO2 in the atmosphere… The RT would be zero,

And, a perturbation of CO2 really had a halflife of only a few years, CO2 levels would be a heck of a lot lower. That is why you have to understand the full nature of the carbon cycle, which is not characterized by a single decay time.

The RT will be equal to the Gt C Atmosphere / Gt C Source Rate. If CO2 had an RT of 20 years, there would be 20 years’ worth of CO2 emissions in the atmosphere (~2160 ppmv). The RT is a function of source rate / sink rate.

Land use factors need to be included in cumulative emissions too. Besides which, that result is very sensitive to exactly where you start the cumulative emissions curve. Since good modern measurements have been available, the rise in the atmospheric levels as a fraction of emissions has been quite constant with around 40-50% of the excess CO2 remaining in the atmosphere.

I am using the land-use factor of about 1.5 Gt C/yr and fossil fuel factor of about 6.5 Gt C/yr.
I am using the cumulative emssions that CDIAC (ORNL) publishes. CDIAC’s emssions record starts in 1750. I converted the annual emissions Gt to ppmv and used 277 ppmv (Law Dome) as a starting point for anthropogenic emissions.
When I back 45% of the anthropogenic emissions off the annual number, I wind up with about 370 ppmv for a modern value.
If I back all of the anthropogenic emissions off, I wind up with about 338 ppmv and I get a CO2 curve that mimics Moberg’s temperature reconstruction,

Completely ludicrous. You selectively believe data that tells you what you want to believe and ignore the wealth of data and theory that contradict it.

Actually, the only contradictory data are from the ice cores. The plant stomata data can be empirically tested and directly calibrated to moder instrumental CO2 records. The ice core data cannot be empirically tested; nor can they be calibrated to modern instrumental data yet. Furthermore, the AIRS data pretty clearly shows that low- to mid-latitude atmospheric CO2 is 20 to 30 ppmv higher than Antarctica.
The ice core CO2 data are great because they do give us a very long time series. But they do not accurately image decadal and century scale changes in CO2. They are like a long-period moving average.
The Pleistocene ice cores show that CO2 lagged behind temperature changes,
Kouwneberg’s SI CO2 reconstruction shows two CO2 maxima: ~390ppmv @ 400 AD and ~320 ppmv @ 1300 AD. Those maxima occur ~150 and ~350 years after warming peaks in the GISP2 ice core temperature reconstruction (Alley, 2004).
The instrumental and ice core data show that CO2 lagged behind the warm up from the Little Ice Age.
Ad hoc ergo propter hoc is a logical fallacy. “A” did not necessarily cause “B” just because “B” followed“A”… But… If CO2 consistently follows behind temperature changes… It’s kind of impossible for CO2 to be causing those temperature changes.

Tom P
December 30, 2009 9:59 am

Dave Middleton (08:39:51) :
Maybe should should join Wilson and Motl in this proposed article. Your science is at a comparable level.
The residence time is not dependent on the ratio of the source to sink rates, but the ratio of the mass in the system to the sum of the source and sink terms in mass/time. Hence your assertion:
“If the source rate and sink rate were equal, there would be no CO2 in the atmosphere… The RT would be zero”
is nonsense, as a moment’s consideration of the time a water molecule spends in a pipe between your hot-water tank and bath should demonstrate.
Your last paragraph does severe violence to any form of logic. You might assert that because a certain human-induced causality has not been seen in previous natural responses, it’s “kind of impossible”.
But on the same basis you would have to claim that humans are incapable of burning down forests because such causality had indeed not been seen until humans discovered fire.

P Wilson
December 30, 2009 10:28 am

Tom P
Since the Anthropogenicatmospheric proportion remains constant, then it is no more than 1.5%, (A constant is a constant) if annual emissions are 3% and 50% of this dissolves in reservoirs. The other 50% doesn’t just stay in the atmosphere indefinately – no-one knows what happens to it, and for all we know it could be 0.5% at this moment. No one is measuring it so this atmospheric half life notion – the rate of time required to eliminate c02 down to a theoretically unknown c02 level (hows that for a mathematical paradox) is an arbitrary idea. Nature doesn’t follow these equations, although climatologists do – maybe because they formulate them, or borrow them from physics and mathematics, then quickly fall into fallacy on that basis, quite aside from the fallacy of using isotope tags to determine a ration. That is like taking a drop of water from the ocean to determine how much ocean there is down to the last litre. All we really know is that nature gives an equilibrium of around 0.04% atmospheric c02
It might be apt to put the horse before the cart before jeering at the passenger’s logic, otherwise one has random extrapolation running riot over reason.
What does this have to do with climate cooling and/or climate warming by the way?

Tom P
December 30, 2009 10:50 am

P Wilson (10:28:09):
Your reasoning came crashing down in the first sentence. Of course the “Anthropogenicatmospheric proportion” is not a constant – it was much lower before industrialisation.
“It might be apt to put the horse before the cart before jeering at the passenger’s logic, otherwise one has random extrapolation running riot over reason.”
This paints quite a picture! – and maybe gives some insight into your thought processes.

Editor
December 30, 2009 11:11 am

P (09:59:08) :
My “bath” and the “hot water pipe” aren’t continuously exchanging water molecules… It’s a one-way trip for the water molecule from the pipe to the bath and then out the drain.
CO2 is continuously being cycled between the biosphere, atmosphere and oceans. Its residence time in the atmosphere is entirely dependent on the ratio of source rate vs sink rate. The atmosphere didn’t just come pre-equipped with a standard CO2 level. There is some CO2 in the atmosphere because the source rate has generally exceded the sink rate by a small amount over the last 600 million years or so. Atmspheric CO2 has fluctuated between ~180 ppmv to ~5000 ppmv during the Phanerozoic Eon because the ratio of source rate to sink rate has varied.
The RT will always be a function of the Gt C (Atmosphere) / Gt C (Source Rate). If there are 800 Gt C in the atmosphere and the source rate is 200 Gt C/ yr… The RT is 4 years. If the C sources suddenly shut off, the 800 Gt C in the atmosphere would be consumed by the sinks in a bit over 4 years.
You are correct. I did make a mistake when I said, “If the source rate and sink rate were equal, there would be no CO2 in the atmosphere… The RT would be zero.” The RT could not become zero unless the sink rate exceeded the source rate for a period of time. If the source rate and sink rate were equal, the RT would stabilize at Gt C (Atmosphere) / Gt C (Source Rate).
Your analysis of my logic is just about as bass-ackwards as it could be. If “B” happens after “A;” “A” might have caused “B.” Post hoc ergo propter hoc dictates that it is a logical fallacy to assume that “A” caused “B”.
The lag time of CO2 behind temperature in the ice cores and plant SI data does not mean that the temperature changes did cause the CO2 changes. The lag time simply opens up the possibility that temperature changes drive CO2 changes. The lag time does make it rather difficult, if not impossible, for the CO2 changes in the Pleistocene ice cores and Holocene SI data to have driven the temperature changes during those time periods.
Your forest fire example is simply a false analogy. No forest has ever burned down prior to a fire starting. Forests do not cause fires to start, irrespective of an anthropogenic cause of the fire itself.
In the Pleistocene, ice core data indicate that CO2 levels appear to have risen and fallen along with temperature changes of glacial-interglacial cycles. The CO2 changes lagged behind the temperature changes. Mankind did not participate in those changes of CO2 concentrations. In the Holocene, plant SI data show that CO2 also rose and fell in a manner that lagged temperature changes. Mankind’s role in the Holocene CO2 changes was also pretty well nonexistent.
About 400 years ago, the Earth started to warm up from the Little Ice Age. About 150 years ago, CO2 levels started to rise significantly. Mankind has had a role in the current CO2 rise. Our annual contribution of CO2 has risen from 0% to about 4% of the total source of CO2. But the warming still started more than 250 years before the CO2 started to rise.
As far as the personal insults go… What is it about you guys? Is there an insult course over at RC? I think I’ve been extremely civil to all but one warmist on this blog. And I only got snarky with him after about four days of steady insults from him.
It seems that for most of you folks, the play book consists of two logical fallacies and then a tirade of insults…
1) Appeal to authority.
2) Argument ad homiem.
3) Vile personal insults.
Apart from Joel Shore, Ferdinand Englebeen and Leif Svalgaard (although he’s not really a warmist), I’m hard-pressed to name any warmists who actually argue the science in a civil and thoughtful manner.

December 30, 2009 11:12 am

Tom P (02:15:55) :
… and measurements of the increase in the same isotopes in the ocean have been used to determine the time taken for adsorption and sequestration in the marine environment of many decades and more.
———————
… and while there, it causes global warming …. how ??
Or were you switching “catastrophes” ??

Tom P
December 30, 2009 2:52 pm

David Middleton (11:11:12) :
I said your statement about residence times was nonsense, as you now agree. There’s no need to take this as a “vile personal insult”.
As for the analogy of the forest fire, it has more relevance than you admit. In the past it was almost certainly Milankovitch cycles that initiated the increase in global temperatures, while for forest fires it was generally lightning, both natural causes. The causality is not now reversed, as you suggest, but rather human activity is now an additional option in both cases, whether by increased green-house-gas emissions or careless campfires.
It is illogical to reject a human cause for a broad range of phenomena on the basis that such a causality was not exhibited when humans were not present.
As for the current relationship between CO2 and temperature: if the causality is as you suggest, why would smoothly increasing annual CO2 concentrations be the causal result of temperatures that have not been rising steadily year on year? Or to put it another way, as temperatures have indeed stalled for the last eight years, why have CO2 concentrations continued their steady climb?

Don Hamlin
December 30, 2009 3:47 pm

what creates more CO2 the constant decay of plant and animal mater or the burning of it?
and in the time period that were discusing man has been preventing more forest fires than he has been creating so careless camp fire analigy is a little off base.

P Wilson
December 30, 2009 4:43 pm

Tom P (10:50:29) :
Perhaps you’re missing the point. The anthropogenic aerial fraction of co2 has not increased to 30% since 1850 but has remained constant, so today it is 3-4%/2, since 50% remains airborne whilst 50% doesn’t.
http://wattsupwiththat.com/2009/11/10/bombshell-from-bristol-is-the-airborne-fraction-of-anthropogenic-co2-emissions-increasing-study-says-no/
In other words, the increase of co2 since 1850 has been almost entirely natural, and the anthropgenic part is some 12ppm of the total, as we’re at a multicentennial natural highpoint of c02

P Wilson
December 30, 2009 4:57 pm

Tom P
“It is illogical to reject a human cause for a broad range of phenomena on the basis that such a causality was not exhibited when humans were not present.”
Land use changed, deforestation, cities, urban areas all have their effect on local climates and temperatures. They’re not statistically significant enough globally however, and none of these factors are drivers of global climate change in one direction or the other.
In reply to the question of increasing c02 whilst temperatures level or decrease, the warming of oceans at the equator leads to higher co2 as oceans expel more vapour, heat and co2 – vegetation adds its source, some absorbed elsewhere back into oceans and other reservoirs so that the aerial level of c02 stays at around 0.04%. The lag between temperature and c02 is reputed as around 800-1500 years – which might be due to the fact that a complete ocean cycle is 800 years,or that the MWP was between 800-1000 years ago. Its common that temperature falls whilst co2 rises for a while from those ancient ice lollies, and vice versa. What is clear is that temperature leads the way whilst c02 follows the tmperature trend later

Tom P
December 30, 2009 6:06 pm

P Wilson (16:43:12) :
“In other words, the increase of co2 since 1850 has been almost entirely natural, and the anthropgenic part is some 12ppm of the total, as we’re at a multicentennial natural highpoint of c02.”
You’ve completely misread the Knorr paper that this article discusses:
http://wattsupwiththat.files.wordpress.com/2009/11/knorr2009_co2_sequestration.pdf
The annual increase in CO2 in the atmosphere is about half the annual rate of emissions. The proportionality between the two is excellent evidence that increase of CO2 can be entirely attributed to anthropogenic emissions – that’s an increase from 280 ppm to nearly 400 ppm CO2 in the atmosphere.
Your subsequent notion that this more than a quarter increase over a few decades is due to a delayed reaction to the MWP a thousand years ago is fanciful to say the least and ignores the very correlation between emissions and atmospheric CO2 that you first pointed out.

P Wilson
December 30, 2009 6:42 pm

the annual rate is 4% so 2% stays airborne, and prior year’s c02 might be absorbed or taken up by reservoirs – over 5 years certainly so – so its safe to assume that if there is no significant increase in the airborne fraction then anthropogenic c02 doesn’t accumulate whilst natural c02 does. Itquite clearly means that if 45% (some 1.8% of all c02) doesn’t accumulate over years – but that most of the previous year’s aerial anthropgenic c02 circulates between oceans, air and land, such that there is never more than 3-4% anthropogenic c02 in the atmosphere. Its reasonable to assume that oceans have emitted a lot of c02 since the pacific climate shift since the late 70’s, and if the ice cores that show a lag of 800-2000 years are reliable a trend, then the same process is happening today. The important question then is: What happened 800-2000 years ago to give this multicentennial increase in c02 percentages?

P Wilson
December 30, 2009 6:49 pm

certainly there has been a large output of c02 since 1850 though proxies from ice cores are not accurate – they show trends but not exact data, and only vague data from ground level that was absorbed in icesheets. What the actual aerial c02 was in france, the USA or China, indeed antarctica 200-100,000 years ago is anyones guess. Why is it not possible for ice to absorb what show years later at 180ppm, whilst aerial c02 up to 100 metres aloft was 350ppm?

Tom P
December 30, 2009 7:09 pm

P Wilson (18:42:04) :
“… its safe to assume that if there is no significant increase in the airborne fraction then anthropogenic c02 doesn’t accumulate whilst natural c02 does.”
I think you’ve lost everyone now. How does a CO2 molecule know whether it’s anthropogenic or natural? Do they come with little labels?

P Wilson
December 31, 2009 1:46 am

Tom P (19:09:55)
According to present propaganda they certainly do come with labels. the 97% natural co2 is quite benign, whilst the 3% Anthropgenic has all the force of doom that makes the vast difference between cataclysm and perfection, such is its potency.

Editor
December 31, 2009 7:03 am

Tom P (14:52:56) :
David Middleton (11:11:12) :
I said your statement about residence times was nonsense, as you now agree. There’s no need to take this as a “vile personal insult”.

I don’t agree that my statement was nonsense. I did agree that it was inelegantly worded and over-simplified. But… Fair enough.

As for the analogy of the forest fire, it has more relevance than you admit. In the past it was almost certainly Milankovitch cycles that initiated the increase in global temperatures, while for forest fires it was generally lightning, both natural causes. The causality is not now reversed, as you suggest, but rather human activity is now an additional option in both cases, whether by increased green-house-gas emissions or careless campfires.

The falseness of your analogy is this simple:
The delta-CO2 still lags behind the delta-T.
The source of ignition always precedes the fire.

It is illogical to reject a human cause for a broad range of phenomena on the basis that such a causality was not exhibited when humans were not present.

It is impossible for “B” to cause “A” if “A” always occurs first.

As for the current relationship between CO2 and temperature: if the causality is as you suggest, why would smoothly increasing annual CO2 concentrations be the causal result of temperatures that have not been rising steadily year on year? Or to put it another way, as temperatures have indeed stalled for the last eight years, why have CO2 concentrations continued their steady climb?

The only “smoothly increasing” CO2 curve is in the instrumental record (eg. Mauna Loa Observatory); which only dates back to 1960. It’s quite possible, maybe even probable that a significant part of the increase in the MLO data is anthropogenic. Mankind’s CO2 emissions
The ice core CO2 data represent a moving average of CO2 with a century-scale period. So… At decadal scales, the curve should be very smooth.
The CO2 concentrations derived from plant SI data (eg. Jay Bath) have an almost annual resolution back almost 2,000 years. These curves are anything but smooth (as one would expect in a stochastic system). And there is a very consistent pattern of the temperature peaks and troughs preceding the CO2 peaks and troughs by anywhere from 100 to 400 years. Now this doesn’t prove that temperature changes have been driving CO2 changes… But the correlation certainly forms the basis of a hypothesis worthy of investigation. The CO2 rise since the mid-1800’s could largely be the result of oceanic warming that began at the nadir of the Little Ice Age in the 1600’s.
The lack of a correlation between CO2 and temperature that would indicate enhanced greenhouse warming at the Phanerozoic scale and the presence of a correlation that supports temperature-driven changes in atmospheric CO2 at the glacial-interglacial and Dansgaard-Oeshger/Heinrich/Bond cycle (~1,470-yr) scales ought to be a clue that temperature changes have far more impact on atmospheric CO2 than relatively minor changes in CO2 have on temperature.

Editor
December 31, 2009 7:11 am

Correction to: David Middleton (07:03:24)
I forgot to finish a sentence…
Mankind’s CO2 emissions accelerated after 1950, a few years before Keeling started recording CO2 levels at MLO; so the secular trend in the MLO curve probably does carry an anthropogenic signal.

Tom P
December 31, 2009 12:05 pm

David Middleton (07:03:24) :
“It is impossible for “B” to cause “A” if “A” always occurs first.”
Not true – a bachelor party always precedes the marriage, but it is the marriage which is the cause of the party, not the other way round.
Anyway, this is not the relevant statement. What we actually have is:
“It is impossible for “B” to cause “A” if “A” has up to now always occurred first.”
which is a failure of inductive reasoning and logically false.
Happy New Year/Decade!

December 31, 2009 12:22 pm

“Not true – a bachelor party always precedes the marriage, but it is the marriage which is the cause of the party, not the other way round.”
I know this is how you guys think. But the fact is that a bachelor party is a celebration of the end of bachelorhood, not a celebration of marriage.
.
“Marriage isn’t a word, it’s a sentence.”
~K. Vidor

…as I sometimes tell Mrs. Smokey when I’m out of range.
Happy New Year folks! And may AGW suffer the same ignominious fate as Y2K, Alar, and killer bees.

P Wilson
December 31, 2009 1:07 pm

Correlating social convention to the laws of physical causality might be entertaining, tho David Hume wrote a treatise on this very subject on the pitfalls of inductive reasoning during the 18th century, due to the fact that we cannot trace a contiguity between a cause and an effect, or their respective origins. Its even online fortunately for us.
http://www.class.uidaho.edu/mickelsen/ToC/hume%20treatise%20ToC.htm
The debates he threw are still used to day in the latest scientific discoveries.
However, here’s a test of such a relation: Miss Jones is single and teaches mathematics. Mrs Smith is married and teaches mathematics in the same school and a different class, to the same proficiency as Mrs Smith. Miss Jones’s pupils achieved better results than Mrs Smiths’, therefore all mathematics teachers who are single will achieve better results than those who are married, since marital status seems to be the only discernable causal affinity.
Oh – forgot – if a bachelor’s party precedes the cause, then the cause was still in advance of the effect, since the marriage arrangements came prior. But to use the former logic, we’d say that eggs cause water to boil in the pan, since the requirement was that cooked eggs were the greater necessity than boiling water, and must therefore have been the cause.
Climate science is indeed similarly metaphysical.
Happy 2010

P Wilson
December 31, 2009 2:55 pm

Icarus (13:33:14) :
“The long-term warming trend is around 0.13C per decade according to the entire UAH record. What you should be calculating is whether there is any statistically significant deviation from that warming trend – otherwise you’re just grasping at straws.”
Is that figure before adjustments a la Darwin airport, Australia?
http://blogs.telegraph.co.uk/news/jamesdelingpole/100019301/climategate-another-smoking-gun/
investigation courtesy of Willis Eschenbach
and
http://www.surfacestations.org/
scroll down to the title: “Here is a well maintained and well sited USHCN station”
and imagine that a lot of world weather stations give anomalous readings due to their maintenance.
I’m curious as to why less attention is brought to Essenbach and Anthony Watts when they seems more meticulous that NASA or CRU!

P Wilson
December 31, 2009 3:04 pm

Incidentally (Icarus) the satellite data was adjusted in various ways when it was found to deviate from weather station temperatures, since it showed no warming. One of the chief reasons was orbital drifting. The logic of satellites is that if they lose velocity and drift they lost altitude, and so it would be appropriate to adjust temperatures inferred from satellites downwards, since they’re recording more than they should for the original altiude. What diid they do? They did the opposite and adjusted them upwards, which defies all logic. They adjusted them upwards to obtain the 0.6C that they were looking for – that is, adjustment in line with *adjusted* surface temperatures.

Roger Knights
December 31, 2009 4:17 pm

@ P Wilson:
“Mrs Smith is married and teaches mathematics in the same school and a different class, to the same proficiency as Mrs Smith Miss Jones.”
“therefore all mathematics teachers who are single will achieve better results than those who are married, since marital status seems to be the only discernible causal affinity.”
This is formalized as the “common factor fallacy” or the “whiskey-and-soda fallacy.” Here’s one version of it that I just picked up by googling:
“This logic reminds me of one of Husserl’s favourite anecdotes: There is this man who drinks whisky and soda and it makes him sick, then he takes gin and soda and he gets sick; then he takes vodka and soda and he is sick, and he concludes that soda makes him sick.”
Other versions have it that the drinks make him drunk, not sick.

Joel Shore
December 31, 2009 6:12 pm

P Wilson:

The logic of satellites is that if they lose velocity and drift they lost altitude, and so it would be appropriate to adjust temperatures inferred from satellites downwards, since they’re recording more than they should for the original altiude. What diid they do? They did the opposite and adjusted them upwards, which defies all logic. They adjusted them upwards to obtain the 0.6C that they were looking for – that is, adjustment in line with *adjusted* surface temperatures.

So, you are suggesting that Roy Spencer and John Christy have purposely or accidently adjusted the satellite data the wrong way despite the fact that both of them clearly have their allegiance on the side of believing that man-made global warming is no big deal? What a novel idea!
REPLY: yes, I’ll have to agree with Joel, this is a pretty screwy idea. – Anthony

Joel Shore
December 31, 2009 6:21 pm

P Wilson:
Oh, and by the way, the effect of the satellite decay is not obvious as Christy and Spencer difference two different measurements and “the presence of a spurious cooling trend introduced into the MSU2LT data by neglect of the differential effects of satellite orbit decay on the near-limb and near-nadir observations” (quoting from http://www.ssmi.com/papers/msu/A_Reanalysis_of_the_MSU_Channel_2_Tropospheric_Temperature_Record.pdf )

Joel Shore
December 31, 2009 6:47 pm

David Middleton says:

It is impossible for “B” to cause “A” if “A” always occurs first.

However, if by “A” you mean a temperature change, then it does not always occur first. It may tend to start first…but much of the temperature change occurs after the CO2 levels have already been changing.
You might also want to think about chickens and eggs. Certainly, we have never seen a (chicken) egg that hasn’t first been laid by a chicken. And, it turns out to be correct to conclude that chickens “cause” chicken eggs. However, far from being logically impossible that chicken eggs “cause” chickens, it in fact turns out also to be the case.

The CO2 rise since the mid-1800’s could largely be the result of oceanic warming that began at the nadir of the Little Ice Age in the 1600’s.

No…It is not. For one thing, the rises in CO2 and the associated temperature rises seen in the ice core record are not compatible with this large a change in CO2 from the global temperature change. For a second thing, evidence shows that the oceans are actually absorbing CO2, not emitting it. It also begs the question of where the CO2 that we know we are producing by burning fossil fuels is going.

Tom P
January 1, 2010 3:09 am

Smokey (12:22:36) :
Of course the engagement is the original cause of both the party and wedding just as the Milankovitch cycle was the original cause of the past excursions in temperature and CO2. These cycles produced an increase in the input of energy into the global climate.
But this time Milankovich forcing is not driving the change – we’re at the peak of an interglacial cycle. Hence we need to look for a different underlying cause than has been seen in the climate record to date.

P Wilson
January 1, 2010 4:09 am

Joel Shore (18:12:19)
If satellites lose velocity and altitude their readings should surely be adjusted downwards, whouldn’y you agree?

P Wilson
January 1, 2010 4:11 am

Joel Shore (18:12:19)
If satellites lose velocity and altitude their readings should surely be adjusted downwards, wouldn’t you agree?

P Wilson
January 1, 2010 4:50 am

Joel Shore (18:12:19
Fair enough: GISS/Schmidt were the arbiters of this temperature correction of satellite data.

P Wilson
January 1, 2010 4:54 am

Roger Knights (16:17:36)
Thanks for the correction!

P Wilson
January 1, 2010 5:14 am

Joel Shore (18:47:50)
The salient characteristic that traps heat is air pressure than c02 or water vapour concentrations, and air pressure changes constantly – so do temperatures. Over a longer term its ocean heat content. However, the problem for whether oceans are net emitters or net absorbers of c02 is that no-one is measuring it and such exchanges, since its well nigh impossible to. I’d suggest that given SST’s over the course from the 20thC-21’st, that oceans have been net emitters of c02, particularly since the late 1970’s

Joel Shore
January 1, 2010 7:50 am

P Wilson: I have explained to you both why it is ludicrous on the face of it to make the claim that Christy and Spencer are adjusting the satellites in the wrong way for this effect (and even Anthony agrees with me on this) and also given you an explanation of why it is more complicated than you envision. I think there is really nothing more that need be said.

P Wilson
January 1, 2010 10:48 am

It was Wentz, et al who proposed that an artificial cooling trend occurs when satellites lose their orbit and fall closer to the earth’s surface, and i’m not sure that Strong and Christy agree with his theory.
http://spacescience.spaceref.com/newhome/headlines/notebook/essd13aug98_1.htm
the idea that the atmosphere cools as you get closer to the surface of the earth and lose altitude is certainly a novelty to me. I can see the case for adjusting diurnal adjustments that show an artificial warming or cooling trend.

P Wilson
January 1, 2010 10:51 am

ok i can’t make it much clearer.. Is something in orbit slows down and moves closer to the earth, its going to give a higher reading, is it not?

Joel Shore
January 2, 2010 1:44 pm

On the issue of residence times, I was just re-reading the relevant section of “Global Warming: The Hard Science” by L.D. Danny Harvey ( http://books.google.com/books?id=8zBRAAAAMAAJ ). He calculates a residence time on the order of 5-6 years for the atmospheric reservoir. Here is what he then has to say (pp. 20-21):

The atmosphere + biota + soils + mixed layer [of the ocean] components thus form a tightly coupled subsystem which slowly exchanges carbon with the deep ocean.
When fossil fuel carbon is added to the atmosphere, the relevant response time scale is not given by the residence time of atmospheric carbon based on the exchange with the other reservoirs with which the atmosphere rapidly interacts. Rather the response to fossil fuel carbon is given by the residence time of carbon in the coupled atmosphere-biosphere-mixed layer subsystem. This is because the rapid transfer of carbon from the atmosphere to the biota or mixed layer is quickly followed by the return flow to the atmosphere. The residence time of coupled atmopshere + biota + soils + mixed layer subsystem is given by the total mass of carbon in the subsystem (about 3100 Gt) divided by the rate of exchange with the deep ocean (about 10 Gt C but highly uncertain). The result is a residence time of about 300 years. This is a very crude representation of highly complext processes, which are discussed in more detail in Chapter 8, but serves to illustrate in an intuitively simple manner how two very different response time scales for atmospheric CO2 (5-6 years and 300 years) can arise and how we can get a feeling for what the magnitude of the response time scales should be.

DR
January 13, 2010 5:31 pm

test

Adam
February 19, 2010 7:31 am

Interetsing. You calculate a warming trend of 0.95 deg C/century based on the raw UAH data. And then you set about some rather Jonesian “tricks” and –voila!–the trend goes away.